ESRI Coordinate Row Filter Workflow

I need to filter a data table of homes to determine when each is geospatially located within the boundaries of a series of polygons–then assign to each property the name of the polygon it is located within.

I am pretty sure I need to use the ESRI Coordinate Row Filter but cannot figure out how to associate my data table to the created polygons and determine (filter) when each home is within each polygon. I find zero KNIME workflow examples setting up and using the ESRI Coordinate Row Filter and believe this is the exact tool for my situation

I have a database of homes, each with the latitude and longitude coordinates assigned.

After a lot of processing in KNIME I determined the most homogeneous polygons and drew them in google maps.

I downloaded the google KML files into KNIME using the Read from KML node. I can recreate the google map using the KNIME view Geometries as Map node.

temp2

How do I use all of this data to set up the ESRI Coordinate Row Filter to test the data table with each home’s lat/longs to see when each home is within the boundaries of a specific polygon–then assign the name of the polygon to the home when it is within the polygon’s boundaries?

thank you for any guidance you may have.

2 Likes

I have no experience with Geonodes in KNIME (yet) but have you already taken a look at that?

br

2 Likes

@Daniel_Weikert I had not seen this node in my search. I am investigating it, but it seems to define overlap between shapefiles as opposed to filtering which individual points of individual locations fall inside the boundaries of each shapefile.

I will check it out. Thanks!

Hello @smithcreed,

I am dealing with a similar problem (polygonal geofence in JSON, I have to check if a truck is inside the geofence). As for you, I cannot find workflow examples to help me solve the problem. I tried both Intersect and Coordinate Row Filter nodes.
Could you please provide an example on how to do it?

Thank you,
RB

Hello everyone,

I wrote some code in R to calculate if a point is inside a polygon.
It needs to be adapted to your needs/data, but the main function is ready to use.

I am not a programmer, so if you find bugs or you simply find a better way to do it, tell me and I’ll try to fix it. However, I hope it can help you saving some time.

Ray casting algorithm in R

RB

2 Likes

Hi @lelloba I am guessing your code in R cannot be directly integrated into KNIME? Thanks

I took a quick look at the code, it doesn’t do anything fancy or uses random table access, so it should be easy to adapt it into a Java Snippet in case you don’t have R.

2 Likes

Hello @smithcreed,

of course you can! You can use the R Snippet code and generate a true/false column that tells you if a set of coordinates is inside a certain polygon.
I did it today but with sensitive data I cannot share. If you need, I can share with you the full code of the R code for the snippet, but I need to prepare it first. Let me know!

1 Like

@lelloba If you can still share the code you mention here and how you set it up in kinme I would appreciate it. I am back on this geospatial problem again after taking much time to complete several other tasks, but still need to resolve this. Thanks

1 Like

@lelloba Do you have a functioning workflow you can share? It seems you need two feeds, 1) the shapefiles’ data and their points, and 2) the entities you are testing for inclusion in each polygon (identifier for each and their associated latitude and longitude points)? thanks

1 Like

Hi Creedsmith,

I don’t have a workflow ready right now but I can make one tomorrow. The one I am using is working with sensitive data I cannot share.
Is it ok for you? :slight_smile:

RB

@lelloba Anything you can provide would benefit me greatly. It’s not really your job to create a workflow on my behalf, but if you do it I will certainly take it :slight_smile: It likely will benefit many knime users and may make you famous. Thanks

2 Likes

Hello Creedsmith,

have a look at this workflow:

Description of the columns in input and output is inside the workflow.
I hope it can help. Let me know if it’s working for you or you need something else.

Have a nice Sunday,
RB

2 Likes

@lelloba thanks for the workflow. A couple of questions when you have the time.

  1. This can be any unique identifier I assign?

temp

  1. These are the lat/longs for each entity?

temp1

  1. These are the shapefiles’ (polygon) series of points

  1. The real question: if I have 2,000 properties (houses; unique identifiers) with the latitude and longitude associated to each, and 70 different polygon shapefile point series, how do I test each of 2,000 properties for potential inclusion of 70 different polygons. I don’t know R, but it seems you are testing to see if one entity is within or outside of just one polygon; binary, true or false? I need to test if one entity is within perhaps any of 70 different polygons and do this for perhaps 2,000 entities.

As I stated earlier, my problem is not your problem, so if you don’t easily know, I can perhaps restate all of this as a new question in knime forum. Thanks very much :slight_smile:

Hi Creedsmith,

  1. MITTENTE_COD is the identifier of the stop location
  2. MITTENTE_LATITUDINE and MITTENTE_LONGITUDINE are the results of the geocoding of the stop location address
  3. This is the attribute containing the vertices of the geofences in polygonal form
  4. If you can reconduct your “the_geom” column to the format of my “Punti R”, then it should be easy. You can use a cross joiner node to compute all combinations between property and areas, then identify the correct area for each location by filtering on true values on column “In Poligono”. If it is possible to have properties not belonging to an area you need to make sure to include them in your output file.

Different problems but their core part is the same :grinning:
What do you think about point #4? If you want, we can schedule a meeting call on Monday late afternoon/evening and work on this together.

RB

@lelloba thanks for responding. It’s been several months since I was working on this part of the project. Let me re-construct my data ( a lot has changed in my project since I last worked on the GIS portion of the project). I will create some trials, crash-and-burn a few times, then bother you again when I can precisely explain where I am going wrong.

I did read the information on ray casting, a very smart solution to this problem. If this was all your own idea, brilliant, and you should publish or protect it. If this was you discovering others’ work and applying it, still brilliant. Thanks again!

1 Like

No bother, don’t worry :slight_smile:
The algorythm is not mine, I found it in YouTube and rewrote it R.
Hope that will help more people in the future!

Cheers,
RB

1 Like

@lelloba I’m busy proving to myself how un-smart I am, but came across this article, which I believe is related to the problem(sss…) I am trying to solve and may help you some time in the future Extreme Performance Boost for Knime Spatial Processing - ClearPeaks Blog Thanks

2 Likes

Very useful! Thank you :slight_smile:

@lelloba So here’s the tiny start of a new workflow I expect to reproduce many times for different locations, once I get this running correctly.

temp

I am also attaching the initial data set (with Lat/Long data) and my initial set of polygons in KML.

Where do you suggest I go from here? thanks

QUANT_Historical_DSF.xlsx (375.0 KB)

It looks like I cannot upload KML to KNIME, so here is a dropbox link (I will remove it in a few days. The analyses to determine area took a very long time using proprietary information derived.) Dropbox - Highlands Ranch Colorado.kml - Simplify your life