# Identify rows within X minutes of others

I have the following data and am trying to identify rows that are within +/-x minutes of any other time. I plan to group loop for the purpose of looking at each day and using a variable to control for the time threshold. But I can’t figure out how to do the base calculation to account for all possibilities and not just one row above or below, etc. using lag columns.

10:19
10:51
11:21
11:55
12:26
12:58
13:31
14:01
14:03
14:33
15:03
15:32
16:04
17:06
17:35

Obviously any help is appreciated!

Hi there @Blas,

so one row can be part of only one group? And what would be example for data from your post if you would have +/- 45 minutes?

Also check this topic where similar requirement was discussed:

Br,
Ivan

2 Likes

Thanks for replaying @ipazin

The classification wouldn’t necessarily be a group, but rather just a flag identifying it as being within x minutes of any other time in the group. Hopefully that makes sense.

If 45 minutes were to be used, then all of the times would flag. If we used 30 minutes, 10:51, 11:21, 12:26, 12:58, 13:31, 14:01, 14:03, 14:33, 15:03, 15:32, 17:06, and 17:35 would all flag.

Hi @Blas,

I see. This should be pretty easy then. I guess your data is sorted and if not sort it (hope it is possible in your use case). Then what you are actually interested in is time difference with previous row and next row. If minimum of those differences is less than x you want to flag that row. Using Date&Time Difference with previous row option will give you both time difference to previous and next row but not in same row. So using Moving Aggregation node with Forward, 2 and minimum will give you number based on which you can flag rows.

Hope it makes sense. Just in case here is wf example:
2020_02_10_TimeDiffExample.knwf (14.9 KB)

Br,
Ivan

1 Like

@ipazin, Yes this does make sense. I had used a series of loops, etc. to solve it in the meantime, but this is obviously much more efficient and scalable. Learning about the Window Length and then how to use that with the moving aggregation was key. Thank you very much for looking at this for me!

1 Like

Hi @Blas!