Problem with X-Aggregator node

Hi,
I’m using a X-Aggregator node for a cross-validation loop. It gives me this error message at first time:

ERROR X-Aggregator Execute failed: Encountered duplicate row ID “PCK1” at row number 5849

then I re-execute the node and it gives me always the same error message:

ERROR X-Aggregator Execute failed: Loop Head claims this is NOT the first iteration but the tail believes it is?!

…and so I can’t go on with my work! Can somebody help me?
Thanks

Did you connect the output of the learner node to the x-aggregator? That would explain duplicate row IDs because the learning sets overlap. The typical use case is that you connect the output of the predictor to the aggregation node.

Does that make sense?

Yes, I have connected the output of the predictor to the X-Aggregator input…but it doesn’t work!

Then I suggest that you use a CSV writer and write out the intermediate results to a temporary file (appending to the file if it exists – there is a dedicated option in the dialog). The file should tell you, in which of the runs the duplicate key is generated.


… just to be on the same page: The cause of the problem is that two different iterations of the loop generate the same set of row IDs, which get collected in the X aggregator. However, the collection must not contain duplicates but it your case it does (in my use cases of the cross validation looper it works without problems).

Thanks for answering.
I’ve done what you said…I’ve used a CSV writer, and now it doesn’t give me that error message but another one…Now the message is:

WARN NodeContainer Encountered loop-end without corresponding head!

What do you suggest? Thanks

The error message indicates that the loop is not properly closed. It’s hard to help here without seeing your actual workflow. I attach a simple workflow that demonstrates the typical usage scenario (including the csv writer – you might need to adjust the output path). Maybe you can have a look at this workflow and take it from there.

Well…now it works…I’ve erased all my sub-workflow and redone all from the begin…and so now I can go on…Thank you so much for your help…Best Regards

Marco