Variable condition loop end error

Hi Team,
I am using variable condition loop end node in my workflow.

I am getting the following error. How can I fix this?

ERROR Variable Condition Loop End 0:183 Execute failed: Input table's structure differs from reference (first iteration) table: different column counts 10 vs. 1

Thanks,
Srinvas

Hi @ShinagdeS , this usually happens if you are changing the structure of the table inside the loop.

Can you please share your workflow so we can see what’s happening and what’s causing the problem?

Hello @ShinagdeS,

find here workaround:

Also have added +1 to existing ticket to Allow changing table specifications on this type of loop end.

Edit: You can also modify table you create with Table Creator node to have same structure (same column names) as table coming out from Variable to Table Row (deprecated) node. Probably easier than above workaround.

Br,
Ivan

Hi @bruno29a

This is my workflow. I am basically trying to re-execute my workflow if any connection nodes get failed.

Thanks,
Srinivas

@bruno29a was referring to a sample workflow you can upload here (I would assume)
@ipazin probably already provided the answer
br

1 Like

Hi @bruno29a , @ipazin , @Daniel_Weikert.
Thank you for all the help.

This use case workflow worked for me.

But still, Amazon S3 connector node is failing, In the workflow, I have set to re execute the workflow for 10 times if it fails. It is re executing for 10 times, but the amazon s3 connector node never gets successful.

One more weird thing I observed when I open the workflow and execute it then it works, but when I schedule the workflow then there is an error. Not sure why is this happening.

I think when I was using legacy nodes in my workflow, it used to run fine.

This is how my workflow looks -

Thank you.
Srinivas

Hi @ShinagdeS , sorry I took some time to reply, was a bit busy. Yes, I was wondering why you needed the Variable to Table as in why you were trying to go back to data (black triangle port) since there was no data to continue with. You could stay with the variable port and use the Catch Errors (Variable port), so I am glad that you are using this method.

In terms of S3 connector timing out, I did some research over this last night. This seems to be an issue on the S3 side. Even if Knime had the option of setting a timeout, it would not matter. The timeout issue that is happening is on S3 side, which we (users of S3, therefore any application that allows us to use S3 such as Knime) don’t have control on this.

From what I read, this does not happen only with large files, but also with small files. The only “solution” is to retry. However, this might end up costing a lot from AWS when retrying.

Hi @bruno29a , I did not understand what do you mean by -

Yes, I was wondering why you needed the Variable to Table as in why you were trying to go back to data (black triangle port) since there was no data to continue with.

Hi @ShinagdeS , sorry for the confusion. What I meant was, in your original workflow, you were using the Catch Errors (Data port) instead of Catch Errors (Variable port), which then made you have to use a Loop End with a data port (data ports are the black triangle port)
image

image

In the end, you did not need to use a Catch Errors with Data port. There is no data to work with in this case. All your data stream “stopped” at the CSV Writer and you are not using this data stream after.

So, the only way to use a Catch Errors with Data port was to convert variable to data, which the Variable to Table Row does, but as I explained, I did not see why this was being done, as I did not see why a Catch Errors with Data port was needed. The Catch Errors with the Variable port would do, and it would not create the issue of structure being modified.

Looking at your new workflow more carefully, it looks like you changed to a recursive loop instead. This is not exactly an ideal case to use recursive. There is no use of recursion here.

Hi @bruno29a
So basically I was using try-catch to catch the error which is happening due to amazon connector nodes. And re-execute if there was an error.

If I use the normal variable condition end loop node, I used to get this error -

ERROR Variable Condition Loop End 0:183 Execute failed: Input table’s structure differs from reference (first iteration) table: different column counts 10 vs. 1

I followed this workflow to set up my workflow.

With the help of recursive loop the above error was gone.

Hi @ShinagdeS , sorry, I think you missed the point :slight_smile:

I get it what you are trying to do and why you are using the Try and Catch. I’m not questioning that. My point was why were you were trying to use a Catch Errors with Data port instead of Catch Errors with Variable port?

When you used the Catch Errors with Data port, then you ended up using the Variable Condition End Loop - because it accepts a Data input port, and as you found out, you got the error that complains about the structure.

You do not have to use Recursive loop to get rid of this error. It’s not because you are not getting the error via Recursive loop that it’s a fix. You’re not fixing the error.

Of course in this case the Recursive Loop is not going to complain, since there is no data processing between the start and end of the loop. You can do the same thing with your first attempt. If you link your Generic Loop Start to your Variable Condition Loop End, you will not get the error that you got.

You can wipe water off your kitchen counter, or you can use a blow torch to dry the water. Both will get rid of the water, but using a blow torch will also destroy your counter. So it’s not because you don’t see the error that you fixed the issue.

2 Likes

Hello @ShinagdeS,

So it working locally (AP) and not scheduled on KNIME Server? Seems like we then solved both issue from original post (proper loop ending) and workflow design (to retry on failure). That leaves scheduling on KNIME Server not working. I suggest to close this topic and you open up a new one in proper category (KNIME Server). Ok?

Br,
Ivan

1 Like

Ok @ipazin I will open a seperate topic

1 Like

Hello @ShinagdeS,

seems you already have it covered here so no need to open new topic.

Hope it gets resolved quicky.

Br,
Ivan

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.