Reading and writing multiple .CSV's


I am trying to read in multiple csv files, execute a process, and write it to a different .csv each time.

I currently have List Files --> Table Row To Variable Loop Start --> CSV Reader --> (my process) --> Joiner (end of my process) --> CSV Writer.

I'm a little unsure where to put the Loop End.

I had tried to put it on the same Joiner as the csv reader, and it would loop that way, but the file read in would contain empty columns from where my process added columns to the previous .CSV file. 

I also tried putting it after the csv writer using the flow variable port, but that didn't really do anything.


Any and all help is appreciated.



(Also, where would this save the .CSV file? The csv writer is using variable "currentIteration" to save the csv.)

You can see an answer for your question in a recent post of mine.

Hi Karelman, 

Thank you for your response. I have been to that post but was not able to understand how input (file name) was passed to CSV reader when it does not have an input port. 


will try again. 

You might want to watch my mini tutorial here:

Hi Iris,

thx for pointing out the training material! however, i have a bit more of a complex setup, and i’m really scratching my had on this… i guess i’m missing something obvious

main issue is that i have multiple parallel flows, each ending in a different CSV writer

i also tried with a variable loop end

due to the data volume, i want to iterate over the various documents i’m parsing.
–> hence, it seems that i cant setup an iteration mechanism that triggers all parallel flows

i want to keep the loop over the metanodes, so that i can reduce the data volume at an early stage

what would be the best iteration setup for this problem?

kind regards,


for solution - see “Meta Node Within Loop”

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.