I am struggling with a workflow creating large temp files which max out my space and stop the workflow.
I am running a recursive loop on name records read from a database. The loop consists of a cross join. Takes the first record and cross joins the rest of the table checking for matches.
The matches are determined via a rule engine which have 4 different result options. Then row splits between match and no match.
The no match column names are cleaned up and then sent back to the beginning of the loop to be cross joined with the next row.
This loop appears to be creating the large temp files when I step through the process and check the file size throughout.
About 10k loops resulted in about a 5GB temp file. I believe it run through about 40-50k when it maxed out my storage at about 200GB. If I reset the recursive loop start the files are gone.
Ideally I want to be able to run this over 360k rows. What is the cause of these large files and can I get around it?
All suggestions appreciated! Thanks!