transpose node is getting very slow

I have a table of 8000rows and 600 columns that I transpose iteratively in a loop before I do other data manipulation. I noticed in the first few hundreds of iteration, the run of the transpose is quite fast (one or 2 seconds). Then it gets slower and slower. I have enougt memory Xm 8000mb.
The only paprameter I see in the node is the table chunk that I keep at 10.
Is there a way to make the execution of this node faster?
Thanks,

HI there @zizoo,

you are using the latest version? How many iterations you got? In each iteration you are transposing 8.000 rows and 600 columns or?

Maybe it would be best if you share this example workflow and someone could take a look.

Br,
Ivan

Hi @ipazin,
I have 8000 iterations for a manipulation on a table to 8000 rows and 600columns.
I have the latest version.
How I can choose the optimum value of table chunk in transpose node. The highest is the better?

Hi there @zizoo,

if it is the same table every iteration why not first transpose it and then loop over it 8000 times? What loop are you using?

Br,
Ivan

Hi @ipazin
I am using column list loop. So in each iteration, I use one column of the big table and at the same iteration, I work on that column as a row after I do the transpose on the full table.

Hi there @zizoo,

I think I get it now :slight_smile:

Can you share example for someone to check it out?

Br,
Ivan

1 Like

Hi @zizoo,

If I understood your case correctly, you want to iterate over the columns of a table (8000 rows and 600 columns) using column list loop. That should be 600 iterations not 8000.

For your question:

How I can choose the optimum value of table chunk in transpose node. The highest is the better?

This depends on the data in your table. For example on how big your the individual columns are (number of rows, datatype, etc). If all of your columns are numeric and you have just 8000 rows, you can transpose them at once (chunk size = 600).

Another reason why your later iterations might be slower is if the columns down the list contain different type/size of data.

It would be nice if you can share an example workflow so that we understand the problem better. You can also have a look at the attached example workflow for ideas.

transpose_loop_example.knwf (22.8 KB)

Best,
Temesgen

1 Like

Thanks @ipazin.
Unfortunately I cannot share the data as it is private and I donā€™t own it.
When I used a table chunk of 8000 instead of 10, it becomes much faster but I am not sure if this will work with a table of millions of rows of numeric.

Hi there,

I see. Can you simulate it with dummy data?

Br,
Ivan

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.