I am planning to use Knime, mainly, as an ETL tool to populate data into a company datawarehouse, either in local version or in server version mode.
Do you know the kind of limitation on volume or performance you have experimented working with large volumes of structured data? I mean the volumes of data that your workflows have managed with some Server o local computer hardware characteristics?
@fmc00006 I can offer this article with my experience.
Mostly the question is how large is the data and what is the power of the system where knime is running and the connection to the server where the database is.
And if your hardware has plenty of ram and a fast SSD, you can seamlessly process even a billion+ rows locally in seconds. I often work with 100+ M rows on a Thinkpad with 16 gb ram and Ryzen 5 7535u cpu only - all queries (incl parametric queries , window functions etc) done within seconds.