Analytical Modelling - Computing Power and Hardware

Hi there.

I am using KNIME for chemical analytical modelling. I have used KNIME on a standard local computer with 8 RAM. I want to do a scale-up of my models and need a lot more computing power now (which I have not been able to predict yet, because the modelling process isn’t finnished). My old models used 600 MB RAM, which might exponetially increase with my new approaches to.
I want to buy new hardware to work on to reduce the calculation time, but I am unsure about the setup. I want to keep it local and don’t want to run the modelling process on external servers.

Now I would like to know if KNIME is able to work on graphic cards or if I should work with multiple CPU’s. Any experience of your own setup would be very helpful.

Thanks in advance!

Hi @tu72jik,
welcome to the KNIME Forum! What kind of models are you running? Are you using KNIME native learner nodes or are you training the models in a Python or R learner?
Kind regards
Alexander

1 Like

Hi Alexander,
Thanks for your reply.
I have only used KNIME native learner nodes and R learner nodes (like R predictor and R Learner) until now. I have not been planning on training models in a Python learner yet.
Do the native learner nodes and R learner nodes benefit from graphic cards or are they only working on CPUs?

Best regards
Judith

Hello @tu72jik,
the native KNIME Learner nodes do not make use of a graphics card. In that case I suggest a fast CPU and as much memory as you can afford :wink:
Kind regards
Alexander

1 Like

Hi Alexander,

Okay thanks so much! Then I will focus on CPUs and Memory. :slight_smile:
One last question: Which nodes benefit from graphic cards? Just so I know for future modelling and workflows.

Best regards!

Hi @tu72jik,
as far as I know only the Deep Learning nodes benefit from a good graphics card.
Kind regards
Alexander

1 Like

Anything deeplearning related, eg. the keras/tensorflow nodes if setup correctly or thew deeplearning4j nodes.

On that regards what doesn’t work but would be a nice feature is that the xgboost nodes could also use GPU if one is available.

Hint: If you buy a GPU, buy one from Nvida and not AMD. Else you will enter a world of hurt if your interest is deep learning.

EDIT:

Bascially if you are doing deep learning, you will need an Nvida GPU, if not, then focus on CPU cores and speed (soem stuff is onyl single-threaded by nature). Also for knime IO matters so having the workspace on an ssd will also help.

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.