After much ado, I have set up my new machine to leverage CUDA via DL4J in KNIME, but I noticed that the classifications using the GPU are not working correctly as compared to CPU when I select the GPU option for DL4J.
I have confirmed that that CUDA 8.0 is installed and works. I am running on an Nvidea RTX 2060 GPU, and the task manager shows some GPU utilization during training. The results of the training put all the training set into a single class though.
If I run it on CPU, the examples seem to run correctly. If I monitor the Backprop loss, the coefficients coming off the GPU are all zeros, versus some actual number from the CPU.
Is there something I am missing???
Seems that this is similar to another problem posted where someone installed an older driver version and got theirs to work. I am running Windows 10 and have a RTX 2060, so that old driver wont work for me.
Does DL4J in KNIME support CUDA 10, or is that in the roadmap? According to the DL4J page, the current release is compatible with 10.1.
The last thing I can think to try is to install CUDA 10 and just cross my fingers and hope it works.
Ok, I updated to CUDA 10.1 last night and reran the example workflows. The CPU usage is down and the GPU usage is up, but the GPU barely breaks 10%. Is this expected??
I’m sorry that you are having troubles with the DL4J Integration. Unfortunately, the Integration uses an old DL4J version which should only support CUDA 8, and we do not plan to update that in the near future. Did you get it to work with CUDA 10.1?
Regarding the GPU usage, that greatly depends on the used network and data. However, we had some problems with the GPU usage of the DL4J Integration in general.
Would it be an option for you to switch to our Keras Integration? We recommend using the Keras Integration for deep learning if possible. With Keras, GPU usage should be fine (assuming there are not bottlenecks in your network).
Thanks for the reply. I have not used Keras. One of the blog posts said that it can only run on the GPU on Linux - is this accurate?
On Windows and on Linux, GPU with Keras will work. However, I’m not sure about Mac.
Ugh… It would be great to have everything pre-installed… . . I am flailing around in Python trying to get these things to work.
This might help (in case you haven’t found it yourself already): https://docs.knime.com/2018-12/deep_learning_installation_guide/index.html#keras-integration
We plan to simplify installing the Python environment in future versions of KNIME .