I have the same problem as defined in this post.;
I’m trying to activate GPU support to run the examples in chapters 7 & 8 in the Codeless Deep Learning book. The models run, but don’t use the GPU. I have CUDA installed. My Keras and Tensorflow environments are attached. I’m an analyst, not a computer expert. Any help would be appreciated.
Keras Environment.txt (7.3 KB)
Tensorflow Environment.txt (8.4 KB)
Keras and Tensorflow2 GPU environments are created via Knime create new environment. CUDA installed. İ7 laptop 8 GB Ram Geforce 520M GPU ( quite old machine but still has pulse )
Here is the GPU and CPU utilization;
Here is the DL python settings;
By the way, I’m no Data Scientist or programmer, Just an ordinary industrial Engineer trying new things.
The GForce 520M only has compute capability 2.1, but recent tensorflow-gpu versions seem to need at least 3.5 ( GPU support | TensorFlow). You will probably need a beefier GPU
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.