I hope you’re all doing well. I’m currently in the process of learning about deep learning, specifically image classification, using the cat and dog classification workflow.
I’ve been encountering some difficulties with installing TensorFlow and Keras libraries. I’ve attempted installation via KNIME by creating a new environment, trying both CPU and GPU options. However, I’m consistently facing errors such as “library not installed correctly” or “Keras library missing”.
Furthermore, I’ve also tried manual installation using the Conda browser, but I still encounter the same error messages.
These issues are primarily related to the installation of Keras and TensorFlow libraries. In addition to this, I’m experiencing another problem concerning the usage of the workflow, specifically with the Keras Network Learner node.
Previously, the node could execute with a message indicating a missing library, but now I’m unable to execute the workflow at all due to a different error message:
" Errors loading flow variables into node : Training data converter ‘org.knime.ip.dl.DLImgPlusValueToFloatTensorConverterFactory’ of network input ‘input_1_0:0’ could not be found. Are you missing a KNIME extension? "
I would greatly appreciate any guidance or suggestions on resolving these issues. If anyone has encountered similar problems or has insights into potential solutions, I would be very grateful for your assistance.
@Grayfox unfortunately the setup of Keras and TensorFlow ist still a challenge. I have built an example with the Cats and Dogs example adapted from another KNIME workflow that uses Conda Environment Propagation (best to download the whole workflow group).
I have also tried to write down the process of how to set up a deep learning environment with KNIME. It might need som effort:
Thank you very much for your help. I tried your workflow, which works correctly, especially the Conda environment.
Later on, I wanted to create my own environment following your tutorial and using the configuration from the YAML file for Windows.
Unfortunately, when I tried to activate the environment, I received an error message. Here is part of the error message:
C:\Users\moham>IF "0" == "1" (
IF "" == "" SET "CMAKE_GENERATOR_PLATFORM=x64"
IF "" == "" SET "CMAKE_GENERATOR_TOOLSET=v141"
)
C:\Users\moham>pushd C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\
The system cannot find the path specified.
C:\Users\moham>CALL "VC\Auxiliary\Build\vcvars64.bat" -vcvars_ver=14.16
The system cannot find the path specified.
C:\Users\moham>if 1 NEQ 0 (if "" == "" (CALL "VC\Auxiliary\Build\vcvars64.bat" ) )
The system cannot find the path specified.
If I understand correctly, activating the environment on Conda uses Visual Studio. I don’t think it is installed on my PC.
Should I install the corresponding version, namely Visual Studio 2017 Enterprise?
Maybe that’s why creating the environment from Knime isn’t working correctly?
In any case, thank you for your help and your article, which has been very useful.
@Grayfox I do not think visual studio has anything to do with it. You could use that to code or maybe you have it connected to an existing python environment. I would use a basic miniforge to manage Python packages and environments.
You would tell knime where your environment is. You would only want to activate it via command line if you want to use other tools like jupyter notebooks.
I have found why I cannot manage to install Keras libraries with the YAML File.
In fact, I have copied the code the screenshot in your article, but when I have checked the YAML file from the environnement that you have created, there are more packages. So, I have just copied all libraries in a new file and I have created my environnement with this file.
But It’seems to be unstable, sometimes the python version decrease 3.7 to 3.6 and library keras are not available. So, I need to remove the environnement and create a new one with the YAML File.
@Grayfox glad it does work. The difference might be that a typical YAML file will hold ‘broader’ packages that might have several smaller dependencies. You can list them all but then something might change and it would be better to let conda sort out the current combination again with just some basic limitations like python=3.7 or so. Also if you use a YAML file across operating systems this also can differ.
Of course you can also choose to have every detail written down in a configuration/yaml. This is what the conda environment propagation does but then you will have the very exact combination including maybe version numbers - if this is what you want.
With KNIME and Deep Learning you will have to find the right combination. In the DL packages themselves there are heavy changes all the time it seems and the KNIME nodes have to catch up.