Keras training context compatibility issue

Hi,

I was able to connect the other Keras nodes with my custom Keras Learner by using the indications from: Compatibility with Keras nodes. But I’m having this issue: “ERROR ModalContext Node Loading model settings failed: org.knime.dl.keras.tensorflow.core.training.DLKerasTensorFlowDefaultTrainingContext cannot be cast to org.activpred.dl.keras.core.training.DLKerasTrainingContext”
I want to add some new features to the Keras Learner node. I need to have both Keras implementations of this node available for my team in the workbench. I want to have an easy installation of my custom node. What are the best practices for doing this?

Thank you,
Mihai

Hi Mihai,

Could you please provide your knime.log file and/or some strack trace that illustrates where this error is being thrown?

You typically write your own extension as described in this guide. The section “Deploy your Extension” of the guide describes how to manually create a local Update Site, which may already be enough to deploy the extension within your team (by distributing it as an archive file).

Marcel

1 Like

Hi Marcel,

The [AP] Keras Network Learner node into a new plugin and copied the code from the org.knime.dl.keras plugin.
I don’t have this issue in the way I had it when I wrote the post. I’m having instead this warning after I remade the Eclipse setup again:

I created several days ago [AP1] Keras Network Learner node inside the org.knime.dl.keras plugin and I deployed it as in the guide, the node was is working fine without any issues. I need to have the original Keras Network Learner provided by KNIME unchanged along with my custom Keras learner.
I created new optimizer classes inside the DLKerasOptimizer.java file for my modified optimizers and are not loaded in the [AP1] Keras Network Learner node. I created a new DEFAULT_CFG_KEY variable for my optimizers and didn’t worked.

Which is the recommended way to create a new node for my case?

  1. From a new plugin and copying the files from the old plugin?
  2. Creating a new node inside the existing plugin?

Regards,
Mihai

Hi @mihais1,

If you copy classes into your plugin, they are seen as completely separate from the original ones by the OSGi / Eclipse RCP framework. This is causing the error message in your screenshot.
Have you tried defining a dependency on the keras plugin and directly importing the classes you want to use? That way they remain compatible.

A second option for you would be to define your plugin as a fragment of the original keras plugin, that way you can use all the classes provided by that plugin.

Be advised, that this might lead to problems following upgrades as we can’t guarantee that the internal structure of plugins stays the same between KNIME AP versions.

best,
Gabriel

Hi @gab1one,

I managed to add my optimizers to my custom node by creating the node inside the org.knime.dl.keras plugin. I created the APDLKerasOptimizer interface and I extended the DLKerasOptimizer interface in order to ensure the backward compatibility.
But now I am facing a new problem regarding to the deploying my new Keras Network Learner node. After I deployed the node by creating the feature and update projects, when I installed it to my local KNIME installation I got an error that required to uninstall the existing “KNIME Deep Learning - Keras Integration”. I uninstalled the existing KNIME Deep Learning and installed my AP Deep Learning version, with this installation the Tensorflow nodes have been removed.

I tried to create a ‘Feature Patch’ project instead of the ‘Feature’ type with adding the org.knime.dl.keras plugin as dependency, but this setup didn’t worked because of this error : ““AP Deep Learning” is not applicable to the current configuration and will not be installed.” I am using Eclipse IDE for RCP and RAP Developers, Version: 2020-03 (4.15.0). Here is the fragment wizard:

I tried to use this approach the but didn’t worked because org.knime.dl.keras.base.nodes.learner.view.* packages are not being exposed by the org.knime.dl.keras plugin.

Thank you very much for your suggestions!

Best regards,
Mihai

Hi @mihais1

Can you post that error please? Also when it tells you to uninstall the existing deeplearning extension can you select: “Show full error and build my own solution” and post that error. That should tell us what is going on.

best
Gabriel

Hi @gab1one,

I am showing the print screens. Here they are:

Best regards,
Mihai

Hi @gab1one,

I tried to include only the org.knime.dl.keras plugin in my feature and I got almost the same error:

I tried with the Feature Patch wizard applied only to the org.knime.dl.feature.dl.keras and I got this:

Best regards,
Mihai

Hi @mihais1,

Where does the KNIME Deep Learning Core plugin with a build time qualifier from today come from? It looks to me like you are trying to install the same plugin in two different versions which either means your self build version of org.knime.dl or you named your fragment plugin that way.
Can you try naming your fragment plugin something different, e.g. org.activeprod.dl.keras and only put that fragment into the feature. Also make sure you set a version range in the fragment host to ensure you are not pinning the plugin versions.

best,
Gabriel

Hi @gab1one,

I made a merge yesterday from the bitbucket repo(branch releases/2020-07) with the latest updates in order to have the 4.2.1 version. I think the KNIME Deep Learning Core plugin comes from my build from today.

I think I have the Fragment Project broken on my Eclipse, is creating the project without the fragment.xml file.

Best regards,
Mihai

Hi @mihais1,

this sounds to me like you are trying to do something which can not work, which is shipping your own version of the org.knime.dl plugin that also is supposed to be compatible with the one shipped by KNIME.

As I understand your usecase, you want to need to add your own node that uses code from the dl / keras integration. If you put that code into a fragment project, that defines org.knime.dl as it’s host and only include that fragment into your feature, then you should be able to install that feature without issued.

best,
Gabriel

Hi @gab1one,

Sorry for the delayed answer, I was struggling with the Fragment Project setup, I never used this setup before. I think there is a problem with it, when I’m creating a new Fragment Project it never creates the fragment.xml file and the dependencies from the host plugin cannot be seen in the fragment.
I was able to create and deploy a fragment with the host plugin org.knime.dl.keras, I was able to install it without errors on my KNIME local installation but the [AP] Keras Network Learner node is not available on the node repository. When I’m running the KNIME within the Eclipse, the node is available in the node repository.
I tried to create the fragment with the host org.knime.dl plugin and I got circular dependency errors when I’m adding the org.knime.dl.keras as dependency for the fragment in order to use the code from dl / keras integration.

https://wiki.eclipse.org/FAQ_What_is_a_plug-in_fragment%3F

Best regards,
Mihai

This sounds like the correct setup but there might be activation errors, to investigate this start the KNIME AP where you installed your custom fragment from the command line with the extra argument -console, e.g. like this: ./knime -console. This will show the OSGi console in that terminal once the AP has started.
Enter ss there to see a list of all installed bundles, locate your fragment and check its status and id:
image.
If its status is resolved, then there are probably startup errors, enter start $id to force the osgi platform to start the plugin, it will most likely fail with some errors. Please post them here.

best,
Gabriel

Hi @gab1one,

Here are the status and error from my fragment:

Best regards,
Mihai

Hi @mihais1,

ok this looks a bit too complicated to solve via the forum, I’ll send you an email to setup a call.

In the call we figured out, that the problem was that the feature.xml file was not contained in the build.properties. This means that the fragment is installed, but the nodes are not discovered as there is no definition to be read from a fragment.xml file.

best,
Gabriel

1 Like