Keras Workflow Automation

Hello,

The new Knime 3.6.1 runs now much better with the Keras Integration and i dont received any Backend Error for today. :grinning:

My Question ist now, what is the right approche to set the Optimizer as a flow variable in the Keras Network Learner. I want to Test different configurations that are stored in an excel file and loop through each setting that I define in there. Here is a snippet from that file.
grafik

When I try to set that in the Kearas Keras Learner Node it shows the folloing Error.

I also tried diffrent Strings e.g. [RMSPROP, RMSProp, rmsprop]

Ralph

Hi Ralph,

The flow variable support of the Keras learner is still work in progress, unfortunately.
You’ll be able to specify the optimizer via its internal id, e.g., org.knime.dl.keras.core.training.DLKerasOptimizer.DLKerasRMSProp for RMSProp but the node will then complain that its parameters (e.g., “rmsprob_lr”) are not set. You have to set each of them explicitly via the node’s Flow Variables tab while the optimizer is actually selected.
The problem is that you cannot do that for more than one optimizer (at least I’m not aware of any way) because the moment you switch to a different optimizer, with different parameter names, the flow variable mapping of the old parameters are dimissed.
You could use one Keras Learner node per optimizer and loop over different parametrizations of that optimizer, if this helps.

The ids of all optimizers are:

  • org.knime.dl.keras.core.training.DLKerasOptimizer.DLKerasAdadelta
  • org.knime.dl.keras.core.training.DLKerasOptimizer.DLKerasAdam
  • org.knime.dl.keras.core.training.DLKerasOptimizer.DLKerasAdamax
  • org.knime.dl.keras.core.training.DLKerasOptimizer.DLKerasAdagrad
  • org.knime.dl.keras.core.training.DLKerasOptimizer.DLKerasNadam
  • org.knime.dl.keras.core.training.DLKerasOptimizer.DLKerasRMSProp
  • org.knime.dl.keras.core.training.DLKerasOptimizer.DLKerasStochasticGradientDescent

Marcel

4 Likes