Please provide possible explanation of Parameter Optimization Loop behavior.
Parameter Optimization Loop has 2 parameters defined by user for RProp MLP Learner, “topology”[number of layers] and “nodes”.
Only topology variable is specified as flow variable for RProp MLP Learner. Message appears that “hiddenlayer” parameter is controlled by a variable. On the other hand, Number of Hidden neurons per layer is specified directly in RProp MLP node, value of 3 is selected.
After loop run, user checks Parameter Optimization Loop End, Right click, then select All Parameter , and it shows that loop actually employed variable “nodes” to train model.
Therefore, even though user defined parameter“ nodes” was not transferred to other nodes as flow variable, it was used to train the model [ 7 nodes instead of 3 ].
Is this expected behavior of Optimization Loop? I have checked node description and sample nodes on Hub , but unfortunately havent found answer.
Thx in Advance
and welcome to KNIME Community!
Parameter Optimization Loop is not aware if you used or not parameters you defined. Actually it assumes you are using it. So in your case you are using Brute force strategy and optimization loop is trying all possible combination of defined parameters. Then what you see as output in Parameter optimization Loop End are just parameter combinations in each loop iteration. That however doesn’t mean RProp MLP Learner is using nodes flow variable for number of hidden neurons per layer. How would it? If you define test flow variable you’ll see that one as well although it’s not used.
So I suggest to remove flow variables not used inside optimization loop to avoid confusion and lower the execution time
Thank you for clarification! Indeed it employs all parameters, after multiple runs I get idea how it works.
Glad to hear that @alex_nest_9845.
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.