H2O with cross-validation and hyperparameters

Good morning!!!

I’m doing a workflow with KNIME. I’m optimising the parameters that gives the best accuracy for this model. Apart, I do cross-validation of my dataset. I don’t understand why but I have received a message when executing the loop end node (corresponding to the loop of parameters optimisation). The message says "Cannot read the array length becasue “localBest” is null. Can this be possible?

image

Thank you in advance!!!

Hello helfortuny,

Can you please provide a sample workflow, so that we can try to understand the problem better?

In the meantime, can you try using the Parameter Optimization Loop End node instead of deprecated Loop End node?

Also if you would like, KNIME has a parameter optimization component with cross-validation functionality, which you use directly by referring to our example space on parameter optimization :

1 Like

Thank you very much!!!

I do have a question regarding the Parameter Optimisation (Table) node.

When doing the cross-validation for each set of parameters, the Performance Metric configuration will be the mean of all the loops in the cross-validation? So in case I would like to maximise the F-mean, the result is the mean of the F-mean obtained for each iteration, right?

Thank you again.

Hello,

The Parameter Optimisation (Table) Component uses the cross-validation KNIME nodes consisting of X-Partitioner and X-Aggregator nodes. Which works as below :

‘For K-Fold cross-validation, each row in the input is used in K-1 iterations for training the model and in only one iteration for prediction.
The top output of the X-Aggregator provides you with the prediction for each row from the iteration where it was not used to train the model.
The bottom output gives you the error rates for each iteration i.e. how the trained model performed on the rows that were not used to train it in this iteration.’

Best,
Keerthan

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.