Dear everyone
Applying XGboost to an imbalanced dataset, did anyone ever encounter a method for tuning the model and finding the sweetspots for the parameters :
Have you tried the Parameter Optimization Loop nodes in KNIME for this type of task? There are several examples of their use available on the Workflow Hub, but this is probably a good one to start with.
Dear ScottâŚ
Thank you for the good and correct solution !
I tried what you explained and I keep on getting this error: ERROR Gradient Boosted Trees Predictor (deprecated) 0:82 Execute failed: (âArrayIndexOutOfBoundsExceptionâ): null
A lot of trial and error has been conducted tonight and iâm unsure why it states this. I tried both brute force scenario and hillclimbâŚ
I just changed the node and itâs still the same scenario. Iâm unfortunately not in a position where I can upload data. However I just tried copying in the same âparam loop startâ and âparam loop endingâ into the example workflow and set the XGBoost up the same way. Here it doesnât fail.
Hey again⌠I actually managed to make a workflow and reproduce the problem in a dataset you are allowed to look at would you be OK with me e-mailing this to you?
Sorry, I had a bit of travel scheduled and have been away from the forum for a couple of days. Please feel free to email the workflow to me at scott.fincher@knime.com.