I am new to KNIME. Recently, I trained a classifier using SVM. My data is imbalanced and skewed toward one out of four labels. So, the classifier still seems to be misclassifying a lot of the sample as that skewed label. Is there a way to update SVM to penalize misclassifying?
Your question about class imbalance is a common one that comes up from time to time. You might check the threads I’ve linked below:
Having said that, are you restricted to using SVM only or can you try other algorithms? You’ll see that SMOTE is discussed in the threads above, but some folks really advocate against it in favor of something XGBoost with class weights applied.
@acsmtl if you still want to use -Scale positive weight- when using XGBoost classification, you could convert your -4 classes- problem into a 4 -2 classes problem-, where you would train and use 4 different XGBoost models to classify every “Positive” class against the 3 others set as the complementary Negative class. In this case, you would need to set the -Scale Positive Weight- specifically for each of the XGBoost models. In the end, you could consider the P (class=Positive) returned by every model to calculate for every sample its final predicted class among your 4 classes.