Solutions to "Just KNIME It!" Challenge 23

This thread is for posting solutions to “Just KNIME It!” Challenge 23, which starts a series of four challenges on data classification. :brain:

Here is the challenge: Just KNIME It! | KNIME

Feel free to link your solution from KNIME Hub as well!

Have an idea for a challenge? We’d love to hear it! :heart_eyes: Tell us all about it here .

1 Like

Hi,

Here is my solution, for the just knime it 23 challange.

3 Likes

Hi! I did basically the same but tried to use rule engine just to make it easier to understand which row was representing a churn or not.
Also added the “Model Writer” node to make it easier for deployment.

3 Likes

Here is my solution for Challenge 22
Total nodes used: 5
Accuracy: 93%

Posted this also on my blog and LinkedIn along with other KNIME challenges and personal projects of mine
Blog: Just KNIME It! – Challenge 23: Modeling Churn Predictions – Part 1 – My DAN :
LinkedIn: My DAN on LinkedIn: Just KNIME It! Challenge 23: Modeling Churn Predictions - Part 1 https

#justknimeit


4 Likes

This is my submission for the Churn prediction challenge.

I added an ROC curve, as done by @MarioNasser before. I wrote my cross validated model to a file and used it on the test data to compare the non cross validated ROC vs the cross validated one. I got a marginally better result.
Notes:
The simple Regression Tree did not work and I don’t understand why.
I would have liked to see the scorer results from multiple scorers together but I don’t know how.

3 Likes

Hi everyone,
Here is my solution.
Accuracy:

  • 93,3% with pruning
  • 91,2% without pruning
2 Likes

My best result is 94.003% challenge accepted!

KnimeIT_23

3 Likes

Hello KNIMErs, Here is my solution for Challenge 23

2 Likes

My submission for Challenge 23

2 Likes

hi

here’s my solution.
I did two versions of the challenge, 1) simple prediction model according to the challenge and 2) an advanced using AutoML-node to benchmark several models and choosing the one with highest accuracy. VERY COOL.

summary:
Decision tree: 93% accuracy
AutoML (Gradient Boosted Trees): 95%

/cheers

6 Likes

Hello,

here is my solution for this challenge:

Have a nice weekend,
RB

2 Likes

Hi, here is my solution.

1 Like

3 Likes

Here’s my solution and parameter settings and accuracy statistics.




REF Challenge 23.knwf (453.3 KB)

1 Like

Here my solution: jKi-23 – KNIME Hub

1 Like

knime://My-KNIME-Hub/Users/jefleisc/Public/jefleisc-knime_challenge-23

1 Like

Hello dear KNIME users !

Since this topic is about classification with a strong hint about the use of tree methods, I decided to compare three tree-based approaches (I didn’t use XGBoost since some users already tested it):

  • Decision Tree,
  • Gradient Boosted Tree
  • Random Forest .

I tried to put the emphasis on model understanding/explainability, that’s why I added a decision tree view for the Decision Tree and the component “Global Feature Importance” for the Random Forest. I haven’t seen any view for Gradient Boosted Tree, so I’m open for any suggestions :slight_smile:

The goal for the team might be to have good churn prediction performance, but also an understanding or insights about the model. For example in the random forest, “State” is the first important feature. That may lead the team to better know their customers and learn how to adapt their strategy in different states…

Here is my workflow : KNIME_Challenge-23 – KNIME Hub

1 Like

Wow! Thanks for going the extra mile! Very cool.

1 Like

Hi guys !
This is my take on this. I choose another model in order to show a quick versus between them. Hope you find this useful.

Have a nice sunday !

2 Likes

Great build Martin :ok_hand: :ok_hand:

1 Like