incorrect output values using Keras dense layers

Hi, I am new to Knime. I built a neuralnetwork to predict numeric output values (full precision, one output) for the given input (3 inputs). When using Keras Input, dense and output layers, with ReLU activation and Adam optimizer, the output values are all zeroes. However, using RProp MLP learner node and multilayer perceptron predictor nodes, I was able to successfully train, test and get expected results. I would appreciate some tips to resolve theproblem using Keras layers.
Thanks.

Hi @Sumans_knime and welcome to the forum.

Any chance you could upload your workflow with some dummy data so the folks here could help troubleshoot more directly?

2 Likes

Hi Scott,

Thanks for your response. Attached herewith are the workflow, training and testing data. Also included is the output from training using Tan(h) activation function
TM_Try04_keras 1.knwf (35.3 KB)
Training data- Updated03_A_Two_ShapesTogether.csv (174.7 KB)
.Testing Data- For Test_A_Updated03.csv|attachment](upload://yVNX9LHgi2NgGYbSEgOKmZd92Af.csv) (14.3 KB)

Output obtained from Keras learner node training outputl out.xlsx (57.5 KB)

Thanks.
Suman

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.