Numeric columns in DB to Spark converted to Strings

Hello everyone,
I have a problem with the db to spark node;

I write my query and i extract these double numeric

But after the db to spark node they all become string

How come this happens?
You can help me?

thanks Andrea.

Hi @And,

I guess something with the mapping goes wrong, and every unknown mapping gets converted to string as a fallback.

Can you describe your setup? What KNIME version, Database, JDBC driver and Spark version are you using?

Cheers,
Sascha

1 Like

Hi Sascha,
Yes, this is my setup:

KNIME 4.6.0
Spark version 2.4
Database type: Oracle → node (oracle connector)
Ojdbc8
Driver: oracle → Version=19.3.0

thanks
Andrea.

Hi @And,

A somehow tricky problem. Spark converts Numerics from an Oracle DB to Decimal Spark Types, and KNIME does not support them right now. Inside Spark, they are still decimals, but every time the data or schema gets exported to KNIME, they become strings. There is no simple fix about this.

What numeric column do you have in your Oracle DB, are they Integers?

The following code converts two columns to integers, using code inside a Spark DataFrame Java Snippet:

return dataFrame1 //
	.withColumn("GAE_CODE", dataFrame1.col("GAE_CODE").cast(DataTypes.IntegerType)) //
	.withColumn("RAE_CODE", dataFrame1.col("RAE_CODE").cast(DataTypes.IntegerType));

Or to convert all decimal columns to doubles:

Dataset newDataFrame = dataFrame1;
for (StructField f : dataFrame1.schema().fields()) {
    if (DecimalType.is32BitDecimalType(f.dataType()) || DecimalType.is64BitDecimalType(f.dataType())) {
        newDataFrame = newDataFrame.withColumn(f.name(), newDataFrame.col(f.name()).cast(DataTypes.DoubleType));
    }   
}

return newDataFrame;

Note that converting decimals to doubles can result in a loss of precision.

Cheers,
Sascha

2 Likes

I Sasha,
Fine, thanks for the answer, then I’ll try to use this fix !

thanks 1000!
Andrea!