PostgreSQL and DB to Spark are not connecting.

Hello. KNIME Support Team.

Is there a node that converts connected data from PostgreSQL Connector directly to Spark, like CSV to Spark?

DB to Spark is not ported, is there any other way?

image

Your answer would be appreciated.

Hi @JaeHwanChoi,

You can use the DB Table Selector between the Connector and the DB to Spark node.

Cheers,
Sascha

1 Like

Hi @sascha.wolke. Thank you for your response.

If you check Upload local JDBC driver
There is a message about the workflow β€œ(1030)DB_to_Spark_PysparkCustomerChurn 2023-10-30 22.46.19”
Container input (variables) 216:2057 - Warning: default variables will be output.
Container input (variable) 216:2085 - Warning: default variable will be output.
Container input (variables) 216:2086 - Warning: default variables will be output.
DB to Spark 216:3070 - Error: Execution failed: Failed to load JDBC data: Connection attempt failed.

If you don’t check
There is a message about the workflow. β€œ(1030)DB_to_Spark_PysparkCustomerChurn 2023-10-30 22.46.19”
Container input (variables) 216:2057 - Warning: default variables will be output.
Container input (variable) 216:2085 - Warning: default variable will be output.
Container input (variables) 216:2086 - Warning: default variables will be output.
DB to Spark 216:3070 - Error: Execution failed: Failed to load JDBC data: org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper.()

I get an error message like the one above.
I added the postgresql.jar file path to the Livy settings, but I still get the error.

What more settings do I need to add?

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.