DB Connector (Sql Server) to DB to Spark "integrated security" issue

#1

Hi All,

From what I’ve learned, different DB’s in KNIME can’t be joined without first initiating Spark or similar. DB Connectors using Azure + Spark work great for me, but there’s incompatibility when trying to connect Spark to DB Connectors using SQL Server:

ERROR DB to Spark Execute failed: Failed to load JDBC data: This driver is not configured for integrated authentication. ClientConnectionId:96f56d7a-9c3e-4727-ba36-d58d4736c7ed

SQL Server’s JDBC Parameter for “integratedSecurity true” seems required for me. But this seems to be the Spark incompatibility. Anyone know of a workaround? I’m on Knime4

0 Likes

#2

Hi cscheeser,
native authentication will very likely not work when you are using the DB to Spark node. The reason for this is that each Spark worker which are running on machines within your big data cluster will open a connection to your database using the settings from the DB Connector node in KNIME. For more details on this see the Spark JDBC data source documentation which we use to load the data from a database into Spark. So in your case this is the jdbc url including the parameters e.g. nativeAuthentication=true but no user name and password. That is why I would suggest to use user name and password based authentication in this case if possible.
Bye
Tobias

0 Likes