Using jars on pyspark (xgboost)

Hi, I am trying to do spark xgboost using pyspark nodes.

I have my jar files on my workflow directory, and also used the export context properties to make sure it reads the jars from the said directory.

It is able to the parquet file from the same directory, but when it reads the jars, it throws an error JavaPackage object is not callable.

here is the detailed logs:
Traceback (most recent call last):
File “C:/Users/mizuakari/AppData/Local/Temp/”, line 107, in
predictionCol=‘prediction’, eval_metric = ‘auc’)
File “C:\Users\mizuakari\Downloads\knime_3.7.2.win32.win32.x86_64\knime_3.7.2\plugins\org.knime.bigdata.spark.local_2.4.1.v201901281147\libs\\”, line 110, in wrapper
File “C:\Users\mizuakari\AppData\Local\Temp\knime_localspark_6207716683640157987\spark_local_dir\spark-8b04c409-cb4d-4c93-9586-2be915eab78d\userFiles-9a6dcd79-04ea-46eb-b9f0-4a55e4dcb3b2\\sparkxgb\”, line 115, in init
File “C:\Users\mizuakari\Downloads\knime_3.7.2.win32.win32.x86_64\knime_3.7.2\plugins\org.knime.bigdata.spark.local_2.4.1.v201901281147\libs\\pyspark\ml\”, line 67, in _new_java_obj
TypeError: ‘JavaPackage’ object is not callable

I also attached the zip files with all the required files.

Looking forward for assistance

Best regards,
Mizu~ (2.4 MB)

Hi @Mizunashi92
I think the issue is, that the jars are not present in the current Spark Context. Could you try to add them in the Create Local Big Data node, using the custom spark.jars Spark Setting?

best regards Mareike

1 Like

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.