Pyspark problem

When I try using Pyspark Script nodes whith the “Create Local big data Environnment” it returns : Traceback (most recent call last):
File “C:/Users/FABIEN~1/AppData/Local/Temp/pythonScript_fb21a911_61b2_40cf_98ee_484e55357a1416476276819499058754.py”, line 3, in
from pyspark.mllib.common import py2java, java2py
File "F:\knime_4.4.1\plugins\org.knime.bigdata.spark.local_4.5.2.v202203041212\libs\pyspark.zip\pyspark\mllib_init
.py", line 28, in
File "F:\anaconda3\envs\py3_knime\lib\site-packages\numpy_init
.py", line 140, in
from . import _distributor_init
File “F:\anaconda3\envs\py3_knime\lib\site-packages\numpy_distributor_init.py”, line 34, in
from . import _mklinit
ImportError: DLL load failed: Le module sp�cifi� est introuvable.

Has anybody hint about it ?

HI @Fabien_Couprie,

sounds like numpy Problem with your anacona installation. Does numpy work in the Python Nodes and without KNIME?

Not sure how you installed numpy in anaconda, but this might help: ImportError: DLL load failed for numpy 1.16.1 · Issue #12957 · numpy/numpy · GitHub

Cheers
Sascha

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.