Hi,
this question is the extension of this previus
I cannot Install numpy on company hadoop cluster.
Can I ad my python library via personalization of livy session? ( Node Create Spark Context (Livy)
Can someone help me?
Thank 1000 in advance
Giorgio
Hi @salvatorigio,
You might be able to provide an archive with the dependencies using spark.yarn.dist.archives
. Never tested this, it might depend on your Livy configuration if it allows providing them, or needs to be configured in the Livy server config. A blog post about how to create such an archive can be found here: How to Manage Python Dependencies in Spark
Cheers,
Sascha
1 Like