I don’t think this is possible with just the node itself. Maybe you could use a PySpark Script Source to get at the Spark properties that way? But I’ll defer to @sascha.wolke for a more educated answer
I have a situation where I terminated the Spark Session with the Destroy Spark Context node, but due to unexpected circumstances, the Destroy Spark Context could not be executed and the WorkFlow was terminated.
Therefore, I want to get the generated Session id & name and force termination externally via API, which requires the Session id & name value in Context Settings as shown above.
You mentioned that I can pull that value by creating a DataFrame and connecting it as a flow variable, but that doesn’t work for me. Do you have an example of this?