Is it possible to add a value to the PYTHONPATH of a worker in a spark?
I know that you can go to each node worker, configure spark-env.sh and do it, but I want a more flexible approach
I am trying to use the setExecutorEnv method, but without success
conf = SparkConf().setMaster("spark://192.168.10.11:7077")\ .setAppName(''myname')\ .set("spark.cassandra.connection.host", "192.168.10.11") / .setExecutorEnv('PYTHONPATH', '$PYTHONPATH:/custom_dir_that_I_want_to_append/')
It creates a pythonpath env.variable for each artist, makes it be lower_case, and does not interpret the $ PYTHONPATH command to add a value.
I end up with two different env.variables,
pythonpath : $PYTHONPATH:/custom_dir_that_I_want_to_append PYTHONPATH : /old/path/to_python
The first is dynamically created, and the second already existed before.
Does anyone know how to do this?
source share