Cannot add spark to PYTHONPATH

I am trying to add a spark to my python path:

(myenv) me@me /home/me$ set SPARK_HOME="/home/me/spark-1.2.1-bin-hadoop2.4" (myenv) me@me /home/me$ set PYTHONPATH=$PYTHONPATH:$SPARK_HOME:$SPARK_HOME/python:$SPARK_HOME/python/build:$SPARK_HOME/bin (myenv) me@me /home/me$ python -c 'import sys; print(sys.path)' ['', '/home/me/.virtualenvs/default/lib/python2.7', '/home/me/.virtualenvs/default/lib/python2.7/plat-x86_64-linux-gnu', '/home/me/.virtualenvs/default/lib/python2.7/lib-tk', '/home/me/.virtualenvs/default/lib/python2.7/lib-old', '/home/me/.virtualenvs/default/lib/python2.7/lib-dynload', '/usr/lib/python2.7', '/usr/lib/python2.7/plat-x86_64-linux-gnu', '/usr/lib/python2.7/lib-tk', '/home/me/.virtualenvs/default/local/lib/python2.7/site-packages', '/home/me/.virtualenvs/default/lib/python2.7/site-packages'] (myenv) me@me /home/me$ python -c 'import pyspark' Traceback (most recent call last): File "<string>", line 1, in <module> ImportError: No module named pyspark 
+5
source share
2 answers

I have the same problem, but it helped.

Just add the following command to your .bashrc

 export SPARK_HOME=/path/to/your/spark-1.4.1-bin-hadoop2.6 export PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/build:$PYTHONPATH export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$PYTHONPATH 
+6
source

I think you confused PYTHONPATH and sys.path . But are you sure you need to change PYTHONPATH if you have pyspark installed correctly?

EDIT:

I have not used pyspark, but will it help? import pyspark in python shell

0
source

Source: https://habr.com/ru/post/1214552/


All Articles