I am trying to add a spark to my python path:
(myenv) me@me /home/me$ set SPARK_HOME="/home/me/spark-1.2.1-bin-hadoop2.4" (myenv) me@me /home/me$ set PYTHONPATH=$PYTHONPATH:$SPARK_HOME:$SPARK_HOME/python:$SPARK_HOME/python/build:$SPARK_HOME/bin (myenv) me@me /home/me$ python -c 'import sys; print(sys.path)' ['', '/home/me/.virtualenvs/default/lib/python2.7', '/home/me/.virtualenvs/default/lib/python2.7/plat-x86_64-linux-gnu', '/home/me/.virtualenvs/default/lib/python2.7/lib-tk', '/home/me/.virtualenvs/default/lib/python2.7/lib-old', '/home/me/.virtualenvs/default/lib/python2.7/lib-dynload', '/usr/lib/python2.7', '/usr/lib/python2.7/plat-x86_64-linux-gnu', '/usr/lib/python2.7/lib-tk', '/home/me/.virtualenvs/default/local/lib/python2.7/site-packages', '/home/me/.virtualenvs/default/lib/python2.7/site-packages'] (myenv) me@me /home/me$ python -c 'import pyspark' Traceback (most recent call last): File "<string>", line 1, in <module> ImportError: No module named pyspark
source share