Are you at the windows? I am dealing with the same things, and several things have helped. In 00-pyspark-setup.py change this line (map your path to your spark folder)
# Configure the environment if 'SPARK_HOME' not in os.environ: print 'environment spark not set' os.environ['SPARK_HOME'] = 'C:/spark-1.4.1-bin-hadoop2.6'
I am sure that you have added a new environment variable, if not, it will manually set it.
The next thing I noticed is that if you are using ipython 4 (the latter), the configuration files do not work as you see in all the tutorials. You can try if your configuration files are called by adding a print statement or just messing them up so that the error is reset.
I am using a lower version of iPython (3) and I am calling it using
ipython notebook --profile=pyspark
ninehundred
source share