How to install Apache Toree for spark core in Jupyter in (ana) conda environment?

I am trying to install Jupyter support for Spark in a conda environment (which I installed using http://conda.pydata.org/docs/test-drive.html ) anaconda distribution . I am trying to use apache toree like Jupyter Kernel for this.

Here is what I did after installing Anaconda:

conda create --name jupyter python=3
source activate jupyter
conda install jupyter
pip install --pre toree
jupyter toree install

Everything worked fine until I got to the last line. There i get

PermissionError: [Errno 13] Permission denied: '/usr/local/share/jupyter'

What begs the question: why is he even looking in this directory? After that, he must remain in the environment. So I go out

jupyter --paths

and get

config:
    /home/user/.jupyter
    ~/anaconda2/envs/jupyter/etc/jupyter
    /usr/local/etc/jupyter
    /etc/jupyter
data:
    /home/user/.local/share/jupyter
    ~/anaconda2/envs/jupyter/share/jupyter
    /usr/local/share/jupyter
    /usr/share/jupyter
runtime:
    /run/user/1000/jupyter

, , , (, , ) conda "jupyter".

+4
2

Jupyter ​​ . --user, . kernelspec.py. ​​ ​​

jupyter toree install --user
+6

--help, :

$ jupyter toree install --help
A Jupyter kernel for talking to spark

Options
-------

Arguments that take values are actually convenience aliases to full
Configurables, whose aliases are listed on the help line. For more information
on full configurables, see '--help-all'.

--user
    Install to the per-user kernel registry
--replace
    Replace any existing kernel spec with this name.
--sys-prefix
    Install to Python sys.prefix. Useful in conda/virtual environments.
--debug
    set log level to logging.DEBUG (maximize logging output)
--kernel_name= (ToreeInstall.kernel_name)
    Default: 'Apache Toree'
    Install the kernel spec with this name. This is also used as the base of the
    display name in jupyter.
--spark_home= (ToreeInstall.spark_home)
    Default: '/usr/local/spark'
    Specify where the spark files can be found.
--toree_opts= (ToreeInstall.toree_opts)
    Default: ''
    Specify command line arguments for Apache Toree.
--spark_opts= (ToreeInstall.spark_opts)
    Default: ''
    Specify command line arguments to proxy for spark config.
--interpreters= (ToreeInstall.interpreters)
    Default: 'Scala'
    A comma separated list of the interpreters to install. The names of the
    interpreters are case sensitive.
--python_exec= (ToreeInstall.python_exec)
    Default: 'python'
    Specify the python executable. Defaults to "python"
--log-level= (Application.log_level)
    Default: 30
    Choices: (0, 10, 20, 30, 40, 50, 'DEBUG', 'INFO', 'WARN', 'ERROR', 'CRITICAL')
    Set the log level by value or name.
--config= (JupyterApp.config_file)
    Default: ''
    Full path of a config file.

To see all available configurables, use `--help-all`

Examples
--------

    jupyter toree install
    jupyter toree install --spark_home=/spark/home/dir
    jupyter toree install --spark_opts='--master=local[4]'
    jupyter toree install --kernel_name=toree_special
    jupyter toree install --toree_opts='--nosparkcontext'
    jupyter toree install --interpreters=PySpark,SQL
    jupyter toree install --python=python

jupyter toree install --sys-prefix - conda venv.

+2

All Articles