I would like to launch a spark shell with an external package behind the corporate proxy. Unfortunately, external packages transferred using the --packages option --packages not allowed.
For example, at startup
bin/spark-shell
cassandra connector package not allowed (stuck on last line):
Ivy Default Cache set to: /root/.ivy2/cache The jars for the packages stored in: /root/.ivy2/jars :: loading settings :: url = jar:file:/opt/spark/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml datastax
After some time, the connection time contains error messages similar to this:
:::: ERRORS Server access error at url https:
When I deactivate a VPN with a corporate proxy, the packet will be allowed and immediately downloaded.
What I have tried so far:
Mapping proxies as environment variables:
export http_proxy=<proxyHost>:<proxyPort> export https_proxy=<proxyHost>:<proxyPort> export JAVA_OPTS="-Dhttp.proxyHost=<proxyHost> -Dhttp.proxyPort=<proxyPort>" export ANT_OPTS="-Dhttp.proxyHost=<proxyHost> -Dhttp.proxyPort=<proxyPort>"
Starting a spark shell with additional java options:
bin/spark-shell --conf "spark.driver.extraJavaOptions=-Dhttp.proxyHost=<proxyHost> -Dhttp.proxyPort=<proxyPort>" --conf "spark.executor.extraJavaOptions=-Dhttp.proxyHost=<proxyHost> -Dhttp.proxyPort=<proxyPort>" --packages datastax:spark-cassandra-connector:1.6.0-M1-s_2.10
Is there any other configuration option that I don't see?
source share