When you run the spark-shell command, the following error appears:
I downloaded the spark-2.1.1-bin-hadoop2.7.tgz file from http://spark.apache.org/downloads.html , extracted the tar file and inserted the contents of the folder into the c: \ Spark directory. After that, I set up the environment variable for spark and jdk respectively, but I get this error. Any help would be appreciated.
I am pretty sure that your environment variable JAVA_HOMEcontains a space that breaks spark-shell. Please reinstall Java in the directory with no spaces in the path.
JAVA_HOME
spark-shell
bin/spark-class2.cmd, spark-shell Windows spark-shell ( bin/spark-submit2.cmd):
if "x%1"=="x" (
, spark-class2.cmd %1 ( - ), :
spark-class2.cmd
%1
if "x"Files\Java\jdk1.8.0_45""=="x" (
- .
, JAVA_HOME . , , .
, Java Windows
JAVA_HOME=C:\Program Files (x86)\Java\jdk1.8.0_162\bin
, "Program Files (x86)" 10
"C:\Program Files (x86)\Java\jdk1.8.0_162\bin"
Java Program Files (x86),
JAVA_HOME=C:\java\jdk1.8.0_171\bin
java, , C:\Program Files\Java.. . , JAVA_HOME , . , , C:\Java.. . , "C:\Program Files\Java.." , , . - , .
java spark-env.sh " "
export JAVA_HOME=/usr/lib/jvm/java-8-oracle
I fixed this error using the abbreviation "Progra ~ 1", which is the abbreviation "Program Files (x86)".
C: \ Progra ~ 1 \ Java \ jdk1.8.0_161