Spark: attempt to start a spark shell, but get "cmd" is not recognized as internal or

I am trying to install Spark on a Windows desktop. Everything should work fine, but I get an error. "Cmd" is not recognized as an internal or external command ... "

I installed Scala, Java JDK and unpacked Spark tgz in C: \, but for some reason I can not start Spark in cmd. Any ideas?

+5
source share
6 answers

My colleague solved the problem. Although Java seems to be working fine (link), the Java path that Spark tried to read was incorrect with an extra \ bin at the end. When it was removed, Spark started working! @gonbe, thank you so much for your efforts to help!

+8
source

(I'm not a Windows Spark user) Sparks-shell.cmd for Windows source code expects the cmd command to be available in PATH.

https://github.com/apache/spark/blob/master/bin/spark-shell.cmd

Try adding the directory containing "cmd.exe" in the PATH environment variable? The directory location shows the title bar in the screenshot, and the environment variable can be set using the control panel.

+3
source

I had a similar error. I fixed it after the following changes:

  • There were several Java / bin paths to the system path. So I fixed them to reflect one Java / Bin that syncs with JAVA_HOME
  • Added C: Windows \ system32 to the system path variable.
  • My Java_Home and java.exe pointed to different places. I fixed them.

Now it works.

Thanks guys.

+3
source

All my variables were fine, so I decided to debug the scripts, and I found in "spark-class2.cmd" and added a couple more quotes to "% RUNNERS%". BEFORE "% RUNNER%" -Xmx128m -cp "% LAUNCH_CLASSPATH%" .... AFTER "% RUNNER%" "-Xmx128m -cp"% LAUNCH_CLASSPATH% "....

+1
source

Check the values ​​in JAVA_HOME and make sure it indicates the correct value. Add% JAVA_HOME% / bin to the path value. After the change, close the command prompt and restart it. Write a spark shell and it will work.

+1
source

In my case, I had a similar problem. I had to fix a couple of things.

1- The JAVA_HOME check is true in both places;

enter image description here

enter image description here

2- Then I had to change the following lines in the spark-2.1.1-bin-hadoop2.7 \ bin folder .

enter image description here

  • Add extra quotes around "% RUNNER%". So it will look like "% RUNNER%" "
  • Then follow. \ spark-shell.cmd again.

enter image description here

0
source

All Articles