How to start Spark Shell in window 8.1?

I keep trying to get the Spark shell to work to no avail.

OS: Windows 8.1

Spark: 1.3.1

Java: 8

Both unpacked and unpacked options are loaded (unpacked version, built by maven and a simple build tool). I tried to solve my problem in three different ways to no avail.

1) From my Spark directory, I am trying to start my shell with options spark-shell.cmdor .\bin\spark-shell.cmd.

I consistently get an error in these lines:

'C: \ Program' is not recognized as an internal or external command, operating program, or batch file.

Knowing a possible whitespace error, when I see it, I tried to execute variants of my command with quotes, full paths, etc. No results yet.

2) (C:\\ spark-1.3.1-bin-hadoop2.6).

, , :

find: 'version': .

3) - Spark Scala ( screencasts ). , Scala (2.11.6) . .

, Spark, . Scala .

.

+4
5

bin\spark-class2.cmd

set RUNNER="%JAVA_HOME%\bin\java"

(remove the)

set RUNNER=%JAVA_HOME%\bin\java
+4
  • Spark C:\ . (C:\spark-1.6.0-bin-hadoop2.6).

  • PATH find.exe

+2

C:\Program is not recognized, C:\Program Files C:\Progra~1, C:\Program Files.

+2

Spark Scala C: \, java- C: \,

0

Spark Windows 8.1 . , - Safari "Apache Spark Scala": Apache Spark Scala " " . , , ... -. , . . , . "2.1.0:" 2.1.1. . , -, , .cmd 2.1.1: from https://spark.apache.org/downloads.html p >

0

All Articles