I keep trying to get the Spark shell to work to no avail.
OS: Windows 8.1
Spark: 1.3.1
Java: 8
Both unpacked and unpacked options are loaded (unpacked version, built by maven and a simple build tool). I tried to solve my problem in three different ways to no avail.
1) From my Spark directory, I am trying to start my shell with options spark-shell.cmdor .\bin\spark-shell.cmd.
I consistently get an error in these lines:
'C: \ Program' is not recognized as an internal or external command, operating program, or batch file.
Knowing a possible whitespace error, when I see it, I tried to execute variants of my command with quotes, full paths, etc. No results yet.
2) (C:\\ spark-1.3.1-bin-hadoop2.6).
, , :
find: 'version': .
3) - Spark Scala ( screencasts ). , Scala (2.11.6) . .
, Spark, . Scala .
.