Why does the spark shell fail with the error: not found: spark value?

I am using Spark 2.1.0.

When I run spark-shell, I encounter this error:

<console>:14: error: not found: value spark
       import spark.implicits._
              ^
<console>:14: error: not found: value spark
       import spark.sql
              ^

What could be the reason? How to fix it?

+13
source share
7 answers

I encountered the same problem, after the investigation, I found that there is a compatibility problem between spark version and winutils.exe from hadoop-2.xx .

hadoop-2.7.1 winutils.exe spark-2.2.0-bin-hadoop2.7 hadoop-2.6.0 winutils.exe spark-1.6.0-bin-hadoop2.6

SCALA_HOME  : C:\Program Files (x86)\scala2.11.7;
JAVA_HOME   : C:\Program Files\Java\jdk1.8.0_51
HADOOP_HOME : C:\Hadoop\winutils-master\hadoop-2.7.1
SPARK_HOME  : C:\Hadoop\spark-2.2.0-bin-hadoop2.7
PATH    : %JAVA_HOME%\bin;%SCALA_HOME%\bin;%HADOOP_HOME%\bin;%SPARK_HOME%\bin;

C:\tmp\hive ,

C:\Hadoop\winutils-master\hadoop-2.7.1\bin>winutils.exe chmod -R 777 C:\tmp\hive

Dera metastore_db , .

C:\Users\<User_Name>\metastore_db

-

C:>spark-shell

enter image description here

+8

, - ( - , Windows, winutils.exe - , Derby metastore).

- , .

+2

Cloudera, soltuion github (https://github.com/cloudera/clusterdock/issues/30):

root ( , ) HDFS. (sudo -u hdfs hdfs dfs -mkdir/user/root, sudo -u hdfs dfs -chown root: root/user/root), .

.. HDFS , .

+1

"chmod 777"

C:\tmp\hive

, : -

C:\spark\Hadoop\bin\winutils.exe chmod 777 C:\tmp\hive

http://mytechnologythought.blogspot.com/2017/10/fixed-spark-setup-error-not-found-spark.html

+1

Ubuntu

, .

- , -.

0

Cloudera, , cloudera , HIVE . , , - . ( HIVE METASTORE, HIVESERVER, HOSTS)

Spark, , HDFS, YARN HIVE .

, HIVE .

0

I had the same error. In my case, the hard drive was almost full. I deleted several large files from the disk and restarted after rebooting. It worked! But I think that this is not always the case.

0
source

All Articles