I encountered the same problem, after the investigation, I found that there is a compatibility problem between spark version and winutils.exe from hadoop-2.xx .
hadoop-2.7.1 winutils.exe spark-2.2.0-bin-hadoop2.7 hadoop-2.6.0 winutils.exe spark-1.6.0-bin-hadoop2.6
SCALA_HOME : C:\Program Files (x86)\scala2.11.7;
JAVA_HOME : C:\Program Files\Java\jdk1.8.0_51
HADOOP_HOME : C:\Hadoop\winutils-master\hadoop-2.7.1
SPARK_HOME : C:\Hadoop\spark-2.2.0-bin-hadoop2.7
PATH : %JAVA_HOME%\bin;%SCALA_HOME%\bin;%HADOOP_HOME%\bin;%SPARK_HOME%\bin;
C:\tmp\hive ,
C:\Hadoop\winutils-master\hadoop-2.7.1\bin>winutils.exe chmod -R 777 C:\tmp\hive
Dera metastore_db , .
C:\Users\<User_Name>\metastore_db
-
C:>spark-shell
