Spark SQL 1.5 build error

I installed Spark 1.5 on Ubuntu 14.04 LTS. When running build with the build/mvn -Dscala-2.11 -DskipTests clean package I get the following build error during the Spark SQL project:

  [error] missing or invalid dependency detected while loading class file 'WebUI.class'. [error] Could not access term eclipse in package org, [error] because it (or its dependencies) are missing. Check your build definition for [error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.) [error] A full rebuild may help if 'WebUI.class' was compiled against an incompatible version of org. [error] missing or invalid dependency detected while loading class file 'WebUI.class'. [error] Could not access term jetty in value org.eclipse, [error] because it (or its dependencies) are missing. Check your build definition for [error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.) [error] A full rebuild may help if 'WebUI.class' was compiled against an incompatible version of org.eclipse. [warn] 22 warnings found [error] two errors found [error] Compile failed at Sep 18, 2015 6:09:38 PM [17.330s] [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Spark Project Parent POM ........................... SUCCESS [ 6.723 s] [INFO] Spark Project Core ................................. SUCCESS [03:07 min] ... [INFO] Spark Project Catalyst ............................. SUCCESS [ 58.166 s] [INFO] Spark Project SQL .................................. FAILURE [ 19.912 s] [INFO] Spark Project Hive ................................. SKIPPED [INFO] Spark Project Unsafe ............................... SKIPPED ... [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ 

Below is below my env variables in a .bashrc

 export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-amd64 export SCALA_HOME=/usr/local/src/scala/scala-2.11.7 export PATH=$SCALA_HOME/bin:$PATH export PATH=/home/ubuntu/apache-maven-3.3.3/bin:$PATH export SPARK_HOME=/home/ubuntu/spark-1.5.0 export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m" 

Update: I tried working with -Ylog-classpath, but did not work:

 Unable to parse command line options: Unrecognized option: -Ylog-classpath 
+7
maven build apache-spark apache-spark-sql
source share
5 answers

Refer to Angelo Genovese's comment, do not include build -Dscala-2.11 in the command.

+1
source share

Just run ./dev/change-scala-version.sh 2.11 from your spark directory to switch all code to 2.11. Then run mvn (3.3.3+) or make-distribution.sh with the flags you set.

+5
source share

If you don't need spark-sql specifically, then simply exclude the sql-related modules from the assembly:

mvn clean package -Dscala-2.11 -DskipTests -pl '!sql/core,!sql/catalyst,!sql/hive'

+1
source share

I also encountered this problem in a project that I imported into IntelliJ from Maven pom.xml . My co-worker helped me understand that although <scope>runtime</scope> is suitable for most dependencies, this particular dependency should be <scope>compile</scope> (for reasons we don’t understand):

 <dependency> <groupId>org.scala-lang</groupId> <artifactId>scala-reflect</artifactId> <version>${scala.version}</version> <scope>compile</scope> </dependency> 
0
source share

This build problem can be solved by initially replacing the scala version from 2.10 to 2.11 by running the "change-scala -version.sh" command located at @ spark-1.6.1 / dev / change-scala -version.sh 2.11

See the link below for more details. http://gibbons.org.uk/spark-on-windows-feb-2016

-one
source share

All Articles