Spark: Exception in thread "main" java.lang.VerifyError: class com.fasterxml.jackson.module.scala.ser.ScalaIteratorSerializer

I'm trying to make a simple Spark parallelized with some data in a large project, but even with the simplest example I get this error

Exception in thread "main" java.lang.VerifyError: class com.fasterxml.jackson.module.scala.ser.ScalaIteratorSerializer overrides final method withResolved.(Lcom/fasterxml/jackson/databind/BeanProperty;Lcom/fasterxml/jackson/databind/jsontype/TypeSerializer;Lcom/fasterxml/jackson/databind/JsonSerializer;)Lcom/fasterxml/jackson/databind/ser/std/AsArraySerializerBase; 

An error occurs by any simple parallelization, even such a simple one. I don’t even know where this error came from

  val conf: SparkConf = new SparkConf().setAppName("IEEG Spark").setMaster("local") val sc: SparkContext = new SparkContext(conf) val data = Array(1, 2, 3, 4, 5) val distVals = sc.parallelize(data) distVals.foreach(println) 

and the following my maven pom.xml file

 <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <groupId>astiefel</groupId> <artifactId>ieeg-spark</artifactId> <modelVersion>4.0.0</modelVersion> <name>Spark IEEG</name> <parent> <groupId>edu.upenn.cis.ieeg</groupId> <artifactId>ieeg</artifactId> <version>1.15-SNAPSHOT</version> </parent> <properties> <scala.version>2.10.4</scala.version> </properties> <dependencies> <dependency> <groupId>edu.upenn.cis.ieeg</groupId> <artifactId>ieeg-client</artifactId> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>1.5.0</version> </dependency> <dependency> <groupId>org.scala-lang</groupId> <artifactId>scala-compiler</artifactId> <version>${scala.version}</version> <scope>compile</scope> </dependency> <dependency> <groupId>org.scalanlp</groupId> <artifactId>breeze_2.10</artifactId> <version>0.10</version> </dependency> </dependencies> <build> <sourceDirectory>src/main/scala</sourceDirectory> <plugins> <plugin> <groupId>org.scala-tools</groupId> <artifactId>maven-scala-plugin</artifactId> <executions> <execution> <goals> <goal>compile</goal> <goal>testCompile</goal> </goals> </execution> </executions> <configuration> <scalaVersion>${scala.version}</scalaVersion> <args> <arg>-target:jvm-1.5</arg> </args> </configuration> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-eclipse-plugin</artifactId> <configuration> <downloadSources>true</downloadSources> <buildcommands> <buildcommand>ch.epfl.lamp.sdt.core.scalabuilder</buildcommand> </buildcommands> <additionalProjectnatures> <projectnature>ch.epfl.lamp.sdt.core.scalanature</projectnature> </additionalProjectnatures> <classpathContainers> <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer> <classpathContainer>ch.epfl.lamp.sdt.launching.SCALA_CONTAINER</classpathContainer> </classpathContainers> </configuration> </plugin> </plugins> </build> <reporting> <plugins> <plugin> <groupId>org.scala-tools</groupId> <artifactId>maven-scala-plugin</artifactId> <configuration> <scalaVersion>${scala.version}</scalaVersion> </configuration> </plugin> </plugins> </reporting> </project> 

I was so lost with this error, any tips on where to even start?

+6
source share
1 answer

You can run a simple Spark parallelize test with a lower dependency using spark-assembly-xxx.jar directly as a jar dependency.

 <profile> <id>apache-spark/scala</id> <dependencies> <!--Apache Spark --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-assembly</artifactId> <version>1.5.2</version> <scope>system</scope> <systemPath>${spark.home}/lib/spark-assembly-1.5.2-hadoop2.6.0.jar</systemPath> </dependency> <dependency> <groupId>org.scala-lang</groupId> <artifactId>scala-library</artifactId> <version>2.10.2</version> </dependency> </dependencies> </profile> 
0
source

All Articles