Is it possible to use json4s 3.2.11 with Spark 1.3.0?

Spark has a dependency on json4s 3.2.10, but this version has a few bugs, and I need to use 3.2.11. I added json4s-native 3.2.11 build.sbt dependency and everything compiled is fine. But when I make me submit my JAR, it gives me 3.2.10.

build.sbt

import sbt.Keys._ name := "sparkapp" version := "1.0" scalaVersion := "2.10.4" libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.0" % "provided" libraryDependencies += "org.json4s" %% "json4s-native" % "3.2.11"` 

plugins.sbt

 logLevel := Level.Warn resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns) addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0") 

App1.scala

 import org.apache.spark.SparkConf import org.apache.spark.rdd.RDD import org.apache.spark.{Logging, SparkConf, SparkContext} import org.apache.spark.SparkContext._ object App1 extends Logging { def main(args: Array[String]) = { val conf = new SparkConf().setAppName("App1") val sc = new SparkContext(conf) println(s"json4s version: ${org.json4s.BuildInfo.version.toString}") } } 

sbt 0.13.7 + sbt-assembly 0.13.0 Scala 2.10.4

Is there a way to use using version 3.2.11?

+5
source share
3 answers

We ran into a problem similar to that described by Necro, but downgrading from 3.2.11 to 3.2.10 when the build assembly did not resolve it. We completed its solution (using Spark 1.3.1), obscuring version 3.2.11 in the task bank:

 assemblyShadeRules in assembly := Seq( ShadeRule.rename("org.json4s.**" -> " shaded.json4s.@1 ").inAll ) 
+4
source

I asked the same question on the Spark User mailing list and got two answers on how to make it work:

  • Use spark.driver.userClassPathFirst = true and spark.executor.userClassPathFirst = true, but it only works in Spark 1.3 and will probably require some other modifications, such as excluding Scala classes from your assembly.

  • Recover Spark with json4s version 3.2.11 (you can change it in core / pom.xml)

Both work fine, I prefer the second.

+3
source

This is not an answer to your question, but it happened while searching for my problem. I was getting a NoSuchMethod exception in formats.emptyValueStrategy.replaceEmpty (value) in json4s 'render'. The reason is that I am building with 3.2.11, but Spark linked 3.2.10. I dropped to 3.2.10 and my problem disappeared. Your question helped me understand what was going on (that Spark links the inconsistent version of json4s) and I was able to solve the problem, so thanks.

+2
source

Source: https://habr.com/ru/post/1215931/


All Articles