I am currently facing the problem of deploying uber-jar in a Spark Streaming application, where there are congruent JAR files with different versions that cause a spark to rule out runtime exceptions. This library contains TypeSafe Config .
After many attempts, my solution was to defer the shadowing of the provided dependency so that it would not collide with the JAR provided by Spark at runtime.
Therefore, I went to the documentation for sbt-assembly and when shading I saw the following example:
assemblyShadeRules in assembly := Seq( ShadeRule.rename("org.apache.commons.io.**" -> " shadeio.@1 ") .inLibrary("commons-io" % "commons-io" % "2.4", ...).inProject )
Trying to obscure com.typesafe.config , I tried applying the following solution to my build.sbt :
assemblyShadeRules in assembly := Seq( ShadeRule.rename("com.typesafe.config.**" -> " shadeio.@1 ").inProject )
I assumed that he should have renamed any TypeSafe Config link in my project. But that does not work. It corresponds to several classes in my project and forces them to be removed from uber jar. I see this when I try to run sbt assembly :
Fully-qualified classname does not match jar entry: jar entry: ***/Identifier.class class name: **/Identifier.class Omitting ***/OtherIdentifier.class. Fully-qualified classname does not match jar entry: jar entry: ***\SparkBaseJobRunner$$anonfun$1.class class name: ***/SparkBaseJobRunner$$anonfun$1.class
I also tried using:
assemblyShadeRules in assembly := Seq( ShadeRule.rename("com.typesafe.config.**" -> " shadeio.@1 ") .inLibrary("com.typesafe" % "config" % "1.3.0")
This completed the uber JAR build process, but did not have the desired runtime effect.
I'm not sure I fully understand what effect shading has in my build process using sbt.
How can I obscure com.typesafe.config in my project, so when I call the library at runtime, Spark will load my shaded library and avoid the collision caused by versioning?
I am running sbt-assembly v0.14.1