Failed to send Spring bootable java application to Spark cluster

I developed a web application with Spring Boot that uses Apache Spark to query data from different data sources (e.g. Oracle). At first, I planned to launch the application without sending it using a spark-submit script, but it looks like I cannot connect to the Master cluster without sending it to the bank. I have successfully created the uber jar, which includes all the dependencies and subprojects that I use, but it seems that Spark does not like Spring boot applications. When I try to send the application, the spark shows the following error:

Exception in thread "main" java.lang.IllegalArgumentException: LoggerFactory is not a Logback LoggerContext but Logback is on the classpath. Either remove Logback or the competing implementation (class org.slf4j.impl.Log4jLoggerFactory loaded from file:/home/rojasmi1/spark/spark-1.4.0/assembly/target/scala-2.10/spark-assembly-1.4.0-hadoop2.2.0.jar). If you are using Weblogic you will need to add 'org.slf4j' to prefer-application-packages in WEB-INF/weblogic.xml Object of class [org.slf4j.impl.Log4jLoggerFactory] must be an instance of class ch.qos.logback.classic.LoggerContext at org.springframework.util.Assert.isInstanceOf(Assert.java:339) at org.springframework.boot.logging.logback.LogbackLoggingSystem.getLoggerContext(LogbackLoggingSystem.java:151) at org.springframework.boot.logging.logback.LogbackLoggingSystem.getLogger(LogbackLoggingSystem.java:143) at org.springframework.boot.logging.logback.LogbackLoggingSystem.beforeInitialize(LogbackLoggingSystem.java:89) at org.springframework.boot.logging.LoggingApplicationListener.onApplicationStartedEvent(LoggingApplicationListener.java:152) at org.springframework.boot.logging.LoggingApplicationListener.onApplicationEvent(LoggingApplicationListener.java:139) at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:151) at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:128) at org.springframework.boot.context.event.EventPublishingRunListener.publishEvent(EventPublishingRunListener.java:100) at org.springframework.boot.context.event.EventPublishingRunListener.started(EventPublishingRunListener.java:54) at org.springframework.boot.SpringApplication.run(SpringApplication.java:277) at org.springframework.boot.SpringApplication.run(SpringApplication.java:957) at org.springframework.boot.SpringApplication.run(SpringApplication.java:946) at ch.dlx.QubidaOracleConnectorApplication.main(QubidaOracleConnectorApplication.java:12) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 

Using Spark default log4j profile: org / apache / spark / log4j-defaults.properties

I tried to eliminate the slf4j-log4j12 dependency in the pom file, but I still get the same error.

The pom file contains the following configuration:

 <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>ch.dlx</groupId> <artifactId>qubida-oracle-connector</artifactId> <version>0.0.1-SNAPSHOT</version> <name>qubida-oracle-connector</name> <description></description> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <java.version>1.8</java.version> </properties> <dependencyManagement> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-dependencies</artifactId> <version>1.2.5.RELEASE</version> <type>pom</type> <scope>import</scope> </dependency> </dependencies> </dependencyManagement> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>log4j-over-slf4j</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-tomcat</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> </dependency> <!-- Spark --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> <version>1.4.0</version> <scope>provided</scope> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.11</artifactId> <version>1.4.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.mongodb</groupId> <artifactId>mongo-hadoop-core</artifactId> <version>1.3.0</version> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>log4j-over-slf4j</artifactId> </exclusion> </exclusions> </dependency> <!-- DB Drivers --> <dependency> <groupId>com.oracle</groupId> <artifactId>ojdbc14</artifactId> <version>10.2.0.4.0</version> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> <configuration> <createDependencyReducedPom>false</createDependencyReducedPom> <keepDependenciesWithProvidedScope>true</keepDependenciesWithProvidedScope> <artifactSet> <excludes> <exclude>org.slf4j</exclude> </excludes> </artifactSet> </configuration> <executions> <execution> <phase>package</phase> <goals> <goal>shade</goal> </goals> </execution> </executions> </plugin> </plugins> </build> 

Is there a way to send a Spring Boot application to the cluster? Should I use a different type of project given that I need to open the RESTful API? Is there a way to connect to a spark cluster without sending .jar?

Thanks in advance for your help.

+8
java spring-boot jar apache-spark
source share
3 answers

I had a similar problem, to solve it, try removing Spring Boot logging with the following exception:

  <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> <exclusions> <exclusion> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-logging</artifactId> </exclusion> </exclusions> </dependency> 

If you still get an error while initializing the servlet

java.lang.NoSuchMethodError: javax.servlet.ServletContext.getVirtualServerName () Ljava / lang / String;

Then try using version 1.2.1.RELEASE of the parent-starter, as this is caused by the version of the servlet-api used by Spark Cluster.

+4
source share

During the build process, Spring Boot looks to make sure that you include a specific logging implementation in your assembly, and if you did not, it uses Logback by default. Apparently, Spark adds Log4J to the class path when the application starts, which in turn causes a run-time error because Spring Boot now finds two options for executing the log in the class path: the one that it included at build time (Logback) and one Spark adds at runtime (Log4J).

If Spark provides a way to suppress the inclusion of Log4J at run time, you can do this and just put the Spring Boot wire in Logback by default.

If Spark forces Log4J on you, then the solution must explicitly include Log4J (not Logback) in your assembly so that Spring Boot β€œsees” this build time and thus does not enable Logback.

EDIT: I had to test my assumption by looking at the Spring Boot docs. You should also explicitly exclude Log4J. See Spring Download Registration Documents .

+1
source share

Spark only supports log4j. To force spring-boot to use log4j instead of logback by default, use this procedure from the spring-boot help documentation , but be sure to change log4j2 to log4j and give it a version like 1.2.17. You will also need to put the log4j.properties file in src/main/resources . You can copy the log4j.properties.template file from the Spark / conf directory and rename it to log4j.properties.

0
source share

All Articles