I developed a web application with Spring Boot that uses Apache Spark to query data from different data sources (e.g. Oracle). At first, I planned to launch the application without sending it using a spark-submit script, but it looks like I cannot connect to the Master cluster without sending it to the bank. I have successfully created the uber jar, which includes all the dependencies and subprojects that I use, but it seems that Spark does not like Spring boot applications. When I try to send the application, the spark shows the following error:
Exception in thread "main" java.lang.IllegalArgumentException: LoggerFactory is not a Logback LoggerContext but Logback is on the classpath. Either remove Logback or the competing implementation (class org.slf4j.impl.Log4jLoggerFactory loaded from file:/home/rojasmi1/spark/spark-1.4.0/assembly/target/scala-2.10/spark-assembly-1.4.0-hadoop2.2.0.jar). If you are using Weblogic you will need to add 'org.slf4j' to prefer-application-packages in WEB-INF/weblogic.xml Object of class [org.slf4j.impl.Log4jLoggerFactory] must be an instance of class ch.qos.logback.classic.LoggerContext at org.springframework.util.Assert.isInstanceOf(Assert.java:339) at org.springframework.boot.logging.logback.LogbackLoggingSystem.getLoggerContext(LogbackLoggingSystem.java:151) at org.springframework.boot.logging.logback.LogbackLoggingSystem.getLogger(LogbackLoggingSystem.java:143) at org.springframework.boot.logging.logback.LogbackLoggingSystem.beforeInitialize(LogbackLoggingSystem.java:89) at org.springframework.boot.logging.LoggingApplicationListener.onApplicationStartedEvent(LoggingApplicationListener.java:152) at org.springframework.boot.logging.LoggingApplicationListener.onApplicationEvent(LoggingApplicationListener.java:139) at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:151) at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:128) at org.springframework.boot.context.event.EventPublishingRunListener.publishEvent(EventPublishingRunListener.java:100) at org.springframework.boot.context.event.EventPublishingRunListener.started(EventPublishingRunListener.java:54) at org.springframework.boot.SpringApplication.run(SpringApplication.java:277) at org.springframework.boot.SpringApplication.run(SpringApplication.java:957) at org.springframework.boot.SpringApplication.run(SpringApplication.java:946) at ch.dlx.QubidaOracleConnectorApplication.main(QubidaOracleConnectorApplication.java:12) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Using Spark default log4j profile: org / apache / spark / log4j-defaults.properties
I tried to eliminate the slf4j-log4j12 dependency in the pom file, but I still get the same error.
The pom file contains the following configuration:
<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>ch.dlx</groupId> <artifactId>qubida-oracle-connector</artifactId> <version>0.0.1-SNAPSHOT</version> <name>qubida-oracle-connector</name> <description></description> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <java.version>1.8</java.version> </properties> <dependencyManagement> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-dependencies</artifactId> <version>1.2.5.RELEASE</version> <type>pom</type> <scope>import</scope> </dependency> </dependencies> </dependencyManagement> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>log4j-over-slf4j</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-tomcat</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> </dependency> <!-- Spark --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> <version>1.4.0</version> <scope>provided</scope> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.11</artifactId> <version>1.4.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.mongodb</groupId> <artifactId>mongo-hadoop-core</artifactId> <version>1.3.0</version> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>log4j-over-slf4j</artifactId> </exclusion> </exclusions> </dependency> <!-- DB Drivers --> <dependency> <groupId>com.oracle</groupId> <artifactId>ojdbc14</artifactId> <version>10.2.0.4.0</version> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> <configuration> <createDependencyReducedPom>false</createDependencyReducedPom> <keepDependenciesWithProvidedScope>true</keepDependenciesWithProvidedScope> <artifactSet> <excludes> <exclude>org.slf4j</exclude> </excludes> </artifactSet> </configuration> <executions> <execution> <phase>package</phase> <goals> <goal>shade</goal> </goals> </execution> </executions> </plugin> </plugins> </build>
Is there a way to send a Spring Boot application to the cluster? Should I use a different type of project given that I need to open the RESTful API? Is there a way to connect to a spark cluster without sending .jar?
Thanks in advance for your help.
java spring-boot jar apache-spark
Stefan s
source share