How to add a Spark project to Maven in Eclipse?

I would like to start a Spark project in Eclipse using Maven. I installed m2eclipse and I have a working HelloWorld Java application in my Maven project.

I would like to use the Spark infrastructure, and I follow the instructions on the official site . I added the Spark repository to my pom.xml :

 <repository> <id>Spark repository</id> <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url> </repository> 

And then the dependency:

 <dependency> <groupId>spark</groupId> <artifactId>spark</artifactId> <version>0.9.9.4-SNAPSHOT</version> </dependency> 

But I get an error in Eclipse:

 Missing artifact spark:spark:jar:0.9.9.4-SNAPSHOT 

How can I solve this problem? I do not want to download the jar Spark file and put it in a local repository.

This is my pom.xml file:

 <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.myproject</groupId> <artifactId>Spark1</artifactId> <version>1.0-SNAPSHOT</version> <packaging>jar</packaging> <name>Spark1</name> <url>http://maven.apache.org</url> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> </properties> <repository> <id>Spark repository</id> <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url> </repository> <dependencies> <!-- (...) --> <dependency> <groupId>spark</groupId> <artifactId>spark</artifactId> <version>0.9.9.4-SNAPSHOT</version> </dependency> </dependencies> </project> 
+6
source share
8 answers

The repository block must be enclosed in the repositories block:

 <repositories> <repository> <id>Spark repository</id> <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url> </repository> </repositories> 
+7
source

No Spark repository required to download the Spark library at this time.

You just need to add

 <dependency> <groupId>com.sparkjava</groupId> <artifactId>spark-core</artifactId> <version>2.6.0</version> </dependency> 

What is it.

Useful tutorials to play with here

+12
source

The reason for the refusal is 0.9.9.4. SNAPSHOT is not available. Below is a list of available snapshots. Use one of them based on your requirement.

0.9.8-SNAPSHOT / Sat May 21 21:54:23 UTC 2011
0.9.9-SNAPSHOT / Mon May 23, 10:57:38 UTC 2011
0.9.9.1-SNAPSHOT / Thu May 26 09:47:03 UTC 2011
0.9.9.3-SNAPSHOT / Thu Sep 01 07:53:59 UTC 2011

Thanks Sankara Reddy

+5
source

I had the same problem, because initially I started with a different repository URL for the spark, and then to use an earlier version I changed the repository URL. Some of them did not seem to take effect until I changed the repository identifier. Try changing the repository id.
There may be an error in maven, because starting maven from the console also could not resolve the dependency without updating the identifier.

0
source

add the repository tag inside the repository tag as below

 <repositories> <repository> <id>Spark repository</id> <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url> </repository> </repositories> 
0
source

In the latest versions (2.1 and later), Spark only requires a dependency defined inside the pom.xml file

 <dependency> <groupId>com.sparkjava</groupId> <artifactId>spark-core</artifactId> <version>2.1</version> </dependency> 

storage definition is no longer required

0
source

use this last repository. http://mvnrepository.com/artifact/org.apache.spark/spark-core_2.10/1.6.0

 <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>1.6.0</version> </dependency> 
0
source

use this and also make sure you replace the spark library with version 2.11.x in the path of building the eclipse project.

 <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.spark-scala</groupId> <artifactId>spark-scala</artifactId> <version>0.0.1-SNAPSHOT</version> <name>${project.artifactId}</name> <description>Spark in Scala</description> <inceptionYear>2010</inceptionYear> <properties> <maven.compiler.source>1.8</maven.compiler.source> <maven.compiler.target>1.8</maven.compiler.target> <encoding>UTF-8</encoding> <scala.tools.version>2.10</scala.tools.version> <!-- Put the Scala version of the cluster --> <scala.version>2.10.4</scala.version> </properties> <!-- repository to add org.apache.spark --> <repositories> <repository> <id>cloudera-repo-releases</id> <url>https://repository.cloudera.com/artifactory/repo/</url> </repository> </repositories> <build> <sourceDirectory>src/main/scala</sourceDirectory> <testSourceDirectory>src/test/scala</testSourceDirectory> <plugins> <plugin> <!-- see http://davidb.github.com/scala-maven-plugin --> <groupId>net.alchim31.maven</groupId> <artifactId>scala-maven-plugin</artifactId> <version>3.2.1</version> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-surefire-plugin</artifactId> <version>2.13</version> <configuration> <useFile>false</useFile> <disableXmlReport>true</disableXmlReport> <includes> <include>**/*Test.*</include> <include>**/*Suite.*</include> </includes> </configuration> </plugin> <!-- "package" command plugin --> <plugin> <artifactId>maven-assembly-plugin</artifactId> <version>2.4.1</version> <configuration> <descriptorRefs> <descriptorRef>jar-with-dependencies</descriptorRef> </descriptorRefs> </configuration> <executions> <execution> <id>make-assembly</id> <phase>package</phase> <goals> <goal>single</goal> </goals> </execution> </executions> </plugin> <plugin> <groupId>org.scala-tools</groupId> <artifactId>maven-scala-plugin</artifactId> </plugin> </plugins> </build> <dependencies> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> <version>1.2.1</version> </dependency> </dependencies> </project> 
0
source

All Articles