Spark Java Application Development on Eclipse

I am trying to develop a Spark application for Eclipse. I used to be able to set up Spark Scala projects on Eclipse. Now I would like to configure the Spark Java application on Eclipse. I assume that I will need to use maven in some way, but I do not know exactly how I should use it. Any help would be greatly appreciated. Thank!

+4
source share
3 answers

I ended up following the instructions for creating a Java Maven project for Spark from here . After that, I converted the Maven project to an Eclipse project, running mvn eclipse:eclipsefrom the root directory of the project that it contains pom.xml.

+3
source

You can create your project in Eclipse as a Maven project, edit pom and add the spark dependencies you need ( core, mllib, etc. ), then start with the java examples spark and from there refactoring, depending on the problem you are trying to solve . If you have more specific questions, I would be happy to help.

0
source

pom, , kafka .

<dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.6.1</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>1.6.1</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.10</artifactId>
            <version>$1.6.1</version>
            <scope>provided</scope>
        </dependency>
0

All Articles