An IDE for writing and completing work tasks?

I recently started learning the basics of Java Hadoop transform programming. So far, the only way to find code in a file is with vi or emacs, which seems awful, primitive and very painful.

Is there an IDE for recording, compiling and running Hadoop programs?

+6
source share
4 answers

You should also check out IntelliJ IDEA, as there are instructions on setting up a development environment using Apache Hadoop wiki pages.

Literature:

http://www.jetbrains.com/idea/

http://wiki.apache.org/hadoop/HadoopUnderIDEA

+2
source

Use Eclipse . Cloudera has a great screencast here for setting up eclipse for hadoop development. Also, debugging adoop work locally is pretty cool using eclipse (albeit not trivial). To learn more about this, see here .

+4
source

Here is a blog for debugging / developing Hadoop Jobs on Eclipse / Linux. Also check out Karmasphere Studio .

+1
source

Yes, it is possible to run MapReduce jobs in Eclipse. There are many differences in how MapReduce runs on a cluster and in Eclipse. Below are a few notes:

  • In Eclipse, it runs in a special mode called: LocalJobRunner mode, i.e. all Hadoop daemons run in the same JVM.
  • In Eclipse, all paths refer to local file paths, not the HDFS path.
0
source

All Articles