How can I run the mapreduce task remotely

I met some problems when I tried to remotely start the work to reduce the map (example words). After searching google, I still can’t achieve my goal. And I just saw very few topics about how to remotely remove the work to reduce the map. The following are the problems:

  • First I encounter a resolution problem:

    SEVERE: PriviledgedActionException as:[user] cause:org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=[user], access=WRITE, inode="mapred":root:supergroup:rwxr-xr-x 

    It seems that the resolution prohibits the hdfs path. I turn off validation by setting dfs.permissions = true . Is there any other way to overcome this problem, but still keep checking.

  • Then I met an exception saying that I can’t access the application bank with reduced display.

     SEVERE: PriviledgedActionException as:[User] cause:java.io.FileNotFoundException: File /home/hduser/WordCount-1.0.jar does not exist. 

    My code is:

     JobConf conf = new JobConf(WordCount.class); conf.setJobName("wordcount"); conf.set("fs.default.name", "hdfs://10.97.1.91:54310"); conf.set("mapred.job.tracker", "10.97.1.91:54311"); conf.setJar("/home/hduser/WordCount-1.0.jar"); ... JobClient.runJob(conf); 

    It seems like a local file system named node is denying me access to the jar file. How can I solve this problem? I found the link in stackoverflow. By reference, the location of the jar file is not required. But how does the name node find the jar file without specifying a location?

  • I have seen very few examples of completing a task remotely in a tutorial that I found on the Internet. Doesn't that suggest doing it that way?

+4
source share
1 answer

To the first question, it seems that [user] does not have rights to HDFS. First of all, Hadoop uses the whoami command to determine if the user submitting the task and the output from the command are the same. Two ways to solve this problem, a) Identify the user who has HDFS permissions, say root, and add the following line before submitting the job. System.setProperty ("HADOOP_USER_NAME", "root"); b) personification. Check out the following link http://hadoop.apache.org/docs/stable/Secure_Impersonation.html

+1
source

All Articles