I met some problems when I tried to remotely start the work to reduce the map (example words). After searching google, I still can’t achieve my goal. And I just saw very few topics about how to remotely remove the work to reduce the map. The following are the problems:
First I encounter a resolution problem:
SEVERE: PriviledgedActionException as:[user] cause:org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=[user], access=WRITE, inode="mapred":root:supergroup:rwxr-xr-x
It seems that the resolution prohibits the hdfs path. I turn off validation by setting dfs.permissions = true . Is there any other way to overcome this problem, but still keep checking.
Then I met an exception saying that I can’t access the application bank with reduced display.
SEVERE: PriviledgedActionException as:[User] cause:java.io.FileNotFoundException: File /home/hduser/WordCount-1.0.jar does not exist.
My code is:
JobConf conf = new JobConf(WordCount.class); conf.setJobName("wordcount"); conf.set("fs.default.name", "hdfs://10.97.1.91:54310"); conf.set("mapred.job.tracker", "10.97.1.91:54311"); conf.setJar("/home/hduser/WordCount-1.0.jar"); ... JobClient.runJob(conf);
It seems like a local file system named node is denying me access to the jar file. How can I solve this problem? I found the link in stackoverflow. By reference, the location of the jar file is not required. But how does the name node find the jar file without specifying a location?
I have seen very few examples of completing a task remotely in a tutorial that I found on the Internet. Doesn't that suggest doing it that way?
source share