Access files in HDFS using Java

I am trying to access a file in HDFS using the Java API, but every time I get the file not found. The code I use for access is: -

Configuration conf = new Configuration(); conf.addResource(FileUtilConstants.ENV_HADOOP_HOME + FileUtilConstants.REL_PATH_CORE_SITE); conf.addResource(FileUtilConstants.ENV_HADOOP_HOME + FileUtilConstants.REL_PATH_HDFS_SITE); try { FileSystem fs = FileSystem.get(conf); Path hdfsfilePath = new Path(hdfsPath); logger.info("Filesystem URI : " + fs.getUri()); logger.info("Filesystem Home Directory : " + fs.getHomeDirectory()); logger.info("Filesystem Working Directory : " + fs.getWorkingDirectory()); logger.info("HDFS File Path : " + hdfsfilePath); if (!fs.exists(hdfsfilePath)) { logger.error("File does not exists : " + hdfsPath); } 

And now the code is displayed from the command line.

 [ root@koversevms ~]# java -jar /tmp/thetus-incendiary-koverse-extension-fileutils-1.0-SNAPSHOT.jar 13/07/10 02:47:18 INFO fileutils.HadoopFileChecksumUtils: Filesystem URI : file:/// 13/07/10 02:47:18 INFO fileutils.HadoopFileChecksumUtils: Filesystem Home Directory : file:/root 13/07/10 02:47:18 INFO fileutils.HadoopFileChecksumUtils: Filesystem Working Directory : file:/root 13/07/10 02:47:18 INFO fileutils.HadoopFileChecksumUtils: HDFS File Path : /usr/hadoop/sample/sample.txt 13/07/10 02:47:18 ERROR fileutils.HadoopFileChecksumUtils: File does not exists : /usr/hadoop/sample/sample.txt 

I am new to hadoop, so I don’t know what is going wrong.

Thanks Nayan

+4
source share
1 answer

Here is a code snippet originally posted in the context of answering this question . It should solve your question, although the intent of the original question was different. The main thing in your code is that you have problems starting with the scheme ( file:// ). In the configuration, you can check the fs.defaultFS variable.

 package org.myorg; import java.security.PrivilegedExceptionAction; import org.apache.hadoop.conf.*; import org.apache.hadoop.security.UserGroupInformation; import org.apache.hadoop.fs.Path; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.FileStatus; public class HdfsTest { public static void main(String args[]) { try { UserGroupInformation ugi = UserGroupInformation.createRemoteUser("hbase"); ugi.doAs(new PrivilegedExceptionAction<Void>() { public Void run() throws Exception { Configuration conf = new Configuration(); conf.set("fs.defaultFS", "hdfs://1.2.3.4:8020/user/hbase"); conf.set("hadoop.job.ugi", "hbase"); FileSystem fs = FileSystem.get(conf); fs.createNewFile(new Path("/user/hbase/test")); FileStatus[] status = fs.listStatus(new Path("/user/hbase")); for(int i=0;i<status.length;i++){ System.out.println(status[i].getPath()); } return null; } }); } catch (Exception e) { e.printStackTrace(); } } } 
+6
source

All Articles