Host and port for use in hdfs directory listing

First of all, I use HortonWorks Sandbox as a Hadoop dist, without any custom configuration.

After connecting to the sandbox, I can view the HDFS directory files:

[root@sandbox ~]# hadoop fs -ls hdfs:///user/guest

but if I try to specify the host and port, I get only errors:

[root@sandbox ~]# hadoop fs -ls hdfs://localhost:8020/user/guest ls: Call From sandbox.hortonworks.com/10.0.2.15 to localhost:8020 failed on connection exception: java.net.ConnectException: Connexion refusée; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

[root@sandbox ~]# hadoop fs -ls hdfs://localhost:9000/user/guest ls: Call From sandbox.hortonworks.com/10.0.2.15 to localhost:9000 failed on connection exception: java.net.ConnectException: Connexion refusée; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

Once I know the correct host and port to use, I can use them in my Java call:

Path pt = new Path("hdfs://host:port/user/guest/test-text-file.txt");

+4
source share
1 answer

Check the value of the property fs.defaultFSin core-site.xmlwhich contains the ip address / host name and the port with which the NameNode daemon should appear when it starts.

, hortonworks, core-site.xml /etc/hadoop/conf/core-site.xml

<property>
  <name>fs.defaultFS</name>
  <value>hdfs://sandbox.hortonworks.com:8020</value>
</property>

, - :

hadoop fs -ls hdfs://sandbox.hortonworks.com:8020/user/guest 

ip- sandbox.hortonworks.com /etc/hosts, vm, :

127.0.0.1       localhost.localdomain localhost
192.168.1.3 sandbox.hortonworks.com sandbox

, :

hadoop fs -ls hdfs://192.168.1.3:8020/user/guest
+8

All Articles