Hadoop dfs -ls returns a list of files in my home / director

I installed the Hailop sigle-node configuration launched through cygwin under Win7. After starting Hadoop with

bin/start-all.sh 
I launched
 bin/hadoop dfs -ls 
which returns me a list of files in my hadoop directory. Then i run
 bin/hadoop datanode -format 
bin/hadoop namenode -format
but -l still returns me the contents of my hadoop directory. As far as I understand, it should not return anything (empty folder). What am I doing wrong?
+6
hadoop hdfs
source share
4 answers

Have you edited core-site.xml and mapred-site.xml in the conf folder? It seems your hadoop cluster is in local mode.

+2
source share

I know this question is quite old, but the directory structure in Hadoop has changed a bit (version 2.5) The current version of Jeroen will be.

hdfs dfs -ls hdfs: // localhost: 9000 / users / smalldata strong>

Also just for information - using start -all.sh and stop-all.sh is deprecated, use start-dfs.sh and start-yarn.sh instead

+2
source share

I had the same problem and I decided to explicitly provide it with the NameNode URL.

To list all the directories in the root space of your hdfs, follow these steps:

 ./bin/hadoop dfs -ls hdfs://<ip-of-your-server>:9000/ 

The documentation says about the default hdfs point in the configuration, but I cannot find it. If anyone knows what they mean, please enlighten us.

Here I got the information: http://hadoop.apache.org/common/docs/r0.20.0/hdfs_shell.html#Overview

+1
source share

Or you can just do:

  • Run stop -all.sh.
  • Delete dfs data and name directories
  • Namenode -format
  • Run start-all.sh
+1
source share

All Articles