Hadoop HDFS copies using wildcards?

I want to copy a specific sample of files from hdfs to another location in the same hdfs cluster. The dfs shell does not seem to be able to handle this:

hadoop dfs -cp /tables/weblog/server=jeckle/webapp.log.1* /tables/tinylog/server=jeckle/ 

The error is not returned: all the same, the files are not copied.

+9
source share
5 answers

You need to use a double quote with your path that contains a wildcard, for example:

 hdfs fs -cp "/path/to/foo*" /path/to/bar/ 
+13
source

The first supported copy is HDFS with wildcards. Secondly, using hadoop dfs is deprecated, you'd better use hadoop fs or hdfs dfs . If you are sure that the operation was not successful (although it seems successful), you can check the nomenoda log files to find out what is wrong.

+1
source

Interesting. This is what I get on my local virtual machine running Hadoop 0.18.0. Which version are you using? I can try 1.2.1 as well

 hadoop-user@hadoop-desk :~$ hadoop fs -ls /user/hadoop-user/testcopy hadoop-user@hadoop-desk :~$ hadoop dfs -cp /user/hadoop-user/input/*.txt /user/hadoop-user/testcopy/ hadoop-user@hadoop-desk :~$ hadoop fs -ls /user/hadoop-user/testcopy Found 2 items -rw-r--r-- 1 hadoop-user supergroup 79 2014-01-06 04:35 /user/hadoop-user/testcopy/HelloWorld.txt -rw-r--r-- 1 hadoop-user supergroup 140 2014-01-06 04:35 /user/hadoop-user/testcopy/SampleData.txt 
+1
source

They both worked for me:

 ~]$ hadoop fs -cp -f /user/cloudera/Dec_17_2017/cric* /user/cloudera/Dec_17_2017/Dec_18 ~]$ hadoop fs -cp -f "/user/cloudera/Dec_17_2017/cric*" /user/cloudera/Dec_17_2017/Dec_18 

I think the best way is to not give double / single ("/") quotes.

0
source

If someone wants to copy files and folders from the current directory where the user is in the terminal, then

 hdfs dfs -put ./ 
0
source

All Articles