Linux: move 1 million files to prefix-based folders

I have a catalog called "images", filled with about a million images. Yes.

I want to write a shell command to rename all these images to the following format:

original: filename.jpg
new: /f/i/l/filename.jpg

Any suggestions?

Thanks,
Dan

+6
linux file shell rename folders
source share
5 answers
 for i in *.*; do mkdir -p ${i:0:1}/${i:1:1}/${i:2:1}/; mv $i ${i:0:1}/${i:1:1}/${i:2:1}/; done; 

It is possible that part of ${i:0:1}/${i:1:1}/${i:2:1} may be variable or shorter or different, but the above command does the job. You've probably run into performance issues, but if you really want to use it, limit *.* To fewer options ( a*.* , b*.* Or whatever suits you)

edit: added $ to i for mv , as noted by Dan

+6
source share

You can generate a new file name using, for example, sed:

 $ echo "test.jpg" | sed -e 's/^\(\(.\)\(.\)\(.\).*\)$/\2\/\3\/\4\/\1/' t/e/s/test.jpg 

So you can do something like this (assuming all directories are already created):

 for f in *; do mv -i "$f" "$(echo "$f" | sed -e 's/^\(\(.\)\(.\)\(.\).*\)$/\2\/\3\/\4\/\1/')" done 

or if you cannot use bash $( syntax $( :

 for f in *; do mv -i "$f" "`echo "$f" | sed -e 's/^\(\(.\)\(.\)\(.\).*\)$/\2\/\3\/\4\/\1/'`" done 

However, given the number of files, you can simply use perl, since there are many sed and mv processes to create:

 #!/usr/bin/perl -w use strict; # warning: untested opendir DIR, "." or die "opendir: $!"; my @files = readdir(DIR); # can't change dir while reading: read in advance closedir DIR; foreach my $f (@files) { (my $new_name = $f) =~ s!^((.)(.)(.).*)$!$2/$3/$4/$1/; -e $new_name and die "$new_name already exists"; rename($f, $new_name); } 

This perl is certainly limited only to the file system, although you can use File::Copy::move to get around this.

+2
source share

You can do this as a bash script:

 #!/bin/bash base=base mkdir -p $base/shorts for n in * do if [ ${#n} -lt 3 ] then mv $n $base/shorts else dir=$base/${n:0:1}/${n:1:1}/${n:2:1} mkdir -p $dir mv $n $dir fi done 

Needless to say, you may need to worry about spaces and files with short names.

+2
source share

I suggest a short python script. Most shell tools will discourage a lot of input (although xargs can do the trick). Will be updated with an example in seconds.

 #!/usr/bin/python import os, shutil src_dir = '/src/dir' dest_dir = '/dest/dir' for fn in os.listdir(src_dir): os.makedirs(dest_dir+'/'+fn[0]+'/'+fn[1]+'/'+fn[2]+'/') shutil.copyfile(src_dir+'/'+fn, dest_dir+'/'+fn[0]+'/'+fn[1]+'/'+fn[2]+'/'+fn) 
+1
source share

Any of the proposed solutions that use shell wildcard syntax will most likely fail due to the large number of files that you have. Of the solutions currently offered, perl is probably the best.

However, you can easily adapt any of the shell script methods to work with any number of files, this way:

 ls -1 | \ while read filename do # insert the loop body of your preference here, operating on "filename" done 

I would still use perl, but if you are limited only by the simple unix tools available, then combining one of the above shell solutions with a loop, as I showed, should lead you there. It will be slow, however.

0
source share

All Articles