Recursively finding and replacing usend Perl in cmd (Windows)

I use this command to find and replace a string with another on the command line:

perl -pi -i.bak -e "s/Mohan/Sitaram/g" ab.txt 

This replaces Mohan with Sitaram in the ab.txt file in the current directory.

However, I want to replace all Mohan occurrences with Sitaram in all .txt files in all subdirectories (recursively). Using *.txt instead of ab.txt does not work. Regular expressions work differently since I downloaded regex packages for Windows. It does not work only for this team, saying

 E:\>perl -pi -e "s/Sitaram/Mohan/g" *.txt Can't open *.txt: Invalid argument. 

Is there any way to fix this? Perhaps another team?

+6
cmd perl
source share
3 answers

find . -name "*.txt" | xargs perl -p -i -e "s/Sitaram/Mohan/g"

find used to recursively search all * .txt files.

xargs used to create and execute commands from standard input.

+7
source share

Windows solution

On Windows, a command can be executed for multiple files using the forfiles . The /s option tells her to look for directories recursively.

 forfiles /s /m *.txt /c "perl -pi -es/Sitaram/Mohan/g @path" 

If you need to search from another working directory, put /p path\to\start .

Unix Solution

On Unix, there is a more general command than forfiles called xargs , which passes the lines of its standard input as arguments to this command. Directories are searched recursively for .txt files using the find .

 find . -name '*.txt' | xargs perl -pi -e 's/Sitaram/Mohan/g' 

Platform Independent Solution

You can also encode both file searches and string replacements in Perl. File::Find main module can help with this. (Main module = distributed with the interpreter.)

 perl -MFile::Find -e 'find(sub{…}, ".")' 

However, Perl code will be longer, and I do not want to waste time writing it. Deploy sub yourself using the information from the File::Find man page above. He must check if the file name ends with .txt and is not a directory, back it up and overwrite the original file with the modified version of the backup.

The quote will be different on Windows - perhaps writing a script to a file would be the only reasonable solution.

Problems with the original OPs approach

In a Unix shell, glob patterns (such as *.txt ) are expanded by the shell, while Windows cmd leaves them untouched and passes them directly to the called program. His task is to cope with them. Perl cannot do this obviously.

The second problem is that even on Unix, globbing will not work as desired. *.txt - all .txt files in the current directory, not including those in subdirectories and their subdirectories ...

+6
source share

If you are going to worry about Perl, why not just do your best and write a (short) Perl program to do this for you?

Thus, you do not pass it between the shell and your program, and you have something more universal and can work on several operating systems.

 #!/usr/bin/env perl <-- Not needed for Windows, but tradition rules use strict; use warnings; use feature qw(say); use autodie; # Turns file operations into exception based programming use File::Find; # Your friend use File::Copy; # For the "move" command # You could use Getopt::Long, but let go with this for now: # Usage = mungestrings.pl <from> <to> [<dir>] # Default dir is current # my $from_string = shift; my $to_string = shift; my $directory = shift; $from_string = quotemeta $from_string; # If you don't want to use regular expressions $directory = "." if not defined $directory; # # Find the files you want to operate on # my @files; find( sub { return unless -f; # Files only return unless /\.txt$/ # Name must end in ".txt" push @files, $File::Find::name; }, $directory ); # # Now let go through those files and replace the contents # for my $file ( @files ) { open my $input_fh, "<", $file; open my $output_fh, ">" "$file.tmp"; for my $line ( <$input_fh> ) { $line =~ s/$from_string/$to_string/g; print ${output_fh} $line; } # # Contents been replaced move temp file over original # close $input_fh; close $output_fh; move "$file.tmp", $file; } 

I use File::Find to collect all the files that I want to modify in my @files array. I could put all this in the find routine:

  find(\&wanted, $directory); sub wanted { return unless -f; return unless /\.txt/; # # Here: open the file for reading, open output and move the lines over # ... } 

The entire program is in the wanted routine in this way. This is more efficient because I now replace when I find files. No need to go through first, find files, and then perform a replacement. However, it looks like a bad design to me.

You can also delete your entire file into an array without scrolling it first:

 open my $input_fh, "<", $file; @input_file = <$input_fh>; 

Now you can use grep to check if there is something that needs replacing:

 if ( grep { $from_string } @input_file ) { # Open an output file, and do the loop to replace the text } else { # String not here. Just close up the input file # and don't bother with writing a new one and moving it over } 

This is more efficient (there is no need to perform a replacement unless this file contains the string you are looking for). However, it takes up memory (the entire file must be in memory at a time), and don't let one line fool you. The entire file is still read into this array one line at a time, as if you had completed the entire loop.

File::Find and File::Copy are standard Perl modules, so all Perl installations have them.

+1
source share

All Articles