I recently came up with a situation where I need to trim some fairly large log files when they exceed a certain size. Everything except the last 1000 lines in each file is deleted, work is performed every half hour cron. My solution was to just start the file list, check the size and crop if necessary.
for $file (@fileList) { if ( ((-s $file) / (1024 * 1024)) > $CSize) { open FH, "$file" or die "Cannot open ${file}: $!\n"; $lineNo = 0; my @tLines; while(<FH>) { push @tLines, $_; shift @tLines if ++$lineNo < CLLimit; } close FH; open FH, ">$file" or die "Cannot write to ${file}: $!\n"; print FH @tLines; close FH; }
This works in its current form, but there is a lot of overhead for large log files (especially those that have 100_000 + lines) due to the need to read on each line and, if necessary, change.
Is there a way that I could only read part of the file, for example. in this case, I want to have access only to the last lines of "CLLimit". Since the script is being deployed on a system that saw better days (I think Celeron 700 MHz with 64 MB of RAM), I am looking for a faster alternative using Perl.
source share