How can I process many log files as a single virtual file in Perl?

I have several access logs in the log directory, following the naming convention below:

access.log.1284642120
access.log.1284687600
access.log.1284843260

In principle, the logs "rotate" using Apache per day, so they can be sorted in order.

I am trying to "read them one by one," so that they can be thought of as one log file.

my @logs = glob('logs/access.log.*');

The above code will store all the logs, but I'm not sure:

  • In what order will the magazines be organized in alphabetical order?
  • If I want to check the "recent access time from a unique IP", how can I do this?

I have a Perl script that can read one access log and easily check this (my algorithm should have a large hash that uses the IP address as the key and the access time as the value, and just keep pressing the key / value pair ... ) But I do not want to simply merge all access files into one temporary file for this process only.

Any suggestions? Thank you very much in advance.

+5
source share
2 answers

If you want to provide a specific order, assemble it yourself, even if you just make sure that it comes out correctly:

 my @files = sort { ... } glob( ... );

In this case, when the file names are the same, with the exception of single digits, you may not need a sorting block:

 my @files = sort glob( ... );

über , local @ARGV, , ARGV. @ARGV, . , , @ARGV :

 {
 local @ARGV = sort { ... } glob( ... );

 while( <> ) {
      ...;
      }
 }

, , $ARGV.

- , , , .

+11

Unix-y :

my @files = glob("$dir/access.log.*");
open my $one_big_logfile, "-|", "cat @files" or die ...;
while (<$one_big_logfile>) {
   ...
}
+2

All Articles