On a Solaris system that processes a large number of files and stores their information in a database (yes, I know that using a database is the fastest way to get information about the number of files that we have). I need a quick way to track files as they go through the system along the path of their storage in the database.
I am currently using a perl script that reads an array in a directory and then grabs the size of the array and sends it to the monitoring script. Unfortunately, as our system grows, this monitor becomes slower.
I am looking for a method that will work much faster, instead of pausing and updating every 15-20 seconds after performing the count operation in all the directories involved.
I am relatively confident that my bottleneck is the read directory in the array operation.
I don’t need any information about files, I don’t need sizes or file names, just the number of files in a directory.
In my code, I do not count hidden files or text files that I use to store configuration information. It would be great if this functionality was preserved, but, of course, is not required.
I found some links to counting inodes with C code or something in that direction, but I'm not very experienced in this area.
I would like to make this monitor as real as possible.
The perl code I use looks like this:
opendir (DIR, $currentDir) or die "Cannot open directory: $!"; @files = grep ! m/^\./ && ! /config_file/, readdir DIR;
unix directory perl count solaris
Andrew
source share