How about iterating over files and then putting each file in a background job? As Mark noted, this may not be acceptable if you have a very large number of log files. It is also assumed that you are not using anything else in focus.
mkdir results
for f in "$(find /mylogs/)"; do
(zgrep -i -f ips.txt "$f" >> results/"$f".result &);
done
wait
cat results/* > ip.results.txt
rm -rf results
You can limit the number of files to search using head and / or tail , for example, only search the first 50 files:
for f in "$(find /mylogs/ | head -50)"; do...
Then the following 50:
for f in "$(find /mylogs/ | head -100 | tail -50)"; do...
Etc.
source
share