I am new to fail2ban and have difficulty figuring out performance considerations for the different configurations that I'm thinking of. This works on a raspberry pi panel, so performance is a concern.
The obvious optimizations I can think of are the use of efficient regular expressions and only the minimum number of boxes required. I think my specific questions are:
- How does resource utilization increase with respect to search time values? I assume that very small and very large values ββcan affect the server differently with respect to RAM and processor.
- In the same way, how does the size of the log file and the number of different log files controlled by fail2ban affect the total use of resources?
As an example, this jail would allow someone to try 3,600 SSH login passwords per day if they figured out the fail2ban configuration and adjusted the script time for hosting.
[ssh] enabled = true action = iptables-allports[name=ssh] filter = sshd logpath = /var/log/auth.log maxretry = 6 findtime = 120
If we changed findtime to the other extreme 86400 (1 day), this would only allow 5 attempts per day, but now it controls most of the log file. How does this affect resource utilization?
Another example: a prison for POST attacks:
[apache-post-flood] enabled = true action = iptables-allports[name=apache-post-flood] filter = apache-post-flood logpath = /var/log/apache2
Here we have the opposite, where the search time counter is reset every 10 seconds. It also controls all access logs (I guess, again, I'm new to this). This may mean monitoring access.log, other_vhosts_access.log, and possibly https_access.log for https parts of the site. What if it was a busy day, and these files are all 10-20mb each?
Hope this helps explain what's on your mind. Thanks in advance for your help.
performance python ram cpu-usage fail2ban
user4181885
source share