I have a requirement that I do not find a solution or help on the forums yet ...
Problem: I control real-time systems and they generate some very large log files that are rolled on a daily basis, 1 GB is not unusual, and I do this by looking at the logs (tail end) for known error conditions or other lines that I want to track, for which I may need to take action.
Since such work is time-consuming and tedious, and for human error it is easy to skip problems in the logs, I automated the monitoring of this log file. I use a product called Servers Alive to perform scheduled checks, and I write scripts to monitor log files for log entries that interest me, which may indicate problems with services, and I can name other scripts to restart services if necessary to fix the problem you are having.
I previously made scripts for these log monitoring tasks using Perl, and these scripts are very quick, but not necessary, the cleanest way to do this, Iβm more of an administrator than a programmer, so I donβt have development methodologies or experience to rely on.
Below is a Perl code snippet showing that I open the $logfile log file and then βsearchβ back from the end of the file for a given amount of data, and then look at the data from that moment on the end of the file to write to the log, I'm interested in monitoring, in this Example log entry " No packet received from EISEC Client "
In this example, the logging that we are tracking indicates a problem with the EISEC service and a simple restart of the service usually fixes the problem, all of which I automatically use Alive servers as a verification and notification mechanism.
Perl script function
sub checkEisecSrvloggedon { print "$logfile\n"; if (open (EISECSRV_LOGGEDON, $logfile)) { seek (EISECSRV_LOGGEDON, -40000, 2); $line = <EISECSRV_LOGGEDON>; $eisecsrvloggedon_ok = 0; while ($line = <EISECSRV_LOGGEDON>) { if ($line =~/No packet received from EISEC Client/) {
I would like to implement a solution for this using PowerShell, if possible, now that we have moved to the Windows Server 2008 R2 and Windows 7 clients, but I canβt find details on how I could effectively do this, quickly and without any big memory overhead.
I tried Get-Content based solutions, but having to read the entire log file makes these types of solutions unusable because it takes too much time to query the log file. I need to be able to check the number of these large log files on a regular basis, every few minutes in some cases. I have seen tail-type solutions that are great for closing log files, and these scripts use System.IO.File methods. This allows me to get the performance / speed that I would like to achieve in my scripts, but I'm not familiar with PowerShell to know how to use this methodology to quickly get to the end of a large log file, and then be able to read back for a given amount of data and then search for the relevant lines in this section of the log.
Does anyone have any ideas?