Log reading performance

I have a perl script that is used to monitor databases, and I'm trying to write it as a powershell script.

There is a function in the perl script that reads the error and filters out what matters, and returns it back. It also saves the current position of the log file, so the next time it needs to read the log, it can start from where it went out, instead of reading the entire log again. This is done using the tell function .

I have an idea to use the Get-Content cmdlet and start reading at the last position, process each line to the end of the file, and then save the position.

Do you know any tricks so that I can get the position in the log file after reading and start reading in a specific place.

Or is there a better and / or easier way to achieve this?

Gis

EDIT: this needs to be done with a script, not with any other tool.

EDIT: So I get somewhere with the .NET API, but it doesn’t quite work for me. I found useful links here and here.

Here is what I still have:

function check_logs{
param($logs, $logpos)
$count = 1
$path = $logs.file
$br = 0
$reader = New-Object System.IO.StreamReader("$path")
$reader.DiscardBufferedData()
$reader.BaseStream.Seek(5270, [System.IO.SeekOrigin]::Begin)
for(;;){
    $line = $reader.ReadLine()
    if($line -ne $null){$br = $br + [System.Text.Encoding]::UTF8.GetByteCount($line)}
    if($line -eq $null -and $count -eq 0){break}
    if($line -eq $null){$count = 0}
    elseif($line.Contains('Error:')){
        $l = $line.split(',')
        Write-Host "$line  $br"
    }
}

}

I did not find a way to use the search function correctly. Can someone point me in the right direction?

If I run this, it returns 5270, but if I run this without the line that I'm trying to search in the base thread, I get:

2011-08-12 08:49:36.51 Logon       Error: 18456, Severity: 14, State: 38.  5029
2011-08-12 08:49:37.30 Logon       Error: 18456, Severity: 14, State: 38.  5270
2011-08-12 16:11:46.58 spid18s     Error: 1474, Severity: 16, State: 1.  7342
2011-08-12 16:11:46.68 spid18s     Error: 17054, Severity: 16, State: 1.  7634
2011-08-12 16:11:46.69 spid29s     Error: 1474, Severity: 16, State: 1.  7894

- , , , . , , , , , , 5270, .

?????

+5
3

, , .net- ..

, logparser. !

COM PowerShell. , , ( lpc).

, - , . , .

, - ...

EDIT:

, import-csv ( , -1, ).

# use Import-CliXML to get the $last_count
Import-CliXML $path_to_last_count_xml
$file = Import-Csv filename -delimiter "`t"
for ($i=$last_count; $i -gte $file.count; $i++) {
    $line = $file[$i]
    # do something with $line
    ...
}
$last_count = $file.count | Export-CliXML $path_to_last_count_xml

# use this to clear the memory
Remove-Variable $file
[GC]::collect

, sp_readerrorlog; , .

+2

- , , . , - , .


[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO')  | Out-Null
$server = 'ServerName'
$chkDate = Get-Date -Date '8/16/2011 15:00'  # time of last check
$srvObj =  New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server -argumentList $srv
$srvObj.ReadErrorLog(0) | foreach { if ($_.LogDate -notlike '' `
   -and $_.LogDate -ge $chkDate `
   -and $_.Text -like 'Error: *') {$_}} |ft -AutoSize

, - , $chkDate .

( (`) $srvObj.ReadErrorLog(0) . HTML)

+1

There are two options that I would suggest if you really want to go with PowerShell.

  • Use the .NET APIs.

  • Read the entire contents of the journal and clear it. Store the analyzed content in a database.

0
source

All Articles