Using a logarithmic analyzer to analyze a batch of magazines in different folders

I recently started using Log Parser with a visual interface.

The logs I want to analyze come from IIS and they are connected to SharePoint. For example, I want to know how many people visited certain web pages, etc.

And it seems that IIS creates logs in different folders (I donโ€™t know why), and every day a new log file appears in a different folder.

So my question is: is it possible to approach all of these files in different folders?

I know that you can use From-clause to place different folders, but this is too complicated, especially if new folders are added in the future. The goal is to create one script to be executed.

So, for example, in a folder log with the name LogFIles, I have folders folder1, folder2, folder3, folder4, etc., and each folder has log files log1, log2, log 3, logN, etc.

So, my request should look like this: Select * FROM path/LogFiles/*/*.log , but the Select * FROM path/LogFiles/*/*.log parser does not accept it, so how to implement it?

+7
source share
5 answers

You can use the -recurse option when calling logparser.

For example:

 logparser file:"query.sql" -i:IISW3C -o:CSV -recurse 

where query.sql contains:

 select * from .\Logs\*.log 

and in my current directory there is a directory called "Logs" which contains several subdirectories, each of which contains log files. For example:

 \Logs\server1\W3SVC1 \Logs\server1\W3SVC2 \Logs\server2\W3SVC1 \Logs\server2\W3SVC2 etc. 
+17
source

You can merge logs, then request a merged log

I need to do

 LogParser.exe -i:w3c "select * into E:\logs\merged.log from E:\logs\WEB35\*, E:\logs\WEB36\*, E:\logs\WEB37\*" -o:w3c 
+5
source

I prefer powershell as follows: Select-String C:\Logs\diag\*.log -pattern "/sites/Very" | ?{$_.Line -match "Important"} Select-String C:\Logs\diag\*.log -pattern "/sites/Very" | ?{$_.Line -match "Important"} or whatever.

+1
source

LogParser help does not list the -recurse option, so I'm not sure if it is still supported. However, this is what I did to get around this:

Let's say you use the following command to execute logparser -

 logparser "SELECT * INTO test.csv FROM 'D:\samplelog\test.log'" -i:COM -iProgID:Sample.LogParser.Scriptlet -o:CSV 

Then just create a script package for "recurse" through the folder structure and parse all the files in it. A script package that looks like this:

 echo off for /r %%a in (*) do ( for %%F in ("%%a") do ( logparser "SELECT * INTO '%%~nxF'.csv FROM '%%a'" -i:COM -iProgID:Sample.LogParser.Scriptlet REM echo %%~nxF ) ) 

Run it from the path where the log files that you want to analyze are located. This can be further configured to spill out all analyzed logs in a single file using the append (โ†’) operator.

Hope this helps.

0
source

Check this out: https://stackoverflow.com/a/166168/2126 using powershell to recursively retrieve files in subdirectories and parse them.

0
source

All Articles