Thursday 19 November 2009

PowerShell vs LogParser

The scenario:


We have an application server (iis) that is running slowly, the database shows no signs of stress so I want to review the web server application log to see if there are any entries that show any clues as to the issue. Easy, grab the data from the event log and quickly analyse the results; what sort of event, how many of them, when do they occur??

I have two options immediately in my mind - PowerShell and LogParser.

PowerShell I feel I ought to learn, its embedded in the SSMS UI so I am guessing that Microsoft are going to make it defacto across their server products, there should be good support resources and it could well become a 'required' for certain tasks and even job descriptions.

LogParser is something I have used on and off for years now, there seems very little support for it but what there is is good quality. Starting off with the help document that ships with it as a chm file.

Both tools are command line, sadly, they are their own command line, not the DOS command window you may be used to but a command line none the less.

So, abandon your mouse and start typing ...

PowerShell:
In PowerShell 1 you cant access remote logs without some funky looking Function so this needs to be created:
Function Get-RemoteEventLog {
param ([string]$Computername =$env:ComputerName, [string]$Logname = 0)
# connect .NET class to get eventlogs
$evt = [System.Diagnostics.EventLog]::GetEventLogs("$Computername")

# check if logname is asked
if ($logname -eq 0) {
return $evt
}
Else {
$val = 0
$evt | foreach {
if ($_.log -contains $logname) {
$logindex=$val
}
$val = $val + 1
}
return $evt[$logindex].entries
}
}

and then you can run:
get-RemoteEventLog -ComputerName -LogName application | where-object {$_.source -eq 'ASP.NET 2.0.50727.0'}| out-file -filepath C:\Server_App_Log.csv

This gets the data into a csv file of your choosing, ready for importing into your favourite analysis tool.

LogParser
With LogParser we get to start work right away, simply by specifying the target machine in the FROM part of the query:
logparser "SELECT * into MyServer_App_Log from \\<servername>\application where sourcename = 'ASP.NET 2.0.50727.0'" -o:sql -database:TargetDBName -server: -createtable:on -resolveSIDs:on -formatmsg:on -icheckpoint:on
Here I have got the data straight into a sql database, ready for a grilling from TSQL, alternatively I could have it in a csv file for Excel and so on... Its even resolved the SIDs from AD for me!


Conclusion/Opinion



PS took 2 mins+ to get the data into the csv file and I still have to get that into something that I can analyse it with. Its not formatted for Excel to analyse from the get-go, the columns are not delimited, the messages are truncated, it isnt fit for purpose

LogParser took 13 seconds to get it into a table where I can get T-SQL working on it instantly, it would be no slower into a csv file where I could double click to get Excel or Calc (OpenOffice) to open it and interrogate it.

It seems to me that I still dont have a reason to go digging deep into PowerShell - to get my job done I have other tools that are faster and simpler.

This opinion may change after I have attended the PowerShell session at SQLBits this coming weekend. If it does I'll let you know.

Resources:
If you want to try one or the other then here are some good starting points.
LogParser
Download - http://www.microsoft.com/downloads/details.asp
Support - LogParser
Background - Wikipedia entry with lots of links

PowerShell
Download - PowerShell Blog
Support - PowerShell Blog

1 comment:

/\/\o\/\/ said...

for the formatting of the CSV file try to use the export-Csv Cmdlet instead of out-file

Greetings /\/\o\/\/

AddThis

Bookmark and Share
 
Google Analytics Alternative