Sunday 22 November 2009

SQLBits round-up (The free bit)

OK, so I couldnt get to any of SQLBits other than the free Saturday, yup, they do a whole day for free. All you need do is register in time at www.sqlbits.com and then turn up. This is the fifth happening and was held at the Celtic Manor Hotel, Newport, Wales. A very impressive hotel with plenty of staff keeping all of the debris that 300+ DBAs generate while they eat bacon/egg/sausage baps, tea/coffee, cookies, croissants, fruit cake and assorted packed lunches (I did mention this was free didnt I?).

Putting the excellent facilities to one side for a while, there were five rooms where sessions were taking place and each room had 5 sessions through the day so my primary school maths tells me that there were 30 sessions to select from. Yes, 30 - because the dinner break had sponsored sessions in each of the rooms too. The full listing is here and I had a lot of awkward decisions to make on what I wanted to see.

In the end I elected to go for a performance/tuning flavoured day so, in order of attendance, I went to
>"When a query plan geos wrong" - Simon Sabin
>"Let's Make SQL fly - a session for developers and administrators who believe in magic" - R Meyyappan
>"Using Perfmon and Profiler" - Brent Ozar
>Quest sponsored lunch showing what to monitor and how their products Foglight and Spotlight can help - Brent Ozar
>"Vital Statistics" - Christian Bolton
>"SQL Server Optimisation stuff you won't find on Google (yet)" - Andre Kamman

Simon's session got things off to a great start, looking at how query plans affect the performance of queries and that sometimes the plan available isnt necessarily the one that should be used and SQL needs a hand to make a different choice. Simon carried the topic very well, demonstrating his points clearly and interacting with the audience consistently. I was handicapped in this session as the goodie bag I received didnt have a pen in it so I couldnt take any notes.

After a short break the second session demonstrated a curious effect of how many rows requested in the query affects the duration of the query. Well, obviously it does but the example had a difference of 1 row and a difference in duration from 25ms to 900ms! It turns out its all to do with the data type and the SQL engine using tempdb unnecessarily. Applying the fix Ramesh showed that there was no reason to accept the slower performance. As he stressed, find out what the problem is and then the solution is easy! Contact Ramesh at www.sqlworkshops.com.

Far and away the most enthusiastic presenter of the day Brent Ozar took us through how to use free tools that are on every server already to create baselines of current performance and then how to compare these over time to measure the effects of system changes and when something spikes what the causes may be. Its all on his blog at www.brentozar.com.

Brent jumped out of this session and straight into the sponsored lunch session telling us how Quest tools can help take the hard work out of monitoring server performance and go some way to pinpointing the actual query that is causing a problem and what to do to solve it. Foglight and Spotlight are the tools they produce to help the DBA make sure their servers are performing as good as they can.

Christian Bolton took on the topic of SQL Server table statistics in the first session after lunch and explained what part statistics play in the efficiency of SQL queries, how to maintain up to date statistics and how different versions of SQL Server have different statistics. We also got an insight on how the "cost" value in execution plans is calculated and what it means.

The final session I attended was by Andre Kamman and Henk van der Valk(Unisys) to show how SQL Server 2008 scales up when pushed. On a system with 96 cores (that's 64 quad core CPUs) and 512GB RAM they achieved some truly staggering data transfer speed to handle ETL process. SSIS out performed bcp, BULK INSERT and all other alternatives. There were some nice tweaks that they shared with us, setting the packet size in the BIOS for example gives a huge boost to transfer speed. Sadly this session was marred by the absence of a working internet connection and the live examples were replace with video from previous events.

A lot of attendees gathered for the RedGate software sponsored drinks reception and games in the bar afterwards. There was a quick prize giving process for the various competitions that were taking place at the vendors stands through the day. The lucky winners got prizes ranging from software licenses to web cams and one person won an Xbox. The driving and football games were popular, as was the Rock Band session in one of the meeting rooms, while others simply had a drink and a chat until dinner. Sadly I had to leave to drive home but will certainly be making the effort to attend next time and hopefully for more than just one day.

I would like to thanks all of those who gave up their time and talent to make this conference happen, you are a credit to the industry.

Did I mention this was all free?

Thursday 19 November 2009

PowerShell vs LogParser

The scenario:


We have an application server (iis) that is running slowly, the database shows no signs of stress so I want to review the web server application log to see if there are any entries that show any clues as to the issue. Easy, grab the data from the event log and quickly analyse the results; what sort of event, how many of them, when do they occur??

I have two options immediately in my mind - PowerShell and LogParser.

PowerShell I feel I ought to learn, its embedded in the SSMS UI so I am guessing that Microsoft are going to make it defacto across their server products, there should be good support resources and it could well become a 'required' for certain tasks and even job descriptions.

LogParser is something I have used on and off for years now, there seems very little support for it but what there is is good quality. Starting off with the help document that ships with it as a chm file.

Both tools are command line, sadly, they are their own command line, not the DOS command window you may be used to but a command line none the less.

So, abandon your mouse and start typing ...

PowerShell:
In PowerShell 1 you cant access remote logs without some funky looking Function so this needs to be created:
Function Get-RemoteEventLog {
param ([string]$Computername =$env:ComputerName, [string]$Logname = 0)
# connect .NET class to get eventlogs
$evt = [System.Diagnostics.EventLog]::GetEventLogs("$Computername")

# check if logname is asked
if ($logname -eq 0) {
return $evt
}
Else {
$val = 0
$evt | foreach {
if ($_.log -contains $logname) {
$logindex=$val
}
$val = $val + 1
}
return $evt[$logindex].entries
}
}

and then you can run:
get-RemoteEventLog -ComputerName -LogName application | where-object {$_.source -eq 'ASP.NET 2.0.50727.0'}| out-file -filepath C:\Server_App_Log.csv

This gets the data into a csv file of your choosing, ready for importing into your favourite analysis tool.

LogParser
With LogParser we get to start work right away, simply by specifying the target machine in the FROM part of the query:
logparser "SELECT * into MyServer_App_Log from \\<servername>\application where sourcename = 'ASP.NET 2.0.50727.0'" -o:sql -database:TargetDBName -server: -createtable:on -resolveSIDs:on -formatmsg:on -icheckpoint:on
Here I have got the data straight into a sql database, ready for a grilling from TSQL, alternatively I could have it in a csv file for Excel and so on... Its even resolved the SIDs from AD for me!


Conclusion/Opinion



PS took 2 mins+ to get the data into the csv file and I still have to get that into something that I can analyse it with. Its not formatted for Excel to analyse from the get-go, the columns are not delimited, the messages are truncated, it isnt fit for purpose

LogParser took 13 seconds to get it into a table where I can get T-SQL working on it instantly, it would be no slower into a csv file where I could double click to get Excel or Calc (OpenOffice) to open it and interrogate it.

It seems to me that I still dont have a reason to go digging deep into PowerShell - to get my job done I have other tools that are faster and simpler.

This opinion may change after I have attended the PowerShell session at SQLBits this coming weekend. If it does I'll let you know.

Resources:
If you want to try one or the other then here are some good starting points.
LogParser
Download - http://www.microsoft.com/downloads/details.asp
Support - LogParser
Background - Wikipedia entry with lots of links

PowerShell
Download - PowerShell Blog
Support - PowerShell Blog

Monday 9 November 2009

Give SQL Server 2008 R2 CTP a Quick Try for Free

Extracted from http://bradmcgehee.com/2009/11/give-sql-server-2008-r2-ctp-a-quick-try-for-free/

Give SQL Server 2008 R2 CTP a Quick Try for Free: "

You have probably heard that the SQL Server 2008 R2 CTP is available, but you may not have given it a try yet because it is such a pain to download the huge install file, find a test server to run it on, and then install it. Under the best of conditions, this is at least a 3 hour job.


Now, if you want to check out the new features of SQL Server 2008 R2 (the August 2009 CTP), you can without going through all this hassle. Instead, you can go to www.sqlserverbeta.com, create a virtual instance of the software, and try it out, all for free.


All you have to do is fill out a short registration form, and shortly thereafter, you will have your own personal instance of SQL Server 2008 R2 to play and experiment with.  So now, you don’t have any excuse to learn about, and try out for yourself, all the cool new features in R2.


This offer is sponsored by PASS, Microsoft, Dell, and MaximumASP.




"

Wednesday 4 November 2009

A relevant choice of Index options?

I have recently had reason to review the indexing of a table as a query was taking a lot longer to run than I would have expected.

My first move was to check out the estimated execution plan for the SQL that the view was based on. It turns out that a large chunk of effort in the query was running an index scan on a table that had 3 indexes. . The indexes details were: a primary key (unique, clustered), a single column index (based on one of the most common client references in our system) and a third index on some key columns that describe our clients. The Primary Key column was an INT IDENTITY column - basically just a unique ID in the table for absolute reference to asingle row if needed. It plays no part in our business.

Have you spotted what's wrong here?

Exactly, having the ID column as a CLUSTERED index means that table is sorted on disk in the order of that column, this is like having a library sort it's books by the height of the respective authors - useless when you need to find anything. You would have to run up and down the shelves to find anything you need.

I changed the PRIMARY KEY to be non-clustered and then changed the single column index to be the CLUSTERED index for the table and the performance of the query went from 7s down to 1s (YMMV). WIN!!

This got me thinking whether this may be a scenario that is repeated in other tables or other databases on our servers so I have written the query below to track down indexes that are clustered primary keys and returns the table/column name and the data type of that column. Simply reviewing the index name (columns called ID are suspicious!) and whether its an INT IDENTITY column mean I can check out anything that looks awry and improve our systems performance.

SELECT
OBJECT_NAME([i].[object_id]) AS [Table] ,
[i].[name] AS [Index name] ,
[c].name AS [Column name] ,
[t].[name] AS [Column type] ,
[c].[is_identity]
FROM
[sys].[indexes] AS i
INNER JOIN [sys].[index_columns] AS ic
ON i.index_id = ic.index_id
AND i.object_id = ic.object_id
INNER JOIN sys.columns AS c
INNER JOIN [sys].[types] AS t
ON [c].[system_type_id] = [t].[system_type_id]
ON c.column_id = ic.column_id
AND ( ic.[object_id] = i.[object_id]
AND ic.[object_id] = c.[object_id]
)
WHERE
[i].[is_primary_key] = 1
AND [i].[type] = 1

Personally where the column type is a INT and is_identity is true then I am off to look into making similar changes across our servers...

Monday 2 November 2009

Opera v10 is out. Go get it and enjoy the web at least 35% more. http://www.opera.com/


(Please note: your experience may not be exactly 35% better, but it will be better than with any other browser)

AddThis

Bookmark and Share
 
Google Analytics Alternative