Sunday, 28 February 2010

Snapshot Isolation level

OK, I've just a minute to put this up but I have used PowerShell to help someone with a problem. It wasnt the only way and may not have been the simplest way but it certainly work. A question was posted on asking "How can I tell if Snapshot Isolation is turned on?", my reaction was to see if PowerShell could tell me what the Snapshot Isolation setting was on my SQL databases.

The easiest way to get to PowerShell from SSMS is to right click in the Object Explorer and then choose the PowerShell link in the menu. this opens a PowerShell window at the level that you clicked. Simply typing in
get-childitem|select name, snapshotisolationstate
returns the information that you want for the server you are connected to. Simple, quick and useful. Maybe this PowerShell thing has its uses for a SQL database administrator after all?!

Please be sure to read the code disclaimer page before you use any code in this blog.

Friday, 26 February 2010

PowerShell - my first steps

So I asked this question on and the concensus of opinion seems to be that PowerShell should be in my future. I have a few books that I am deciding between but have downloaded some eBooks and have installed some new applications. It was a real hassle getting PSv2 installed as it seems to conflict with PSv1. I have downloaded PowerShell Analyzer from Shell Tools and it works very well. Rather than just the PS command line you get a full IDE that gives a lot of support to a person new to PS.

As a small nugget of what I have covered so far have this little section of Powershell code:

Get-Process *sql* Select-Object name, CPU, virtualmemorysize

This will bring a list of processes on the local machine that have "sql" in their name, their CPU usage and the virtualmemory they are using.
Which for me tells me this:

SQL Response Client413.03125223125504
SQLAGENT 48.1875138895360

neat huh?

Monday, 22 February 2010

People You May Know (Linked In version)

That's the heading of a little box on my LinkedIn page (, if you have never heard of LinkedIn before, its like Facebook for business people, but with fewer photos.

Now lots of 'social networking' sites have a way of finding other people you know that you havent yet connected with. It makes sense, the more contacts you have on thier site the more traffic will increase as there is more chance of interaction. Some use you email address book for their search, others look at your history - education, work etc and even the history of some of your current contacts so that people you have something in common with are suggested as potential friends.

LinkedIn is different.

How many people do you know, let me clarify that, how many people have you met, spent some time with and remember their name? 50? 100? 200? 500? There's lots. When you stop to think about family, school friends, neighbours, work colleagues, those people you met on holiday in Tenerife and so on ... Out of those people, how many had your name or something similar (Dave Gorman need not answer this.) ? In my case - none. In your case, I'm willing to bet its less than 2 in the majority of cases.

Well, LinkedIn seem to think that despite the education and career history of thousands (#)of people on their records the best people that they can match to me all have the same name as me. Today my suggestions are Jon Allen, Jonathan Allen, Jon Allen, Jonathan Allen, Jonathan Allen, Jon Allen and Jonathan Allen all from America or Mexico and have no career connections in common with me - I program computers for a living and most of them seem to be involved with banking in some way.

LinkedIn, I say this about your friends/contacts suggestion method : its rubbish.

Now, rant over, lets see what rubbish this Monday brings me.

# - Edit - an email from LinkedIn says they have 60 Million members.

Monday, 8 February 2010

Keeping your database files tidy

Behind every database there are database files, traditionally they are denoted by one of 3 file extensions - mdf is the main data file, ldf is the main log file and ndf is used for any additional log files. one mdf and one ldf are a minimum requirement for a SQL Server database.

These files are locked by SQL Server when it is running, thus ensuring that the only updates go through the database engine and are therefore managed correctly. It is possible to separate SQL Server and the database files in a number of ways, one of which is to detach the database using sp_detach_db. This removes the database from the SQL Server and releases the database files to be accessed by any other process that can reach them. It may be that you were moving the database from one server to another or what ever circumstance but it is possible to leave the database files on your server and forget they are there.

This means a number of things:
1 - the files are taking up HDD space and potentially being backed up by your nightly processes (you are backing up regularly arent you?)
2 - the files can be accessed by any editor that can get to the folder location where they are stored. Notepad can open an mdf file. Dont believe me?

open up your favourite T-SQL editor and run this:
Create table TestTable
(Myname varchar(30))

insert into testtable
select 'Fatherjack'

select * from TestTable

use master

sp_detach_db 'DETACHTEST'

That will have created a database, created a table in it, inserted data and then detached the database.

Now open up windows explorer and go to you default SQL data folder, in there you will find a file called DETACHTEST.mdf, open that in notepad or any other text editor and search for the string "fatherjack". There it is, in plain text. I agree its not readily usable and a database of 00's of tables and 000,000's of rows will make data extraction difficult without a SQL Server but it is wholly possible and therefore wholly insecure to have these files lying around on your server.

To find any of these files that may have got left behind we need a process to locate and remove them. My preferred method to do this involves a small script to be run in LogParser and then a nifty join to some live SQL data and a look at the date the files were last used.

Here is the LogParser script to get the file name, location, last access time and last write time from any folder up to 10 levels down from the root on the C:\ and D:\ drives where the file extension is either mdf or ldf:
logparser "SELECT path, name, lastaccesstime, lastwritetime into datafiles from '\\London\d$\*.mdf', '\\London\c$\*.mdf','\\London\c$\*.ldf','\\London\d$\*.ldf'" -i:fs -recurse:10 -o:sql -database:AdminDB -server:SQLLondon -createtable:on -e:10

(Where London is the windows server and SQLondon is the SQL Server instance installed.)

Here is the SQL to work with the data that LogParser retrieves and give us a hit list of files to track down and deal with:
USE [CentralDB]
COALESCE(, 'Not attached to a database. ***') AS [Database Name],
FROM [dbo].[datafiles] AS d
LEFT JOIN [BOA\PARIS].[master].[sys].[master_files] AS mf ON REPLACE(REPLACE([d].[Path], '\\boa\d$\', 'd:\'), '\\boa\c$\', 'c:\') = [mf].physical_name
SELECT '-----',
'--- Items above here not accessed in the last day --',
DATEADD(d, -1, GETDATE()), -- 24 hours ago
ORDER BY [LastAccessTime],
[Database Name]
The resulting data will be a list of database files that have not been accessed recently and whether they are attached to a database on the server. You may wish to adjust the value I have used in the lower part of the query to suit your environment - 24 hours may not be appropriate to you.


Bookmark and Share
Google Analytics Alternative