Thursday, 10 December 2009

Rememba Da Schema

There are loads of great scripts out there on other blogs and elsewhere that help you administer your database and run a tight ship but a lot dont show how to account for the database schema, its not uncommon to have everything in the dbo schema but there are lots of databases I manage that also have objects in other schemas and there are a couple of extra steps to take to get a funky new admin script to help you out in all cases.

Equipment you will need.
We will be using sys.schemas and the "[" and "]" brackets to make our scripts properly useful.

So, if you wanted to see all the columns in your database that are nullable you might run a script something like this:
USE [adventureworks]
GO

SELECT
[t2].[name] AS [Table Name],
[c].[name] AS [Column Name],
[t].[name] AS [Data Type],
[c].[is_nullable]
FROM
sys.columns AS c
INNER JOIN [sys].[types] AS t
ON [c].[system_type_id] = [t].[system_type_id]
INNER JOIN [sys].[tables] AS t2
ON [c].[object_id] = [t2].[object_id]
WHERE [c].[is_nullable] = 1

Now, if you use this list to provide an automated process with object names that its going to do work on it could fail when it tries to use the table and column names in dynamic sql. Especially if, as I do, you have to manage databases that got slung together with table and column names that have spaces in them.

To get around this we need to alter the above script a little:
USE [adventureworks]
GO

SELECT
[s].name AS [Schema Name],
[t2].[name] AS [Table Name],
[c].[name] AS [Column Name],
[t].[name] AS [Data Type],
[c].[is_nullable]
FROM
sys.columns AS c
INNER JOIN [sys].[types] AS t
ON [c].[system_type_id] = [t].[system_type_id]
INNER JOIN [sys].[tables] AS t2
ON [c].[object_id] = [t2].[object_id]
INNER JOIN [sys].[schemas] AS s ON [t2].[schema_id] = [s].[schema_id]
WHERE [c].[is_nullable] = 1

Right, we can now see that all these objects are in different schemas but to refer to a certain column we need to format the data slightly differently. we simply wrap each of the first three columns in square brackets and concatenate them together:

USE [adventureworks]
GO

SELECT
'[' + [s].name + '].[' + [t2].[name] + '].[' + [c].[name] + ']' AS [explicitly referenced column] ,
[t].[name] AS [Data Type] ,
[c].[is_nullable]
FROM
sys.columns AS c
INNER JOIN [sys].[types] AS t
ON [c].[system_type_id] = [t].[system_type_id]
INNER JOIN [sys].[tables] AS t2
ON [c].[object_id] = [t2].[object_id]
INNER JOIN [sys].[schemas] AS s
ON [t2].[schema_id] = [s].[schema_id]
WHERE
[c].[is_nullable] = 1

Now you have a list of all your database objects that can be passed into functions, procedures etc that you use to look after your databases.

Sunday, 22 November 2009

SQLBits round-up (The free bit)

OK, so I couldnt get to any of SQLBits other than the free Saturday, yup, they do a whole day for free. All you need do is register in time at www.sqlbits.com and then turn up. This is the fifth happening and was held at the Celtic Manor Hotel, Newport, Wales. A very impressive hotel with plenty of staff keeping all of the debris that 300+ DBAs generate while they eat bacon/egg/sausage baps, tea/coffee, cookies, croissants, fruit cake and assorted packed lunches (I did mention this was free didnt I?).

Putting the excellent facilities to one side for a while, there were five rooms where sessions were taking place and each room had 5 sessions through the day so my primary school maths tells me that there were 30 sessions to select from. Yes, 30 - because the dinner break had sponsored sessions in each of the rooms too. The full listing is here and I had a lot of awkward decisions to make on what I wanted to see.

In the end I elected to go for a performance/tuning flavoured day so, in order of attendance, I went to
>"When a query plan geos wrong" - Simon Sabin
>"Let's Make SQL fly - a session for developers and administrators who believe in magic" - R Meyyappan
>"Using Perfmon and Profiler" - Brent Ozar
>Quest sponsored lunch showing what to monitor and how their products Foglight and Spotlight can help - Brent Ozar
>"Vital Statistics" - Christian Bolton
>"SQL Server Optimisation stuff you won't find on Google (yet)" - Andre Kamman

Simon's session got things off to a great start, looking at how query plans affect the performance of queries and that sometimes the plan available isnt necessarily the one that should be used and SQL needs a hand to make a different choice. Simon carried the topic very well, demonstrating his points clearly and interacting with the audience consistently. I was handicapped in this session as the goodie bag I received didnt have a pen in it so I couldnt take any notes.

After a short break the second session demonstrated a curious effect of how many rows requested in the query affects the duration of the query. Well, obviously it does but the example had a difference of 1 row and a difference in duration from 25ms to 900ms! It turns out its all to do with the data type and the SQL engine using tempdb unnecessarily. Applying the fix Ramesh showed that there was no reason to accept the slower performance. As he stressed, find out what the problem is and then the solution is easy! Contact Ramesh at www.sqlworkshops.com.

Far and away the most enthusiastic presenter of the day Brent Ozar took us through how to use free tools that are on every server already to create baselines of current performance and then how to compare these over time to measure the effects of system changes and when something spikes what the causes may be. Its all on his blog at www.brentozar.com.

Brent jumped out of this session and straight into the sponsored lunch session telling us how Quest tools can help take the hard work out of monitoring server performance and go some way to pinpointing the actual query that is causing a problem and what to do to solve it. Foglight and Spotlight are the tools they produce to help the DBA make sure their servers are performing as good as they can.

Christian Bolton took on the topic of SQL Server table statistics in the first session after lunch and explained what part statistics play in the efficiency of SQL queries, how to maintain up to date statistics and how different versions of SQL Server have different statistics. We also got an insight on how the "cost" value in execution plans is calculated and what it means.

The final session I attended was by Andre Kamman and Henk van der Valk(Unisys) to show how SQL Server 2008 scales up when pushed. On a system with 96 cores (that's 64 quad core CPUs) and 512GB RAM they achieved some truly staggering data transfer speed to handle ETL process. SSIS out performed bcp, BULK INSERT and all other alternatives. There were some nice tweaks that they shared with us, setting the packet size in the BIOS for example gives a huge boost to transfer speed. Sadly this session was marred by the absence of a working internet connection and the live examples were replace with video from previous events.

A lot of attendees gathered for the RedGate software sponsored drinks reception and games in the bar afterwards. There was a quick prize giving process for the various competitions that were taking place at the vendors stands through the day. The lucky winners got prizes ranging from software licenses to web cams and one person won an Xbox. The driving and football games were popular, as was the Rock Band session in one of the meeting rooms, while others simply had a drink and a chat until dinner. Sadly I had to leave to drive home but will certainly be making the effort to attend next time and hopefully for more than just one day.

I would like to thanks all of those who gave up their time and talent to make this conference happen, you are a credit to the industry.

Did I mention this was all free?

Thursday, 19 November 2009

PowerShell vs LogParser

The scenario:


We have an application server (iis) that is running slowly, the database shows no signs of stress so I want to review the web server application log to see if there are any entries that show any clues as to the issue. Easy, grab the data from the event log and quickly analyse the results; what sort of event, how many of them, when do they occur??

I have two options immediately in my mind - PowerShell and LogParser.

PowerShell I feel I ought to learn, its embedded in the SSMS UI so I am guessing that Microsoft are going to make it defacto across their server products, there should be good support resources and it could well become a 'required' for certain tasks and even job descriptions.

LogParser is something I have used on and off for years now, there seems very little support for it but what there is is good quality. Starting off with the help document that ships with it as a chm file.

Both tools are command line, sadly, they are their own command line, not the DOS command window you may be used to but a command line none the less.

So, abandon your mouse and start typing ...

PowerShell:
In PowerShell 1 you cant access remote logs without some funky looking Function so this needs to be created:
Function Get-RemoteEventLog {
param ([string]$Computername =$env:ComputerName, [string]$Logname = 0)
# connect .NET class to get eventlogs
$evt = [System.Diagnostics.EventLog]::GetEventLogs("$Computername")

# check if logname is asked
if ($logname -eq 0) {
return $evt
}
Else {
$val = 0
$evt | foreach {
if ($_.log -contains $logname) {
$logindex=$val
}
$val = $val + 1
}
return $evt[$logindex].entries
}
}

and then you can run:
get-RemoteEventLog -ComputerName -LogName application | where-object {$_.source -eq 'ASP.NET 2.0.50727.0'}| out-file -filepath C:\Server_App_Log.csv

This gets the data into a csv file of your choosing, ready for importing into your favourite analysis tool.

LogParser
With LogParser we get to start work right away, simply by specifying the target machine in the FROM part of the query:
logparser "SELECT * into MyServer_App_Log from \\<servername>\application where sourcename = 'ASP.NET 2.0.50727.0'" -o:sql -database:TargetDBName -server: -createtable:on -resolveSIDs:on -formatmsg:on -icheckpoint:on
Here I have got the data straight into a sql database, ready for a grilling from TSQL, alternatively I could have it in a csv file for Excel and so on... Its even resolved the SIDs from AD for me!


Conclusion/Opinion



PS took 2 mins+ to get the data into the csv file and I still have to get that into something that I can analyse it with. Its not formatted for Excel to analyse from the get-go, the columns are not delimited, the messages are truncated, it isnt fit for purpose

LogParser took 13 seconds to get it into a table where I can get T-SQL working on it instantly, it would be no slower into a csv file where I could double click to get Excel or Calc (OpenOffice) to open it and interrogate it.

It seems to me that I still dont have a reason to go digging deep into PowerShell - to get my job done I have other tools that are faster and simpler.

This opinion may change after I have attended the PowerShell session at SQLBits this coming weekend. If it does I'll let you know.

Resources:
If you want to try one or the other then here are some good starting points.
LogParser
Download - http://www.microsoft.com/downloads/details.asp
Support - LogParser
Background - Wikipedia entry with lots of links

PowerShell
Download - PowerShell Blog
Support - PowerShell Blog

Monday, 9 November 2009

Give SQL Server 2008 R2 CTP a Quick Try for Free

Extracted from http://bradmcgehee.com/2009/11/give-sql-server-2008-r2-ctp-a-quick-try-for-free/

Give SQL Server 2008 R2 CTP a Quick Try for Free: "

You have probably heard that the SQL Server 2008 R2 CTP is available, but you may not have given it a try yet because it is such a pain to download the huge install file, find a test server to run it on, and then install it. Under the best of conditions, this is at least a 3 hour job.


Now, if you want to check out the new features of SQL Server 2008 R2 (the August 2009 CTP), you can without going through all this hassle. Instead, you can go to www.sqlserverbeta.com, create a virtual instance of the software, and try it out, all for free.


All you have to do is fill out a short registration form, and shortly thereafter, you will have your own personal instance of SQL Server 2008 R2 to play and experiment with.  So now, you don’t have any excuse to learn about, and try out for yourself, all the cool new features in R2.


This offer is sponsored by PASS, Microsoft, Dell, and MaximumASP.




"

Wednesday, 4 November 2009

A relevant choice of Index options?

I have recently had reason to review the indexing of a table as a query was taking a lot longer to run than I would have expected.

My first move was to check out the estimated execution plan for the SQL that the view was based on. It turns out that a large chunk of effort in the query was running an index scan on a table that had 3 indexes. . The indexes details were: a primary key (unique, clustered), a single column index (based on one of the most common client references in our system) and a third index on some key columns that describe our clients. The Primary Key column was an INT IDENTITY column - basically just a unique ID in the table for absolute reference to asingle row if needed. It plays no part in our business.

Have you spotted what's wrong here?

Exactly, having the ID column as a CLUSTERED index means that table is sorted on disk in the order of that column, this is like having a library sort it's books by the height of the respective authors - useless when you need to find anything. You would have to run up and down the shelves to find anything you need.

I changed the PRIMARY KEY to be non-clustered and then changed the single column index to be the CLUSTERED index for the table and the performance of the query went from 7s down to 1s (YMMV). WIN!!

This got me thinking whether this may be a scenario that is repeated in other tables or other databases on our servers so I have written the query below to track down indexes that are clustered primary keys and returns the table/column name and the data type of that column. Simply reviewing the index name (columns called ID are suspicious!) and whether its an INT IDENTITY column mean I can check out anything that looks awry and improve our systems performance.

SELECT
OBJECT_NAME([i].[object_id]) AS [Table] ,
[i].[name] AS [Index name] ,
[c].name AS [Column name] ,
[t].[name] AS [Column type] ,
[c].[is_identity]
FROM
[sys].[indexes] AS i
INNER JOIN [sys].[index_columns] AS ic
ON i.index_id = ic.index_id
AND i.object_id = ic.object_id
INNER JOIN sys.columns AS c
INNER JOIN [sys].[types] AS t
ON [c].[system_type_id] = [t].[system_type_id]
ON c.column_id = ic.column_id
AND ( ic.[object_id] = i.[object_id]
AND ic.[object_id] = c.[object_id]
)
WHERE
[i].[is_primary_key] = 1
AND [i].[type] = 1

Personally where the column type is a INT and is_identity is true then I am off to look into making similar changes across our servers...

Monday, 2 November 2009

Opera v10 is out. Go get it and enjoy the web at least 35% more. http://www.opera.com/


(Please note: your experience may not be exactly 35% better, but it will be better than with any other browser)

Saturday, 31 October 2009

I live here


Halloween, originally uploaded by jonathanallen69.

Sunday, 18 October 2009

Pretty things


DSC_2477_cropped, originally uploaded by jonathanallen69.

Wednesday, 14 October 2009

Who pays the piper, calls the tune...

I have just watched an episode of Robert Llewelyn's Car Pool - here. It was the show where he takes Sharon Corr to Heathrow Airport and along the way they discussed the topic of music sales and the effect of the internet. This has been something that has rattled around in my head for a while now and I must admit to a certain amount of wonderment and confusion.

I may be wrong but the music industry (and entertainment in general I guess) must have its roots in the distant past as some sort of hybrid between a court jester and a wandering minstrel. These people travelled from place to place doing what they do and getting a few coins payment for it. Their show was new to each town and they paid their way from the money they made. If the show was no good they ended up being a blacksmith/archer/maid/etc as a way of living. Presumably at some point the good ones got a name and when they came to town people travelled to see the show and they made more money. It was still live performance, new to each location. They learned/wrote more songs and played those to locations on the second visit, and so on ...

Now, again - I presume, someone must have spotted the role of MR 10% and gotten into the "Stick with me, I'll make you a millionaire" groove of promoting these popular acts. They run ahead to the next location, do some marketing/promo work, create some excitement and build the interest for the act arriving on the horse/carriage next week. then once the act arrives Mr 10% moves to the next location. Everyone is happy, the act is getting more popular (and more $£$£$), Mr 10% is getting £$£$ for little effort and no creative talent and the crowds are seeing a good act with new material.

Fast forward and Mr 10% takes on a few more acts, maybe some not so good ones too. He's getting a cut from all of them but not doubling ot tripling his work. Still everyone is happy, especially Mr 10%.

Then some clever boffin invents "recording" - whatever media wax,tape,vinyl. Now the act and Mr 10% dont need to travel. They record the music and flog it. Much money is made by all. Maybe the audience is getting short changed? That's opinion. Anyway, its not really practical to copy the recordings and distribute them so the act is still fresh when it tours. That boosts media sales. Acts and Mr 10%s get ridiculous incomes.

Then some genius invents computers the internet and then Billy Jones can copy CDs, DVDs and share them with anyone else who has an internet from the comfort of his own home. Suddenly the sales start to drop. Live music is then the best way to gain income as people still want new live music, well played and the thrill of a concert atmosphere.

Dont get me wrong, I want as many people to succeed in as many ways as they can but it strikes me that Music (and the entertainment industry as a whole) is not a service. As such there is no 'right' that those people have to an income, whether it is at a subsistence level of 100's of years ago, the phenomenal levels of the 60's,70's,80's or the lower but still a comfortable amount thank you very much of today. If we got a service - medical aid, farmed produce etc then they would be justified to argue it was paid at a reasonable level or not. I agree that copying the entertainers work is wrong and in many cases illegal but that isnt my point. The industry needs to accept that its possible and react accordingly. If they are not happy that their sales are down and their incomes are lower then they are still at liberty to take a different career route - there is a shortage of nurses and teachers in many parts of the world. Their ability to do something creative and recreational can be passed on to people they encounter and may in time get them an income. If its not enough to satisfy them then they need to accept that 'fame' will not be knocking on their door.

Confused by slashes

Anon: "I always get confused over slashes, you know back slashes and forward slashes"

Me: "Oh?"

Anon: "Well, you'd think a forward slash would be on the right of a keyboard and a back slash on the left of a keyboard"

Me: "They are"

Anon: "..."

Anon: "..."

Anon: "..."


Additional:
Anon is a web developer.

Tuesday, 13 October 2009

Just trying out ping.fm

Friday, 9 October 2009

Shaving Review

"... I shave my legs with all of the grace and finesse of a raccoon pawing at a garbage pail ... "

Monday, 5 October 2009

How to group time data by time periods

This solution builds on using a 'numbers' or 'tally' table as championed by Jeff Moden here http://www.sqlservercentral.com/articles/T-SQL/62867/. Read that article for the full description of how to use set based solutions for efficient SQL queries.

Here I am considering a large log table that has entries that are time based and how to aggregate data based on various time bandings. For example if you want to see how many visitors your website had in 10 minute bands through the day or how many application errors you get in the server log per hour and so on...

-- First things first we will create the Numbers table
------------------------------------------------------
-- Things to note:
-- We are using a temp table in this instance as it is a small one but it is often the case that DBAs add a numbers table to a production database or have one in their metadata database that they can use.
-- Select a sensible number of rows to insert - relevant to the task in hand. If you are dealing with data that you want to analyse over a year and its date based then 1000 rows will be more than ample, its excessive to use 1,000,000! Especially in this instance where its a temp table, remember how temp tables work and how this might affect your system.

--DROP TABLE #numbers
IF OBJECT_ID('tempdb..#Numbers') > 0
BEGIN
RAISERROR('There is already a table called Numbers, please use that or select a different name.',0,0,1) WITH NOWAIT
END
ELSE
BEGIN
CREATE TABLE #Numbers
(
Num INT NOT NULL ,
PRIMARY KEY CLUSTERED ( Num )
)
RAISERROR('Numbers table created.',0,0,1) WITH NOWAIT
-- Insert values to a given (sic Relevant) value
-------------------------------------------------
INSERT INTO #numbers
SELECT TOP ( 1500 )
ROW_NUMBER() OVER ( ORDER BY c1.column_id )
FROM
master.sys.all_columns c1
CROSS JOIN master.sys.all_columns c2

RAISERROR('Numbers table filled.',0,0,1) WITH NOWAIT
END



-- Right the numbers table is ready for us, lets set some local values
DECLARE @Start DATETIME -- when we want to start anlysis
DECLARE @interval INT -- the grouping period we want to use


SET @interval = 60
--where 10=10 minutes;30=half hourly;60=hourly;240=4hourly;720=halfdaily;1440 = daily
SET @Start = '05-Oct-2009 00:00' ;
-- any date you want as the start date

-- Using the numbers table we can create a CTE that has the start and end times over which we want to analyse the data
WITH times
AS ( SELECT n.Num,
DATEADD(n, num - @interval, @Start) AS [LowerTime],
DATEADD(n, num, @Start) AS [UpperTime]
FROM [#Numbers] AS n
WHERE [n].[Num] % @interval = 0 -- use modulo to get required intervals

)
-- simply join our source data to the CTE using the date column in the join
SELECT [times].[LowerTime] AS [Interval Start],
COUNT(*) AS [Sessions]
FROM [dbo].[Log] AS dl
INNER JOIN [times] ON [StartTime] BETWEEN [times].[LowerTime]
AND [times].[UpperTime]
GROUP BY [LowerTime]
ORDER BY [LowerTime] desc


Your results will look something like this:









IntervalStartSessions
2009-10-0511:00:00.000320
2009-10-0510:00:00.0002892
2009-10-0509:00:00.0002490
2009-10-0508:00:00.0002264
2009-10-0507:00:00.0001249
2009-10-0506:00:00.000106
2009-10-0503:00:00.0001
2009-10-0502:00:00.0007


Simply adjusting the value for @interval will group your day in smaller or larger periods.

It is simple from here to adjust the script to calculate days or months rather than minutes and it could all be added into a stored procedure to provide a way for users to query their own data via some interface that allows them to supply a parameter for the interval they need - this may be Reporting Services, Excel or a bespoke interface.

Sunday, 13 September 2009

Knitting Sunday

 
Well, knitting of sorts
Posted by Picasa

Just chill

 
Everything is nice and peaceful in North Cornwall, just in case you were interested ...
Posted by Picasa

Thursday, 10 September 2009

Buzz buzz buzz



Are you making honey for us?

Thank you

Wednesday, 9 September 2009

 
I do wish these things would stay still while I try to take their picture.
Posted by Picasa

Friday, 4 September 2009

The Moon vs new lens

 
 
Posted by Picasa

Wednesday, 26 August 2009

Genius

Genius: "http://www.boycottscotland.co.uk/

Please read through the site; it raises some very good points.

Sunday, 23 August 2009

U2 - 360 toour

Yeah, the U2 stage is pretty impressive, that'll be more than half a football pitch impressive.
Posted by Picasa

Monday, 17 August 2009

Unblocking your SQL Server

Finding which process is locking up your SQL Server can cause some headaches to people new to DBA activities. Digging deaper into the information that is available and refining the data to inform how you proceed is something that can still cause a well established DBA.


sp_who is a stored procedure that ships with SQL Server. If you run EXEC sp_who from a Management Studio query window then you have a grid of data returned that details the connection details of all activity on your server. This is useful as it shows who is connected to which database and what they are up to. Now this can be a small amount of data on your test server but on a production server could return hundreds of rows. This is also instantaneous, in so far as when you run it next it could all have changed.


I find it useful to get these results into a temporary table so that I can shake it up a little and refer to the same set of results more than once while I am trying to resolve block or some such. To do this you need to get the output of sp_who into the aforementioned temporary table with a section of code like this:


IF OBJECT_ID('tempdb.dbo.#who') > 0
BEGIN
TRUNCATE TABLE #who
END
ELSE
BEGIN
CREATE TABLE #who
(SPID INT ,
ECID INT ,
[Status] NVARCHAR(255) ,
[LOGINNAME] NVARCHAR(255) ,
[HostName] NVARCHAR(255) ,
[Blk] NVARCHAR(255) ,
[DBName] NVARCHAR(255) ,
[Cmd] NVARCHAR(255) ,
[REQUESTID] INT)
END
go
INSERT INTO #who EXEC sp_who ;

You can then recall this information at any time by running
SELECT * FROM [#who]

If your headache is blocking flavoured then running this will show you which SPID is the cause:

SELECT
CAST([w].[Status] AS NVARCHAR(30)) AS [status],
[w].[LOGINNAME] ,
[w].[SPID] ,
[w].[ECID] ,
[w].[HostName] ,
[w].[Cmd] ,
[w].[REQUESTID] ,
[w].[Blk] AS [Blocked by],
[w].[DBName]FROM
[#who] AS w
ORDER BY [DBName], [blocked BY]

The Blocked by column will have a number in it if that process is being blocked. The value in the column is the SPID of the problem process or, if that process itself has a number in its Blocked By column then it too is being blocked. Keep following the trail until you reach a process that has a zero in its Blocked by column but is being shown as blocking a different process.

Using a temporary table means that the information gathered is only available in the single process and will be dropped when you close the query.

I have this script set up as a SQL Snippet in SQL Prompt so that it is readily available in 3 keystrokes. SQL Prompt is an application that provides shortcuts to SQL Editors, making SQL development simpler and faster from Red Gate. Other editors are available and it is possible to create a script template in SSMS if you are not fortunate enough to have any third party tools working for you.

sp_who is documented in Books Online here so please read more about it there. There is also an undocumented version called sp_who2 that is very similar but has even more information regarding each process that is connecting to your databases.

Thursday, 13 August 2009

The Red Gate Incubator

Free food, office space, training and guest speakers, expenses, accommodation and mentoring for 10 weeks ...

Sounds too good to be true?

Not if you know Red Gate. Not only do they make awesome applications to help your average DBA become an awesome DBA they are now running the Springboard Project for the burgeoning software business.

Take a look here: http://springboard.red-gate.com/

Dont thank me, thank Red Gate

Wednesday, 12 August 2009

Log Parser - the best value 3MB on your hard drive

I was wondering how best to describe LogParser and couldnt come up with anything better than the description in its help file:

  • Log parser is a powerful, versatile tool that provides universal query access to text-based data such as log files, XML files and CSV files, as well as key data sources on the Windows® operating system such as the Event Log, the Registry, the file system, and Active Directory®.
They are right. One thing they didnt mention is that its free - the price we all like and can all afford.

To my mind this application works much more intuitively than PowerShell and I use it regularly for grabbing data from all sorts of sources and bringing it into a SQL table for further dissection.

Got problems in a web application and need to review the application log? Then use something like "SELECT * INTO AppLog FROM \\Server\Application".

Want to analyse the security log of a server for failure events? Then use "select top 100 * from \\boa\security where eventtypename = 'Failure Audit Event'"

These pseudo scripts all make immediate sense to a SQL DBA, OK its not exactly like this but the differences wont stop you understanding the script syntax and getting to use more data about your environment.

Lets start off by downloading and installing the application. You get it from Microsoft here:Logparser download.
OK, you should have the application all installed now, along with its comprehensive help documentation.

Start the app and you will get a command-prompt-like window open, titled Log Parser 2.2 and with a stream of BNF description. Now, all Log Parser commands begin with the key word LOGPARSER, followed by the query in double quotes, with any switches and parameters following on behind that. Hitting enter runs the query.

Are you ready for your first Log Parser query? OK, we'll get the names, sizes and last accessed date of files in the root of C:\. Type this into the Log Parser window.


  • LOGPARSER "SELECT TOP 10 name, size, lastaccesstime from C:\*.txt" -i:FS
Simple isnt it? We asked for the name, size and lastaccesstime for top ten records from C:\ that have a .txt extension. Just as a little hint to Logparser that we are asking for data from the file system, we set input format with -i:FS. If you want Log Parser to describe this to you itself type in:

  • LOGPARSER -h -i:FS

To get a chart illustrating the types of event in your application log from a given date then use

  • Logparser "select eventtypename, count(*) into c:\application.jpg from file://machinename/application where timegenerated > '2009-08-12 00:00:00' group by eventtypename" -charttype:column3d

To review the massive list of options on what sort of chart you can have then use this:

  • LOGPARSER -h -o:charttype
Thats it for the minute, next time we'll look at getting Log Parser to get data into our database so that we can analyse it further.

LogParser resources:

Download details
http://www.microsoft.com/DownLoads/details.aspx?FamilyID=890cd06b-abf8-4c25-91b2-f8d975cf8c07&displaylang=en

Support forum
http://forums.iis.net/default.aspx?GroupID=51

Wednesday, 5 August 2009

Keeping orphans away from your servers

When restoring a database backup to a different server it is wholly possible that there will be security issues when you take the next step and make the system live to the user interface.

Most often in our environment this happens when a live database backup is used to 'refresh' the data in the test system. To do this we take a backup file made from the database on Server A and restore it onto Server B. This also applies however if you have to restore to a new server in the event of a disaster - imagine Server A is on fire and you are under an SLA to get the system live again. First, make sure someone deals with the fire, preferably a trained professional such as someone from the local fire service.

In live the server may have a mixture of users - Windows domain user and group accounts, SQL server logins etc and some of these may have been given permission to the database you are in the process of transferring to a new location. This may not compare exactly with the new server configuration. In order to ensure that once the database is live you move on to dealing with all the security issues that may arise from the above scenario you need to reference the database security and the (new) server security details.

In SQL 2008 Database user details are stored in sys.database_principals; Server login details are stored in sys.server_principals.

In SQL 2005 Database user details are stored in sysusers; Server login details are stored in sys.syslogins.

The script below can be switched between the two versions simply by swapping the table names accordingly as the columns used are in both versions.

In order to view the users that have come with the database and dont exist on the new server we need to find them with:

SELECT
[dp].name AS [Orphan DB User]
FROM
[sys].[database_principals] AS dp
LEFT JOIN sys.[server_principals] AS sp
ON dp.[name] = [sp].name
WHERE
( [dp].[sid] IS NOT NULL
AND [sp].[name] IS NULL
)
AND CAST([dp].[sid] AS BIGINT) < 0


These users are in the database but dont have a server security principal to link to. Thats the "LEFT JOIN" and "[sp].[name] IS NULL" working it out for us.

Next are the database users that might have similar named principals on the server but if the SIDs for them dont match SQL Server will consider them different accounts and stop access to the database. This is a great GOTCHA - using SSMS you see the server login and then in the database you see the user, apparently identical. Only when you compare SIDs are the differences there to be seen. These occurrences are found with:
SELECT
[sp].[name] AS [SQLServerLogIn] ,
[sp].[SID] AS [SQLServerSID] ,
[dp].[sid] AS [DatabaseSid]
FROM
sys.server_principals AS sp
INNER JOIN [sys].[database_principals] AS dp
ON
sp.[name] = dp.[name]
WHERE
[dp].[sid] <> sp.sid
AND [sp].[sid] < 1


Now this is OK but we can combine the two queries to have a concise list of security changes that we need to review:
SELECT
NULL AS ['Split' login] ,
NULL AS [Server Principal SID] ,
[dp].[sid] AS [DB Principal SID] ,
[dp].name AS [Orphan DB User]
FROM
[sys].[database_principals] AS dp
LEFT JOIN sys.[server_principals] AS sp
ON dp.[name] = [sp].name
WHERE
( [dp].[sid] IS NOT NULL
AND [sp].[name] IS NULL
)
AND CAST([dp].[sid] AS BIGINT) < 0
UNION ALL
SELECT
[sp].[name] ,
[sp].[SID] ,
[dp].[sid] ,
NULL
FROM
sys.server_principals AS sp
INNER JOIN [sys].[database_principals] AS dp
ON
sp.[name] = dp.[name]
WHERE
[dp].[sid] <> sp.sid
AND CAST([sp].[sid] AS BIGINT) < 0


We filter out SQL Server roles and system principals such as Guest, sa, public etc by CASTing the SIDs to a BIGINT and checking its a -ve value.


Next time we will review what to do with these accounts and how to get them enabled for the database in its new location.

Monday, 6 July 2009

 

Well, not been here for a while. Crap times are everywhere I look, friends suffering in different ways that I cant really help or ease other than to feably say "Let me know if I can help.". It seems to be such a flippant remark that I wonder if its worth saying. I mean it, whole-heartedly, but dont in all honesty expect to get the call that I am suggesting they make. I wish they would in some ways, if only to validate my offer! In other ways I hope they know it is meant without reservation and that they just dont actually need me. If that is true then that is good.

Hopefully things will improve in all aspects as the year progresses, we need to emjoy the month or two of summer that the calendar says we have left and then get ready for winter. Please dont be as wet last year. Cold I dont mind, wet gets on my nerves.
Posted by Picasa

Wednesday, 6 May 2009

How to find if and where your table is replicated

Another in the sporadic series of I need to do but I cant find it on the internet.

I have a table that I think might be involved in replication but I dont know which publication its in...

We know the table name - 'MyTable'
We have replication running on our server and its replicating our database, or part of it - 'MyDatabase'
we need to know if the table is replicated and if so which publication is going to need to be reviewed if we make changes to the table.

How?

USE MyDatabase
GO

/* Lots of info about our table but not much thats relevant to our current requirements*/
SELECT
*
FROM
sysobjects
WHERE
NAME = 'MyTable'

-- mmmm, getting there
/* To quote BOL - "Contains one row for each merge article defined in the local database. This table is stored in the publication database.replication"

interesting column is [pubid]
*/
SELECT
*
FROM
dbo.sysmergearticles AS s
WHERE
NAME = 'MyTable'

-- really close now
/*
the sysmergepublications table - Contains one row for each merge publication defined in the database. This table is stored in the publication and subscription databases.

so this would be where we get the publication details
*/
SELECT
*
FROM
dbo.sysmergepublications AS s
WHERE
s.pubid = '2876BBD8-3D4E-4ED8-88F3-581A659E8144'

-- DONE IT.
/*
Combine the two tables above and we get the information we need
*/
SELECT
s.[name] AS [Publication name]
FROM
dbo.sysmergepublications AS s
INNER JOIN dbo.sysmergearticles AS s2 ON s.pubid = s2.pubid
WHERE
s2.NAME = 'MyTable'

So I now know which

Wednesday, 1 April 2009

My life in cartoon form

Dilbert.com

Tuesday, 24 February 2009

Fancy a burger?

Friday, 23 January 2009

End of the week

Well, I seem to have taken on adding more to other people's blog than anything here. Long may it continue, others deserve the attention. I'll sit here in the corner. Before I go though, there's a few more photos over on my Picasa space - http://picasaweb.google.com/jonathanallen69. Some of the ones worth filling the internet with from about 40 taken this week.

Hoping to get out and about with my new camera this weekend, expect to see that in a few weeks!

J

Wednesday, 21 January 2009

 
Posted by Picasa

confused by my new camera

 
I have NEF files, it's like RAW but Nikon flavoured so all a bit odd.
Posted by Picasa

Tuesday, 13 January 2009

Seven Things

So, I have no idea where it started but Twitter has been full of people linking to their "Seven Things You (probably) Dont Know About Me" Blog posts and then they 'tag' 7 other people to do the same thing. Well, just like the plague, when you see you friends (more on that later) going down with it you know its only a matter of time before you get it too. So, without further ado, here's mine:

Won) Friends.
I don't have many. I never have really. I am friends with lots of people who I work with but I don't have many friends outside of that. I know lots of people and I think I get on well with them, they don't seem to recoil when I am around and I can sometimes make them laugh which I think is a good thing. There are a growing number of people that I know via the internet and I would like to be able to call them friends despite never having met them or knowing what they look like, sound like or smell like.

Too) School
I hated it, partly because I was made to do homework by my mother and it stopped me doing what I would have preferred but mostly because of the bullying. I suffered mostly in my last two years but it made me hate, a lot. I guess thats gonna happen when 30 kids pick on you at once, oh and the tampon in the water bottle was plain nasty.

iii)
Fore) Work
I have only had two employers. I left college before completing my A levels and got an apprenticeship at a local council. When that ended I move to another council and worked there for 13 years in different jobs and then 9 years ago moved to my current employer. I like my job, I enjoy the surroundings and the colleagues company. I work with my wife and that goes just fine.

Phyve) Family
I love my family, in the broken state that it is in following my divorce 4 years ago. I miss my father - he died shortly before my 3rd birthday. I say miss but I am not sure that's possible. I didn't know him well enough to miss his voice or miss him do things for me, I just have a gap in my life where other people have a man.

Sicks) Camera
I have just bought - yesterday - a new camera. I am looking forward to enjoying photography again.

7) Who next
Apart from the fact that I had to do this list the other thing on my mind as I saw others doing their's was the fact that I would have to tag 7 other people. As you will have read number 1, you will appreciate that a limited number of friends will mean that its tricky getting together a list of 7 that haven't done their list already. Well here is my best shot:
@melodik; @chibialfa; @Hello_Nurse; @bobbyllew; @Tony_D; @Glinner; @Wossy
a star studded mix, maybe it will pay off, maybe I'll be ignored by everyone...

I missed out iii), due to lack of planning in my life there isnt anything much to tell about me. I (with my awesome wife Annette) own 8 horses. Have that as an extra thing

Sunday, 11 January 2009

AddThis

Bookmark and Share
 
Google Analytics Alternative