Showing posts with label SQL Server. Show all posts
Showing posts with label SQL Server. Show all posts

Monday, 26 April 2010

How to understand SQL Server execution plans

SQL Server Execution Plans by Grant Fritchey

Query Execution Plans are something that a DBA or TSQL developer will get to encounter pretty soon after starting to work with SQL Server. They are found in a number of guises - graphical, text or even XML. In either format they are a daunting array of information that seems really important but also pretty complex and technical. With Grant's book by your side you will soon be reading the execution plans and relating them to the query you have written and then be able to alter your script or adjust parts of the database to improve the efficiency of your queries.

The book starts of in a basic way, with worked examples that you can follow on any version of SQL Server that has an AdventureWorks database. You will work your way from the simplest of queries up in complexity to examples using JOINS and GROUP BY and then examining what indexes are being used (or possibly more importantly, why they are not being used).

The book also covers how to see what the execution plans are and also how to control the way the execution plan is created with HINTS so that it is compiled the way you want it if you have a preference for a certain method.

This book is something that I will be referring to regularly as I build up my understanding of SQL Server Execution Plans. An excellent cover-to-cover read and a great reference book.


There are various versions of the book:
electronic version available from Simple Talk Publishing - SQL Server Execution Plans - eVersion
in print from Amazon.com - SQL Server Execution Plans- in print
and it should be available soon via Amazon.co.uk I will put a link here when it is.

Please be sure to read the code disclaimer page before you use any code in this blog.


Friday, 9 April 2010

Testing a database connection or Who the hell (UDL) are you?

If you work with software and databases then it wont be long before you encounter  problems connecting to SQL server (or any other data source) and there are often many reasons why you would be presented with  problems. A good way of starting to analyse your problems with no need for any specific software on a  PC is to use a UDL file.

Create a new text file on the desktop of the computer you are using. Rename this file with a udl extension, something like test.udl will be ideal. All you need do is double click this and a Data Link Properties dialog opens. For some reason this always opens on the wrong tab so make sure you select to the Provider tab and then choose the OLE DB provider applicable to your problem. Next move to the Connection tab and move through the required steps, entering a data source name, whether you want to use Windows or specify SQL login details and finally the database name you wish to connect to. Once this is complete press the Test Connection button and see what results you get.

You may get a 'test succeeded' message in which case there is no reason why any other means of connecting from this computer wont work and if they do then its something wrong with the settings in the other application you are using. Alternatively you may get a message detailing why the connection failed. This is the key, you get a message detailing the issue and this lets you investigate and resolve your problem.

IMPORTANT:
Remember to delete the udl file once you have used it as, if you have entered a username and password then this is saved in plain text with no security and could pose a security risk. All that is required is for the udl file to be renamed to txt and the username and password can be accessed by notepad or any other text reading software.


Please be sure to read the code disclaimer page before you use any code in this blog.

Wednesday, 3 March 2010

How to slipstream SQL Server 2008 RTM and SQL Server 2008 SP1

I have a number of SQL Servers to upgrade from 2005 to 2008 this year and as service pack 1 is already out it makes sense to install this at the same time as the main software. In Microsoft parlance this is known as 'slipstreaming'.

In theory it involves getting the set up files for SQL Server 2008 and applying the service pack to them rather than to the installed software. Its covered on an MSDN blog here http://blogs.msdn.com/petersad/archive/2009/02/25/sql-server-2008-creating-a-merged-slisptream-drop.aspx and these are the instructions that I followed. However there are a couple of pointers that I have had to work through so this could be considered a supplemental blog for the slipstream creation process. The step numbers used relate to those in Peter Saddow's blog.

Step 1: I got iso files downloaded. These need to be mounted (Virtual Clone Drive) or extracted to disc and then the contents copied to the directory as described. Note: I could not get PowerShell to do the copy from my VCD drive - it errored saying the drive was not the right format. Windows drag and drop does it anyway.
Step 2: No problems (apart from time taken to download!)
Step 3: Execute each statement listed one at a time and leave it a few minutes before you start the next one. You will see an extracting progressbar for each package.
Step 4: This is simply moving them up a directory level and overwriting the files of the same name already there. Robocopy is a tool installed as part of a server admin resource kit. You will need to install that in order to copy the command given otherwise follow the instructions exactly.
Step 5: Robocopy again but simple
Step 6: Easy enough to edit 3 .ini files and add a line.
Step 7: Install SQL Server 2008 and note the slipstream identifiers as you go through the installation process.

Hope this helps.

Other useful links
SQL Server 2008 Slipstream FAQ : http://blogs.msdn.com/petersad/archive/2009/04/24/sql-server-2008-slipstream-frequently-asked-questions.aspx
CSS SQL Server Engineers How to fix your SQL Server 2008 Setup before you run setup... Part 1
CSS SQL Server Engineers How to fix your SQL Server 2008 Setup before you run setup... Part 2

Please be sure to read the code disclaimer page before you use any code in this blog.

Tuesday, 2 March 2010

Brad's Sure Guide to SQL Server Maintenance Plans

That's the title for the latest book from Brad McGehee and Simple Talk Publishing. Its available from Amazon here.
ISBN 978-906434-34-2
Price:GBP19.99 / USD29.99 / CAN35.99

This book will enable any person responsible for data stored in a SQL Server database to ensure it is being managed to a required minimum with a small amount of effort and possibly limited experience. This is not a book long time DBAs who have scripts already in place to maintain their databases will get a lot out of, it’s for the people who are becoming known as ‘Accidental DBAs’. Maybe you were closest to the server when the last DBA quit, or you are a system administrator with a new system based on SQL Server and you don’t have a DBA in-house. Whichever way you have happened to find yourself looking after SQL Server databases, this book is for you.

It explains very clearly what a maintenance plan is designed to do, what it can and can’t do and in some cases why you don't want it to do some things. Once the intent of each feature is established Brad then walks you through configuring the task options to your requirements and how it can be linked to or affects other tasks you need carried out on your databases. A clear format of what the task does, why you want it done (or not) and how to make it happen is followed in each chapter. Pitfalls and limitations of the various tasks are explained clearly and where there are options to avoid or work around these they are explained so that they can be implemented. The tasks are described with enough background information that someone unsure about SQL Server will clearly understand what the task is attempting to achieve and will then be able to decide whether that is something they want to apply to their systems.

While the majority of the book refers to the process followed by the Maintenance Plan Wizard Brad also explains how to make your own custom Maintenance Plan or alter existing plans with the Maintenance Plan Designer. Implementing schedules and operator notifications into your plan is explained so that a user can keep up to date with the status of any executions without having to read through the detailed reports along with using logic to control flow from one step to another within a plan and adding multiple copies of tasks in order to get the exact results needed.

If you have any databases in your remit and you are wondering how to look after them, or want a guide on how to make your existing maintenance plans more effective, then this book will be a valuable asset to you.

It could be as simple as reading this book for a couple of hours on Monday, Tuesday and Wednesday and then over a similar time on Thursday and Friday you could have your SQL databases fully covered by a set of maintenance plans to ensure your data is well maintained and backed up.

Brad's Sure Guide to SQL Server Maintenance Plans

Sunday, 28 February 2010

Snapshot Isolation level

OK, I've just a minute to put this up but I have used PowerShell to help someone with a problem. It wasnt the only way and may not have been the simplest way but it certainly work. A question was posted on www.ServerFault.com asking "How can I tell if Snapshot Isolation is turned on?", my reaction was to see if PowerShell could tell me what the Snapshot Isolation setting was on my SQL databases.

The easiest way to get to PowerShell from SSMS is to right click in the Object Explorer and then choose the PowerShell link in the menu. this opens a PowerShell window at the level that you clicked. Simply typing in
get-childitem|select name, snapshotisolationstate
returns the information that you want for the server you are connected to. Simple, quick and useful. Maybe this PowerShell thing has its uses for a SQL database administrator after all?!

Please be sure to read the code disclaimer page before you use any code in this blog.

Friday, 26 February 2010

PowerShell - my first steps

So I asked this question on ask.sqlservercentral.com/questions and the concensus of opinion seems to be that PowerShell should be in my future. I have a few books that I am deciding between but have downloaded some eBooks and have installed some new applications. It was a real hassle getting PSv2 installed as it seems to conflict with PSv1. I have downloaded PowerShell Analyzer from Shell Tools and it works very well. Rather than just the PS command line you get a full IDE that gives a lot of support to a person new to PS.

As a small nugget of what I have covered so far have this little section of Powershell code:

Get-Process *sql* Select-Object name, CPU, virtualmemorysize

This will bring a list of processes on the local machine that have "sql" in their name, their CPU usage and the virtualmemory they are using.
Which for me tells me this:







NameCPUVirtualMemorySize
------------------------
RedGate.SQLBackup.UI184.125272957440
RedGate.SQLPrompt.TrayApp3.6875187170816
SQL Response Client413.03125223125504
SQLAGENT 48.1875138895360
sqlbrowser0.04687514163968
sqlservr2198.968751570562048

neat huh?

Monday, 8 February 2010

Keeping your database files tidy

Behind every database there are database files, traditionally they are denoted by one of 3 file extensions - mdf is the main data file, ldf is the main log file and ndf is used for any additional log files. one mdf and one ldf are a minimum requirement for a SQL Server database.

These files are locked by SQL Server when it is running, thus ensuring that the only updates go through the database engine and are therefore managed correctly. It is possible to separate SQL Server and the database files in a number of ways, one of which is to detach the database using sp_detach_db. This removes the database from the SQL Server and releases the database files to be accessed by any other process that can reach them. It may be that you were moving the database from one server to another or what ever circumstance but it is possible to leave the database files on your server and forget they are there.

This means a number of things:
1 - the files are taking up HDD space and potentially being backed up by your nightly processes (you are backing up regularly arent you?)
2 - the files can be accessed by any editor that can get to the folder location where they are stored. Notepad can open an mdf file. Dont believe me?

open up your favourite T-SQL editor and run this:
CREATE DATABASE DETACHTEST
GO
USE DETACHTEST
GO
Create table TestTable
(Myname varchar(30))
go

insert into testtable
select 'Fatherjack'
go

select * from TestTable
go

use master
go

sp_detach_db 'DETACHTEST'


That will have created a database, created a table in it, inserted data and then detached the database.

Now open up windows explorer and go to you default SQL data folder, in there you will find a file called DETACHTEST.mdf, open that in notepad or any other text editor and search for the string "fatherjack". There it is, in plain text. I agree its not readily usable and a database of 00's of tables and 000,000's of rows will make data extraction difficult without a SQL Server but it is wholly possible and therefore wholly insecure to have these files lying around on your server.

To find any of these files that may have got left behind we need a process to locate and remove them. My preferred method to do this involves a small script to be run in LogParser and then a nifty join to some live SQL data and a look at the date the files were last used.

Here is the LogParser script to get the file name, location, last access time and last write time from any folder up to 10 levels down from the root on the C:\ and D:\ drives where the file extension is either mdf or ldf:
logparser "SELECT path, name, lastaccesstime, lastwritetime into datafiles from '\\London\d$\*.mdf', '\\London\c$\*.mdf','\\London\c$\*.ldf','\\London\d$\*.ldf'" -i:fs -recurse:10 -o:sql -database:AdminDB -server:SQLLondon -createtable:on -e:10

(Where London is the windows server and SQLondon is the SQL Server instance installed.)

Here is the SQL to work with the data that LogParser retrieves and give us a hit list of files to track down and deal with:
USE [CentralDB]
GO
SELECT DISTINCT
COALESCE(mf.name, 'Not attached to a database. ***') AS [Database Name],
[d].[Path],
[d].[Name],
[d].[LastAccessTime],
[d].[LastWriteTime]
FROM [dbo].[datafiles] AS d
LEFT JOIN [BOA\PARIS].[master].[sys].[master_files] AS mf ON REPLACE(REPLACE([d].[Path], '\\boa\d$\', 'd:\'), '\\boa\c$\', 'c:\') = [mf].physical_name
UNION
SELECT '-----',
'--- Items above here not accessed in the last day --',
'-----',
DATEADD(d, -1, GETDATE()), -- 24 hours ago
''
ORDER BY [LastAccessTime],
[Database Name]
The resulting data will be a list of database files that have not been accessed recently and whether they are attached to a database on the server. You may wish to adjust the value I have used in the lower part of the query to suit your environment - 24 hours may not be appropriate to you.

Monday, 18 January 2010

Writing TSQL faster with templates

SSMS Templates

The management tools that come with SQL Server are the way that most DBAs access and administer their database environment. SSMS has plenty of shortcuts and times saving features, one of which is the use T-SQL templates. SSMS templates are accessed via their own menu and are stored in folders just like you would access files on a windows explorer window. SSMS comes with a whole load of templates ready to run but you can add to and alter these very easily.

Firstly, lets look at getting a template onto the query editor and running it successfully.
Right, open up an SSMS session and hit Alt + N (or Ctrl + N for SSMS for SQL 2005) to open a new TSQL editor window. Now its possible that you will have the templates toolbar ready to use already but hit Ctrl + Alt + T to make it active. Lets pick something simple to start with, open up the Database folder and drag the Create Database template over to your query editor page. Right, we have the text on the page but we have to do a little work before we can hit F5 or click Execute.

You'll see that there are sections of code inside < > brackets. These are SSMS template parameters, we need to take this code out and replace it with something specific to our environment in order to make the script parse and then execute successfully. We could swipe the parameter with our mouse but that could get tedious if we pop out a template with loads of them, its easier to just press Alt + Q, Alt + S (Ctrl + Shift + M for SQL 2005 and earlier). This brings up the parameter completion dialog for you.

Simply type the database name in the higlighted area and press Enter. Job done, it is now ready to run, the text that you entered in the parameters dialog has been put in three places and the script will now successfully create a database (with all the default options such as file locations that are set at server level) providing you have security on the server to execute this command.

Monday, 17 August 2009

Unblocking your SQL Server

Finding which process is locking up your SQL Server can cause some headaches to people new to DBA activities. Digging deaper into the information that is available and refining the data to inform how you proceed is something that can still cause a well established DBA.


sp_who is a stored procedure that ships with SQL Server. If you run EXEC sp_who from a Management Studio query window then you have a grid of data returned that details the connection details of all activity on your server. This is useful as it shows who is connected to which database and what they are up to. Now this can be a small amount of data on your test server but on a production server could return hundreds of rows. This is also instantaneous, in so far as when you run it next it could all have changed.


I find it useful to get these results into a temporary table so that I can shake it up a little and refer to the same set of results more than once while I am trying to resolve block or some such. To do this you need to get the output of sp_who into the aforementioned temporary table with a section of code like this:


IF OBJECT_ID('tempdb.dbo.#who') > 0
BEGIN
TRUNCATE TABLE #who
END
ELSE
BEGIN
CREATE TABLE #who
(SPID INT ,
ECID INT ,
[Status] NVARCHAR(255) ,
[LOGINNAME] NVARCHAR(255) ,
[HostName] NVARCHAR(255) ,
[Blk] NVARCHAR(255) ,
[DBName] NVARCHAR(255) ,
[Cmd] NVARCHAR(255) ,
[REQUESTID] INT)
END
go
INSERT INTO #who EXEC sp_who ;

You can then recall this information at any time by running
SELECT * FROM [#who]

If your headache is blocking flavoured then running this will show you which SPID is the cause:

SELECT
CAST([w].[Status] AS NVARCHAR(30)) AS [status],
[w].[LOGINNAME] ,
[w].[SPID] ,
[w].[ECID] ,
[w].[HostName] ,
[w].[Cmd] ,
[w].[REQUESTID] ,
[w].[Blk] AS [Blocked by],
[w].[DBName]FROM
[#who] AS w
ORDER BY [DBName], [blocked BY]

The Blocked by column will have a number in it if that process is being blocked. The value in the column is the SPID of the problem process or, if that process itself has a number in its Blocked By column then it too is being blocked. Keep following the trail until you reach a process that has a zero in its Blocked by column but is being shown as blocking a different process.

Using a temporary table means that the information gathered is only available in the single process and will be dropped when you close the query.

I have this script set up as a SQL Snippet in SQL Prompt so that it is readily available in 3 keystrokes. SQL Prompt is an application that provides shortcuts to SQL Editors, making SQL development simpler and faster from Red Gate. Other editors are available and it is possible to create a script template in SSMS if you are not fortunate enough to have any third party tools working for you.

sp_who is documented in Books Online here so please read more about it there. There is also an undocumented version called sp_who2 that is very similar but has even more information regarding each process that is connecting to your databases.

Wednesday, 5 August 2009

Keeping orphans away from your servers

When restoring a database backup to a different server it is wholly possible that there will be security issues when you take the next step and make the system live to the user interface.

Most often in our environment this happens when a live database backup is used to 'refresh' the data in the test system. To do this we take a backup file made from the database on Server A and restore it onto Server B. This also applies however if you have to restore to a new server in the event of a disaster - imagine Server A is on fire and you are under an SLA to get the system live again. First, make sure someone deals with the fire, preferably a trained professional such as someone from the local fire service.

In live the server may have a mixture of users - Windows domain user and group accounts, SQL server logins etc and some of these may have been given permission to the database you are in the process of transferring to a new location. This may not compare exactly with the new server configuration. In order to ensure that once the database is live you move on to dealing with all the security issues that may arise from the above scenario you need to reference the database security and the (new) server security details.

In SQL 2008 Database user details are stored in sys.database_principals; Server login details are stored in sys.server_principals.

In SQL 2005 Database user details are stored in sysusers; Server login details are stored in sys.syslogins.

The script below can be switched between the two versions simply by swapping the table names accordingly as the columns used are in both versions.

In order to view the users that have come with the database and dont exist on the new server we need to find them with:

SELECT
[dp].name AS [Orphan DB User]
FROM
[sys].[database_principals] AS dp
LEFT JOIN sys.[server_principals] AS sp
ON dp.[name] = [sp].name
WHERE
( [dp].[sid] IS NOT NULL
AND [sp].[name] IS NULL
)
AND CAST([dp].[sid] AS BIGINT) < 0


These users are in the database but dont have a server security principal to link to. Thats the "LEFT JOIN" and "[sp].[name] IS NULL" working it out for us.

Next are the database users that might have similar named principals on the server but if the SIDs for them dont match SQL Server will consider them different accounts and stop access to the database. This is a great GOTCHA - using SSMS you see the server login and then in the database you see the user, apparently identical. Only when you compare SIDs are the differences there to be seen. These occurrences are found with:
SELECT
[sp].[name] AS [SQLServerLogIn] ,
[sp].[SID] AS [SQLServerSID] ,
[dp].[sid] AS [DatabaseSid]
FROM
sys.server_principals AS sp
INNER JOIN [sys].[database_principals] AS dp
ON
sp.[name] = dp.[name]
WHERE
[dp].[sid] <> sp.sid
AND [sp].[sid] < 1


Now this is OK but we can combine the two queries to have a concise list of security changes that we need to review:
SELECT
NULL AS ['Split' login] ,
NULL AS [Server Principal SID] ,
[dp].[sid] AS [DB Principal SID] ,
[dp].name AS [Orphan DB User]
FROM
[sys].[database_principals] AS dp
LEFT JOIN sys.[server_principals] AS sp
ON dp.[name] = [sp].name
WHERE
( [dp].[sid] IS NOT NULL
AND [sp].[name] IS NULL
)
AND CAST([dp].[sid] AS BIGINT) < 0
UNION ALL
SELECT
[sp].[name] ,
[sp].[SID] ,
[dp].[sid] ,
NULL
FROM
sys.server_principals AS sp
INNER JOIN [sys].[database_principals] AS dp
ON
sp.[name] = dp.[name]
WHERE
[dp].[sid] <> sp.sid
AND CAST([sp].[sid] AS BIGINT) < 0


We filter out SQL Server roles and system principals such as Guest, sa, public etc by CASTing the SIDs to a BIGINT and checking its a -ve value.


Next time we will review what to do with these accounts and how to get them enabled for the database in its new location.

AddThis

Bookmark and Share
 
Google Analytics Alternative