Sunday 11 July 2010

When GMail goes bad

Well, someone, from somewhere in Egypt (according to Google's assessment of the IP location) got into my GMail account. Thanks for that, I didnt have anything else planned for today so have spent hours changing passwords and contacting people to apologise for sending them links to Canadian Pharmacy websites.

Just in case you have any similar unfortunate experience here some basic instructions/recommendations from http://www.google.com/support/forum/p/gmail/thread?tid=44ef855684a3f46e&hl=en

If your account has been compromised/hacked/stolen you will need to check and fix at least all of the following settings.
But first you need to check the bottom of the Inbox and make sure your account is not open at any other locations.  If it shows additional locations, open the Details window and "Sign out all other sessions". 
Account Security:
Settings -> Accounts and Import -> Google Account Settings -> Change Password [pick a new secure password]
Settings -> Accounts and Import -> Google Account Settings -> Change Password Recovery Options [verify secret question, SMS and recovery e-mail address]
Potential Spam:
Settings -> General -> Signature [make sure nothing as been added]
Settings -> General -> Vacation Responder [make sure it's disabled and empty]

E-mail Theft

Settings -> Accounts and Import -> Send Mail As [make sure it is using your correct e-mail address]
Settings -> Filters [no filters that forward or delete e-mail]
Settings -> Forwarding and POP/IMAP -> Forwarding [disabled or correct address]
Settings -> Forwarding and POP/IMAP -> POP Download [disabled]
Settings -> Forwarding and POP/IMAP -> IMAP Access [disabled]
Additional Information
Keeping account secure:  https://mail.google.com/support/bin/answer.py?hl=en&answer=46526
Protecting your account:  https://mail.google.com/support/bin/answer.py?hl=en&answer=29407
More account security info:  http://www.google.com/help/security/
If your account is compromised:  http://mail.google.com/support/bin/answer.py?hl=en&answer=50270
Someone using your address:  http://mail.google.com/support/bin/answer.py?hl=en&answer=50200
Google Employee comments:  http://www.google.com/support/forum/p/gmail/thread?tid=560d53dee40be5e6&hl=en&start=70

Sunday 23 May 2010

Where's my dinner?

We feeds the birds from a bird feeder in our garden, just outside the kitchen window. We get Sparrows, Chaffinches, Blue Tits, Great Tits, Starlings, Blackbirds and occasionally Woodpeckers. Now we have for some time wondered at the way that when stocks run low we get Sparrows and Blue Tits on the window sill, tapping at the window. We are not sure whether they are just finding amusement at their own reflection or actually trying to get our attention – having made some sort of connection between the food appearing in the feeder and the people that live in the house.

Well, today i was sat here writing a different blog article and there is a knocking at the window and this is what I saw.

ccm3uuk4

 image

Yep, just to make sure, here is a close-up. The Woodpecker wants his dinner!!!

 

Apologies for poor exposure/composition but this guy is so timid and I had to get the camera and take the picture without disturbing him. This was all over in maybe 5-10 seconds. He'd gone almost by the time my camera was away from my face.

 

Today I feel lucky to be here, now. :)

Friday 21 May 2010

How do you use blog content?

Do you write a blog, have you ever thought about it? I think people fall into one of a few categories when it comes to blogs, especially blogs with technical content.

  1. Writing articles furiously – daily, twice daily and reading dozens of others.
  2. Writing the odd piece of content and read plenty of others’ output.
  3. Started a blog once and its fizzled out but reading lots.
  4. Thought about starting a blog someday but never got around to it, hopping into the occasional blog when a link or a Tweet takes them there.
  5. Never thought about writing one but often catching content from them when Google (or other preferred search engine) finds content related to their search.

Now I am not saying that either of these is right or wrong, nor am I saying that anyone should feel any compulsion to be in any particular category. What I would say is that you as a blog reader have the power to move blog writers from one category to another.

How, you might ask? How do I have any power over a blog writer? It is very simple – feedback. If you give feedback then the blog writer knows that they are reaching an audience, if there is no response then they we are simply writing down our thoughts for what could amount to nothing more than a feeble amount of exercise and a few more key stokes towards the onset of RSI.

Most blogs have a mechanism to alert the writer when there are comments, and personally speaking, if an email is received saying there has been a response to a blog article then there is a rush of enthusiasm, a moment of excitement that someone is actually reading and considering the text that was submitted and made available for the whole world to read. I am relatively new to this blog game and could be in some extended honeymoon period as I have also recently been incorporated into the Simple Talk ‘stable’. I can understand that once you get to the "Dizzy Heights of Ozar" (www.brentozar.com) then getting comments and feedback might not be such a pleasure and may even be rather more of a chore but that, I guess, is the price of fame. For us mere mortals starting out blogging, getting feedback (or even at the moment for me, simply the hope of getting feedback) is what keeps it going. The hope that you will pick a topic that hasn’t been done recently by Brad McGehee, Grant FritcheyPaul Randall, Thomas LaRock or any one of the dozen of rock star bloggers listed here or others from SQLServerPedia and so on, and then do it well enough to be found, reviewed, or <shudder> (re)tweeted to bring more visitors is what we are striving for, along with the fact that the content we might produce is something that will be of benefit to others.

There is only so much point to typing content that no-one is reading and putting it on a blog. You may as well just write it in a diary. A technical blog is not like, say, a blog covering photography techniques where the way to frame and take a picture stands true whether it was written last week, last year or last century - technical content goes sour, quite quickly. There isn't much call for articles about yesterdays technology unless its something that still applies to current versions too, so some content written no more than 2 years ago isn't worth having now. The combination of a piece of content that you know is going to not last long and the fact that no-one reads it is a strong force against writing anything else. Getting feedback counters that despair and gives a value to writing something new.

I would say that any feedback is good but there are obviously comments that are just so negative or otherwise badly phrased that they would hasten the demise of a blog but, in general most feedback will encourage a writer. It may not be a comment that supports or agrees with the main theme of a post but if it generates discussion or opens up a previously unexplored viewpoint it is contributing to the blog and is therefore encouraging to the writer.

Even if you only say "thank you" before you leave a blog, having taken a section of script to use for yourself or having been given a few links to some content that has widened your knowledge it will be so welcome to the blog owner.

Isn't it also the decent thing to do, acknowledging that you have benefited from another's efforts?

Saturday 15 May 2010

Godrevy

Well, it was really windy and a little rainy at times but I had the camera out for a good DSC_6231while. Sadly the wind put and end to many close-ups of the birds as it was blowing me on my feet but here is one that I am pleased with. I know the horizon wasnt that sloping but its a symptom of trying to pan with the bird in the middle of a blowy day on the top of a cliff. No apologies :).

Thursday 13 May 2010

Scary update on the future of this blog

I have the great pleasure (and no small amount of anxiety if I am completely open about it) to announce that I have been invited to syndicate this blog with Simple-Talk (www.simple-talk.com). This is a brilliant opportunity for me to connect with so many more people in the world of SQL Server than via this site so I truly hope I am good enough to produce suitable quantity and quality to match the other creators at Simple Talk. Its all going on at http://www.simple-talk.com/community/blogs/jonathanallen/default.aspx

This blog will continue, but there may be some items that dont get double posted.

I would like to thank the staff at Red Gate (www.red-gate.com) for their offer and assistance in getting this all in place, hopefully I wont let you down.

Jonathan



Please be sure to read the code disclaimer page before you use any code in this blog.

Monday 26 April 2010

How to understand SQL Server execution plans

SQL Server Execution Plans by Grant Fritchey

Query Execution Plans are something that a DBA or TSQL developer will get to encounter pretty soon after starting to work with SQL Server. They are found in a number of guises - graphical, text or even XML. In either format they are a daunting array of information that seems really important but also pretty complex and technical. With Grant's book by your side you will soon be reading the execution plans and relating them to the query you have written and then be able to alter your script or adjust parts of the database to improve the efficiency of your queries.

The book starts of in a basic way, with worked examples that you can follow on any version of SQL Server that has an AdventureWorks database. You will work your way from the simplest of queries up in complexity to examples using JOINS and GROUP BY and then examining what indexes are being used (or possibly more importantly, why they are not being used).

The book also covers how to see what the execution plans are and also how to control the way the execution plan is created with HINTS so that it is compiled the way you want it if you have a preference for a certain method.

This book is something that I will be referring to regularly as I build up my understanding of SQL Server Execution Plans. An excellent cover-to-cover read and a great reference book.


There are various versions of the book:
electronic version available from Simple Talk Publishing - SQL Server Execution Plans - eVersion
in print from Amazon.com - SQL Server Execution Plans- in print
and it should be available soon via Amazon.co.uk I will put a link here when it is.

Please be sure to read the code disclaimer page before you use any code in this blog.


Sunday 18 April 2010

SQLBits VI

Date: Friday 16th April 2010
Location: Church House Westminster
Event: SQLBits VI

Having been to one SQLBits previously I was very keen to get registered and attend this event. This time I was accompanied by my wife (she's also a DBA/Developer so it wasn't a case of having to fund a shopping trip too!) and travelling from Devon meant that we stayed over in London for the previous night. The accommodation as recommended by SQLBits at The Grange Rochester was excellent, I had hoped it was a spelling mistake and we were at The Dorchester but sadly that wasnt the case!

We had a relaxed start to the day and walked for little more than 10 minutes to get to the venue. We easily found our way to the registration area and got our goody bags (more on them later) and then went on towards the exhibition hall. We had plenty of time to grab a coffee & sausage sandwich, meet with some people from last SQLBits and check out a couple of sponsors stands before the first session of the day.

Our session plan for the day was:
Monitoring and Tuning Parallel Query Execution - R Meyyappan - link
Inside the SQL Server Query Optimizer - C Cunnigham - link
Performance and Scalability through normalisation and set theory - T Rogerson - link
Lunch (inc sponsored session)
Denormalisation - Having your cake and eating it - M Whitehorn and Y Ahmed - link
Optimising Tempdb and temporary object use - C Bolton - link
High Performance Functions - S Sabin - link


Monitoring and Tuning Parallel Query Execution - R Meyyappan - link
Ramesh was on fine form again and we really enjoyed his presentation and picked up some neat tips and techniques to handle situations where the query optimiser recommends (or uses) parallelism in a non-optimal way. Sadly the session was a bit rushed as the first few minutes were taken up by announcements about the day ahead and changes to sessions following the issues with flights being cancelled after the Icelandic volcano ash closed British airspace. The DVD handed out at the end of the session should fill in the gaps though.

Inside the SQL Server Query Optimizer - C Cunnigham - link
Conor was a real treat to listen to, hearing his accounts of how the Query Optimiser was built to work and how they managed to account for so many different possibilities of executing a query and then pick the best. Well, not actually the very best but it picks a pretty good one, pretty much every time - its a case of getting as close as possible in a reasonable amount of time so that actually executing the query isnt held up too long. Oh, he also mentioned that he worked on the data management views framework so make sure you take a minute to thank Conor when you next use one.

Performance and Scalability through normalisation and set theory - T Rogerson - link
Tony started out showing us what he had in mind for his talk - it filled the screen with about 30 different headings. We covered a good number of them, based around normalising data and how it can benefit the integrity of the data stored and 

Lunch (inc sponsored session - Improving database development with source control)
We attended the Red Gate sponsored lunch session where Stephanie Herr and David Simner took us through the way that all of Red Gate's SQL Toolbelt applications work together to ease the work of a DBA and provide continuous integration. We had a preview of the new SQL Source Control application that maintains database versioning details while you work in SSMS. All the scripting is done in the background and it supports SVN or TFS source control. A really neat product and they are looking for EAP volunteers, get a copy of SQL Source Control here and help them develop the application the way you want it to work.

Denormalisation - Having your cake and eating it - M Whitehorn and Y Ahmed - link
We attended this session hoping to get some tips and ideas on what would be a good policy for denormalising data and how to go about it in a structured and well ordered way. Sadly it was a session that dealt almost wholly in theory and apart from hearing an interesting discussion about what qualifies as 1NF, 2NF, 3NF and so forth we didnt get as much as we had hoped from this session.

Optimising Tempdb and temporary object use - C Bolton - link
Christian Bolton gave us a peak into the mind of a SQL Server Master with his talk about the way to optimise your server with various tempdb settings, when to take action like this and also when it is not advisable. We had details about Latches, what Trace 1118 used to do and why its no longer needed, the sizing of tempdb and how many files to have and so on.

High Performance Functions - S Sabin - link
Simon's session on using functions was the last one in the day and it started with creating a SSMS keyboard shortcut (Tools|Options|Environment|Keyboard and adding sp_helptext in the Query Shortcuts grid). Next time you highlight an object name in your script and press Ctrl + 3 - hey presto the results window opens with the results of sp_helptext for that object. Nice time saver. We then heard about how Scalar functions suck and how its better to replace them with table values functions wherever possible with the usual selection of clear examples from Simon.

When we got back to the main conference lobby the exhibitors stands had been dismantled and there was beer and pizza to keep us going while we played air hockey, giant jenga, Xbox, giant connect4 or just chatted . A short prize giving was done and then people started to drift off home.

Another great SQLBits, I would like to thank all of the organisers, speakers and sponsors for their not inconsiderable contributions to make this event happen and run as smoothly as it did.



Please be sure to read the code disclaimer page before you use any code in this blog.

Friday 9 April 2010

Testing a database connection or Who the hell (UDL) are you?

If you work with software and databases then it wont be long before you encounter  problems connecting to SQL server (or any other data source) and there are often many reasons why you would be presented with  problems. A good way of starting to analyse your problems with no need for any specific software on a  PC is to use a UDL file.

Create a new text file on the desktop of the computer you are using. Rename this file with a udl extension, something like test.udl will be ideal. All you need do is double click this and a Data Link Properties dialog opens. For some reason this always opens on the wrong tab so make sure you select to the Provider tab and then choose the OLE DB provider applicable to your problem. Next move to the Connection tab and move through the required steps, entering a data source name, whether you want to use Windows or specify SQL login details and finally the database name you wish to connect to. Once this is complete press the Test Connection button and see what results you get.

You may get a 'test succeeded' message in which case there is no reason why any other means of connecting from this computer wont work and if they do then its something wrong with the settings in the other application you are using. Alternatively you may get a message detailing why the connection failed. This is the key, you get a message detailing the issue and this lets you investigate and resolve your problem.

IMPORTANT:
Remember to delete the udl file once you have used it as, if you have entered a username and password then this is saved in plain text with no security and could pose a security risk. All that is required is for the udl file to be renamed to txt and the username and password can be accessed by notepad or any other text reading software.


Please be sure to read the code disclaimer page before you use any code in this blog.

Saturday 27 March 2010

Configuring a new SQL Reporting Services server






I am now getting into my stride on upgrading to SQL 2008. We have a reasonably compact environment here with only a handful of production servers. We use mirroring and replication in places and we have a couple of report servers to keep the data flowing to the users. I have just moved a database that has a gang of reports pointing to it and have been going through the directory settings for security and so forth and got to the point where I was repeatedly selecting an object (data sources or folders mostly) and going to the properties and selecting "hidden" and applying the change. That got tedious really quickly so I moved over to SSMS and started reviewing the Catalog table in the ReportServer database. 





Now lets get things clear from the start, mucking about in system generated objects on a working server can easily result in a big mess around your ankles very quickly so only go here if you are totally happy that you can undo the changes you have made if you get something wrong.





Right, still with me? Great. First of all we need to get to the database and find some information about the objects we want to work with.

USE [ReportServer]
go

SELECT top 20 
    [c].[Path] ,
    [c].[Name] ,
    [c].[Type] ,
    [c].[Description] ,
    [c].[Hidden]
FROM    [dbo].[Catalog] AS c

so this shows we have a column called hidden that we can switch to a value of 1 to hide the objects we want and objects have a type value. Now some research on my 2005 server tells me that there are 6 types that I need to deal with and they match up to objects as:

--    1-directory
--    2-report
--    3-other file
--    4-linkedreport
--    5-data source
--    6-reportbuilder object


So, in my case, I want to hide all images and text files as they only appear in reports, noone goes to a report server to view a picture or a text file and can use

UPDATE  [dbo].[Catalog]
SET     [Hidden] = 1
WHERE   [Hidden] != 1
        AND [Type] = 3 -- images


Now, I also want to hide any directories called "Data Sources" as I dont want these in the way of users navigating the report directories and I can do that with

UPDATE  [dbo].[Catalog]
SET     [Hidden] = 1
WHERE   [Hidden] != 1
        AND ( [Type] = 1 -- directories
              AND [dbo].[Catalog].[Name] LIKE 'Data%Sources' -- directories with specified name(s)
            ) 


Now I had to use a LIKE and add a % in the name as some of our projects got created with a typo in the deployment properties and the names are either "DataSources" or "Data Sources"; you may not need this or you may need to add other code to include all the variations you have.




So there you have it, the whole report server deployment gets configured pretty much instantly. Just remember to run this once in a while as other projects come online so that nothing sneaks past you. Maybe even slip the code into your scheduled weekly maintenance job(s).


[Edit - 27/Mar/2010]
I am adding the following section after receiving some messages (via http://twitter.com/fatherjack) that this technique is inappropriate.


For clarity that may be missing from the above, this is not a method to apply any form of security over a report server and its contents, this is simply to hide items that otherwise clog up the view of a user browsing the report server. There is, to my knowledge, no option in setting permissions on a report server object that allows the user to still access the item but not be able to see it. I apply the hidden property on items that the user inherently HAS permission to consume - images may be used in reports they have full access to, the data sources are obviously needed for the reports to run properly. There is no need for the user to have to navigate past these items, so I hide them.
Permissions are used to control access to data and strongly tie in with Active Directory users and groups and SQL Server Logins and Users, this ensures that no person has access to a report that contains data they are not authorised to view.


Please be sure to read the code disclaimer page before you use any code in this blog.

Tuesday 9 March 2010

PowerShell in use - moving SQL Mail Profiles

I am in the middle of a SQL Server 2008 installation and today I needed to transfer the Mail Accounts and Profiles to the new instance in order to be ready for use by the incoming system. I never been a fan of the GUI to configure Mail on SQL 2005 as it involves a lot of back and forth clicking to get everything done.

I decided to take a look at what PowerShell would let me do. After getting connected to the out-going server it was simply a case of changing to the mail directory and then the accounts directory and using the power of the script method of the objects to get the scripts into a file on my local hard drive.

set-Location SQLServer:\mail\accounts
get-ChildItem | foreach {$_.script()}| out-File C:\SQLServer_MailAccounts.sql

then I just moved to the profiles directory and repeated the process

set-Location SQLServer:\mail\profiles
get-ChildItem | foreach {$_.script()}| out-File C:\SQLServer_MailProfiles.sql

Once that was done, I opened the file sin SSMS, connected to the new server and ran them to create the mail accounts first and then the profiles.

Really simple, really quick. I am beginning to see ways that PowerShell will continue to help me

Please be sure to read the code disclaimer page before you use any code in this blog.

Monday 8 March 2010

The varying cost of free stuff

That doesn't sound right does it? If something is free it doesn't cost anything and if you have lots of those things they all cost the same - right?

Well, not entirely. Let me explain in a little more detail. I am learning PowerShell as you may well know if you have seen any other of my recent blog items and so am looking around the internet to find code samples and try out different methods of working with PS scripts. I have previously mentioned PowerShellAnalyser, so far the best PS script editor I have found, and the fact that its free.

In contrast I also found a site that supports SQL DBA's with various applications/tools to make their job easier with automation of all the important tasks. They have a link to a set of Powershell scripts that they have accumulated, presumably verified and declared worthy of distribution to the SQL community. The route to them from their own (paid) PS editor details page is via a "Free Scripts" link where you get another option to get their paid editor and click on another link to "Download Now". You now reach a personal details page where 12 pieces of data (note only 1 is optional) are required and there are 2 checkboxes to discourage/encourage follow up sales calls/emails. Once this data has been verified - yes, there is intense validation of the data formats you input - you get to a page that says
"Thank you! You will soon receive a confirmation email with the download link to the XX Company name product(s) you selected.".
I went through this process a while ago and thought 'great, when I get a chance I'll download the scripts and see what they do'. When I got the chance I opened the email, clicked the link and the browser opened a page that said
"Downloads.Your download link has expired. Please email support to request reactivation of your link. For the fastest response, please ensure that you send from the email address you used when registering for the download. Re-activation requests will be handled in 1 business day or less."
Gah, this is starting to hurt now, I'm building a dislike for a product I haven't even seen - I had to submit stupid amounts of details to get nothing a week ago and I have come to take a proper look at the product and its now taken from me. I email support. Support reply - `just download the product again`. OK, building a proper dislike of the whole company now. In all honesty, I cant be bothered. My time is not worth a lot in the grand scheme of things but its worth more than dancing through their hoops. Whatever paid products they have they will be unlikely to see me on their future customers list. If I get any emails from them following this then the domain name is going on the spam list. Just as my curiosity got spiked I reviewed the URL from my first download attempt and my trial today to get the process right. There have been just under 2000 people getting downloads in this fashion since my first attempt and 330 in the last 48 hours. How do I know? The link is simply their domain with a querystring parameter called CustomerID that has an incremental integer value. I wonder what their conversion rate from enquiries to sales is? Perhaps their paid-up users are simply more patient people than I am.

By contrast, the PowerShellAnalyser download is actually making its way to my PC with two clicks from the main page.

I understand there are different business models but Shell Tools have endeared themselves to me by making their free stuff freely available. They have no idea who I am or what I have done on their site but I would go to them if I have a need that one of their future products can fix simply for the help they gave me in using PSA for free without any strings. I dont feel stalked or tracked or like I have somehow given away a load of information for no good reason. There are other companies that have paid for products but also make available some tools for free with little or no data capture attempts. They are the sort of companies that make me a friend and I want to support their commitment to giving something to the community.

Its possible either product would have been useless or brilliant. PSA is brilliant AND free, the other tool would have had to be brilliant in order to balance out what I have given that company already.

While I am at it, another tool that I use regularly that is free but does a great job is Paint.net. An image manipulation tool that punches way above its weight. This product has a PayPal link in its menus and on the site as it is a project that isnt associated with any business endeavour. I whole heartedly recommend Paint.Net too

Please be sure to read the code disclaimer page before you use any code in this blog.

Thursday 4 March 2010

Getting to SQL Server from PowerShell script interface

Right, I'm progressing with PowerShell a little at a time. I still dont have any books but whenever I need to reference SQL information (not data, I am only getting info on Server settings not accessing customer database data) I try the PS route first.

In order to connect to a SQL Server from the command line you need to add two PS snapins. To do that you issue the commands add-PSSnapin sqlservercmdletsnapin100 and add-PSSnapin sqlserverprovidersnapin100

Today I am looking at collation on my server so I first need to 'map a drive' using new-PSDrive -name ~Drivename~ -psprovider SQLServer -root sqlserver:\SQL\~servername~\~instancename~ and then change my current drive to that server so I use set-location ~Drivename~:\databases and then simply use get-ChildItem |Select-Object name, collation to get a list of database names on that SQL Instance and their collation setting.

As always, if you want to see the extra switches or other information about PS commands then simply type get-help ~pscommand~ for the command you are interested in. eg. get-help new-PSDrive will return the help for the new-PSDrive command

Please be sure to read the code disclaimer page before you use any code in this blog.

Wednesday 3 March 2010

How to slipstream SQL Server 2008 RTM and SQL Server 2008 SP1

I have a number of SQL Servers to upgrade from 2005 to 2008 this year and as service pack 1 is already out it makes sense to install this at the same time as the main software. In Microsoft parlance this is known as 'slipstreaming'.

In theory it involves getting the set up files for SQL Server 2008 and applying the service pack to them rather than to the installed software. Its covered on an MSDN blog here http://blogs.msdn.com/petersad/archive/2009/02/25/sql-server-2008-creating-a-merged-slisptream-drop.aspx and these are the instructions that I followed. However there are a couple of pointers that I have had to work through so this could be considered a supplemental blog for the slipstream creation process. The step numbers used relate to those in Peter Saddow's blog.

Step 1: I got iso files downloaded. These need to be mounted (Virtual Clone Drive) or extracted to disc and then the contents copied to the directory as described. Note: I could not get PowerShell to do the copy from my VCD drive - it errored saying the drive was not the right format. Windows drag and drop does it anyway.
Step 2: No problems (apart from time taken to download!)
Step 3: Execute each statement listed one at a time and leave it a few minutes before you start the next one. You will see an extracting progressbar for each package.
Step 4: This is simply moving them up a directory level and overwriting the files of the same name already there. Robocopy is a tool installed as part of a server admin resource kit. You will need to install that in order to copy the command given otherwise follow the instructions exactly.
Step 5: Robocopy again but simple
Step 6: Easy enough to edit 3 .ini files and add a line.
Step 7: Install SQL Server 2008 and note the slipstream identifiers as you go through the installation process.

Hope this helps.

Other useful links
SQL Server 2008 Slipstream FAQ : http://blogs.msdn.com/petersad/archive/2009/04/24/sql-server-2008-slipstream-frequently-asked-questions.aspx
CSS SQL Server Engineers How to fix your SQL Server 2008 Setup before you run setup... Part 1
CSS SQL Server Engineers How to fix your SQL Server 2008 Setup before you run setup... Part 2

Please be sure to read the code disclaimer page before you use any code in this blog.

Tuesday 2 March 2010

Brad's Sure Guide to SQL Server Maintenance Plans

That's the title for the latest book from Brad McGehee and Simple Talk Publishing. Its available from Amazon here.
ISBN 978-906434-34-2
Price:GBP19.99 / USD29.99 / CAN35.99

This book will enable any person responsible for data stored in a SQL Server database to ensure it is being managed to a required minimum with a small amount of effort and possibly limited experience. This is not a book long time DBAs who have scripts already in place to maintain their databases will get a lot out of, it’s for the people who are becoming known as ‘Accidental DBAs’. Maybe you were closest to the server when the last DBA quit, or you are a system administrator with a new system based on SQL Server and you don’t have a DBA in-house. Whichever way you have happened to find yourself looking after SQL Server databases, this book is for you.

It explains very clearly what a maintenance plan is designed to do, what it can and can’t do and in some cases why you don't want it to do some things. Once the intent of each feature is established Brad then walks you through configuring the task options to your requirements and how it can be linked to or affects other tasks you need carried out on your databases. A clear format of what the task does, why you want it done (or not) and how to make it happen is followed in each chapter. Pitfalls and limitations of the various tasks are explained clearly and where there are options to avoid or work around these they are explained so that they can be implemented. The tasks are described with enough background information that someone unsure about SQL Server will clearly understand what the task is attempting to achieve and will then be able to decide whether that is something they want to apply to their systems.

While the majority of the book refers to the process followed by the Maintenance Plan Wizard Brad also explains how to make your own custom Maintenance Plan or alter existing plans with the Maintenance Plan Designer. Implementing schedules and operator notifications into your plan is explained so that a user can keep up to date with the status of any executions without having to read through the detailed reports along with using logic to control flow from one step to another within a plan and adding multiple copies of tasks in order to get the exact results needed.

If you have any databases in your remit and you are wondering how to look after them, or want a guide on how to make your existing maintenance plans more effective, then this book will be a valuable asset to you.

It could be as simple as reading this book for a couple of hours on Monday, Tuesday and Wednesday and then over a similar time on Thursday and Friday you could have your SQL databases fully covered by a set of maintenance plans to ensure your data is well maintained and backed up.

Brad's Sure Guide to SQL Server Maintenance Plans

Sunday 28 February 2010

Snapshot Isolation level

OK, I've just a minute to put this up but I have used PowerShell to help someone with a problem. It wasnt the only way and may not have been the simplest way but it certainly work. A question was posted on www.ServerFault.com asking "How can I tell if Snapshot Isolation is turned on?", my reaction was to see if PowerShell could tell me what the Snapshot Isolation setting was on my SQL databases.

The easiest way to get to PowerShell from SSMS is to right click in the Object Explorer and then choose the PowerShell link in the menu. this opens a PowerShell window at the level that you clicked. Simply typing in
get-childitem|select name, snapshotisolationstate
returns the information that you want for the server you are connected to. Simple, quick and useful. Maybe this PowerShell thing has its uses for a SQL database administrator after all?!

Please be sure to read the code disclaimer page before you use any code in this blog.

Friday 26 February 2010

PowerShell - my first steps

So I asked this question on ask.sqlservercentral.com/questions and the concensus of opinion seems to be that PowerShell should be in my future. I have a few books that I am deciding between but have downloaded some eBooks and have installed some new applications. It was a real hassle getting PSv2 installed as it seems to conflict with PSv1. I have downloaded PowerShell Analyzer from Shell Tools and it works very well. Rather than just the PS command line you get a full IDE that gives a lot of support to a person new to PS.

As a small nugget of what I have covered so far have this little section of Powershell code:

Get-Process *sql* Select-Object name, CPU, virtualmemorysize

This will bring a list of processes on the local machine that have "sql" in their name, their CPU usage and the virtualmemory they are using.
Which for me tells me this:







NameCPUVirtualMemorySize
------------------------
RedGate.SQLBackup.UI184.125272957440
RedGate.SQLPrompt.TrayApp3.6875187170816
SQL Response Client413.03125223125504
SQLAGENT 48.1875138895360
sqlbrowser0.04687514163968
sqlservr2198.968751570562048

neat huh?

Monday 22 February 2010

People You May Know (Linked In version)

That's the heading of a little box on my LinkedIn page (www.linkedin.com), if you have never heard of LinkedIn before, its like Facebook for business people, but with fewer photos.

Now lots of 'social networking' sites have a way of finding other people you know that you havent yet connected with. It makes sense, the more contacts you have on thier site the more traffic will increase as there is more chance of interaction. Some use you email address book for their search, others look at your history - education, work etc and even the history of some of your current contacts so that people you have something in common with are suggested as potential friends.

LinkedIn is different.

How many people do you know, let me clarify that, how many people have you met, spent some time with and remember their name? 50? 100? 200? 500? There's lots. When you stop to think about family, school friends, neighbours, work colleagues, those people you met on holiday in Tenerife and so on ... Out of those people, how many had your name or something similar (Dave Gorman need not answer this.) ? In my case - none. In your case, I'm willing to bet its less than 2 in the majority of cases.

Well, LinkedIn seem to think that despite the education and career history of thousands (#)of people on their records the best people that they can match to me all have the same name as me. Today my suggestions are Jon Allen, Jonathan Allen, Jon Allen, Jonathan Allen, Jonathan Allen, Jon Allen and Jonathan Allen all from America or Mexico and have no career connections in common with me - I program computers for a living and most of them seem to be involved with banking in some way.

LinkedIn, I say this about your friends/contacts suggestion method : its rubbish.

Now, rant over, lets see what rubbish this Monday brings me.


# - Edit - an email from LinkedIn says they have 60 Million members.

Monday 8 February 2010

Keeping your database files tidy

Behind every database there are database files, traditionally they are denoted by one of 3 file extensions - mdf is the main data file, ldf is the main log file and ndf is used for any additional log files. one mdf and one ldf are a minimum requirement for a SQL Server database.

These files are locked by SQL Server when it is running, thus ensuring that the only updates go through the database engine and are therefore managed correctly. It is possible to separate SQL Server and the database files in a number of ways, one of which is to detach the database using sp_detach_db. This removes the database from the SQL Server and releases the database files to be accessed by any other process that can reach them. It may be that you were moving the database from one server to another or what ever circumstance but it is possible to leave the database files on your server and forget they are there.

This means a number of things:
1 - the files are taking up HDD space and potentially being backed up by your nightly processes (you are backing up regularly arent you?)
2 - the files can be accessed by any editor that can get to the folder location where they are stored. Notepad can open an mdf file. Dont believe me?

open up your favourite T-SQL editor and run this:
CREATE DATABASE DETACHTEST
GO
USE DETACHTEST
GO
Create table TestTable
(Myname varchar(30))
go

insert into testtable
select 'Fatherjack'
go

select * from TestTable
go

use master
go

sp_detach_db 'DETACHTEST'


That will have created a database, created a table in it, inserted data and then detached the database.

Now open up windows explorer and go to you default SQL data folder, in there you will find a file called DETACHTEST.mdf, open that in notepad or any other text editor and search for the string "fatherjack". There it is, in plain text. I agree its not readily usable and a database of 00's of tables and 000,000's of rows will make data extraction difficult without a SQL Server but it is wholly possible and therefore wholly insecure to have these files lying around on your server.

To find any of these files that may have got left behind we need a process to locate and remove them. My preferred method to do this involves a small script to be run in LogParser and then a nifty join to some live SQL data and a look at the date the files were last used.

Here is the LogParser script to get the file name, location, last access time and last write time from any folder up to 10 levels down from the root on the C:\ and D:\ drives where the file extension is either mdf or ldf:
logparser "SELECT path, name, lastaccesstime, lastwritetime into datafiles from '\\London\d$\*.mdf', '\\London\c$\*.mdf','\\London\c$\*.ldf','\\London\d$\*.ldf'" -i:fs -recurse:10 -o:sql -database:AdminDB -server:SQLLondon -createtable:on -e:10

(Where London is the windows server and SQLondon is the SQL Server instance installed.)

Here is the SQL to work with the data that LogParser retrieves and give us a hit list of files to track down and deal with:
USE [CentralDB]
GO
SELECT DISTINCT
COALESCE(mf.name, 'Not attached to a database. ***') AS [Database Name],
[d].[Path],
[d].[Name],
[d].[LastAccessTime],
[d].[LastWriteTime]
FROM [dbo].[datafiles] AS d
LEFT JOIN [BOA\PARIS].[master].[sys].[master_files] AS mf ON REPLACE(REPLACE([d].[Path], '\\boa\d$\', 'd:\'), '\\boa\c$\', 'c:\') = [mf].physical_name
UNION
SELECT '-----',
'--- Items above here not accessed in the last day --',
'-----',
DATEADD(d, -1, GETDATE()), -- 24 hours ago
''
ORDER BY [LastAccessTime],
[Database Name]
The resulting data will be a list of database files that have not been accessed recently and whether they are attached to a database on the server. You may wish to adjust the value I have used in the lower part of the query to suit your environment - 24 hours may not be appropriate to you.

Thursday 21 January 2010

Blog challenge

This post is a slight departure from the normal content and its part of what might be considered a challenge from the blog at http://thesqlagentman.com/2010/01/whose-blog-is-it-anyway/

I woke on Sunday morning in a panic, you know the way, it's like you accelerate from 0 to 100 mph in 2s, slowly the bedroom materialised around me. I made a mental note not to read my twitter feed just before going to bed, a dream featuring Live Nude Cats, that's the only place on the internet where a phrase like that would not be considered wholly out of place. I also know that I'm going to have flash backs of the horses all through the day, there is no natural way that they should have been in those positions. I guess as it was a dream then I can rest assured that none of them got hurt but I feel that there wouldnt have been a complete rescue. You know, like in the disaster movies where one of the most vulnerable of the victims gets left behind as the helicopter leaves the disaster site. Yup, I had a feeling that the pilot would be pleased with his work on the whole but as soon as he looks down he’s going to be very sad about the pony.

I can't claim to be at all creative but this dream was so vivid I wonder whether I could remember enough of it to pitch it as a TV series? After all, as we are getting more and more TV channels the quality of the content is dropping off a cliff. Who could I ask? It would need to be someone with enough clout to carry the whole project through, Steven "Of course a forth Indian Jones is a good idea" Spielberg or I could just ask George “Let’s Have Padme Die Of A Broken Heart Instead Of Anakin Crushing Her To Death” Lucas, he hasnt been too busy recently.

In realityI understand this is unlikely, so, how can we do this? May be by simply using some magical tool, library, or bong? No, we need to ... put it in a blog.

I guess there isnt so much to tell about this dream, maybe its a pity I’ve started, so I’ll finish!

Monday 18 January 2010

Writing TSQL faster with templates

SSMS Templates

The management tools that come with SQL Server are the way that most DBAs access and administer their database environment. SSMS has plenty of shortcuts and times saving features, one of which is the use T-SQL templates. SSMS templates are accessed via their own menu and are stored in folders just like you would access files on a windows explorer window. SSMS comes with a whole load of templates ready to run but you can add to and alter these very easily.

Firstly, lets look at getting a template onto the query editor and running it successfully.
Right, open up an SSMS session and hit Alt + N (or Ctrl + N for SSMS for SQL 2005) to open a new TSQL editor window. Now its possible that you will have the templates toolbar ready to use already but hit Ctrl + Alt + T to make it active. Lets pick something simple to start with, open up the Database folder and drag the Create Database template over to your query editor page. Right, we have the text on the page but we have to do a little work before we can hit F5 or click Execute.

You'll see that there are sections of code inside < > brackets. These are SSMS template parameters, we need to take this code out and replace it with something specific to our environment in order to make the script parse and then execute successfully. We could swipe the parameter with our mouse but that could get tedious if we pop out a template with loads of them, its easier to just press Alt + Q, Alt + S (Ctrl + Shift + M for SQL 2005 and earlier). This brings up the parameter completion dialog for you.

Simply type the database name in the higlighted area and press Enter. Job done, it is now ready to run, the text that you entered in the parameters dialog has been put in three places and the script will now successfully create a database (with all the default options such as file locations that are set at server level) providing you have security on the server to execute this command.

AddThis

Bookmark and Share
 
Google Analytics Alternative