Saturday, 27 March 2010

Configuring a new SQL Reporting Services server






I am now getting into my stride on upgrading to SQL 2008. We have a reasonably compact environment here with only a handful of production servers. We use mirroring and replication in places and we have a couple of report servers to keep the data flowing to the users. I have just moved a database that has a gang of reports pointing to it and have been going through the directory settings for security and so forth and got to the point where I was repeatedly selecting an object (data sources or folders mostly) and going to the properties and selecting "hidden" and applying the change. That got tedious really quickly so I moved over to SSMS and started reviewing the Catalog table in the ReportServer database. 





Now lets get things clear from the start, mucking about in system generated objects on a working server can easily result in a big mess around your ankles very quickly so only go here if you are totally happy that you can undo the changes you have made if you get something wrong.





Right, still with me? Great. First of all we need to get to the database and find some information about the objects we want to work with.

USE [ReportServer]
go

SELECT top 20 
    [c].[Path] ,
    [c].[Name] ,
    [c].[Type] ,
    [c].[Description] ,
    [c].[Hidden]
FROM    [dbo].[Catalog] AS c

so this shows we have a column called hidden that we can switch to a value of 1 to hide the objects we want and objects have a type value. Now some research on my 2005 server tells me that there are 6 types that I need to deal with and they match up to objects as:

--    1-directory
--    2-report
--    3-other file
--    4-linkedreport
--    5-data source
--    6-reportbuilder object


So, in my case, I want to hide all images and text files as they only appear in reports, noone goes to a report server to view a picture or a text file and can use

UPDATE  [dbo].[Catalog]
SET     [Hidden] = 1
WHERE   [Hidden] != 1
        AND [Type] = 3 -- images


Now, I also want to hide any directories called "Data Sources" as I dont want these in the way of users navigating the report directories and I can do that with

UPDATE  [dbo].[Catalog]
SET     [Hidden] = 1
WHERE   [Hidden] != 1
        AND ( [Type] = 1 -- directories
              AND [dbo].[Catalog].[Name] LIKE 'Data%Sources' -- directories with specified name(s)
            ) 


Now I had to use a LIKE and add a % in the name as some of our projects got created with a typo in the deployment properties and the names are either "DataSources" or "Data Sources"; you may not need this or you may need to add other code to include all the variations you have.




So there you have it, the whole report server deployment gets configured pretty much instantly. Just remember to run this once in a while as other projects come online so that nothing sneaks past you. Maybe even slip the code into your scheduled weekly maintenance job(s).


[Edit - 27/Mar/2010]
I am adding the following section after receiving some messages (via http://twitter.com/fatherjack) that this technique is inappropriate.


For clarity that may be missing from the above, this is not a method to apply any form of security over a report server and its contents, this is simply to hide items that otherwise clog up the view of a user browsing the report server. There is, to my knowledge, no option in setting permissions on a report server object that allows the user to still access the item but not be able to see it. I apply the hidden property on items that the user inherently HAS permission to consume - images may be used in reports they have full access to, the data sources are obviously needed for the reports to run properly. There is no need for the user to have to navigate past these items, so I hide them.
Permissions are used to control access to data and strongly tie in with Active Directory users and groups and SQL Server Logins and Users, this ensures that no person has access to a report that contains data they are not authorised to view.


Please be sure to read the code disclaimer page before you use any code in this blog.

Tuesday, 9 March 2010

PowerShell in use - moving SQL Mail Profiles

I am in the middle of a SQL Server 2008 installation and today I needed to transfer the Mail Accounts and Profiles to the new instance in order to be ready for use by the incoming system. I never been a fan of the GUI to configure Mail on SQL 2005 as it involves a lot of back and forth clicking to get everything done.

I decided to take a look at what PowerShell would let me do. After getting connected to the out-going server it was simply a case of changing to the mail directory and then the accounts directory and using the power of the script method of the objects to get the scripts into a file on my local hard drive.

set-Location SQLServer:\mail\accounts
get-ChildItem | foreach {$_.script()}| out-File C:\SQLServer_MailAccounts.sql

then I just moved to the profiles directory and repeated the process

set-Location SQLServer:\mail\profiles
get-ChildItem | foreach {$_.script()}| out-File C:\SQLServer_MailProfiles.sql

Once that was done, I opened the file sin SSMS, connected to the new server and ran them to create the mail accounts first and then the profiles.

Really simple, really quick. I am beginning to see ways that PowerShell will continue to help me

Please be sure to read the code disclaimer page before you use any code in this blog.

Monday, 8 March 2010

The varying cost of free stuff

That doesn't sound right does it? If something is free it doesn't cost anything and if you have lots of those things they all cost the same - right?

Well, not entirely. Let me explain in a little more detail. I am learning PowerShell as you may well know if you have seen any other of my recent blog items and so am looking around the internet to find code samples and try out different methods of working with PS scripts. I have previously mentioned PowerShellAnalyser, so far the best PS script editor I have found, and the fact that its free.

In contrast I also found a site that supports SQL DBA's with various applications/tools to make their job easier with automation of all the important tasks. They have a link to a set of Powershell scripts that they have accumulated, presumably verified and declared worthy of distribution to the SQL community. The route to them from their own (paid) PS editor details page is via a "Free Scripts" link where you get another option to get their paid editor and click on another link to "Download Now". You now reach a personal details page where 12 pieces of data (note only 1 is optional) are required and there are 2 checkboxes to discourage/encourage follow up sales calls/emails. Once this data has been verified - yes, there is intense validation of the data formats you input - you get to a page that says
"Thank you! You will soon receive a confirmation email with the download link to the XX Company name product(s) you selected.".
I went through this process a while ago and thought 'great, when I get a chance I'll download the scripts and see what they do'. When I got the chance I opened the email, clicked the link and the browser opened a page that said
"Downloads.Your download link has expired. Please email support to request reactivation of your link. For the fastest response, please ensure that you send from the email address you used when registering for the download. Re-activation requests will be handled in 1 business day or less."
Gah, this is starting to hurt now, I'm building a dislike for a product I haven't even seen - I had to submit stupid amounts of details to get nothing a week ago and I have come to take a proper look at the product and its now taken from me. I email support. Support reply - `just download the product again`. OK, building a proper dislike of the whole company now. In all honesty, I cant be bothered. My time is not worth a lot in the grand scheme of things but its worth more than dancing through their hoops. Whatever paid products they have they will be unlikely to see me on their future customers list. If I get any emails from them following this then the domain name is going on the spam list. Just as my curiosity got spiked I reviewed the URL from my first download attempt and my trial today to get the process right. There have been just under 2000 people getting downloads in this fashion since my first attempt and 330 in the last 48 hours. How do I know? The link is simply their domain with a querystring parameter called CustomerID that has an incremental integer value. I wonder what their conversion rate from enquiries to sales is? Perhaps their paid-up users are simply more patient people than I am.

By contrast, the PowerShellAnalyser download is actually making its way to my PC with two clicks from the main page.

I understand there are different business models but Shell Tools have endeared themselves to me by making their free stuff freely available. They have no idea who I am or what I have done on their site but I would go to them if I have a need that one of their future products can fix simply for the help they gave me in using PSA for free without any strings. I dont feel stalked or tracked or like I have somehow given away a load of information for no good reason. There are other companies that have paid for products but also make available some tools for free with little or no data capture attempts. They are the sort of companies that make me a friend and I want to support their commitment to giving something to the community.

Its possible either product would have been useless or brilliant. PSA is brilliant AND free, the other tool would have had to be brilliant in order to balance out what I have given that company already.

While I am at it, another tool that I use regularly that is free but does a great job is Paint.net. An image manipulation tool that punches way above its weight. This product has a PayPal link in its menus and on the site as it is a project that isnt associated with any business endeavour. I whole heartedly recommend Paint.Net too

Please be sure to read the code disclaimer page before you use any code in this blog.

Thursday, 4 March 2010

Getting to SQL Server from PowerShell script interface

Right, I'm progressing with PowerShell a little at a time. I still dont have any books but whenever I need to reference SQL information (not data, I am only getting info on Server settings not accessing customer database data) I try the PS route first.

In order to connect to a SQL Server from the command line you need to add two PS snapins. To do that you issue the commands add-PSSnapin sqlservercmdletsnapin100 and add-PSSnapin sqlserverprovidersnapin100

Today I am looking at collation on my server so I first need to 'map a drive' using new-PSDrive -name ~Drivename~ -psprovider SQLServer -root sqlserver:\SQL\~servername~\~instancename~ and then change my current drive to that server so I use set-location ~Drivename~:\databases and then simply use get-ChildItem |Select-Object name, collation to get a list of database names on that SQL Instance and their collation setting.

As always, if you want to see the extra switches or other information about PS commands then simply type get-help ~pscommand~ for the command you are interested in. eg. get-help new-PSDrive will return the help for the new-PSDrive command

Please be sure to read the code disclaimer page before you use any code in this blog.

Wednesday, 3 March 2010

How to slipstream SQL Server 2008 RTM and SQL Server 2008 SP1

I have a number of SQL Servers to upgrade from 2005 to 2008 this year and as service pack 1 is already out it makes sense to install this at the same time as the main software. In Microsoft parlance this is known as 'slipstreaming'.

In theory it involves getting the set up files for SQL Server 2008 and applying the service pack to them rather than to the installed software. Its covered on an MSDN blog here http://blogs.msdn.com/petersad/archive/2009/02/25/sql-server-2008-creating-a-merged-slisptream-drop.aspx and these are the instructions that I followed. However there are a couple of pointers that I have had to work through so this could be considered a supplemental blog for the slipstream creation process. The step numbers used relate to those in Peter Saddow's blog.

Step 1: I got iso files downloaded. These need to be mounted (Virtual Clone Drive) or extracted to disc and then the contents copied to the directory as described. Note: I could not get PowerShell to do the copy from my VCD drive - it errored saying the drive was not the right format. Windows drag and drop does it anyway.
Step 2: No problems (apart from time taken to download!)
Step 3: Execute each statement listed one at a time and leave it a few minutes before you start the next one. You will see an extracting progressbar for each package.
Step 4: This is simply moving them up a directory level and overwriting the files of the same name already there. Robocopy is a tool installed as part of a server admin resource kit. You will need to install that in order to copy the command given otherwise follow the instructions exactly.
Step 5: Robocopy again but simple
Step 6: Easy enough to edit 3 .ini files and add a line.
Step 7: Install SQL Server 2008 and note the slipstream identifiers as you go through the installation process.

Hope this helps.

Other useful links
SQL Server 2008 Slipstream FAQ : http://blogs.msdn.com/petersad/archive/2009/04/24/sql-server-2008-slipstream-frequently-asked-questions.aspx
CSS SQL Server Engineers How to fix your SQL Server 2008 Setup before you run setup... Part 1
CSS SQL Server Engineers How to fix your SQL Server 2008 Setup before you run setup... Part 2

Please be sure to read the code disclaimer page before you use any code in this blog.

Tuesday, 2 March 2010

Brad's Sure Guide to SQL Server Maintenance Plans

That's the title for the latest book from Brad McGehee and Simple Talk Publishing. Its available from Amazon here.
ISBN 978-906434-34-2
Price:GBP19.99 / USD29.99 / CAN35.99

This book will enable any person responsible for data stored in a SQL Server database to ensure it is being managed to a required minimum with a small amount of effort and possibly limited experience. This is not a book long time DBAs who have scripts already in place to maintain their databases will get a lot out of, it’s for the people who are becoming known as ‘Accidental DBAs’. Maybe you were closest to the server when the last DBA quit, or you are a system administrator with a new system based on SQL Server and you don’t have a DBA in-house. Whichever way you have happened to find yourself looking after SQL Server databases, this book is for you.

It explains very clearly what a maintenance plan is designed to do, what it can and can’t do and in some cases why you don't want it to do some things. Once the intent of each feature is established Brad then walks you through configuring the task options to your requirements and how it can be linked to or affects other tasks you need carried out on your databases. A clear format of what the task does, why you want it done (or not) and how to make it happen is followed in each chapter. Pitfalls and limitations of the various tasks are explained clearly and where there are options to avoid or work around these they are explained so that they can be implemented. The tasks are described with enough background information that someone unsure about SQL Server will clearly understand what the task is attempting to achieve and will then be able to decide whether that is something they want to apply to their systems.

While the majority of the book refers to the process followed by the Maintenance Plan Wizard Brad also explains how to make your own custom Maintenance Plan or alter existing plans with the Maintenance Plan Designer. Implementing schedules and operator notifications into your plan is explained so that a user can keep up to date with the status of any executions without having to read through the detailed reports along with using logic to control flow from one step to another within a plan and adding multiple copies of tasks in order to get the exact results needed.

If you have any databases in your remit and you are wondering how to look after them, or want a guide on how to make your existing maintenance plans more effective, then this book will be a valuable asset to you.

It could be as simple as reading this book for a couple of hours on Monday, Tuesday and Wednesday and then over a similar time on Thursday and Friday you could have your SQL databases fully covered by a set of maintenance plans to ensure your data is well maintained and backed up.

Brad's Sure Guide to SQL Server Maintenance Plans

AddThis

Bookmark and Share
 
Google Analytics Alternative