Tuesday, July 27, 2010

Creating your own default trace

Are you aware of the default trace that SQL Server runs when it starts? Did you know it’s configurable?

No, you can’t edit it. But you can still run a trace on startup that captures only the events that you want.

The first thing you should do is to see if the default trace is enabled. Simple enough, just run sp_configure with advanced options and look for 'default trace enabled'. Or query sys.configurations for configuration_id 1568.

FROM sys.configurations
WHERE configuration_id = 1568

On my laptop it is.


There’s a few ways to see what the trace is doing. First find the trace id. It should be 1, but you can make sure…

SELECT * FROM sys.traces

SELECT * FROM ::fn_trace_getinfo(0)

…then query fn_traace_geteventinfo(1) to see what events the default trace is looking at. You can see the descriptions of the events by joining to some system views. 

SELECT tc.name AS [Category],
dt.eventid, te.name AS [Event Name],
dt.columnid, tcol.name AS [Column Name]
FROM ::fn_trace_geteventinfo(1) AS dt
INNER JOIN sys.trace_events AS te ON dt.eventid = te.trace_event_id
INNER JOIN sys.trace_columns AS tcol ON dt.columnid = tcol.trace_column_id
INNER JOIN sys.trace_categories AS tc ON te.category_id = tc.category_id


But what if, after looking at the events that the default trace is looking at, you decide to add to the list? Or maybe you want to skip some events? Remember, you can’t alter the default trace. 

First disable the default trace. Use sp_configure, the change doesn’t require a restart. It will also close the trace.

sp_configure 'default trace enabled', 0;


Next create a stored procedure in the master database that will create and start your trace. Use sp_trace_create to generate a trace id, use sp_trace_setevent to select the events you want to capture, and finally start the trace with sp_trace_setstatus. Read more about creating traces here. If you aren’t familiar with creating traces via T-SQL use SQL Profiler to capture the events you need and set whatever properties you need then extract the trace properties to a script file.

Finally mark your new procedure to start when SQL does by using sp_procoption. This procedure takes 3 parameters; the name of your procedure, the option ‘startup’, and a true or false value. ‘Startup’ is the only option available, true sets to autostart, and false sets to not autostart. 

In my demo script below I’ve created a trace that will track what a certain user does in the database as well as when they log in or out. 

USE [master]

CREATE PROCEDURE [dbo].[MyDefaultTrace]

declare @rc int
declare @TraceID int
declare @maxfilesize bigint
declare @filecount int
set @maxfilesize = 100
set @filecount = 5

exec @rc = sp_trace_create @TraceID output, 2,
@maxfilesize, NULL, @filecount

declare @on bit
set @on = 1
exec sp_trace_setevent @TraceID, 14, 7, @on
exec sp_trace_setevent @TraceID, 14, 23, @on
exec sp_trace_setevent @TraceID, 14, 8, @on
exec sp_trace_setevent @TraceID, 14, 64, @on
exec sp_trace_setevent @TraceID, 14, 1, @on
exec sp_trace_setevent @TraceID, 14, 9, @on
exec sp_trace_setevent @TraceID, 14, 21, @on
exec sp_trace_setevent @TraceID, 14, 25, @on
exec sp_trace_setevent @TraceID, 14, 41, @on
exec sp_trace_setevent @TraceID, 14, 49, @on
exec sp_trace_setevent @TraceID, 14, 57, @on
exec sp_trace_setevent @TraceID, 14, 2, @on
exec sp_trace_setevent @TraceID, 14, 6, @on
exec sp_trace_setevent @TraceID, 14, 10, @on
exec sp_trace_setevent @TraceID, 14, 14, @on
exec sp_trace_setevent @TraceID, 14, 26, @on
exec sp_trace_setevent @TraceID, 14, 66, @on
exec sp_trace_setevent @TraceID, 14, 3, @on
exec sp_trace_setevent @TraceID, 14, 11, @on
exec sp_trace_setevent @TraceID, 14, 35, @on
exec sp_trace_setevent @TraceID, 14, 51, @on
exec sp_trace_setevent @TraceID, 14, 12, @on
exec sp_trace_setevent @TraceID, 14, 60, @on
exec sp_trace_setevent @TraceID, 15, 7, @on
exec sp_trace_setevent @TraceID, 15, 15, @on
exec sp_trace_setevent @TraceID, 15, 23, @on
exec sp_trace_setevent @TraceID, 15, 8, @on
exec sp_trace_setevent @TraceID, 15, 16, @on
exec sp_trace_setevent @TraceID, 15, 64, @on
exec sp_trace_setevent @TraceID, 15, 9, @on
exec sp_trace_setevent @TraceID, 15, 17, @on
exec sp_trace_setevent @TraceID, 15, 41, @on
exec sp_trace_setevent @TraceID, 15, 49, @on
exec sp_trace_setevent @TraceID, 15, 57, @on
exec sp_trace_setevent @TraceID, 15, 6, @on
exec sp_trace_setevent @TraceID, 15, 10, @on
exec sp_trace_setevent @TraceID, 15, 14, @on
exec sp_trace_setevent @TraceID, 15, 18, @on
exec sp_trace_setevent @TraceID, 15, 26, @on
exec sp_trace_setevent @TraceID, 15, 66, @on
exec sp_trace_setevent @TraceID, 15, 3, @on
exec sp_trace_setevent @TraceID, 15, 11, @on
exec sp_trace_setevent @TraceID, 15, 35, @on
exec sp_trace_setevent @TraceID, 15, 51, @on
exec sp_trace_setevent @TraceID, 15, 12, @on
exec sp_trace_setevent @TraceID, 15, 60, @on
exec sp_trace_setevent @TraceID, 15, 13, @on
exec sp_trace_setevent @TraceID, 15, 21, @on
exec sp_trace_setevent @TraceID, 43, 7, @on
exec sp_trace_setevent @TraceID, 43, 15, @on
exec sp_trace_setevent @TraceID, 43, 8, @on
exec sp_trace_setevent @TraceID, 43, 64, @on
exec sp_trace_setevent @TraceID, 43, 1, @on
exec sp_trace_setevent @TraceID, 43, 9, @on
exec sp_trace_setevent @TraceID, 43, 41, @on
exec sp_trace_setevent @TraceID, 43, 2, @on
exec sp_trace_setevent @TraceID, 43, 10, @on
exec sp_trace_setevent @TraceID, 43, 3, @on
exec sp_trace_setevent @TraceID, 43, 11, @on
exec sp_trace_setevent @TraceID, 43, 35, @on
exec sp_trace_setevent @TraceID, 43, 51, @on
exec sp_trace_setevent @TraceID, 43, 12, @on
exec sp_trace_setevent @TraceID, 43, 5, @on
exec sp_trace_setevent @TraceID, 43, 13, @on
exec sp_trace_setevent @TraceID, 43, 6, @on
exec sp_trace_setevent @TraceID, 43, 14, @on
exec sp_trace_setevent @TraceID, 43, 62, @on
exec sp_trace_setevent @TraceID, 45, 7, @on
exec sp_trace_setevent @TraceID, 45, 8, @on
exec sp_trace_setevent @TraceID, 45, 64, @on
exec sp_trace_setevent @TraceID, 45, 1, @on
exec sp_trace_setevent @TraceID, 45, 41, @on
exec sp_trace_setevent @TraceID, 45, 10, @on
exec sp_trace_setevent @TraceID, 45, 3, @on
exec sp_trace_setevent @TraceID, 45, 11, @on
exec sp_trace_setevent @TraceID, 45, 35, @on
exec sp_trace_setevent @TraceID, 45, 51, @on
exec sp_trace_setevent @TraceID, 45, 12, @on
exec sp_trace_setevent @TraceID, 45, 5, @on
exec sp_trace_setevent @TraceID, 45, 13, @on
exec sp_trace_setevent @TraceID, 45, 6, @on
exec sp_trace_setevent @TraceID, 45, 14, @on
exec sp_trace_setevent @TraceID, 45, 62, @on
exec sp_trace_setevent @TraceID, 45, 15, @on
exec sp_trace_setevent @TraceID, 41, 7, @on
exec sp_trace_setevent @TraceID, 41, 15, @on
exec sp_trace_setevent @TraceID, 41, 8, @on
exec sp_trace_setevent @TraceID, 41, 64, @on
exec sp_trace_setevent @TraceID, 41, 1, @on
exec sp_trace_setevent @TraceID, 41, 41, @on
exec sp_trace_setevent @TraceID, 41, 10, @on
exec sp_trace_setevent @TraceID, 41, 3, @on
exec sp_trace_setevent @TraceID, 41, 11, @on
exec sp_trace_setevent @TraceID, 41, 35, @on
exec sp_trace_setevent @TraceID, 41, 51, @on
exec sp_trace_setevent @TraceID, 41, 12, @on
exec sp_trace_setevent @TraceID, 41, 5, @on
exec sp_trace_setevent @TraceID, 41, 13, @on
exec sp_trace_setevent @TraceID, 41, 6, @on
exec sp_trace_setevent @TraceID, 41, 14, @on

-- Set the Filters
declare @intfilter int
declare @bigintfilter bigint

exec sp_trace_setfilter @TraceID, 11, 0, 6, N'BadUser'

exec sp_trace_setstatus @TraceID, 1



EXEC sp_procoption N'MyDefaultTrace', 'startup', '1'


Now when I check to see what traces are running trace id #1 is the trace created in my procedure. The trace start is also written to the SQL error log when the master database starts.


As always, please test this on a server you can play with before putting into production.

Wednesday, July 21, 2010

Winning the DBA Open

Last weekend I, like millions of other golf fans around the world, watched as Louis Oosthuizen, an unknown (at least to me) golfer out of South Africa, won The Open Championship at St Andrews by an astounding 7 strokes. He showed confidence in his abilities throughout the tournament, never losing his composure playing in some of the toughest conditions on any course in the world. Congratulations, Louis!


As i watched the game, I began to think that being a DBA and being a champion golfer weren’t that different. Both need to be confident in their abilities, both need to have a solid game plan, and both have to be able to execute their craft under pressure, being prepared for any eventuality.

Preparing for the tournament

A good golfer gets to the course early, days before the tournament begins. They play the course, talk to locals to learn any particular quirks the course may have, and practices hundreds of shots of different lengths or difficulties.


Good DBAs do the same. They find out about the host server, talking to the system team or SAN administrator; what kind of disk subsystem is there? what’s the memory or CPU like? does the server meet the requirements for SQL?

You drive for show…

The golfer starts his round on the tee. He’ll determine the club to use based on experience and conditions; where’s the wind coming from? how long is the hole? where’s the danger spots? A well hit drive will end up far down the fairway, perfectly placed for his next shot. 

Click to view

The DBA does the same when he sets up the server. His experience comes into play knowing what has to be done, and server conditions will help him to correctly place the system databases and data and log files for other databases. He’ll know what options to set and what to set them to. His server will be set up to optimal conditions, perfectly placed for hosting the company data.

…the approach shot…

Now the golfer will try to get his ball onto the green. He’ll again choose the right club, line up the shot, take a practice swing or two, then hits his best shot. He’ll know how his ball will fly because he has hit the same shot hundreds of times before so he can compare against past performance.

Click to view

The DBAs approach shot is to benchmark his system. He should know how the server will perform during peak and non-peak times.

---getting out of a hazard…

Sometimes, though, the golfer’s approach shot is mishit. Or maybe a gust of wind holds up the shot and it lands in a greenside bunker. A good golfer knows how to get out of those situations and save par.

humorous golf hazard

Similarly, a DBA must know how to get out of disaster scenarios. He knows he has set up and is maintaining  a proper backup strategy and has practiced recovering databases until he knows the procedure inside and out. He knows the different options available for DBCC CHECKDB.

---and you putt for dough

This is where the good golfers shine. They don’t fade when standing over a three foot putt with the tournament on the line. They know they’ve made the same putt over and over before, and they’ve practiced for just this occasion. They read the putt’s line, judge the speed, and with millions of eyes watching they calmly knock it hit it center cup. 

Click to view

The DBA also is good under pressure. They’ve written and checked over their scripts, and they’re confidant they’ll perform as written. They don’t wilt when the CIO is standing over their shoulder asking why performance is so bad on their server. 

The caddy…

All good golfers know the value of their caddy. A caddy’s job is to help his golfer, pointing out the course yardage, offering the right club, taking some of the pressure of the golfer. Sometimes the caddy is also a teacher, pointing out flaws in the golfer’s swing.

Golf Caddies Baltusrol

A DBA also needs a caddy. Their caddy can be a mentor or coworker at work. They can use social networks for advice on specific problems or as a sounding board.      

Preparing for the next tournament

The golfer is always practicing. He’s at the driving range on his day off, constantly working on correcting swing flaws and working on new shots.

Likewise the DBA is constantly learning. They attend their local PASS chapter meetings or go to national conventions. They’re constantly reading blogs and whitepapers, learning new skills to help them manage their servers or help their company.

The moral? Like in golf, or any endeavor, the secret of being a good DBA is to be prepared. Learn your craft. Ask your caddy when you have questions. Practice when you’re able. Keep learning. And you too can win the DBA Open.

Monday, July 19, 2010

What I learned today – importing mixed data types from Excel

Recently a coworker came up with an issue I never ran into before. He’s importing a large number of Excel spreadsheets into a SQL 2005 instance. One of the columns contains mixed data; 1, 2, 3, x, y, z for example. What he found is that if the first 20 or so rows are strictly numeric then the connection treats the whole column as numeric, and alphanumeric values are returned as NULLs. He’s querying using OPENROWSET, but it looks like this happens in SSIS as well. 

What his research found was that there’s a few registry keys that come into play. There’s also another property he needed to set in his connection string. When he adds “IMEX=1” it allows importing of mixed data types based on the the registry setting “ImportMixedTypes”. The default value is “text”. One drawback to text is that you’re limited to 255 characters. The only other available value is “Majority Type”. When you use Majority Type some of the values are scanned and the data type with the most values is used.

The number of rows scanned is based on the “TypeGuessRow” registry setting. By default it’s set to 8, but valid values are 0 to 16. If you set it to 0 then all rows are sampled. Setting it to a value between 1 and 16 samples that number.

Both keys can be found at HKEY_Local_Machine/Software/Microsoft/Jet/4.0/Engines/Excel/.  

Here are a few articles that explain the issue better.



UPDATED 7/20/2010: You can read more about the actual issue here.

Wednesday, July 14, 2010

Changing the name of your SQL server

My company recently changed their standard naming conventions for computers, so yesterday I had to rename my workstation. Usually this isn’t a big deal, except that I’m running locally a default instance of SQL 2005 and a named instance of SQL 2008.Again, not a big deal since this is just my local playground. But I wanted to sync up the names.

Let’s say that my laptop was named “CHICAGO”. That makes the default instance also “CHICAGO'”, and my named instance “CHICAGO\KATMAI”. Now my laptop name changed to “NEWCHICAGO”. My SQL instances stay as “CHICAGO” and “CHICAGO\KATMAI”. How do you change them to match the new computer name?

Couldn’t be simpler, just execute two procedures. For the default instance.

USE master;

EXEC sp_dropserver 'CHICAGO';

EXEC sp_addserver 'NEWCHICAGO', local;

It’s the same for a named instance. Just add the instance name.

USE master;

EXEC sp_dropserver 'CHICAGO\KATMAI';

EXEC sp_addserver 'NEWCHICAGO\KATMAI', local;
Then, just restart the SQL service and you should see the name change.

Note, though, that this only changes the server name, not the instance name.  I haven’t had an opportunity to try that yet.

Tuesday, July 13, 2010

Another use for PowerShell and SQL

A few months ago I wrote a PowerShell script that looped through servers and databases  to save the properties inside an Excel spreadsheet (PowerShell and SQL). I thought it’s time to add another one.

One of the daily tasks I set up for myself as a SQL DBA is to go through the error logs for anything that may be an indicator of a problem. This is especially true on Sunday afternoons or Monday mornings, after my weekly DBCC CHECKDB job runs. I’ve configured the jobs to email me the results, but since I’m monitoring 10 instances they can be time consuming to read through. I usually just cut to the bottom, where DBCC reports the number of errors it finds. If the number of errors is 0, all is well and good. Otherwise, of course, further investigation is needed.

This seemed like a natural for a new PowerShell script. One of the great strengths of PowerShell is it’s ability to connect to as many remote servers as needed. And since I knew that the DBCC CHECKDB summary is written in the error log all I need to do is to read the logs. I had read this post by Buck Woody (blog | twitter) about doing almost exactly what I wanted to do. I merged his script with my earlier script and made a few other changes. My final code looks like this:

[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null 

#Get SQL account information
$credential = Get-Credential
$loginName = $credential.UserName -replace("\\","")
$password = $credential.Password

#Get variables
$logDate = read-host("
Enter log date")
$searchPhrase = "
$searchPhrase += read-host("
Enter search phrase")
$searchPhrase += "
$srvlist = @(get-content "

#Loop through all instances
foreach ($instance in $srvlist)
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $instance
$srv.ConnectionContext.LoginSecure = $false
$srv.ReadErrorLog(0) | where {$_.LogDate -gt $logDate -and $_.Text -like $searchPhrase}

Since I don’t always access the servers from a computer on the domain, I added login credentials. You can remove these if you want to use Windows authentication. After that I read a date and search phrase I type in. I found that this can come in handy when I want to change what I’m searching for. For instance, I can enter the phrase “DBCC CHECKDB” and the script will return the summaries based on the date I entered. Or I can search for “BACKUP” or “ERROR”, whatever my heart desires.

Tuesday, July 6, 2010

Stop SQL Agent when enabling Service Broker on msdb

I forget this every time I set up a new server.

When I set up database mail on anew server I usually get the message “Service Broker is not enabled on msdb. Do you want to enable it?” Clicking OK gives me an unresponsive SQL server. Running the command “ALTER DATABASE msdb SET Enable_Broker” runs forever with no message returned.

What I forget each and every time is that the SQL Agent is running. Checking the locks or running sp_who2 shows that my ALTER DATABASE is blocked by the process running the Agent. Simply stopping the Agent allows the statement to finish, whether from the query window or clicking OK in the dialog box mentioned earlier. Then I just restart the Agent and everything’s hunky-dory again.

Maybe now I’ll remember it.

Monday, July 5, 2010

My favorite SQL features

Recently I ran across a contest sponsored by Microsoft SQL Server MVP Ken Simmons (Blog | Twitter). One of the ways to enter the contest was to create a blog post listing favorite SQL features. But how do you select a favorite? That’s like asking someone to choose a favorite song. Like most things the the SQL world, the correct answer is “it depends”, usually on what I’m trying to do at the time.

So here’s my top 10 favorite SQL features and tips, in the order that they come to mind. I’m ignoring features I’m not currently using, like backup compression and data encryption, and limiting myself to those I use the most often.

  1. Integration with PowerShell. This is a great feature to administer multiple SQL instances easily. For those who haven’t been on the internet in the last few years, PowerShell is Microsoft’s new scripting language based on the .NET framework. And starting with SQL 2008, PowerShell became integrated with SQL. What does this mean to you as a DBA? Well, imagine yourself as the new head DBA for your company, and you’re now in charge of 100 SQL instances. Your boss tells you that your first task is to standardize all instances. How can you quickly view the properties of each server? With PowerShell and remoting, you can easily query each instance and save the results to an Excel spreadsheet for easy analysis. I gave an example of how I do it in an earlier blog post here.      
  2. Dynamic Management Views and Functions (DMV). Introduced with SQL 2005, DMVs are a great way to see how your server is performing. I’m not going to go into them in depth here since they’ve been written about by many others much better than I can. If you’re not familiar with them, do yourself a favor and start reading about them. They’ll make your life much easier.
  3. Storing queries in the toolbox. Like all DBAs, I’ve got a number of queries that I run over and over again. There’s numerous ways to store these for easy access, but my favorite for short one-line queries is to store them on the toolbox. Simply open the toolbox (Ctrl + Alt + x) and drag your query over. To recall it, open the toolbox, copy the desired query, and paste it into a new query window.
  4. Policies. Introduced in SQL 2008, Policy Based Management allows DBAs to manage multiple servers from a central server. Policies can standardize server configurations and check them for compliance.   
  5. Report Services. Built-in reporting capabilities for the same price as the database engine. ‘nuff said.
  6. IntelliSense. This feature gives the same statement completion that you see in Visual Studio, not surprising since SQL Management Studio uses the same IDE and Visual Studio. Unfortunately it doesn’t work against 2005 servers or earlier. It did in the earlier betas but was removed in the final release. Hopefully Microsoft will put it back.
  7. Object Explorer Details. You can now see details of different database objects using the details view (F7). It’s also configurable. Just right click on the screen and choose what columns you want to see.    image   
  8. Running queries against multiple instances. If you set up Registered Groups and register SQL instances by groups, you can execute a query against all servers in the group. You can select all groups, but you can’t choose to ignore instances inside the group. It’s all or nothing. But still, a significant timesaver when you need to run multiple statements against multiple instances.  
  9. Setting variable values. One of the nice things in the .NET world is the ability to use easily change the value of a variable; i += 1 for instance. Now you can do that in SQL too. Instead of 
  10. SET @i = @i + 1

    you can now do it like this 

    SET @i += 1

  11. SQL Community. Not technically part of SQL itself, but I feel it’s important nevertheless. I first began learning about SQL on my own and quickly found myself overwhelmed. There was no time to learn at work, and I needed to find the knowledge to do my job. I found the Chicago SQL User Group and found what I needed; experienced DBAs volunteering to talk about a subject they love. ChiSUG lead me to PASS, an international group with tons of free resources. I began following blogs written by the same people authoring the articles on PASS and presenting at the group meetings. Reading these blogs lead me to twitter, where I can follow SQL experts in real-time and ask for help with the #SQLhelp tag. Finally there’s SQL Saturdays, day-long events with SQL experts speaking on various topics. Notice the recurring theme about these resources; they are all free. I’m constantly amazed by the amount of content that SQL DBAs and developers make available for no cost, and the time they put in as volunteers at in-person events. To the entire SQL community, thank you. I wouldn’t be where I am today without your help.