Tuesday, February 16, 2010

How I save table sizes for all databases

Back when I first started monitoring our production servers, I wanted a way to track the size and growth of all tables in our databases. At the time, we were really only worried about 1 legacy database, and all the objects in that database were in the dbo schema. So tracking size growth was pretty easy; I ran the following script against that database weekly and compared results.

declare @TableName    sysname

declare mySpace cursor for
select name from sys.sysobjects where xtype = 'U' 
order by name
open mySpace
fetch next from mySpace into @TableName
while @@fetch_status = 0
begin
insert into PerfDB.dbo.TableSpace(TableName, NumRows, Reserved, DataUsed, IndexUsed, Unused)
exec sp_spaceused @TableName
fetch next from mySpace into @TableName
end
close mySpace 
deallocate mySpace

After a while, we began to create new projects that required new databases. And these databases had objects that were created into separate schemas. So now, if I wanted to track changes to all tables in all databases, I’d have to run the above code against all databases. Do-able, certainly, but not exactly what I wanted to maintain. I’d have to add the code to each new database we added to the server. And sp_spaceused only returns the name of the table, not the schema name. I wanted to separate the tables by database as well as schema.

So I played around with using two well known undocumented procedures, sp_MSForEachDB and sp_MS_ForEachTable. I’ve used them before, but I never nested foreachtable inside foreachdb. And I still needed a way to separate the schema from the table name. And I wanted to get away from using a cursor.

I found the following script on a SQLServerCentral forum that gave me an idea. Running this script showed that the value of ‘?’ when running sp_MSForEachTable showed the schema name as well as the table name.

declare @command1 varchar(8000)
select @command1 = 
'
IF ''@'' <> ''master'' AND ''@'' <> ''model'' AND ''@'' <> ''msdb'' AND ''@'' <> ''tempdb'' 
begin
use [@] execute sp_MSForEachTable ''print ''''?'''' '' 
end    
'
exec sp_MSforeachdb @command1, '@'
So now I can just run sp_spaceused inside the sp_MSForEachTable block and strip out the schema into a separate column.

Here’s the code I use now

CREATE TABLE  #TableSpace (
DatabaseName    nvarchar(128) NULL,
SchemaName        nvarchar(128) NULL,
TableName        nvarchar(128) NOT NULL,
rows            int NOT NULL,
reserved        varchar(15) NOT NULL,
data            varchar(15) NOT NULL,
index_size        varchar(15) NOT NULL,
unused            varchar(15) NOT NULL)
DECLARE @SQL varchar(8000)
SELECT @SQL = 
'
IF ''@'' <> ''master'' AND ''@'' <> ''model'' AND ''@'' <> ''msdb'' AND ''@'' <> ''tempdb'' 
BEGIN
USE [@] EXECUTE sp_MSForEachTable ''INSERT INTO #TableSpace (TableName, rows, reserved, data, index_size, unused) EXEC sp_spaceused ''''?''''; 
UPDATE #TableSpace SET SchemaName = LEFT(''''?'''', CHARINDEX(''''.'''', ''''?'''', 1) - 2) WHERE SchemaName IS NULL;
UPDATE #TableSpace SET DatabaseName = ''''@'''' WHERE DatabaseName IS NULL; ''
END    
'
EXEC sp_MSforeachdb @SQL, '@'

In my example I’m creating a temporary table, but in practice I use a permanent table.  Line 16 is obvious, it skips the system databases. Line 18 calls the sp_spaceused procedure for each table inside each database.

Line 19 is where I strip out the schema name. I just use CHARINDEX function to return the position of the dot separator, then I use the LEFT function to return the characters before it. Since ‘?’ returns the table name as [Schema].[TableName] I also want to skip the brackets, so I move the position one more place to the left to take care of the ] character. I could have been even fancier and did even more string manipulation to take care of the [ character, but it was just as easy to use the REPLACE function on the whole column.

And Line 20 stores the database name. Running the script on my laptop running SQL 2008 gives me results like these

image

I’ve run this against SQL 2005 and SQL 2008 servers. I haven’t tested it against SQL 2000. As with any code you find on the internet, please test it and become familiar with it before you put it into production.  

Tuesday, February 9, 2010

My Review of Windows PowerShell™ 2.0 Best Practices

Originally submitted at O'Reilly

Learn field-tested solutions, best practices, and proven techniques for Windows PowerShell 2.0—including expert tips and lessons learned from the insiders on the Windows PowerShell Team at Microsoft.


First impressions

By jaype from Chicago IL on 2/9/2010

 

5out of 5

Pros: Easy to understand, Accurate, Well-written, Concise, Helpful examples

Best Uses: Expert, Intermediate

Describe Yourself: Developer, Sys Admin

I'm just beginning to read this book and I thought I'd add my first impressions. I'm a SQL Server DBA and I'm looking to learn Powershell as a means of remotely administering my servers. I've only used Powershell briefly so I'm far from an expert. I'm hopeful this book will help my understanding.

First off, this book is not a book for beginners. A background in Powershell is helpful. I'm having to look up some of the concepts. But that's not the fault of the book, and the authors say as much in the Introduction. My only suggestion for other Powershell beginners is to read Chapter 6, on configuring the environment, first.

I've finished the first two chapters so far and I find the book is well written and informative. Chapter 1 is about Version 2, what's new, and how to deploy it on different OS's. It also shows how to use a version tag for backwards compatibility with version 1. Chapter 2 shows how to use WMI functionality and how remoting works. Remoting is also new to Version 2.

There's 2 things that make this book invaluable as a reference. First, the appendices are full of great information. For example, Appendix A lists all the cmdlets in Version 2 with a short description. Second, on the companion CD, there's a Quick Reference Guide pdf file that summarizes the points of each chapter.

I'll update my review after I finish the book.

(legalese)

Monday, January 18, 2010

Using the RELOG utility – Part 2

In my previous post (What I Learned Today - the RELOG utility) I discussed what I discovered the first time I learned about the RELOG utility. In this post I’ll continue with my education. Again, I’ve used Ask The Performance Team, Microsoft TechNet, and this blog post at My Torn Data Pages as my references.

I found that I could output a list of counters from a perfmon counter log using RELOG (RELOG counterfilename –q –o counterfilename). But you can take that a step further. Simply edit your counter file and delete the counters you don’t want. Then rerun RELOG and use the –cf switch to import a list of counters to filter for. My examples use the same counter log file as before:

RELOG SQL1_20100106.csv – q –o counters.txt -- to get my list of counters

RELOG SQL1_20100106.csv –cf counters.txt –c csv –o filtered1.csv --  to filter by those counters after editing the counters.txt file.

And of course I can still filter by time with the –b and –e switches:

RELOG SQL1_20100106.csv –cf counters.txt -b "1/5/2010 9:00:00AM" -e "1/5/2010 10:00:00 AM" –f csv -o filtered2.csv

You can save your settings in a config file…

[cf]
counters.txt
[b]
1/5/2010 9:00:00AM
[e]
1/5/2010 10:00:00AM
[f]
csv
[o]
configresults.csv

…and use just the –config switch
RELOG SQL1_20100106.csv –config relogconfig.txt

image

Now there’s plenty of ways to filter your perfmon files. The last way is to save the results in a SQL database.

Inserting into a SQL database

The –o switch specifies where to output the results of a perfmon counter log file. You have the option to export to a flat file, which I’ve been doing so far, or directly to a database. And that’s simple. First, create a DSN pointing to the server and database of your choice. Stan Segers, whose blog I’ve been referencing at My Torn Data Pages, is correct in noting that you need to use the SQL Driver. Like Stan, I wasn’t able to connect if I chose the SQL Native Client. Then, when you use the –o switch, specify SQL: and the name of your DSN. In the example my counter name is Counters.

RELOG SQL1_20100106.csv –cf counters.txt –o:SQL:Counters 

RELOG will create these 3 tables:

clip_image001

The tables are created the first time you run RELOG to that database. The utility generates a GUID for each run. Each time you run it it will insert records into the CounterDetails and CounterData tables. But it won’t insert into or update the DisplayToID table. So to keep a record of each time you run the utility, specify a different DisplayString. Everything after the exclamation point (!) is the string.  

RELOG SQL1_20100106.csv –cf counters.txt –o:SQL:Counters!DCICHISQL1 

Now that all my performance counters are in the database, I can easily query against them.clip_image002  

I hope I’ve shown you ways to use the RELOG utility. As you can see, it’s pretty powerful.                            

What I learned today – the RELOG utility

At the Chicago SQL User group meeting on 1/14/2010 John Jones of NetApp gave a talk on storage performance. it was an informative meeting. For instance, did you know that the latency of a 15K Fibre Channel drive is the same as a 15K SAS drive? But SAS drives are smaller; NetApp is able to put 24 drives in a 4 U rack.

During the presentation, John mentioned the RELOG utility and how it can convert a perfmon counter log from the default .blg extension to a .csv file. I’d never heard of RELOG before so I did a little reading up on it. I googled RELOG and found lots of information. The sources I used for this article were Ask The Performance Team, Microsoft TechNet, and this blog post at My Torn Data Pages. And I found that RELOG can do much more than just reformat a file. It can also filter the original counter log to show data from a sample time slice or only pull out certain counters.

RELOG is a command line utility. It’s available in Windows XP or higher machines, and it’s located in the \system32 directory. If you don’t have it you can get from Microsoft downloads. Also the utility isn’t just for SQL counters. you can run it against any perfmon counter file.

Here’s a list of the switches that RELOG uses:

image

In my examples I’m using a benchmark counter log I capture from our production servers. It’s already saved as a .csv file so I don’t need to convert it but I certainly could if I needed to. Also I’m typing RELOG in caps, though it’s not necessary.

Using RELOG with just the filename of the counter log will give you the start and end time of the capture plus the number of samples in the file…

image 

…and using the –q switch will list the counters used in the file. image

Adding the –o switch and specifying a file name will output the counter names to a separate file (RELOG SQL1_20100106.csv –q –o counters.txt).

So now that I know the times and the counters in my baseline, I can filter for the counters between 9 and 10 AM on 1/5(RELOG SQL1_20100106.csv -b "1/5/2010 9:00:00AM" -e "1/5/2010 10:00:00 AM" –f csv -o SQL1_filtered.csv). I can see that I’ve filtered out most of the sampling and now I’ve got the 237 samples I took between 9 and 10.

image

And I can also filter by individual counters.Let’s say I want to see the transactions per second against a specific database between 9 and 10 AM: RELOG SQL1_20100106.csv -b "1/5/2010 9:00:00AM" -e "1/5/2010 10:00:00 AM" -c "\MSSQL$MISDB:Databases(tempdb)\Transactions/sec" –f csv -o SQL1_filtered2.csv. Running this, I see the same 237 samples returned, but if I open the file, I’ll just see the counters I asked for:

image

I can also use a wild card to see the transactions/sec counters against all databases: RELOG SQL1_20100106.csv -b "1/5/2010 9:00:00AM" -e "1/5/2010 10:00:00 AM" -c "\MSSQL$MISDB:Databases(*)\Transactions/sec" –f csv -o SQL1_filtered3.csv

I haven’t tried to save the results to a database yet. According to My Torn Data Pages, there’s a problem with the syntax cited by Microsoft TechNet. I want to test both scenarios, using a DSN and the Native Client.

Tuesday, January 12, 2010

Quick access to your common scripts

Everyone has their own library of SQL scripts they run often, and everyone has their own methods for storing them. These a a few ways you may not be aware of.

My first method is pretty well documented. Have you every just clicked Ctrl + F1 and see the results of sp_who? Well, you can add your own procedure calls here. In Management Studio simply open Tools > Options and select Keyboard. Here you’ll see 12 slots with shortcuts of Alt + F1, Ctrl + F1, and Ctrl + 1 through 0. Three of these are already defined; sp_who, sp_help, and sp_lock. You can leave these or replace them with your own, and you can add your own in any available slot. So now you have a way of calling 12 procedures with just a keyboard shortcut. 

image

The second way I ran across by accident and works best for small queries. In Visual Studio you have the ability to store snippets of code in the Toolbox. One day I thought I’d try it in SSMS, since it’s similar to the VS IDE. And it worked. What you need to do is open the Toolbox ( Ctrl + Alt + X). You’ll most likely only see the General tab with no controls; that’s fine since there aren’t any. Right click in the Toolbox, select Add Tab, and give it a name; mine is simply Scripts. Next, open or write your script in a query window. Then highlight the script and drag it onto the Toolbox. You probably won’t see it yet; right click the Toolbox again and make sure both List All and Show All are selected. And voila! There’s your script! To run it just right click the snippet, copy the code, and paste in a new query window.

If you hover the mouse over the snippet you just added you’ll see the entire code. Notice this is added with Text: appended to the start. you can change this by right clicking the snippet and choosing rename, then just replace the word “Text’ with something more meaningful. 

image

The third way is even better. Most people know that there are templates with SQL scripts you can use. Again, you can add your own and have them available anytime you open SSMS.

If you open the Template Explorer (Ctrl + Alt + T) you’ll see the templates that Microsoft gives. There’s two ways to add your own templates. Right click SQL Server Template and choose New > Template. You can also choose Folder to categorize your scripts, otherwise the new template will be in the root. Give your new template or folder a descriptive name. Then right click the template, choose edit, write your code and save it. Now just double click the template to open it in a new query window.

The other way to add your script to the Template Explorer is to add it to folder where the other templates are. On an XP or a Windows 2003 Server that’s at Documents And Settings\your account\Application Data\Microsoft\Microsoft SQL Server\100 (for SQL 2008) or 90 (for SQL 2005)\Tools\Shell\Templates\Sql. On a Vista or Windows 7 box you can find it in Users\your account\AppData\Roaming\Microsoft\Microsoft SQL Server\100 (for SQL 2008) or 90 (for SQL 2005)\Tools\Shell\Templates\SQL.

One word of warning. If you do save your scripts as templates, remember to back them up occasionally. It would be easy to overlook them if you ever rebuild your computer and need to restore them. But of course you already knew that.

image

Monday, January 11, 2010

How I track database size

One of my DBA tasks is to track the space used on our database servers. I want to know how big the database file sizes are, how big the tables are, and how much free space is available on each server. This is a common task for DBAs, and there’s probably a different method for each DBA. This is mine.

When I first started, I created a database on each production server to store all stored procedures, tables, and views that I use as a DBA. The tables are mostly repositories of server state at the time I run a query or procedure. In the case of database sizes I created the procedure GetDBFileStats to get the filesize, total space, free space, and space used for each file in each database: 

CREATE PROCEDURE [dbo].[GetDBFileStats]
AS

EXEC sp_msforeachdb 'USE [?]

CREATE TABLE #tmpspc (Fileid int, FileGroup int, TotalExtents int, UsedExtents int, Name sysname, FileName nchar(520))

INSERT #tmpspc EXEC ('
'DBCC SHOWFILESTATS'')

INSERT INTO PerfDB.dbo.dbStats
SELECT GETDATE() AS RunDate,
'
'?'',
s.name AS [Name],
s.physical_name AS [FileName],
s.size * CONVERT(float,8) AS [Size],
CAST(tspc.UsedExtents * CONVERT(float,64) AS float) AS [UsedSpace],
(s.size * CONVERT(float,8) - CAST(tspc.UsedExtents*convert(float,64) AS float)) AS FreeSpace,
s.file_id AS [ID]
FROM sys.filegroups AS g
INNER JOIN sys.master_files AS s ON (s.type = 0 AND s.database_id = db_id() AND (s.drop_lsn IS NULL)) AND (s.data_space_id=g.data_space_id)
LEFT OUTER JOIN #tmpspc tspc ON tspc.Fileid = s.file_id
ORDER BY [ID] ASC

DROP TABLE #tmpspc

INSERT INTO PerfDB.dbo.dbStats
SELECT GETDATE() AS RunDate,
'
'?'',
s.name AS [Name],
s.physical_name AS [FileName],
s.size * CONVERT(float,8) AS [Size],
CAST(FILEPROPERTY(s.name, '
'SpaceUsed'') AS float) * CONVERT(float,8) AS [UsedSpace],
(s.size * CONVERT(float,8) - CAST(FILEPROPERTY(s.name, '
'SpaceUsed'') AS float)* CONVERT(float,8)) AS FreeSpace,
s.file_id AS [ID]
FROM sys.master_files AS s
WHERE (s.type = 1 AND s.database_id = db_id())
ORDER BY [ID] ASC'


GO


This was as far as I went until a few months ago. At that time I started to think of using SSIS for some of my daily tasks. So I created this simple package that gets the results from my procedure and puts them into a central database repository. I also truncate the tables and change the default date to just the date; 


 image


Finally I created a report that got the data back. I send 2 parameters; a from date and a to date. Now when I run the report I can view what the database size was for the two dates and easily see how much the files have grown. I also created a sub report that, when I click on the database in the main report, opens to show the growth history of that database on a week to week basis.


Here’s the table I use to store the data and the procedure I use to populate the summary report. I also run it as an ad-hoc query;


CREATE TABLE [dbo].[dbStats](
[Servername] [sysname] NOT NULL,
[RunDate] [datetime] NOT NULL,
[DBName] [sysname] NOT NULL,
[Name] [sysname] NOT NULL,
[FileName] [nchar](520) NOT NULL,
[TotalSize] [float] NOT NULL,
[UsedSpace] [float] NOT NULL,
[FreeSpace] [float] NOT NULL,
[FileID] [int] NOT NULL
) ON [PRIMARY]



CREATE PROCEDURE [dbo].[prc_DBSizeCompare] (
@minDate DATETIME,
@maxDate DATETIME)

AS

SET NOCOUNT ON

;WITH DBCTE(ServerName, DBName, Name, MaxRunDate, MinRunDate) AS
(
SELECT ServerName, DBName, Name,
MAX(RunDate) AS MaxRunDate,
MIN(RunDate) AS MinRunDate
FROM dbo.dbStats
WHERE RunDate BETWEEN @minDate AND @maxDate
AND DBName <> 'tempdb'
GROUP BY ServerName, DBName, Name
)
SELECT db.Servername, db.DBName, db.Name, db.RunDate, db.TotalSize, db.UsedSpace, db.FreeSpace
FROM dbo.dbStats db
INNER JOIN DBCTE CTE ON db.Servername = CTE.ServerName
AND db.Name = CTE.Name
WHERE db.RunDate = CTE.MinRunDate OR db.RunDate = CTE.MaxRunDate
ORDER BY db.Servername, db.DBName, db.FileID, db.RunDate

GO

Tuesday, December 29, 2009

Booting to a VHD in Windows 7

This is something I tried before but didn’t complete for some reason. Now that I’m on vacation this week and I’ve got some extra time to play, I went back to finish it.

As you probably know by now, Windows 7 has the ability to boot into a .vhd file. This is awesome, as you can create a virtual testing environment that you can run directly against you hardware.

There are a few gotchas, though. You’re limited in the OS’s you can run on the virtual side to Windows 7 and Windows 2008. I’ve seen post of people getting other OS’s to run but I haven’t tried. And I’ve seen warnings not to do this on a laptop, though I’ll try it, once I install my new bigger laptop hard drive next week.

Anyway, this is how I did it. I had created a Windows 7 virtual machine in Virtual PC 2007. I used the vhd from that VM instead of creating a new one, though you certainly could if you wanted to.

The first thing I did was to run sysprep inside my VM. I’m not an expert on sysprep, I just followed instructions I found on the web. Briefly, sysprep is a GUI tool that prepares the image to be configured to use the hardware on the new server. You’ll find it in C:\Windows\system32\sysprep. Run it as an admin. Choose the default target of out of the box experience and also choose to generalize. Also choose to shut down. After sysprep is finished it will power down your VM.

I didn’t do this, but it’s probably a good idea to make a copy of the vhd at this point.

The next thing I did was to set up Windows bootloader to see the vhd file. I opened a admin command prompt and ran the following: bcdedit /copy {current} /d “Win 7 VHD”. This returns a GUID I saved to notepad. “Win 7 VHD” is the description I wanted to see on the boot menu. After that I ran these three commands:

bcdedit /set {guid} device vhd=[C:]\VM\Win7\Win7.vhd

bcdedit /set {guid} osdevice vhd=[C:]\VM\Win7\Win7.vhd

bcdedit /set {guid} detecthal on

In my example I replaced guid with the guid I saved earlier. VM\Win7\ is the path to my vhd file, and Win7.vhd is the file I’m using. Note that the drive letter is in square brackets: [C].    

And that’s just about it. Once I restarted my computer I could see both my original Windows 7 installation and my new vhd boot option. When I chose the vhd, Windows started and applied the hardware changes. After that I just logged in and ran my Windows vhd. Once in the virtual environment, I can see all the drives on the computer, including those for my “real” Windows 7. Notice Disk 1 is a blue icon; this shows that it’s a vhd file. It also shows the reserved system partition. I can also see the files on the other physical drives.

VMSetup

I don’t get this if I’m running my physical Windows 7. I can mount the vhd file (on the menu go to Action > Attach VHD). But it doesn’t stay mounted between reboots. I haven’t tried mounting it with DISKPART yet, I’ll try that when I create my laptop VM.

image 

The only drawback is the vhd is not portable, and I can’t run it in Virtual PC 2007 anymore. I can probably run sysprep again to get it back, but I think I’ll keep it as it is for now.