Monday, September 28, 2009

Lessens learned – Report Server administration

It’s a long commute in the morning, so I thought I’d use this time to compile a few things I learned and noticed about administering a server hosting SQL 2005 or 2008 Report Services.

  • Check the recovery model of the databases you create on your report server, including the ReportServerTempdb (or whatever you named the SSRS temp database). By default they will be in full recovery mode unless you’ve changed your model database. If they are still in full, the log files will grow and may cause you space problems later. Consider using simple mode. If you use full remember to schedule log backups.
  • Backup the SSRS tempdb as well as the database holding your catalog. You’ll need this database if you need to restore the server, it’s not recreated when SQL starts up like tempdb. I’ve seen a few maintenance plans that skip this.
  • Speaking of backups, remember to backup the SSRS encryption key. You’ll need that key if you ever want to restore the catalog database and you don’t want to re-enter all encrypted content, like data source credentials. I’ve successfully moved catalogs from production to a testing server using the key to restore the connections. You can backup the key from the Report Server Configuration Manager. I suggest that you also copy the backup somewhere safe in case you can’t get to the server.
  • If you don’t want your reports exported in a specific format then disable that format. For instance, not a lot of reports look good if you export them in a comma delimited format. Just remove or comment out the entry in the rsreportserver.config file. You’ll find this file in the path for your<instance path>\Reporting Services\ReportServer. Be careful removing the image format – this could cause some problems, especially if you open the report in the report viewer control available in Visual Studio.
  • <Render>
    <Extension Name="XML" Type="Microsoft.ReportingServices.Rendering.DataRenderer.XmlDataReport,Microsoft.ReportingServices.DataRendering"/>
    <Extension Name="NULL" Type="Microsoft.ReportingServices.Rendering.NullRenderer.NullReport,Microsoft.ReportingServices.NullRendering" Visible="false"/>
    <!-- <Extension Name="CSV" Type="Microsoft.ReportingServices.Rendering.DataRenderer.CsvReport,Microsoft.ReportingServices.DataRendering"/> -->
    <Extension Name="PDF" Type="Microsoft.ReportingServices.Rendering.ImageRenderer.PDFRenderer,Microsoft.ReportingServices.ImageRendering"/>
    <Extension Name="RGDI" Type="Microsoft.ReportingServices.Rendering.ImageRenderer.RGDIRenderer,Microsoft.ReportingServices.ImageRendering" Visible="false"/>
    <Extension Name="HTML4.0" Type="Microsoft.ReportingServices.Rendering.HtmlRenderer.Html40RenderingExtension,Microsoft.ReportingServices.HtmlRendering" Visible="false"/>
    <Extension Name="MHTML" Type="Microsoft.ReportingServices.Rendering.HtmlRenderer.MHtmlRenderingExtension,Microsoft.ReportingServices.HtmlRendering"/>
    <Extension Name="EXCEL" Type="Microsoft.ReportingServices.Rendering.ExcelRenderer.ExcelRenderer,Microsoft.ReportingServices.ExcelRendering"/>
    <Extension Name="RPL" Type="Microsoft.ReportingServices.Rendering.RPLRendering.RPLRenderer,Microsoft.ReportingServices.RPLRendering" Visible="false" LogAllExecutionRequests="false"/>
    <Extension Name="IMAGE" Type="Microsoft.ReportingServices.Rendering.ImageRenderer.ImageRenderer,Microsoft.ReportingServices.ImageRendering"/>
    <Extension Name="WORD" Type="Microsoft.ReportingServices.Rendering.WordRenderer.WordDocumentRenderer,Microsoft.ReportingServices.WordRendering"/>
    </Render>



  • You’ll find some good information in the ExecutionLogStorage (SQL 2008) or ExecutionLog (SQL 2005) tables. These tables can help you troubleshoot problems with the report as it shows the name of the person running the report, the time it took for retrieving data or rendering the report, the rendering format, and any parameters used. It will also show if the report was opened successfully or if it couldn’t open because of a rendering issue.


  • Another place to look at during troubleshooting is the logs. You’ll find these at <instance path>\Reporting Services\LogFiles. You may need to clean these files up on occasion since some report errors will generate dump files.

Thursday, September 24, 2009

What’s the best practice for starting SQL services?

An incident that occurred to us this morning causes me to rethink how I have SQL services set to start when the server is started or rebooted.

Early this morning we had a power failure at one of our data centers. It caused one SQL server to be rebooted and left another server online. But it caused the SAN controllers to reboot also. Server 1, which didn’t reboot, began reporting error 823 when it couldn’t see the databases, even after the SAN came back online. Restarting the SQL services cleared that up. Server 2, which did reboot, came up reporting no problems. Consistency checks on all databases showed no problems so luckily we were good to go.

I have the SQL services set to restart automatically when the server starts. But now I’m wondering if that is a good idea.What if the SAN wasn’t available when SQL first restarts? What if there is data corruption? Would it be better to always start the services manually so I could check the logs first, before I restart SQL?

I’m interested in hearing how you handle SQL service startups.  

Friday, September 18, 2009

SQL Saturday in Iowa Oct 17th

The East Iowa SQL User Group just listed the sessions for the SQL Saturday event in Iowa City on October 17th. There’s some good stuff scheduled. Go to SQLSaturday.com to register or email sqlsaturday#19@sqlsaturday.com for more information. They can always use more sponsors, too.

Track

Start Time

Session

Speaker

Admin / Platform

9:00 AM

High-Availability Options for Microsoft SQL Server

Tim Plas

App Dev I

9:00 AM

Tuna Helper - A Proven Process for Tuning SQL

Janis Griffin

App Dev II

9:00 AM

Defensive database programming

Alex Kuznetsov

Admin / Platform

10:00 AM

The Benefits of SQL Azure

Aaron King

App Dev I

10:00 AM

ABCs of CTEs

Jason Strate

App Dev II

10:00 AM

SQL Server 2008: The New and Cool...

Zakir Durumeric

BI / Relational Theory

10:00 AM

Building Reports in SQL Server Reporting Services

Jessica Moss

App Dev I

11:10 AM

Extended Events, Work Smarter Not Harder

Jason Strate

App Dev II

11:10 AM

2 Years of Database Unit Testing: Lessons Learned

Alex Kuznetsov

BI / Relational Theory

11:10 AM

Extract, Transform, and Load your Data Warehouse

Jessica Moss

Admin / Platform

1:00 PM

Service Broker: A Brief Introduction

Zakir Durumeric

App Dev I

1:00 PM

Introduction to PowerShell

Keith Dahlby

App Dev II

1:00 PM

SQL Server 2008 - Practical Uses for New Features

Chuck Heinzelman

BI / Relational Theory

1:00 PM

Introduction to SSAS Dimensions

Jessica Moss

Admin / Platform

2:00 PM

Tips Tricks For Administering SQL Server

Timothy Ford

App Dev I

2:00 PM

Drive Your Way to the DMV

Jason Strate

App Dev II

2:00 PM

SQL Server 2008 R2 - Moving to the Next Level

Chuck Heinzelman

BI / Relational Theory

2:00 PM

Database Design

Louis Davidson

App Dev I

3:10 PM

Improving Daily Imports with Partitioned Tables

Jason Strate

App Dev II

3:10 PM

An Introduction to ASP.NET MVC

Chris Sutton

BI / Relational Theory

3:10 PM

Database Design Patterns

Louis Davidson

Thursday, September 17, 2009

Report Service error

Yesterday I helped a developer troubleshoot an issue he was experiencing with a report. The report is hosted on a SQL 2008 instance used for testing and was being viewed in a Win Form application using the Report Viewer control. We congigured the viewer control opens the reports as an image since this would be what the client sees when they print it. He couldn't tell me the exact error message but it referred to not being able to find the stream. He also stated that the report worked fine if he ran it with less data.

I first checked the execution log table in the report database and I saw he was getting a rendering error. Next I had him rerun the report in the viewer. This time he received a generic error saying that there was a problem. But if he clicked the page layout button to render the report in HTML the report ran fine.

My next step was to check the logs on the report server instance. Sure enough I found the error. It was an IO issue; not enough space on the drive. Luckily this was on a testing server and not production.

Now the testing server is not set up the same as a production server. This box is a jumble of programs and databases, running multiple SQL instances for 2000, 2005, and 2008. The report server temp database is on the same drive as the report server database and the error logs and dumps. At the time I saw this the drive had about 100 MB free. And while 100 MB isn't a lot of space nowadays, I would have thought it would have been enough to run the report (it had over 800 pages when we freed up space for it).

At this point I'm not sure where that image is being produced, in the reports database or the report temp database. I've never dug into the report temp db before. I'd like to head from anyone who has had similar problems, or can show me where else I should be looking.

Monday, September 14, 2009

Updates to previous posts

In my posts on virtualization I forgot to mention something. While I did say I was running with the multi-thread option instead of the replay in order option, I left the number of threads at the default of 4. I ran it with 64 threads earlier, but with the replay in order option. Those replays took well over 5 hours, as opposed to 2 1/2 – 3 hours with multi-thread at 4. But if I compare multi-threading at 4 with multi-threading at 64, then the times are much better – 3 1/2 hours compared to 2 3/4 hours on the same virtual server. I’ll summarize all my tests in a few weeks when I’ve completed all my tests.

The second post I need to update regards repeating data in a page header using SSRS. I solution was to use a shared variable. This had potential problems which I neglected to mention. If you use a shared variable then anyone running the report at the same time could overwrite each other values. I knew this at the time I put the report into production, but I thought the odds of that happening would be slim. Later I found I could produce that exact behavior in testing; when I ran the report simultaneously with another tester we did indeed cross values.

I changed the report to use a hashtable instead. The value in each record would be unique since it will depend on the user id of whoever is running the report. As an example of repeating a customer name on each page I

  1. Added a hidden row to the top of the table control on the report. The table was already there and the grouping was already set on the customer name.
  2. Added the customer name field from the table’s dataset to the hidden row.
  3. In the report code section I added a shared hashtable – DIM htCustomer AS system.collections.hashtable = NEW system.collections.hashtable. The hashtable still needs to be shared.
  4. Still in the code section, I added a new function to add a record to the hastable if it didn’t exist or update it if it did. The function takes three parameters; the value from the hidden field, a group name, and the user id. It returns the customer name for the current user
DIM key AS STRING = groupName & UserID
IF NOT group IS NOTHING
DIM g AS STRING= CTYPE(group, STRING)
IF NOT (htCustomer.ContainsKey(key) THEN
htCustomer.Add(key, g)
ELSE
IF NOT (htCustomer(key).Equals(g)) THEN
htCustomer(key) = g)
END IF
END IF
END IF
RETURN htCustomer(key)



5. Add a page header to the report. Add a new text box that uses an expression to call the function and pass in the parameters



=Code.SetCustomerName(ReportItems!txtCustomer.Value, “Customer”, User!UserID)



ReportItems!txtCustomer is the hidden field; Customer is my group name, and User!ID is a global parameter available to the report.  

Friday, September 11, 2009

Job Scheduled for Sunday

It's scheduled to run about 7:30PM

DBCC GoBears

Documenting servers

I'm starting to create a Run Book (not my term, but I don't remember where I heard it) to store all documents relating to the databases and their servers I'm responsible for. That list is growing, and before it gets out of hand I'd like to create a template of what needs to be included. I'm trying to determine just what needs to be included. Drafting the template should help me not to go back and redo what I've already done.

At this point I'm only documenting the servers, not any processes like database backups and restores. Those are important and I've already done most of those, just not formally. I'll do that also.

I'll start with the server itself. What I'm adding is the server name, ip address, if the server is physical or virtual (if virtual the name/location of the host server), the OS and any SPs of patches installed, number of CPUs, amount of memory, size/location of the page file, and the drives and total space. If the drives are on a SAN then I should include a mapping of the SAN.

For the SQL installation, the edition and any SPs or hot fixes, the instance name (we only run one instance per server), any non-default options that were configured, accounts used to run services, configured alerts, jobs and their schedules, if components like SSRS are installed and info about their configurations if they are.

For the databases, the db names (including the virtual name and physical path), compatibility level, recovery model, auto growth and size restrictions, and any db options.

For database backups, the type of backup, where it backs up to, and when it backs up. This info may be duplicated since our backups are handled by jobs.

Is there anything else I'm missing?

Wednesday, September 9, 2009

Alas, no booting to a .VHD for me

I’ve been reading quite a bit lately about attaching a virtual drive to a computer in Windows 7 and Windows 2008 Server R2. It would be nice to have the ability to boot directly into a virtual server. But I’m not seeing a way I can achieve it, at least right now.

My hardware would be my current laptop, an HP Pavilion TX 2000 (I wanted the tablet capabilities) running a 64 bit version of Windows 7 Ultimate. It has 2 CPUs, 4 GB of RAM and a 250 GB hard drive, certainly enough to run one virtual machine. I wanted to run a virtual 64 bit Windows 2008 Server R2 as my BI sandbox. Unfortunately, neither Virtual Server or Virtual PC support running 64 bit guests, only 32 bit. So I built my VM using VMWare’s Workstation. But those virtual disks can’t be mounted.

So I created another VM, this time a 32 bit version of Windows 2008. I created a fixed disk, installed the OS, and followed the directions on Charlie Calvert's blog. Mounting the .vhd file was simple, as was using bcdboot. When I rebooted both servers showed in the boot launcher. Everything good to go, right?

Wrong. When I tried to boot into the .vhd, I'd get an error message that the computer was not correct because of hardware changes. And the computer manager no longer showed the .vhd drive as mounted. That’s when I went back and reread the fine print.

Windows 7 only supports mounting Windows 7 or Windows 2008 Server R2. My VM was only 2008. At this point my options are to restore my VMWare VM (backed up luckily) or to try to install R2 directly into a vhd. But I don’t think that will work, either; it wouldn’t be compatible with Virtual PC.

Well, VMWare Workstation is still a great option. It’s just disappointing I haven’t figured out a way to do this yet.  

East Iowa SQL User Group meeting

This may be one of the nuttiest things I’ve ever done, at least from a technical side.Yesterday I drove to Cedar Rapids to attend the monthly meeting of the I380 Corridor (East Iowa) SQL User Group. That’s a 9 hour round trip drive for a 2 hour meeting, not a very efficient use of my time. But I was glad I did.

The East Iowa group is headed by Michelle Ufford, writer of the SQL Fool blog and Ed Leighton-Dick. Last night’s meeting included a tip of the month by Michelle on using Adam Machanic’s latest version of his who is active script. This was followed by a Live Meeting with Jessica Moss who gave an introduction to data warehousing. That’s what I came for. Ed taped the meeting and hopefully it will come out clean enough so they can post it. I’d love to see it again.

Jessica discussed just what data warehousing is. She went into the design and life cycle of a dw, the definitions of fact and dimension tables, the types of dimensions, the differences between a star and snowflake schema, and how SSIS, SSAS, and SSRS fit into all this. She also discussed the different methodologies, Kimball and Inmon. Finally she gave a demo of a dw, from ETL to reporting, using the AdventureWorksDW2008 database. There was only one question, from Michelle, about using procedures to load data as opposed to a Slowly Changing control in SSIS. Jessica recommended using the SSIS approach.

Jessica’s presentation was exactly what I was looking for, a beginning point for how to build a data warehouse. There’s a lot of information and discussions out there but most of it assumes some knowledge already. And classes usually focus on the tools, not the methods. I need to learn both if I’m to create a good design.

Everyone at the user group was friendly. It was a small gathering, maybe 15 people total. The only difference between them and the Chicago group is they cut their pizza in wedges rather than squares.  

There was one piece of bad news, however. The Iowa group is sponsoring a SQL Saturday on October 17th in Iowa City. But they are far behind what they need as far as registered attendees. They need to hit a certain level for funding from their sponsors to kick in. I think they are running into an issue where there’s another SQL Saturday scheduled the same day in Orlando and others in October. That’s a shame, because I really want to go to this one. Jessica is supposed to come and expand on last night’s presentation. They’ll make the decision Friday if they need to cancel.    

Friday, September 4, 2009

Replaying Traces – What I’ve Learned

So I’ve been running these replays for a while now, with mixed results. There are a number of  lessons I’ve learned along the way.

  • Check what’s running inside the workload you’re replaying. There may be something that will affect the replay. In my case there was a collection trace that was started by the third party software I use for monitoring the production servers for performance. Since this trace had no corresponding stop, it kept running after the rest of my workload was completed, making me think there was a different issue. The only thing I don’t understand is why it stopped on it’s own in half the replays.
  • After you load the workload, you can apply a filter and reload the workload. This is how I eliminated any call for sp_trace_setstatus.
  • Restore the master database to eliminate any replay errors due to mismatched database id’s. This comes directly from BOL. I was ignoring it because at first I wasn’t seeing any errors. But later, especially after I started on the virtual servers, I was seeing more and more errors where SQL wasn’t running against the right database. So when I started the last round of replays I restored the master db first.

I’m in the process of replaying all scenarios again so the data is consistent. So far I’ve completed the 4 physical tests. Coming next are the 3 32-bit virtual, 2 64-bit virtual, and 1 final with VMWare’s 8 CPU server.