Wednesday, October 29, 2008

SQL Server 2005 SP3 Beta

Microsoft has released SQL Server 2005 Service Pack 3 Beta. SP3 contains all cumulative updates for SQL Server 2005 plus a handful of other fixes. If you've kept on top of the cumulative updates there really isn't much new in SP3, which makes me wonder why these weren’t just rolled into a cumulative update #11. Nevertheless, here’s what Microsoft’s “What’s New” page shows:

  • One update for DBCC commands in the DB Engine
  • Support for running notifications services against SQL 2008
  • An additional parameter for sp_showpendingchanges to help identify pending changes of replication subscribers
  • Three updates to Reporting Services

This is a “for testing purposes” only release so use it at your own risk.

Download: SQL Server 2005 Service Pack 3 - Beta

Tuesday, October 28, 2008

SQLSaturday #8 Presentations

For those that attended my sessions from SQLSaturday #8 last weekend here are my presentations:

Download: XML Features in SQL 2005

Download: Transactional Replication Deep Dive

Difference Between “Long merge” and “Slow merge” Replication Warnings

During my presentation about Transactional Replication at SQLSaturday #8 last weekend I was asked what the difference was between “Long merge” and “Slow merge” replication warnings. I didn’t know the answer (it was a session about transactional replication after all), but not being one to let a good question go unanswered I had to find out.

Here’s what they mean:

  • Long merge refers to the amount of time that the merge agent has run (or is currently running) during the most recent synchronization. If this value exceeds the set threshold the alert will fire.
  • Slow merge refers to the number of rows processed per second by the merge agent while running. If this value falls below the set threshold the alert will fire.

There are two threshold values for each alert – one for dialup connections and one for LAN connections. For dialup the default is 8 minutes for long merge and 20 rows/sec for slow merge. For LAN the default is 3 minutes for long merge and 100 rows/sec for slow merge.

Thresholds are enabled and set in Replication Monitor by selecting your merge publication in the left pane and the “Warnings” tab in the right pane. Alert actions are set by clicking the "Configure Alerts…” button in the same pane or in Management Studio’s Object Explorer under SQL Server Agent –> Alerts. A screenshot of the Warnings tab in Replication Monitor is below.

Replication Monitor: Warnings Tab
(Click Image to view full size)

Monday, October 27, 2008

Notes From SQLSaturday #8

I attended\presented at SQLSaturday #8 last weekend and I thought the event went very well. Close to 275 people attended – the most ever for a SQLSaturday event to date. From what I heard a good number of sessions were standing room only. There were also close to 30 volunteers and several sponsors giving out some really good stuff at the end of the day (Xbox 360’s, an iPod Touch, etc.). I enjoyed having the opportunity to see old friends and make new connections with people I met for the first time.

Some random thoughts about the day:

  • Speaker eval cards were collected at the end of each session and used to raffle off book tickets. I liked this for several reasons: First, it gives everybody an equal chance at winning something and doesn’t force people to come up with some kind of thought provoking question just so they can get a ticket. Second, speakers got the chance to see how they did as soon as they were done. In the past we had to wait up to two weeks to get evals back.
  • My first presentation, XML Features in SQL 2005, was well attended. I didn’t count numbers but there were folks sitting on the floor and standing in the back of the room.
  • My second presentation, Transactional Replication Deep Dive, was half capacity. Maybe there were more interesting sessions happening at the same time. Or maybe just not that many people have to worry about managing replication. Regardless, there were some great questions asked – including one I didn’t have the answer to. I’ve since found out and will post an entry this week about it.
  • I’ve found out that I was voted “best speaker” 7 times and “favorite session” 5 times on the event eval cards. Very encouraging, and thank you if you were one of the people who gave me the nod and are reading this.

Andy Warren and co. put a lot of time and effort into this event and it showed! All in all a great day of free training and social networking. You just can’t beat good community events like this.

Friday, October 24, 2008

SQL Server 2005 SP2 Cumulative Update 10

Microsoft has released cumulative update 10 for SQL Server 2005 SP2. The details are published in KB article 956854.

To download the hotfix simply click on the "View and request hotfix downloads" link at the top of the article. Your platform is automatically detected so if you need the x64 build make sure to go to the link from an x64 machine, and vice-versa for x86.

Standard disclaimer applies here: Only apply if you're experiencing one of the issues specifically mentioned in the article.

Download: Cumulative update package 10 for SQL Server 2005 Service Pack 2

SQLH2 Updated for SQL 2008

The SQL Server Health & History project (SQLH2) has been added to CodePlex and updated for SQL 2008. SQLH2 collects and stores information about SQL installations and performance which can then be viewed using reporting services. While it’s not as robust as other products like SQL Sentry or SQL Diagnostic Monitor, it’s free and can grab most everything you need to baseline server performance over time. If you can’t afford those other tools, or you’re not currently taking performance baselines, you may want to give this a try.

Download: SQL Server Health & History @ CodePlex

Thursday, October 23, 2008

It Works On My Machine

I came across this old post on Coding Horror today and nearly fell out of my seat laughing. I’ve spent the last 7 years working on production systems and I’ve lost count of the number of times I’ve heard that famous phrase when a deployment went awry. Totally pointless, but absolutely hilarious!

Visit: The “Works on My Machine” Certification Program

Wednesday, October 15, 2008

My 15 Minutes Of Fame On Jumpstart TV

Recently Andy Warren from End To End Training invited me to come onsite and create video lessons on XML in SQL 2005 for Jumpstart TV.  I’ve done a few SQLSaturday presentations but I was green when it came to recording videos\screencasts. I like doing new things though and I thought it would be good for me both personally and professionally, so I agreed.

My excitement turned into apprehension when I sat down to record my first video. I tend to be a perfectionist and I kept working to get a pristine take of each segment from beginning to end. That ended up being an exercise in frustration, and I walked away feeling like my first day was a disaster. I recorded enough material for a couple of segments and went home to retreat underneath a rock, convinced I wasn’t cut out for this type of thing.

After realizing how stupid it was for me to expect to get it perfect on my first try, I made another trip to E2E, and the 2nd try turned out much better. More time in front of the mic and coaching from Andy helped me loosen up and produce something that I feel good about. And now that I’ve seen how they turned out (kudos to Chris Rock’s editing job!) I’ve got the itch to do more, assuming they’re interested in having me back. :-)

I’m pleased to announce that on Oct. 13th my videos were officially posted! Please, go check them out, and while you’re there have a look around at the other videos that are available. There’s a lot of useful stuff there and the guys at Jumpstart are working hard to turn this into a quality learning site.

Visit: My SQL Server XML Videos on Jumpstart TV

Tuesday, October 14, 2008

SQL 2008 Books Online Updated

The August update for SQL 2008 Books Online has been published. This update includes corrections and omissions left out of the RTM version. It’s always a good idea to keep product documentation current so I recommend you take a few minutes to download and install this update.

Download: Microsoft SQL Server 2008 Books Online (August 2008)

Monday, October 13, 2008

SQLSaturday #8 - Orlando

sql_saturday_8_orlando[1]

SQLSaturday is a one day free training event for SQL Server professionals organized by a local user group. SQLSaturday #8, hosted by the Orlando SQL Server Users Group (oPASS), is coming to Orlando Saturday, Oct. 25, at Seminole Community College. If you live in the central Florida area this is a can’t miss event.

This is the biggest SQLSaturday to date, with 40 presentations to choose from…two of which happen to be mine:

  • XML Features in SQL 2005: Take a look at the XML datatype in SQL 2005 and things that you can do with it using XML-DML
  • Transactional Replication Deep Dive: Looking under the covers at transactional replication and how it can be tweaked for optimal performance

If someone offered you a free day of SQL training from a variety of experts (including a few MVPs), with lunch and prizes thrown in, would you turn it down? If you haven’t signed up, what are you waiting for? Head over to the SQLSaturday site and register now!

Cumulative Update Package 1 For SQL Server 2008

I missed this one when it shipped – Cumulative update 1 for SQL 2008 was released on Sept. 23 (KB article #956717). Like all cumulative updates it’s a rollup of all available post-RTM hot fixes and patches. Most of the fixes appear to be related to Reporting Services and Analysis Services but there are a few that address core functionality as well.

To download the hotfix simply click on the "View and request hotfix downloads" link at the top of the article. Your platform is automatically detected so if you need the x64 build make sure to go to the link from an x64 machine, and vice-versa for x86.

Standard disclaimer applies here: Only apply if you're experiencing one of the issues specifically mentioned in the article.

Download: Cumulative update package 1 for SQL Server 2008

Congratulations Andy Warren: SQL Server MVP

Andy announced in today’s post that he’s been recognized as a SQL Server MVP. Andy got me started in public speaking at SQL Saturday and participation in the SQL community through oPASS and this blog. Andy’s done a lot for the SQL Community in the central Florida area and I can’t think of anyone who deserves it more. Congratulations Andy!

Monday, October 6, 2008

Reduce Transactional Replication Synchronization Time By Applying Snapshots Manually

I've had to take some big transactional snapshots before - 15 million rows\10 GB big. Applying the snapshot to a subscriber sitting in the same facility over a local network can take upwards of an hour...so imagine how long it takes to apply the snapshot to a subscriber sitting in another data center. Even at the best of speeds it can take 6-8 hours or more.

You can intervene and reduce the time it takes to apply the snapshot. Here's how:

  1. Create the subscription to the publication and start the snapshot agent. Before the snapshot agent completes stop the distribution agent for the subscriber. When the snapshot agent finishes it will have written several files for each article to the snapshot folder including: IDX, PRE, SCH, XPP, and BCP files along with a copy of any pre and post snapshot scripts that were configured for the publication.
  2. Using your text editor of preference (Notepad works) open the pre-snapshot script in the snapshot folder, copy and paste the commands into Management Studio, and execute them against the subscriber. Do the same for the PRE, SCH, XPP, and IDX files (respectively) for each article.
  3. Using your compression tool of choice, compress each BCP file and transfer it to the subscriber. I use Info-Zip because I can compress all files individually on the command line using a FOR command like so:
    FOR %F in ("{snapshot folder path}\*.bcp") DO zip -j "%~nF.zip" "%F"
  4. At the subscriber, uncompress each BCP file and use the bcp utility to import the files into their respective tables. There are two options that must be provided for the file to be imported correctly; They are -n for "native type" and -k for "keep null values". Other options can be provided as necessary for authentication, server name, database name, and table. I created a batch script that I named "unzip and bcp.bat" to process each compressed file. Here's what the script looks like:
    pushd "%~dp1"
    unzip -j %1
    bcp "%2" in "%~dpn1.bcp" -n -T -k >> "%~dp1%2.bcplog.txt"
    del "%~dpn1.bcp"
    popd

    The script takes two arguments - the full path of the compressed file (%1 in the script) and the qualified name of the table (i.e. database.owner.table) to bulk insert into (%2 in the script). A log is written in the same directory as the compressed file in order to review for any errors that might have occurred during the bulk copy. The log file is the name of the table (second argument) with ".bcplog.txt" added on at the end. Since the script works with one file at a time and there are multiple files to process, I use the FOR command again like so:
    FOR %F in ("{compressed file path}\{table name}*.zip") DO "unzip and bcp.bat" "%F" {db}.{owner}.{table}
    I always make it a point to check the log files for any errors that occurred (search for the word "error") when the bcp in is done. I also select the top 500 rows from each table to ensure the data looks good and didn't get mangled up somewhere along the way.
  5. At the distributor, replace each file in the snapshot folder with an empty file with the same name. An easy way to do this is create an empty file called "blank.txt" and use the FOR command:
    FOR %F in ("{snapshot folder path}\*.*") DO copy /Y "blank.txt" "%F"
  6. Start the distribution agent. When you view the agent history it will look like it's going through every article in the publication and applying the snapshot, but in reality it's not doing anything because all of the files are empty. After it "applies" the snapshot it will start delivering any transactions that have been read by the log reader agent. Violà!

 

Note that you don't have to do this for every article in your publication - if only one of them is big you can work with just the files generated for that article and leave the others alone. When the distribution agent runs it will do what it normally does to the other articles and "apply" the blank files for the article you transferred manually.

I hope this saves you time the next time you need to transfer a large snapshot!