Tuesday, July 10, 2012

SQL 2012 Upgrade Part 8: The Upgrade

Wow, what a hectic couple of days. There isn't a lot to tell here as things went pretty much according to plan. We tuned the plan on a Friday by upgrading our Development and QA environments and took to Production on Saturday morning. If you're looking for some SQL Server drama then read on!

Friday Morning

I started off the morning by sending out another communication to the BI developers to let them know that I was planning to start the QA upgrade at 8:30. It wasn't long before one of them came by wondering how he was going to get any work done while I was doing the upgrade. The schedule had been communicated to the team earlier in the week but this guy had missed the boat somehow.

The upgrade was planned for a weekend right before the Tuesday iteration break (iterations end on Tuesday morning and new ones start in the afternoon). Based on that I made an effort to let everyone know that they needed to have all their new stuff migrated to Production by the end of the day on Thursday before the upgrade. I guess I need to make sure people are reading or otherwise paying attention to what I have going on.

There was one thing that made the Dev/QA upgrade quicker and the Production upgrade a little riskier. The Analysis Server boxes in Development and QA were running on Windows Server 2003. Since SQL Server 2012 isn't supported on Server 2003 we decided to just replace those two servers. This meant that we didn't have a test case for the SSAS upgrade other than the experience with the cloned environment.

QA Upgrade

I did manage to work out an migration path for my one surly developer and get the upgrade started around 9:00 AM. This was also the last day of our contract with Sardys Avile-Martinez of Pragmatic Works. She had been helping me immensely as a Junior DBA. Without her help, we would not have been on schedule so I invited her to help me with the QA and Development environment upgrades. As usual, she was ecstatic about learning new things and very helpful to boot.

Sardys followed along with me, working remotely via Microsoft Lync, for the QA upgrade as I explained what I was doing. After I knocked out the QA SSIS server, I had her perform the upgrade on one of the two remaining servers as I upgraded the other. Everything went pretty well except for re-discovering that a SQL Server 2008 R2 server must be at Service Pack 1 before the upgrade would run. We ran into that on one of the QA servers but quickly rectified it.

Friday Afternoon

I planned on continuing with the Development environment around 1:00. However, everyone went to lunch early so I took advantage and started the upgrade early. I grabbed a quick lunch and started around noon.

Development Upgrade

The development environment went just as smoothly as with QA. This is when I started to get a little nervous. I am not used to major IT projects going without any significant problems. Really, my confidence level with the process and everything was high in a logical sense but something in the back of my mind felt wrong.

Saturday Morning

I was planning on working from home. That didn't work out so well thanks to CenturyLink's unreliable "Extended Range" DSL which is all that is available to us rural folk. (Slight pause while I hold myself back from ranting about lack of decent Internet service for people that live outside town.)

Needless to say, it was time to trek into the office. I thought it was cold in there on Fridays when most everyone goes home early on the 4x9+4 schedule. I got in there and it was downright frigid so I kicked on the heater and fired up the laptop.

Production Upgrade

Sometimes those bad feelings I mentioned earlier just make you extra cautious. That, combined withe previous days upgrades put me "in the zone." I know it sounds a bit dorky but I was able to work on two or three servers at a time. My checklist was solid and I just made certain to keep it up-to-date after every step, never moving on until the tick-marks were applied.

So, while things were warming up in the cubicle and the laptop was booting up I connected to my virtual desktop via the neat little zero-client device on my desk. (I'll have to post more about those later.) I opened a couple remote desktop connections to take care of the necessary pre-upgrade work.

After getting logged into the laptop it was time to get started on the outstanding SQL Server service packs out of the way. Those took a little while but were uneventful, as was the upgrade process.

When I got to the post-upgrade steps I was starting to second-guess myself but I stuck with the plan. I didn't run into any showstoppers during this last part. In the end, I got the upgrade done in about three hours, not including travel time. All I can say is that planning and testing make all the difference in the world. If you don't have time to do the up-front work you probably won't have time to fix things when the untested and untried plan goes awry.

The biggest positive to this whole thing was the "DBA High" that hit me on the way home. If I hadn't been driving on the Interstate I probably would have broken out in dance and embarrassed the crap out of myself. Despite having burned a third of my Saturday I was completely stoked. I can't imagine a drug that could do better.

Moving Forward

Monday came along and no problems were reported by the end users. There were still some problems though but it was mostly stuff that made my life a little more complicated. The developers were all eager to get on the new platform but hadn't had enough time to load the software on their new laptops much less get used to using in. On top of that, it two me another two weeks to document the process for them to use to update that BIDS solutions to SSDT. (An article or two with lots of screen shots about how to do this is forthcoming.)

This was a big area where I dropped the ball. I made it pretty well certain that the end-users wouldn't be affected by the upgrade. That's all fine and good but not preparing the developers was pretty bad form. Telling them to go forth, install the tools, and learn the new stuff isn't enough. Sure our developers are really smart, hard working folk but they have other things to do and different goals that they are expected to achieve. In the end, it's the BI Admin's job to enable them to achieve their goals within the framework of best practices for proper performance, security, etc. If the development teams do good work, it makes the BI Admin's work that much easier.

1 comment:

  1. Brian, thank you for your comments on this blog. I am looking forward to work with you again

    ReplyDelete