Applying Team Foundation Server 2012 Update 1

I have been eagerly awaiting the November update for Team Foundation Server 2012 from Microsoft. My team has successfully adopted Kanban as a development methodology, and we’ve been wanting to get the new bits up and running in our on-premise TFS environment so we can take advantage of the new tooling surrounding the Kanban board. One thing that may not be apparent at first glance when visiting the download site (Visual Studio Downloads) is that the update is not simply the update. You must use the full installer (Visual Studio Team Foundation Server 2012 with Update 1) to get the update.

Download the .iso image and extract it with the tool of choice and run the tfs_server.exe installer (or run the web installer).install-splash Keep in mind that the installer may require you to reboot your server early on in the process, so prepare for your system to be offline during the upgrade. In fact, after you run the installer, your server will be offline while the upgrade modifies your databases, so be prepared for a bit of downtime anyway.

Once the installer has completed successfully, we need to run the upgrade wizard. Thankfully, the upgrade wizard launches automatically. On the first page of the wizard, click Next to get to the Databases page.

On the Databases page, click the List Available Databases link list available databaseson the right to populate the Databases listbox with your configuration database. You will notice that the left pane’s tree will populate with additional options once the wizard recognizes your TFS configuration database. configuration database At this point, you will need to select the option for AlwaysOn Availability, if your database is using that feature, and confirm that you have a database backup before continuing (you DO have a backup, right?). Click Next.

On the next page of the wizard, choose the same service account credentials you used for the original installation. service account We use a domain service account for our TFS installation. Click Next.

The rest of the upgrade (Reporting Services and SharePoint) will largely depend on how your environment is configured. The wizard should recognize the locations of your existing services and honor them. One thing to remember is to click the List Available Databases link list available databaseson the Database step when configuring Reporting Services. This will help the wizard locate the warehouse database required for Reporting Services.

And for SharePoint, to keep things as they are, step through to the Settings page and make sure the radio button for Use current SharePoint settings is selected (it is the default) and you should be fine. If you wish to modify your installation to point to a different SharePoint environment, well, that’s a topic for another time.

Step through the wizard until you’re prompted with the Configure button and you can complete from there. So far, I haven’t had any issues updating a few different server environments with the new update. I hope your results are the same.

Fuck Cancer

An acquaintance of mine was the guest of honor on a local radio station’s morning show this past Thursday. The DJ’s described the shirt he was wearing as having nothing but the title of this post on the front: Fuck Cancer. I can’t say that I’ve ever seen him wear it, though I’ve only met him a couple times. My wife has run with him at local events, and it was she who first told me about the shirt.

You see, this gentleman is living with cancer that will eventually take his life. It will give him nothing in return. No love, no gifts, no happy memories, no pat on the back. It will only take. It will take him. Paraphrasing him, cancer doesn’t deserve our respect because it has none for us, so fuck cancer.

Yesterday, I received an email from a friend of mine letting me know that a childhood friend of ours had passed away. My friend Joe was diagnosed with cancer in October of 2011. He passed away April 1, 2012, just a couple months past his 43rd birthday.

I met Joe in 1st grade. We rode bikes together and played football together. I remember riding to football practice together in the back of my dad’s pickup (you could do that back then…). He was our star tailback. We got our first motocross bikes for Christmas in 7th grade; matching 1981 YZ 80’s. We rode every chance we could. We went to our first Supercross race together. It was Anaheim, 1982. Shortly after that race, my family moved to Utah, and we kept in touch as best we could. Our lives grew apart, but we kept crossing paths off and on through the years, as I moved back and forth between California and Utah. It was always good to see him and we could pick up like we had just seen each other days ago.

One of the last times I saw him in person was again in Anaheim, at a Supercross race in 2002, or 2003. I was standing in the pits with a group of friends, when I got the urge to turn around and scan the crowd. Not 10 feet away from me was Joe. I got the chance to meet his kids, and see his older brother, who was also there. We exchanged cards for years after, but since I moved back to Utah in 2005, I haven’t had the opportunity to see him in person.

Cancer took his life years too soon. It is a heartless, cruel, miserable beast.

Rest in peace, Joe.

Place a Null Value in a SqlParameter with an Extension Method

I had to write some straight-up ADO-type code today to talk to SQL Server. Some of the values I’m passing into a stored procedure can be null, and as we’ve all experienced, the .AddWithValue() method doesn’t coerce null to DBNull.Value on its own. And by “we’ve”, I mean all of us who program in .NET, of course.

So, I wrote a simple extension method that uses DBNull.Value in the case where the value passed to the method is null:

publicstaticvoid AddWithNullableValue(this SqlParameterCollection parameterCollection, string parameterName, object value)
{
    if (null == value)
        parameterCollection.AddWithValue(parameterName, DBNull.Value);
    else
        parameterCollection.AddWithValue(parameterName, value);
}

Nothing ground-breaking, by any means. But, it sure does clean up my code. Hope it helps someone else.

TFS 2010 Performance Issues with New Project Collection

We stumbled onto an interesting performance problem this week with our TFS 2010 server at work. The DefaultCollection on our corporate TFS server was getting a little crowded, so my team decided to spin up our own project collection to ease management and help us isolate our build operations. The new project collection was created and I was assigned the role of ProjectCollectionAdministrator.

The first order of business was to migrate the source code from our Team Project on the DefaultCollection to a new Team Project on the new collection. I used the TFS Integration Tool to accomplish the migration, but it seemed to be performing slowly while migrating the files. I chalked it up to heavy usage on the server and went about my migration business. Hell, it was only going to happen once anyway, so… Little did I know that this was merely a warning of things to come.

After the migration was complete, I mapped our new source control node to my file system and performed a Get Latest. Yikes!! Performance was pretty bad, and as others started using the system, we started to see Get’s performed against large source control nodes fail outright. They just spun while querying the database for items to get, never attempting to pull the files from source control.

We went about our usual troubleshooting routines of checking logs, event logs, processes, memory usage, etc. We couldn’t help but feel that the issue was with indexing, or something similar, in the new project collection’s database (for those who aren’t familiar with TFS, each Project Collection gets its own database in SQL Server).

It took the efforts of our DBA’s to track down the problem. We/they couldn’t actually inspect the stored procedures in the collection’s database because the procedures are encrypted. The DBA’s noticed an unusually high number of reads on some tables, relative to the amount of records they contained. In comparison to the collection whose performance was good, they were seeing substantially more reads against the smaller tables while people were trying to perform Get’s.

The DBA’s determined that the execution plan(s) of one, or more, stored procedures were in a bad state. Recompiling the stored procedures was not an option because, due to their encryption, we couldn’t see into them to determine which ones to recompile. So, one of the DBA’s used a fancy trick (fancy to me, anyway). He ran sp_recompile on the affected tables, which resulted in recompilation of the stored procedures that accessed the tables. He described it as a “sledge-hammer” method of accomplishing the goal, but it worked a treat. As soon as the recompile completed, our new collection was flying.