TFS 2010 Performance Issues with New Project Collection

We stumbled onto an interesting performance problem this week with our TFS 2010 server at work. The DefaultCollection on our corporate TFS server was getting a little crowded, so my team decided to spin up our own project collection to ease management and help us isolate our build operations. The new project collection was created and I was assigned the role of ProjectCollectionAdministrator.

The first order of business was to migrate the source code from our Team Project on the DefaultCollection to a new Team Project on the new collection. I used the TFS Integration Tool to accomplish the migration, but it seemed to be performing slowly while migrating the files. I chalked it up to heavy usage on the server and went about my migration business. Hell, it was only going to happen once anyway, so… Little did I know that this was merely a warning of things to come.

After the migration was complete, I mapped our new source control node to my file system and performed a Get Latest. Yikes!! Performance was pretty bad, and as others started using the system, we started to see Get’s performed against large source control nodes fail outright. They just spun while querying the database for items to get, never attempting to pull the files from source control.

We went about our usual troubleshooting routines of checking logs, event logs, processes, memory usage, etc. We couldn’t help but feel that the issue was with indexing, or something similar, in the new project collection’s database (for those who aren’t familiar with TFS, each Project Collection gets its own database in SQL Server).

It took the efforts of our DBA’s to track down the problem. We/they couldn’t actually inspect the stored procedures in the collection’s database because the procedures are encrypted. The DBA’s noticed an unusually high number of reads on some tables, relative to the amount of records they contained. In comparison to the collection whose performance was good, they were seeing substantially more reads against the smaller tables while people were trying to perform Get’s.

The DBA’s determined that the execution plan(s) of one, or more, stored procedures were in a bad state. Recompiling the stored procedures was not an option because, due to their encryption, we couldn’t see into them to determine which ones to recompile. So, one of the DBA’s used a fancy trick (fancy to me, anyway). He ran sp_recompile on the affected tables, which resulted in recompilation of the stored procedures that accessed the tables. He described it as a “sledge-hammer” method of accomplishing the goal, but it worked a treat. As soon as the recompile completed, our new collection was flying.

About battenworks
I'm a software developer and craftsman living in Salt Lake City, UT. When I'm not working with computers, I enjoy billiards and motocross/supercross racing. I am a regular attendee at local geek events, where I occassionally give presentations.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: