Just a quick blog here. Working on some queries for a client, and they are performing pretty poorly. This is in a SQL 2005 instance and I notice that the developer who created these reporting queries is using table variables.
I love table variables, they're quick and easy, they reduce logging and lock contention. They have limited scope so my .NET developer colleagues who reach for them tend to do so because they don't like mucking about with understanding global temp tables vs local ones.
However, and let me be clear about this, table variables with large amounts of data perform... poorly.
Case in point, this report. It was taking about 90 seconds to render using table variables, upon examining the query I noticed that it was processing well over 10,000 rows in these table variables and then doing some joining between them. Couple things to note about table variables:
1. You can't add indexes to them
2. If you do data modifications (insert, update, delete) SQL server won't generate a parallel execution plan
3. They don't have any statistics
Long story short, simply by converting the table variables to temporary tables, the processing time of the query went from 90 seconds to... 5 seconds.
I polish my Super DBA badge, and walk away happy.