Clean Up SQL Server Plan Cache |
Written by Ian Stirk |
Thursday, 09 January 2014 |
Improve your SQL performance by following the Plan Cache tips given in this month’s SQL Server Pro. Here's our regular review of the magazine's contents. SQL Server Pro is an online subscription-based monthly magazine that has news and articles from across the SQL Server world. The items in bold below correspond to the name of the article in the current issue.
This month’s cover feature is all about the SQL Server Plan Cache, this is the area of memory that stores your query plans so they can be reused, saving time and thus improving performance when the query is next run. Unfortunately the plan cache can contain queries that have only been used once, these are of little use but take up space. Code is provided to identify these single-use cached plans. The database setting “optimize for ad hoc workloads” is probably worth setting on every database (it is off by default), since this will only cache plans that are used at least twice, thus reducing plan cache bloat. The article proceeds by examining cached plans for clues that might allow you to optimize your SQL queries. In particular code is given to identify cached plans that have missing indexes, implicit conversion warnings, Key Lookups, and Clustered Index Seek operators. Similarly, you might want to investigate my own more generic method of examining cached plans for performance tips in “Get Performance Tips Directly From SQL Server” - you will need a login to SQL Server Central to read this article, but if you’re serious about SQL Server, you’ll want one! Simple queries that differ only in their literal values, are easily identified because they are given the same query hash value. The article provides code to identify these related queries, and suggests it might be better to rewrite these as parameterized routines, again reducing plan cache bloat. The article closes with a link to Greg Low’s excellent article on Plan Caching in SQL Server 2008, which is also applicable to SQL Server 2012. The editorial highlight’s the increasing use of in-memory technologies to help improve SQL Server performance. While SQL Server 2012 introduced columnar indexes (giving x10 to x100 improvements) for data warehouses, SQL Server 2014 takes this further with the In-Memory OLTP database engine, codenamed Hekaton. Many factors have driven this in-memory change, including cheaper memory, an increase in the number of cores/CPUs, and the fact that recently CPU clock-speeds have remained relatively static. Hekaton has been designed using a lock-free design, instead if a row in buffer is modified the engine makes a new timestamped version of the row, and this is very fast since everything is in memory. Another enhancement is that interpreted SQL code can be compiled to native code. Luckily SQL Server 2014 has a wizard to help you determine what tables to convert to in-memory, and which routines to compile. The corruption series continues with Regular Corruption Checks. SQL Server includes the tool DBCC CHECKDB to perform corruption checks. This article provides a step-by-step walkthrough of how to run DBCC CHECKDB as a scheduled SQL Server Agent job. There are some very good options recommended e.g. NO_INFOMSGS to suppress informational messages that tend to clutter the output, also ALL_ERRORMSGS since by default only the first 200 error messages are output. I was a little surprised that nothing was said about restoring a backup to a non-production server and doing the check there (since production server often busy). In Divide and Conquer Halloween Itzik Ben-Gan offers a template solution to the Halloween problem. The Halloween problem occurs when you expand a graph of objects using an iterative pattern (e.g. find employees belonging to a manager), but because temporary objects are used to handle each level, it is possible that the same data is handled more than once. SQL Server provides protection for this via Halloween protection, but this comes at a cost. The article discusses how another pattern (called Divide and Conquer Halloween), can be used to alternate between two temporary tables such that the same table isn’t used for both source and target, so SQL Server doesn’t need to provide additional protection/cost. As always, Ben-Gan provides lots of useful example SQL to illustrate both the problem and its solution. Peter Heller discusses a very useful BI self-service solution he’s created for small-to-midsize businesses (SMBs) in Power Pivot and Small Data. While large enterprises might be able to take advantage of SQL Server Analysis Services (SSAS) to create, maintain and query an OLAP cube, this is not feasible for SMBs. The article presents an alternative, a client-side self-service BI solution. The solution makes use of Excel and the Power Pivot plug-in, and uses a variety of data sources as inputs. I’m sure this approach will prove a useful template for other SMB BI needs. Jen Underwood discusses Building Real-World Microsoft BI Dashboards Today. The article walks through the discussions and decisions made along to way to creating a real-world BI dashboard for a large company. This includes both the software available and concerns over security. In the example given, the data could not be stored in the cloud, and older versions of Microsoft Office had to be used. No code is present in the article. This article is insightful because it shows, not matter how much major software companies push their latest wares, companies often need to use what they have available, and conform to the confines their company security policies. Michael Otey highlights the New Features in Visual Studio 2013. These features include:
How does Microsoft make such elementary mistakes, e.g. flat and monochrome IDE for Visual Studio 2012? (Also the missing Start button in Windows 8, and the Ribbon in Office etc). It’s good these problems get corrected, eventually, but it would be much better if they weren’t made in the first place. This version of Visual Studio comes less than a year after the previous edition, are there any enhancements that are so very important as to warrant a new release? I suspect not. Also, I do wonder if a SQL Server magazine should be looking at non-database technology, and if it does, why not concentrate on the new database related features? There was quite a diverse range of detailed articles in this month’s issue. I particularly like the continuing corruption series, but my favourite article was on the plan cache, it is very well written, with good tips, useful code, and links for further/deeper information – sure to help improve your SQL Server.
More InformationRelated ArticlesSQL Server Pro's Best Products of 2013 Inside SQL Server Pro Magazine November 2013 Data Mining Explored In SQL Server Pro Magazine (October issue) Looking Forward to SQL Server 2014 (September issue) SQL Server Pro's Tips For Performance Tuning and T-SQL (August issue) PowerShell The SQL Server Way (July issue)
To be informed about new articles on I Programmer, install the I Programmer Toolbar, subscribe to the RSS feed, follow us on, Twitter, Facebook, Google+ or Linkedin, or sign up for our weekly newsletter.
Comments
or email your comment to: comments@i-programmer.info
|
Last Updated ( Monday, 10 February 2014 ) |