Thursday, 23 May 2013

Dedicated Administrator Connection (DAC)

We all dread the scenario whereby SQL Server is under so much load and has a complete lack of resources that no further connections can be made.  Although extreme and a situation I have never been in there is a saving grace, one last chance before opting for measures that will induce outages, the Dedicated Administrator Connection or DAC.  

First introduced in SQL Server 2005 the DAC is a special connection that is accessible when all other connections fail. The DAC is available via both SQLCMD and SSMS however it is recommended to use SQLCMD as it uses fewer resources than the GUI of SSMS.  The sole purpose of the DAC is for diagnosing problems when no other connection can be made, it is not to be used as a regular connection.  It is also recommended that you connect to the master database when using the DAC and that you do not run any resource intensive queries.

There are some limitations which I will detail below;

1 - Only one connection to the DAC is allowed, if already in use any further connections will be rejected

The below query it will return the SPID for the DAC if it is in use;
SELECT  s.session_id
FROM    sys.tcp_endpoints AS E
        INNER JOIN sys.dm_exec_sessions AS S ON E.endpoint_id = S.endpoint_id
WHERE   E.name = 'Dedicated Admin Connection';
2 - By default the DAC is only available locally, this can be changed by enabling remote admin connections using sp_configure
3 - Only users with membership in the sysadmin fixed server role can connect to the DAC
4 - Some SQL Statements are unavailable using the DAC, for example BACKUP or RESTORE.

Below are some examples of connecting to the DAC using both SQLCMD and SSMS, If you have never connected to the DAC or used SQLCMD for that matter, I recommend testing connectivity to the DAC and becoming familiar with SQLCMD.  This will save precious time in the event of a serious problem, you really don't want to be googling or boling when facing a potential outage.

Example 1: Connect to the DAC using SQLCMD and integrated security

Open command prompt and run;

sqlcmd -S ServerNameHere\InstanceNameHere -d master -A

Example 2: Connect to the DAC using SQLCMD and SQL authentication

Open command prompt and run;

sqlcmd -S ServerNameHere\InstanceNameHere -U UserNameHere -P PasswordHere -d master -A

Example 3: Connect to the DAC using SSMS

Open SSMS, at the connection window prefix the ServerName or ServerName\InstanceName with ADMIN:

ADMIN:SQL365

or

ADMIN:SQL365\INST01

Enjoy!

Chris


Wednesday, 15 May 2013

Object Qualification

I came across an interesting issue recently with NHibernate, now it is widely known I despise ORM’s, in my experience they do a pretty mediocre job at best and at times can be absolutely horrific.  The issue was that the statements being fired at an instance of SQL Server from an application using NHibernate were not schema qualified.  Now this is not a rant at ORM’s as the issue I will show below is experienced with stored procedures, ad-hoc sql and any T-SQL you execute against SQL Server for that matter.  In fact I will be using a stored procedure in the example ;)

Now for those of you that don’t know SQL Server has to do an awfull lot of work before a statement is actually executed, here I want to show you the performance improvements that can be achieved by schema qualifying your objects.  The below quote is from Microsoft and will set the scene for the rest of the post.

"If user "dbo" owns object dbo.mystoredproc, and another user "Harry" runs this stored procedure with the command "exec mystoredproc," the initial cache lookup by object name fails because the object is not owner-qualified. (It is not yet known whether another stored procedure named Harry.mystoredproc exists, so SQL cannot be sure that the cached plan for dbo.mystoredproc is the right one to execute.) SQL Server then acquires an exclusive compile lock on the procedure and makes preparations to compile the procedure, including resolving the object name to an object ID. Before it compiles the plan, SQL Server uses this object ID to perform a more precise search of the procedure cache and is able to locate a previously compiled plan even without the owner qualification.
 If an existing plan is found, SQL Server reuses the cached plan and does not actually compile the stored procedure. However, the lack of owner-qualification forces SQL to perform a second cache lookup and acquire an exclusive compile lock before determining that the existing cached execution plan can be reused. Acquiring the lock and performing lookups and other work that is needed to get to this point can introduce a delay that is sufficient for the compile locks to lead to blocking. This is especially true if a large number of users who are not the stored procedure's owner simultaneously run it without supplying the owner name. Note that even if you do not see SPIDs waiting on compile locks, lack of owner-qualification can introduce delays in stored procedure execution and unnecessarily high CPU utilization."

Script

To demonstrate this I used the below script, I am running SQL Server 2008 R2 developer edition on my local instance and used the AdventureWorks2008R2 databasewhich is available here.

The script creates a schema called Chris in the AdventureWorks2008R2 database, a user SQL365\Chris is created for the login SQL365\Chris with the default schema of Chris.  Finally a procedure called dbo.spGetSalesOrderHeader is created that returns every record from AdventureWorks2008R2.dbo.SalesOrderHeader.

/*
      -----------------------------------------------------------------
      Object Qualification
      -----------------------------------------------------------------
   
      For more SQL resources, check out SQLServer365.blogspot.com

      -----------------------------------------------------------------

      You may alter this code for your own purposes.
      You may republish altered code as long as you give due credit.
      You must obtain prior permission before blogging this code.

      THIS CODE AND INFORMATION ARE PROVIDED "AS IS"
    
      -----------------------------------------------------------------
*/
-- Set Database Context
USE AdventureWorks2008R2;
GO
-- Declare variable
DECLARE @SQL VARCHAR(255)

-- Set variable
SET @SQL = 'CREATE SCHEMA Chris AUTHORIZATION dbo'

-- Create Schema
IF NOT EXISTS ( SELECT  1
                FROM    sys.schemas
                WHERE   name = 'Chris' )
    BEGIN
        EXEC (@SQL)
    END
GO

-- Create user mapped to login with default schema of the above created schema
IF NOT EXISTS ( SELECT  1
                FROM    sys.database_principals
                WHERE   name = 'SQL365\chris' )
    BEGIN
        CREATE USER [SQL365\Chris] FOR LOGIN [SQL365\Chris] WITH DEFAULT_SCHEMA = Chris;
    END
GO

-- Create procedure in dbo schema to be executed by the above user
IF EXISTS ( SELECT  1
            FROM    sys.objects
            WHERE   [object_id] = OBJECT_ID(N'[dbo].[spGetSalesOrderHeader]')
                    AND type IN ( N'P', N'PC' ) )
    DROP PROCEDURE [dbo].[spGetSalesOrderHeader]
GO
CREATE PROCEDURE dbo.spGetSalesOrderHeader
AS
    BEGIN
        SELECT  SalesOrderID ,
                RevisionNumber ,
                OrderDate ,
                DueDate ,
                ShipDate ,
                [Status] ,
                OnlineOrderFlag ,
                SalesOrderNumber ,
                PurchaseOrderNumber ,
                AccountNumber ,
                CustomerID ,
                SalesPersonID ,
                TerritoryID ,
                BillToAddressID ,
                ShipToAddressID ,
                ShipMethodID ,
                CreditCardID ,
                CreditCardApprovalCode ,
                CurrencyRateID ,
                SubTotal ,
                TaxAmt ,
                Freight ,
                TotalDue ,
                Comment ,
                rowguid ,
                ModifiedDate
        FROM    sales.SalesOrderHeader
    END
GO

I use a great tool SQLQueryStress developed by Adam Machanic (B - T) quite frequently when testing the effects of changes under load, it is ingeniously simple to use and I love it.  I used SQLQueryStress to record the results of executing the procedure and without schema qualification and with schema qualification, I used 4 threads (the number of cores in my laptop) and ran a thousand iterations to get a good average.  Results of which are included below;

Non Schema Qualified



Schema Qualified


As you can see the results are pretty damn conclusive every metric measured by SQLQueryStress saw a performance improvement by schema qualifying objects.

Run Time - 27.01% Improvement
ClientSeconds/Iteration (Avg) - 11.99% Improvement
CPU Seconds/Iteration (Avg) - 1.13% Improvement
Actual Seconds/Iteration (Avg) - 12.54% Improvement

There is no excuse for not schema qualifying your objects, performance improvements like this just cannot be ignored.

Enjoy!

Chris

Thursday, 9 May 2013

DTA and Hypothetical Indexes

For those of you that don’t know DTA stands for Database Engine Tuning Adviser and is available from the Tools menu in Management Studio.  This tool was first introduced in SQL Server 2005 and has been a much used tool by DBA’s and Developers alike in most of the companies I have worked for.

There are two main problems I have with DTA firstly the default object name prefixes are terrible, no really I mean absolutely awful.  See the table below for examples.

Object type
Default object name prefixes
Example
Indexes
_dta_index_
_dta_index_dta_mv_1_7_1150627142_K2
Statistics
_dta_stat_
_dta_stat_2041058307_2_5
Views
_dta_mv_
_dta_mv_3
Partition functions
_dta_pf_
_dta_pf_1043
Partition schemes
_dta_ps_
_dta_ps_1040

Now there is no right or wrong way to standardise the names of your database objects but indexes for example I go with the below;

Single Index Key Column - IDX_TableName:ColumnName
Multi Index Key Column – IDX_TableName:CompositeX

Some people will agree some won’t, but in my experience this makes life easier for me and the team when maintaining our SQL estate.

The other problem is that while DTA is analysing a workload, it automatically creates the recommended indexes with the meaningless names as mentioned above.  DTA will always clean up the indexes it creates; well actually that is a lie.  If the DTA process exits then the indexes it has created so far will persist!

We can identify these indexes by the value of the is_hypothetical column of the sys.indexes catalog view, this will be = 1.

I have created the below script which will email a list of hypothetical indexes and the script to drop them, simply schedule this as a SQL Server Agent job as you see fit changing the @EmailProfile and @EmailRecipient variables accordingly;


/*
      -----------------------------------------------------------------
      Hypothetical Indexes
      -----------------------------------------------------------------
   
      For more SQL resources, check out SQLServer365.blogspot.com

      -----------------------------------------------------------------

      You may alter this code for your own purposes.
      You may republish altered code as long as you give due credit.
      You must obtain prior permission before blogging this code.

      THIS CODE AND INFORMATION ARE PROVIDED "AS IS"
    
      -----------------------------------------------------------------
*/
-- Set database context
USE master;
GO

-- Declare variables
DECLARE @EmailProfile VARCHAR(255)
DECLARE @EmailRecipient VARCHAR(255)
DECLARE @EmailSubject VARCHAR(255)

-- Set variables
SET @EmailProfile = 'DBA'
SET @EmailRecipient = 'Chris@SQLServer365.com'
SET @EmailSubject = 'ALERT - Hypothetical Indexes found on ' + @@SERVERNAME

-- Drop temporary table if exists
IF OBJECT_ID('tempDB.dbo.#HypotheticalIndexDropScript') IS NOT NULL
    DROP TABLE #HypotheticalIndexDropScript;
    
-- Create Temporary Table
CREATE TABLE #HypotheticalIndexDropScript
    (
      DatabaseName VARCHAR(255) ,
      HypotheticalIndexDropScript VARCHAR(4000)
    );

INSERT  INTO #HypotheticalIndexDropScript
        EXEC sp_msforeachdb 'USE [?]; SELECT  DB_NAME(DB_ID()), ''USE '' + ''['' + DB_NAME(DB_ID()) + ''];'' + '' IF  EXISTS (SELECT 1 FROM sys.indexes  AS i WHERE i.[object_id] = '' + ''object_id('' + + '''''''' + ''['' + SCHEMA_NAME(o.[schema_id]) + ''].'' + ''['' +  OBJECT_NAME(i.[object_id]) + '']'' + '''''''' + '')'' + '' AND name = '' + '''''''' + i.NAME + '''''''' + '') ''    
       + '' DROP INDEX '' + ''['' + i.name + '']'' + '' ON '' + ''['' + SCHEMA_NAME(o.[schema_id]) + ''].'' + ''['' + OBJECT_NAME(o.[object_id]) + ''];'' AS HypotheticalIndexDropScript
FROM    sys.indexes i
        INNER JOIN sys.objects o ON o.[object_id] = i.[object_id]
WHERE is_hypothetical = 1'

-- Check for hypothetical indexes
IF EXISTS ( SELECT  1
            FROM    #HypotheticalIndexDropScript )
    BEGIN
        DECLARE @tableHTML NVARCHAR(MAX);
        SET @tableHTML = N'<style type="text/css">'
            + N'.h1 {font-family: Arial, verdana;font-size:16px;border:0px;background-color:white;} '
            + N'.h2 {font-family: Arial, verdana;font-size:12px;border:0px;background-color:white;} '
            + N'body {font-family: Arial, verdana;} '
            + N'table{font-size:12px; border-collapse:collapse;border:1px solid black; padding:3px;} '
            + N'td{background-color:#F1F1F1; border:1px solid black; padding:3px;} '
            + N'th{background-color:#99CCFF; border:1px solid black; padding:3px;}'
            + N'</style>' + N'<table border="1">' + N'<tr>'
            + N'<th>DatabaseName</th>'
            + N'<th>HypotheticalIndexDropScript</th>' + N'</tr>'
            + CAST(( SELECT td = DatabaseName ,
                            '' ,
                            td = HypotheticalIndexDropScript ,
                            ''
                     FROM   #HypotheticalIndexDropScript
                   FOR
                     XML PATH('tr') ,
                         TYPE
                   ) AS NVARCHAR(MAX)) + N'</table>';
          
            -- Email results
        EXEC msdb.dbo.sp_send_dbmail @profile_name = @EmailProfile,
            @recipients = @EmailRecipient, @subject = @EmailSubject,
            @body = @tableHTML, @body_format = 'HTML';
    END
    GO

DTA is not coming out to play as he has been (and will continue to be) a very naughty boy.

Enjoy!

Chris

Tuesday, 7 May 2013

SQL Server Hardware and Acronyms

Over the weekend I finished reading yet another Red Gate book, this one was SQL Server Hardware by Glenn Berry (t - b).  Having earned my crust as a Windows Engineer before honing my SQL Server Skills and becoming a DBA, I have had this book on my reading list since it was published but thought others would give me more benefit and teach me more.  Oh how wrong I was, I honestly thought 99% of this book I would already know but the reality is that quite a few of the ideas and practices in this book  I have overlooked.  There are some great nuggets in here like Windows Power Plans for example.  Windows Power Plans to quote Glenn are;

“Designed to let you manage how your server uses power, this setting can have a dramatic effect on processor performance, and is controlled through Windows as well as the firmware BIOS settings”

Our SQL Server estate was indeed using the Balanced plan, taking Glenn’s advice and testing the changes we were able to achieve around a 15% improvement in CPU performance on a 2 CPU server.  Given this has no up-front cost, is a low risk change with no downtime surely this is a no brainer?!  Yes there is a small increase in power consumption somewhere around 5%-10% between the Balanced and High performance plans but the performance increase is surly worth this.

Go on have a look, I would be interested to see if anyone else has overlooked this setting.  Glenn’s book can be found over at SQL Server Central, the eBook format is also FREE!

Anyway, back to what I wanted to bring to your attention, the IT industry is full off acronyms many of which are duplicated across technologies.  One of the cool features of Glenn’s book is that Appendix C contains a list of abbreviations used in the book.  Below is the complete list;

AMB
Advanced Memory Buffer
AMD
Advanced Micro Devices
API
Application Programming Interface
AWE
Address Windowing Extensions
BI
Business Intelligence
BIOS
Basic Input Output System
CAL
Client Access License
CDC
Change Data Capture
CIFS
Common Internet File System
CISC
Complex Instruction Set Computer
CPI
Cost per Instruction
CSS
Customer Service and Support
CU
Cumulative Update
DAC
Data Tier Application
DACPAC
Data Tier Application Package
DAS
Direct Attached Storage
DDL
Data Definition Language
DDR
Data Dependent Routing
DESC
in T-SQL, Descending
DIMM
Dual in-line Memory Modules
DML
Data Manipulation Language
DMV
Dynamic Management View
DPV
Distributed Partitioned Views
DRAM
Dynamic Random Access Memory
DSS
Decision Support System
DW
Data Warehouse
EE
Enterprise Edition
EFD
Enterprise Flash Disks
EIST
Enhanced Intel Speedstep Technology
EP
Extended Protection
EPIC
Explicitly Parallel Instruction Computing
EPT
Extended Page Tables
FDR
Full Disclosure Report
HBA
Host Bus Adaptors
HT
Hyper-threading
IDE
Integrated Drive Electronics
ILP
Instruction Level Parallelism
IOP
Input Output Operation
IOPS
Input/Output Operations per Second
ISM
Incremental Servicing Model
ISV
Independent Software Vendor
LPIM
Lock Pages in Memory
LUN
Logical Unit Number
MAXDOP
Max Degree of Parallelism
MCH
Memory Controller Hub
MDS
Master Data Services
MLC
Multi Level Cell
MMC
Microsoft Management Console
MPIO
Multi Path Input Output
NIC
Network Interface Card
NTLM
NT Lan Manager
NUMA
Non-Uniform Memory Architecture
ODBC
Open Database Connectivity
OEM
Original Equipment Manufacturer
OLAP
Online Analytical Processing
OLTP
Online Transaction Processing
OMSA
Open Manage System Administrator
PAE
Physical Address Extension
PATA
Parallel Advanced Technology Attachment
PCU
Public Cumulative Update
PK
Primary Key
POST
Power On Self-Test
QPI
Quick Path Interconnect
RAID
Redundant Array of Independent Disk
RAS
Reliability, Availability and Serviceability
RDIMM
Registered Dual In-line Memory Modules
RISC
Reduced Instruction Set Computer
RPO
Recovery Point Objective
RTM
Release to Manufacturing
RTO
Recovery Time Objective
RVI
Rapid Virtualization Indexing
SAN
Storage Area Network
SAS
Serial Attached SCSI
SATA
Serial Advanced Technology Attachment
SCOM
System Center Operations Manager
SCSI
Small Computer System Interface
SKU
Stock Keeping Unit
SLAT
Second Level Address Translation
SLC
Single Level Cell
SMB
Server Message Block
SMP
Symmetrical Multiprocessing
SPD
Serial Presence Detect
SPEC
Standard Performance Evaluation Corporation
SQLCAT
SQL Server Customer Advisory Team
SQLIO
SQL IO, a disk benchmarking utility
SQLOS
SQL Server Operating System
SSAS
SQL Server Analysis Services
SSD
Solid-State Drives
SSMS
SQL Server Management Studio
SSRS
SQL Server Reporting Services
SUT
System Under Test
TAP
Technology Adoption Program
TDE
Transparent Data Encryption
TPC
Transaction Processing Performance Council
TpsE
Transactions per second for TPC-E benchmark
UCP
Utility Control Point
VAS
Virtual Address Space
VT
Virtual Technology
WSUS
Windows Server Update Services

Enjoy!

Chris