This article describes a simple framework for measuring SQL Server database application performance from both CPU & IO perspectives. A ‘Synthetic Transaction’ is proposed that is :
- Precise – A mathematical workload is performed the duration of which is measured. A 10 second test once every 10 minutes is sufficient for trend analysis.
- Easy to implement – There are no tools to install and no learning curve. Simply run the T-SQL setup script then schedule a SQL Agent Job.
- Flexible – The workload can be tuned for OLAP / OLAP bias and for the cost of execution.
- Unambiguous – The ‘duration’ measured is processing time for the same data, queries and plans.
The ‘cellular automation’ application generates new patterns according to a precise set of mathematical rules. It records the start and end time of each call which is a reflection on how SQL Server is handling client requests.
Example Baseline Test Process
Run the ‘CA Setup 2008’ script (https://paulbrewer.wordpress.com/ca_setup_sql2008/ ) to create the cellular automation database application, then schedule a SQL Agent job for the steps below.
CPU Check Step
A call is made to the Cellular Automation engine requesting the generation of 200 ‘New Patterns’ at ‘Stress Level’ = 1. This calls the geometry intersects method which requires CPU, it runs for between 5 and 10 seconds’
IO Check Step
A call is made to the Cellular Automation engine requesting the generation of 100 ‘New Patterns’ at ‘Stress Level’ = 3. This calls set based queries which are IO bound, it runs for between 5 and 10 seconds’
A SQL Agent job to run both the CPU and IO check steps once every 10 minutes, an example of the T-SQL used for the job step is here ( https://paulbrewer.wordpress.com/ca_sla_compliancecheck/ )
Example Baseline Test Results
The results below are from 2 Hyper-V virtual servers and a physical server described in the appendix. There was no concurrent client activity in any of the databases while these baselines were captured.
All Servers – IO Performance
All Servers – CPU Performance
Physical Server – CPU Performance
The ‘Total Duration’ of calls to the CPU pattern generator show a clear trend on the physical server. Cross referencing the trend below against a baseline set of Windows Server and SQL Server performance counters would identify the cause, some sort of cache flushing looks likely.
Synthetic ‘Cellular Automation’ transactions offer a mathematically precise way to measure SQL Server database application CPU and IO performance at any given moment. A simple 5 second CPU & IO check once every 10 minutes could capture the performance information needed. Aggregated and potentially summarized across multiple servers for a given period it might provide a clear and unambiguous insight into database application performance.
Appendix – The Example Baseline Environment
These example results were from a simple environment consisting of 3 SQL Server 2014 SP1 instances:
One Physical Server (PB01)
- OS Name: Microsoft Windows Server 2012 Standard
- OS Version: 6.2.9200 N/A Build 9200
- System Manufacturer: HP
- System Model: ProLiant ML110 G6
- System Type: x64-based PC
- Processor(s): 1 Processor(s) Installed. : Intel64 Family 6 Model 30 Stepping 5 Genuine Intel ~2394 Mhz
- Total Physical Memory: 8,192 MB
Two Virtual Servers – (VM2 & VM4)
- OS Name: Microsoft Windows Server 2012 R2 Standard
- OS Version: 6.3.9600 N/A Build 9600
- OS Manufacturer: Microsoft Corporation
- System Model: Virtual Machine
- System Type: x64-based PC
- Processor(s): 1 Processor(s) Installed.
- Total Physical Memory: 4,093 MB
The VM’s 2 and 4 are configured differently in Hyper-V, VM4 is using best practise settings for the following:
- Use Dynamic Memory
- Use fixed virtual hard disks for database files
- Generation Type 2 Virtual Machines
At such low levels of client activity, there were no discernable differences in performance between the 2 virtual servers.
Categories: Performance Benchmarking