SQL Azure (CA) Performance Benchmarking

The ‘Azure SQL Database DTU Calculator’ service – http://dtucalculator.azurewebsites.net/ –  recommends a ‘Performance Level’ for a given SQL Server workload. A representative workload is run on an existing server and performance counters are captured, these counters are then supplied to the DTU Calculator which makes a recommendation. The table below lists the current SQL Azure Performance Levels, the DTU Calculator will recommend one of these.

Basic & Standard Performance Levels DTU’s   Premium Performance Levels DTU’s
B 5 P1 125
S0 10 P2 250
S1 20 P4 500
S2 50 P6 1000
S3  100 P11 1750

A ‘DTU’ is a ‘Database Throughput Unit’, it’s a calculation of the CPU & IO resource requirements for a given workload on a given server. The exact criteria used by the DTU Calculator to formulate a suggested Performance Level isn’t explained, it’s calculated using the Performance Counters below:

  • Processor – % Processor Time
  • Logical Disk – Disk Reads/sec
  • Logical Disk – Disk Writes/sec
  • Database – Log Bytes Flushed/sec

I have two personal use SQL Instances at home that are candidates for migration to Azure, one hosted on a physical server and one on a virtual server. I used the ‘SQL Heart Beat’ framework described later in this article to benchmark these against different recommended SQL Azure database Performance Levels:

  • Step 1 – SQL Heart Beats were run against both on premise servers once every 2 minutes, after 20 minutes an additional stress test was run.
  • Step 2 – Performance counters collected on the physical and virtual servers were passed as .CSV files to the DTU Calculator.
  • Step 3 – SQL Azure databases were created at different Performance Levels.
  • Step 4 – SQL Heart Beats were run again against the SQL Azure databases once every 2 minutes, after 20 minutes an additional stress test was run.
  • Step 5 – SQL Heart Beat runtimes were compared, the Virtual Server and Physical Servers against SQL Azure Performance Level S3.

This benchmarking process is illustrated in Figure 1 below.

SQL_Azure_Fig1_v3

 

Steps 1 & 2 – Azure SQL Database DTU Calculators

Physical Server

Figure 2 below shows % Processor Time on the physical server. The activity is entirely the result of SQL Heart Beats running at 2 minute intervals, and a stress test after 20 minutes. There was no other concurrent activity on the server.

SQL_Azure_Fig2

Figure 3 below shows Disk Writes/sec on the physical server, read activity was low.

SQL_Azure_Fig3

Figure 4 below shows the DTU Calculators suggested SQL Azure Performance Level – P1. The stress test requires over 250 DTU’s, even a SQL Heart Beat requires over 100 DTU’s.

SQL_Azure_Fig4

 

Virtual Server

Figure 5 below shows % Processor Time on the virtual server. The activity is entirely the result of SQL Heart Beats running at 2 minute intervals, and a stress test after 20 minutes. There was no other concurrent activity on the server.

SQL_Azure_Fig5

 

Figure 6 below shows Disk Writes/sec on the virtual server, read activity was low. The stress test workload took longer to complete than on the physical server.

SQL_Azure_Fig6

Figure 7 below shows the DTU Calculators suggested SQL Azure Performance Level – S3. The stress test requires over 250 DTU’s, even a SQL Heart Beat requires over 100 DTU’s.

SQL_Azure_Fig7

 

Step 3 – Create database ‘SQL_HeartBeat’ in SQL Azure, change ‘Performance Level’ in the dashboard ‘Scale’ tab.

 

Step 4 – Run the SQL Heart Beat setup script, call the SQL Heart Beat stored procedure to recreate workload.

 

Step 5 – SQL Azure DTU Recommendation and SQL Heart Beat Calibration

‘SQL Heart Beat Seconds’ indicate runtime and latency, the lower the value is the better.

 

Virtual Server – SQL Azure Performance Level S3

Figure 13 below shows SQL Azure outperforming the virtual server from a CPU perspective at Performance Level S3, the IO performance of the virtual server and Azure were similar.

SQL_Azure_Fig13

 

Physical Server – SQL Azure Performance Level P1

The DTU Calculator recommended SQL Azure Performance Level P1 for the physical server but upgrading to Premium Level was not in the plan.

Figure 11 below shows a comparison of SQL Heart Beat runtimes on the physical server against SQL Azure Performance Level S3 again, a level below the recommendation. CPU performance on the physical server and SQL Azure S3 were similar, IO performance was better on the physical server.

SQL_Azure_Fig11

 

Conclusions

The DTU calculator recommended SQL Azure Performance Levels for:

  • A fixed workload
  • Two servers, a slow virtual server and a faster physical server.

Exactly the same workload produced 2 different SQL Azure Performance Level recommendations, both seem reasonable.

CPU performance in SQL Azure was much better than the on premise servers at the Performance Level where IO matched, the on premise servers are 5 years old.

 

References

 

SQL Heart Beat
A ‘SQL Heart Beat’ is the total duration (milliseconds) of a call to a stored procedure that repeats precise mathematic calculations using the same fixed starting point / initial dataset. The runtime performance of a call is very consistent in stable environments but also sensitivity to other workloads in contended environments.

To use the framework, first create a database called ‘SQL_HeartBeat’ then run the setup script here – https://paulbrewer.wordpress.com/sql_azure_heartbeat/ , double click the T-SQL code, copy and paste.

The T-SQL script below was run once every 2 minutes for an hour with different descriptions for different servers. It runs a set workload that typically takes between 3 and 10 seconds depending on the server and concurrent activity.


EXECUTE dbo.SP_HeartBeat
   @Cycles_IO = 10
  ,@Cycles_CPU = 1
  ,@CycleDepth = 1
  ,@TestCase = 'SQL Server Central - Edit 1.3'
  ,@Description1 = 'Gentle Heart Beat'
  ,@StressLevel = 2;
GO 

The T-SQL script below was run once with different descriptions for different servers. It runs a CPU intensive workload that can take between 7 and 40 minutes, depending on the server and concurrent activity.


EXECUTE [dbo].[sp_HeartBeat]
   @Cycles_IO = 10
  ,@Cycles_CPU = 1
  ,@CycleDepth = 50
  ,@TestCase = 'SQL Server Central - Edit 1.3'
  ,@Description1 = 'Stress Test'
  ,@StressLevel = 3
GO

The query below returns results (SQL Heart Beat run times) in a format that pivot well in Excel.


;WITH CTE1 AS
(
    SELECT
         TestCase
        ,Description1
        ,CONVERT(VARCHAR(15),StartTime,120) + '0' AS StartTime
        ,Generator
        ,SUM(DATEDIFF(millisecond,StartTime, EndTime)) AS Duration_MS
        ,COUNT(*) AS ResultRows
    FROM dbo.TimingResults
    GROUP BY
         TestCase
        ,Description1
        ,Generator
        ,CONVERT(VARCHAR(15),StartTime,120) + '0'
),
CTE2 AS
(
    SELECT
        Generator, Duration_MS, ResultRows, StartTime,
        CONVERT(VARCHAR(15),StartTime,120) + '0' AS StartTime_10MinuteRounded,
        CONVERT(VARCHAR(14),StartTime,120) + '00' AS StartTime_1HourRounded,
        CAST(StartTime AS DATE) AS StartTime_DateRounded,
        ROW_NUMBER() OVER(PARTITION BY TestCase, Description1 ORDER BY TestCase, Description1, StartTime) AS Time_Sequence,
        TestCase, Description1, @@ServerName AS ServerName
    FROM CTE1
)
SELECT *
FROM CTE2
ORDER BY StartTime

Figure 12 below shows the Pivot Table fields used to produce the graphs in this article. If Heart Beats were collected at irregular intervals then use Average of Duration rather than Sum of Duration for the pivot table ‘VALUES’ column.
SQL_Azure_Fig12

 



Categories: SQL Azure, Performance Benchmarking

2 replies

  1. Pleased you found it useful Chris and many thanks for the feedback.

    Best wishes
    Paul

    Like

  2. I just needed a “ball park” estimate for planning costs of moving a solution to Azure.
    Thank you so much for your work!

    Like

%d bloggers like this: