SQL Azure (CA_SQLAzure) Performance Benchmarking

The ‘Azure SQL Database DTU Calculator’ service (see references) recommends a ‘Performance Level’ for a given SQL Server workload. A representative workload is run on an existing server and performance counters are captured, these counters are then supplied to the DTU Calculator which makes a recommendation. A ‘DTU’ is a ‘Database Throughput Unit’, a suggestion of the CPU & IO resource requirements and service level, for a given workload on a given server. It’s calculated by capturing the performance counters below while the server is under load:

  • Processor – % Processor Time
  • Logical Disk – Disk Reads/sec
  • Logical Disk – Disk Writes/sec
  • Database – Log Bytes Flushed/sec

The CA benchmarking process (see references) was run on a local Hyper-V virtual server with Windows Server 2016 and SQL Server 2016, with 2 vCPU’s and 8 GB of RAM. The performance counters were captured during the benchmark and passed to the DTU Calculator which suggested Premium – P2.

 

The same CA benchmark workload was then run on the VM & Azure (P2) using the parameters below, the results from both were saved as a single csv file and imported into R Studio.

Concurrent Requests @Batches @IO_Benchmark @CPU_Benchmark @NewPatternsInBatch @StressLevel @Description1
2 10 1 0 50 2 VM
2 10 0 1 50 2 VM
2 10 1 0 50 2 Azure
2 10 0 1 50 2 Azure

CPU Benchmark

SQL Azure CPU performance, at the recommended P2 level, was close to twice as fast as the VM.

Benchmark Min. 1st Qu Median    Mean 3rd Qu Max
CPU – VM 11,733 21,004 22,890 22,581 24,703 30,307
CPU – SQL Azure 7,673 12,214 14,570 14,317 16,514 20,126

IO Benchmark

SQL Azure IO performance, at the recommended P2 level, was more than twice as fast as the VM.

Benchmark Min. 1st Qu Median    Mean 3rd Qu Max
IO – VM 10,546 25,009 26,315 26,290 28,998 32,850
IO – SQL Azure 7,234 9,044 10,562 10,299 11,254 14,610

The T-SQL script below was then run in two concurrent query analyser sessions to produce the CA Benchmark workload, first on the VM then SQLAzure.


EXECUTE dbo.CA_Benchmark 
   @Batches = 10
  ,@CPU_Benchmark = 1
  ,@IO_Benchmark = 1
  ,@NewPatternsInBatch = 50
  ,@DisplayPatterns = 0
  ,@Initialize = 1
  ,@StressLevel = 2
  ,@Description1 = 'xx'
GO

The results from the CA benchmarks in Azure and the VM were combined and saved as a csv file then imported into R and graphed using the script below:


rm(list=ls())
ca <- read.csv(file = "c:/DataScience/CA_Azure_20170930.csv", header = TRUE, sep = ",", stringsAsFactors = FALSE)
str(ca)

# Factors
ca$StartTime <- as.factor(ca$StartTime)
ca$BenchmarkPerspective <- as.factor(ca$BenchmarkPerspective)
ca$ServerName <- as.factor(ca$Description1)

#install.packages("ggplot2")
library("ggplot2")

ca_CPU <- subset(ca,ca$BenchmarkPerspective == "CPU")
ca_IO <- subset(ca,ca$BenchmarkPerspective == "IO ")

fill <- "#4271AE"
line <- "#1F3552"


# CPU Benchmark
p1 <- ggplot(ca_CPU, aes(x = ServerName, y = BatchDurationMS)) + geom_boxplot()
p1 <- p1 + scale_x_discrete(name = "Server Name") +
  scale_y_continuous(name = "Mean Benchmark Duration Milliseconds") +
  geom_boxplot(fill = fill, colour = line)
p1 <- p1 +  ggtitle("Boxplot of CPU Benchmark Duration by Server Name")
p1 <- p1 + facet_wrap(~ BenchmarkPerspective)
p1

# IO Benchmark
p2 <- ggplot(ca_IO, aes(x = ServerName, y = BatchDurationMS)) + geom_boxplot()
p2 <- p2 + scale_x_discrete(name = "Server Name") +
  scale_y_continuous(name = "Mean Benchmark Duration Milliseconds") +
  geom_boxplot(fill = fill, colour = line)
p2 <- p2 +  ggtitle("Boxplot of IO Benchmark Duration by Server Name")
p2 <- p2 + facet_wrap(~ BenchmarkPerspective)
p2

summary(subset(ca_CPU$BatchDurationMS,ca_CPU$ServerName == "VM"))
summary(subset(ca_CPU$BatchDurationMS,ca_CPU$ServerName == "Azure"))
summary(subset(ca_IO$BatchDurationMS,ca_IO$ServerName == "VM"))
summary(subset(ca_IO$BatchDurationMS,ca_IO$ServerName == "Azure"))

 

References



Categories: Performance Benchmarking, SQL Azure

2 replies

  1. Pleased you found it useful Chris and many thanks for the feedback.

    Best wishes
    Paul

    Like

  2. I just needed a “ball park” estimate for planning costs of moving a solution to Azure.
    Thank you so much for your work!

    Like

%d bloggers like this: