11
PERFORMANCE CHALLENGES ALONG THE CONTINUOUS DELIVERY PIPELINE
Wolfgang Gottesheim
Compuware APM Dynatrace
22
3 COMPANY CONFIDENTIAL – DO NOT DISTRIBUTE
Is 10PM a good time to find out
about performance
problems?
44
When do YOU find performance problems?
5 COMPANY CONFIDENTIAL – DO NOT DISTRIBUTE
? ? ?Unit/Integration
TestsAcceptance
TestsCapacity
TestsReleaseDevelopers
66
77
When can we find them? When can we find them?
Requirements /Specification /
Design
(Load) Test / QA /
Acceptance
Deployment / Production / Maintenance
Development
(Load)Test/ QA /
Acceptance
Dev
TestDevelopment
Dev
Test
Deployment / Production / Maintenance
Dev
Test
Dep
Dev
Test
Dep
Dev
Test
Dep
Dev
Test
Dep
Dev
Test
Dep
Dev
Test
Dep
Dev
Test
Dep
Dev
Test
Dep
88
The ChallengeThe Challenge
"I couldn't help but notice your pain."
"My pain?"
"It runs deep. Share it with me!"
(Star Trek V)
» Performance pain runs deep» Architecture has enormous influence! You have to continuously ensure your
performance requirements are met!
9 COMPANY CONFIDENTIAL – DO NOT DISTRIBUTE
“But we have tests”
1010
What are we learning from our tests?What are we learning from our tests?
Software testing tells us that our system
»meets the requirements that guided its design and development,
»responds correctly to all kinds of inputs,
»performs its functions within an acceptable time,
»is sufficiently usable,
»can be installed and run in its intended environments, and
»achieves the general result its stakeholders desire.
Source: Wikipedia
1111
Let’s look at the tests we runLet’s look at the tests we run
Unit TestsIntegration
TestsAcceptance
TestsLoad Tests
Meets requirements
Responds correctly to input
Performs in acceptable time
Usability Deployment Achieves Correct Result
Tests looking at non-functional aspects, but- High effort (have to be created and maintained)
- Only possible at a rather late development phase
Very focused on FUNCTIONAL aspects
1212
1313
Focus on automatically analyzing performanceFocus on automatically analyzing performance
Unit Tests Integration Tests
Acceptance Tests
Load Tests
Meets requirements
Responds correctly to input
Performs in acceptable time
Usability Deployment Achieves Correct Result
1414
How can we measure performance of Unit and Integration tests?How can we measure performance of Unit and Integration tests?
What you usually get
[junit] Running com.dynatrace.sample.tests.FastUnitTest[junit] Tests run: 15, Failures: 0, Errors: 0, Time elapsed: 34 sec[junit] Running com.dynatrace.sample.tests.SlowUnitTest[junit] Tests run: 17, Failures: 0, Errors: 1, Time elapsed: 2,457 sec
Looking at overall timings does not make sense But what does?
1515
1616
Basic: Test durationBasic: Test duration
I don’t like endsWith – I like regex!
1717
N+1 QueriesN+1 Queries
Metrics: • # SQL Executions / Request• # of “same” SQL Executions
1818
Ignoring Architectural RulesIgnoring Architectural Rules
Metrics: • # SQL Executions /
Request per Tier
1919
High Number of Requests to Backend SystemHigh Number of Requests to Backend System
Metrics: • # Calls to Web Service
2020
Memory LeakMemory Leak
Still crashing…
Problem fixed!
Fixed Version Deployed
Metrics: • Heap Size• # Objects allocated
2121
Too Many ExceptionsToo Many Exceptions
Metrics: • # Exceptions
2222
2323
What you currently measure
What you should measure
Performance Metrics in your CI
2424
What you should measure
# Test FailuresOverall Duration
Execution Time per test# calls to API# executed SQL statements# Web Service Calls# JMS Messages# Objects Allocated# Exceptions# Log Messages…
What you currently measurePerformance Metrics in your CI
2525
We should not forget about
ACCEPTANCE tests
2626
Large Web SitesLarge Web Sites
17! JS Files – 1.7MB in Size
Useless Information!Even might be a security risk!
2727
Missing Resources Cause DelaysMissing Resources Cause Delays
46! HTTP 403 Requests for images on the landing page
Lots of time “wasted” due to roundtrips that just result in
a 403
Metrics: HTTP 4xx & 5xxTotal Number of Resources
2828
SLOW or Failing 3rd Party ContentSLOW or Failing 3rd Party Content
2929
What you should measure
Performance Metrics in your CI # Test Failures
Overall Duration
Execution Time per test# calls to API# executed SQL statements# Web Service Calls# JMS Messages# Objects Allocated# Exceptions# Log Messages# HTTP 4xx/5xxRequest/Response SizePage Load/Rendering Time…
What you currently measure
3030
Starting from…Starting from…
Production Environment
Developers CI Server TestingEnvironment
Release
3131
…or maybe……or maybe…
Production Environment
Developers CI Server TestingEnvironment
Release
3232
We get to…We get to…
Commit Stage
Automated Acceptance
Testing
Automated CapacityTesting
ReleaseDevelopers
3333
Performance as a Quality Gate
Automated collection of performance metrics in
test runs
Comparison of performance
metrics across builds
Automated analysis of performance metrics to
identify outliers
Automated notifications on performance issues in
tests
Measurements accessible and shareable across teams
Actionable data through deep transactional insight
Integration with build automation tools and
practices
3434
PERFORMANCE as part of our Continuous Delivery ProcessPERFORMANCE as part of our Continuous Delivery Process
Commit Stage
Automated Acceptance
Testing
Automated CapacityTesting
ReleaseDevelopers
3535
Performance Scalability
3636
Collaborate VerifyMeasure
3737
When CAN we find
performance problems?
3838
Unit/IntegrationTests
Acceptance Tests
CapacityTests
ReleaseDevelopers
3939
4040
Who Cares About Performance?Who Cares About Performance?
Developers?
Architects?
Testers?
Operators?
Business?
4141
Everyone!Everyone!
Developers
Architects
Testers
Operators
Business
4242
But remember:
4343
Check out our trialhttp://bit.ly/dttrial
Get your demo and a T-Shirt at
Booth #5107!
4444
4545
How can this look in real life?
4646
Performance Focus in Test AutomationPerformance Focus in Test Automation
Analyzing All Unit / Performance Tests
Analyzing Metrics such as DB Exec Count
Jump in DB Callsfrom one Build to the
next
4747
Performance Focus in Test AutomationPerformance Focus in Test Automation
Cross Impact of KPIs
4848
Performance Focus in Test AutomationPerformance Focus in Test Automation
Here is the difference!
Compare Build that shows BAD Behavior!
With Build that shows GOOD Behavior!
4949
Performance Focus in Test AutomationPerformance Focus in Test Automation
Embed your Architectural Results in Jenkins
5050
Performance Focus in Test AutomationPerformance Focus in Test Automation
CalculateUserStats is the new Plugin that
causes problems
5151
Check out our trialhttp://bit.ly/dttrial
Get your demo and a T-Shirt at
Booth #5107!