continuous performance testing and monitoring in agile development

65
Continuous Performance Testing and Monitoring in Agile Development

Upload: dynatrace

Post on 20-Mar-2017

325 views

Category:

Technology


0 download

TRANSCRIPT

Continuous Performance Testing and Monitoring in Agile Development

Deliver faster applications faster through continuous performance validation

Who We Are

Who We Are

Mission: Deliver faster applications faster through continuous performance validation

Who We Are

Mission: Deliver faster applications faster through continuous performance validation

User Demands Process Revolutions

We’re Driven by Your Challenges

Technology Transitions

AGENDA

Performance matters

Agile testing

From Agile to DevOps

DevOps pipeline

Dynatrace Integration

40% of users are moving towards the competition following a bad experience

57 % of users use their mobile

44-61 % of users share their feelings on social

network

User Experience is Key

Usability

Ergonomic

Performance

Accessibility

Features

What is User Experience?

Etam reduces it's average page load time from 1.2s to 500ms and increased conversions by 20%, time on site by 21%, and pages viewed per visit by 28%.

Walmart saw up to a 2% increase in conversions for every 1 second of improvement in load time. Every 100ms improvement also resulted in up to

a 1% increase in revenue.

Shopzilla decreased load time by 5 seconds and saw a 12% increase in conversion rate, a 25% increase in page views and a 50% reduction in

infrastructure required.

Etsy saw a 12% increase in bounce rate when they added 160kb of images to their mobile page.

User Experience Impact on Business

AGENDA

Performance matters

Agile testing

From Agile to DevOps

DevOps pipeline

Dynatrace Integration

Performance Testing Used To Be ….

Performance Design is The Key

Standard Performance Testing Workflow

Strategy

Scenarios

Prepare

Execute

AnalyzeTuning

Testing Earlier

Early Risk analysis

• Understand the application

• Performance needs to be included in the PRA

• ATAM ( Architecture Tradeoff Analysis Method)

Automation Smarter and faster

Validation Follow our KPI

MonitorDetect problems before the end users

Performance is driven by :• The end users

• Third party system

• …

We need to ask the right questions :• How the users work on the system?

• What are their habits

• When ? and how often

• Are we going to expand in different geos?

• Is the marketing plan to market/promote

the application? If yes what is the type audience?

Take The Time To Understand The Application

Performance testing an existing application

• Ask the support of functional architect, HR..etc

• Understand the behavior of the application with the help of the logs

Migration project : Don’t underestimate the history of the application

Performance testing a new application/service

• Involve the project leader , functional architect

• Try to understand the purpose and relation with the business plan

Testing scenarios

Unit Performance Testing

Any part of the system

Not a standard practice

Do not wait until the system is assembled

Test cases are simpler, fewer variables

• Test-Driven Development may be an answer

Many systems are monolithic

Third-party components

Change Mentality

Before

Late record/playback performance testing

System-level requirements Record/playback approach "Black Box"

Now

Early Performance Engineering Component-level requirements Programming to generate

load/create stubs "Grey Box”

Workflow

3) Development life cycle

Component

Testing

Business

Test Case

1) Specification

2) Risk Analysis

Project Lifecycle

Neotys Continuous Performance Validation Solution

Component

testing

Requirements

Daily Cycle

Performance

End-to-End testing

Sprint Assemble and Deployment

Feedback

Production

Monitoring

Performance

Qualification

Continuous Integration

AGENDA

Performance matters

Agile testing

From Agile to DevOps

DevOps pipeline

Dynatrace Integration

Organizations Adopt Agile and DevOps as an Answer

• 99% of organizations have adopted Agile development methods

• 88% of CIOs are using DevOps

World Quality Report 2016-17

But Speed is Nothing Without Quality

“Surviving Disruption, Leading Change: Winning in the Application Economy,” 2015

2/3 of business leaders say

their company’s future

DEPENDS on the QUALITY

of their software

“The term everyone is using is

DevOps, but I think it should

really be DevTestOps. Testing

is really crucial for achieving

quality & speed”

Diego La Guidice, Principal Analyst

DevTestOps Rather Than DevOps

AGENDA

Performance matters

Agile testing

From Agile to DevOps

DevOps pipeline

Dynatrace integration

Planning

• Provide transparency to the stakeholders

Source Control

• Tracking Change of the code ( infrastructure, application..Etc)

Configuration management

• Infrastructure is treated exactly like code

Continuous integration

• Tools to automate the Build and the deployment

Deployment tools

• Application can be released to production any time you want in order to improve time to market

Testing and validation

• Validate the quality of the release

APM

• applications are commonly tested and monitored with APM tools to ensure high availability, low response time, and quality of service

DevOps Toolchain

Stage 1: Build

Buildrequest

PrepareBuild

Compile

Packaging

Stage 2: Deploy

Deploy

DEV QA UAT

Stage 3: Test

receive task

integration

test

DEV QA UAT

Test

Integration Functional Performance Security

Stage 4 : Release

Release

PROD

receive task

integration

test

Devops toolchain

Buildrequest

Deploy Release

DEV QA UAT

PROD

Test

PrepareBuild

Compile

Packaging

Integration Functional Performance Security

AGENDA

Performance matters

Agile testing

From Agile to DevOps

DevOps pipeline

Dynatrace integration

• response time

• infrastructure behavior

Capture performance metrics that go beyond

• Failures, bottlenecks, slower-than-expected response times

Actionable data in case of problems

Why Combine Load Testing And APM During a Load Test?

How Does the Integration Works?

Web Model

Mobile Model

RestInterface

Business logic

Data Access logic

XML

Legacy

Relationnal

Presentation Business Data

NeoLoad

Virtual Users Load

• Transaction name, virtual user, script name, unique id

• Easier correlation between load testing solution and Dynatrace AppMon

Web request tagging

• Start & stop recording of data to cover exactly one test run

Session Recording

• Each web request is tagged with a test id

• Allows to distinguish between requests if two test are run in parallel

Registering individual test executions

NeoLoad/Dynatrace Integration

• Download our free version of NeoLoad including the integration withDynatrace

Start LoadTesting with NeoLoad now

• Contact us

Ask for a personalized demo or a dedicated workshop

Call for actions

Next Steps

[email protected]

Visit Neotys Community at answers.neotys.com

Download NeoLoad Free Edition from www.neotys.com

Questions?

Visit neotys.com

Follow us on Twitter @hrexed

mobile

browser

network

multi-geo

3rd parties cloud

containers

services

codehosts

synthetic

logs

business

transaction

applications

sdn

relax

full-stack, broad, hyperscale

Synthetic AgentsDeployed and managedby Dynatrace

Backbone Last Mile

Web Performance Management100+ locations

Web Performance Management and

Load Testing 10,000’s+ locations

Mobile

Dozens of countries connected to real wireless carriers

Global Performance Network

Database and Mainframe

1st Tier

N-Tier

Front End 0.2s

App Tier 1.1s

Middleware 0.3s

Database 0.1s

Login

Search

Order Stock

NETWORK

ENTERPRISE APPS

Agentless Monitoring

Inspect single Users

Biz/App

User Experience Management

Deep Transaction Analysis

It‘s not about blind automation of pushing more bad code through a shiny pipeline

It‘s not about blindly giving everyone Ops powerto deploy changes only tested locally

Level-Up your Functional Tests with Metrics

Build 17 testPurchase OK

testSearch OK

Build # Test Case Status

Test & Monitoring Framework Results

Test/Arch

Level-Up your Functional Tests with Metrics

Build 17 testPurchase OK

testSearch OK

Build 18 testPurchase FAILED

testSearch OK

Build # Test Case Status

Test & Monitoring Framework Results

We identified a regression

Test/Arch

Level-Up your Functional Tests with Metrics

Build 17 testPurchase OK

testSearch OK

Build 18 testPurchase FAILED

testSearch OK

Build 19 testPurchase OK

testSearch OK

Build # Test Case Status

Test & Monitoring Framework Results

Problem solved

Test/Arch

Level-Up your Functional Tests with Metrics

Build 20 testPurchase OK

testSearch OK

Build 17 testPurchase OK

testSearch OK

Build 18 testPurchase FAILED

testSearch OK

Build 19 testPurchase OK

testSearch OK

Build # Test Case Status # SQL # Excep CPU

Test & Monitoring Framework Results Architectural Data

Let’s look behind

the scenesTest/Arch

Level-Up your Functional Tests with Metrics

Build 20 testPurchase OK

testSearch OK

Build 17 testPurchase OK

testSearch OK

Build 18 testPurchase FAILED

testSearch OK

Build 19 testPurchase OK

testSearch OK

Build # Test Case Status # SQL # Excep CPU

12 0 120ms

3 1 68ms

Test & Monitoring Framework Results Architectural Data

Let’s look behind

the scenesTest/Arch

Level-Up your Functional Tests with Metrics

12 0 120ms

3 1 68ms

Build 20 testPurchase OK

testSearch OK

Build 17 testPurchase OK

testSearch OK

Build 18 testPurchase FAILED

testSearch OK

Build 19 testPurchase OK

testSearch OK

Build # Test Case Status # SQL # Excep CPU

12 0 120ms

3 1 68ms

12 5 60ms

3 1 68ms

75 0 230ms

3 1 68ms

Test & Monitoring Framework Results Architectural Data

Exceptions probably reason for

failed tests

Let’s look behind

the scenesTest/Arch

Level-Up your Functional Tests with Metrics

12 0 120ms

3 1 68ms

Build 20 testPurchase OK

testSearch OK

Build 17 testPurchase OK

testSearch OK

Build 18 testPurchase FAILED

testSearch OK

Build 19 testPurchase OK

testSearch OK

Build # Test Case Status # SQL # Excep CPU

12 0 120ms

3 1 68ms

12 5 60ms

3 1 68ms

75 0 230ms

3 1 68ms

Test & Monitoring Framework Results Architectural Data

Problem fixed but now we have an

architectural regression

Problem fixed but now we have an

architectural regression

Let’s look behind

the scenesTest/Arch

Level-Up your Functional Tests with Metrics

12 0 120ms

3 1 68ms

Build 20 testPurchase OK

testSearch OK

Build 17 testPurchase OK

testSearch OK

Build 18 testPurchase FAILED

testSearch OK

Build 19 testPurchase OK

testSearch OK

Build # Test Case Status # SQL # Excep CPU

12 0 120ms

3 1 68ms

12 5 60ms

3 1 68ms

75 0 230ms

3 1 68ms

Test & Monitoring Framework Results Architectural Data

Now we have the functional and

architectural confidence

Let’s look behind

the scenesTest/Arch

#1: Analyzing every Unit, Integration & REST API test

#2: Key Architectural Metrics for each test

#3: Detecting regression based on measure per Checkin

Add Metrics into Continuous Integration

Arch/CI

Quality Overview by BuildIn Dynatrace …

#8: Integrate into your Delivery Pipeline

CI/CD

Quality Overview by BuildIn Dynatrace …

… allows you to Stop a Bad Build in Jenkins, NeoLoad, …

#8: Integrate into your Delivery Pipeline

CI/CD

One goal: deliver better features to customers faster

Two fundamental components: speed + quality

Take the next step with a Dynatrace Free Trial !!!

dynatrace.com/trial

Sign up at:

Confidential, Dynatrace, LLC

Q & AHenrik RexedPerformance Engineer@hrexed

Asad AliProduct Specialist@AsadThoughts

Download NeoLoadFree Edition at: www.neotys.com

Try Dynatrace for FREE:www.dynatrace.com/trial