continuous testing - the new normal

16
T1 Session 10/27/2016 10:15:00 AM Continuous Delivery with Cloud-Based Testing Methodologies Presented by: Tom Wissink Intervise, Inc. Brought to you by: 350 Corporate Way, Suite 400, Orange Park, FL 32073 888---268---8770 ·· 904---278---0524 - [email protected] - http://www.starcanada.techwell.com/

Upload: techwell

Post on 23-Jan-2018

79 views

Category:

Software


8 download

TRANSCRIPT

T1 Session 10/27/2016 10:15:00 AM

Continuous Delivery with Cloud-Based Testing Methodologies

Presented by:

Tom Wissink

Intervise, Inc.

Brought to you by:

350 Corporate Way, Suite 400, Orange Park, FL 32073 888---268---8770 ·· 904---278---0524 - [email protected] - http://www.starcanada.techwell.com/

Tom Wissink Intervise, Inc.

Tom Wissink worked for Lockheed Martin (LM) managing, developing and testing software intensive systems for 35 years. He was a Lockheed Martin Senior Fellow when he left Lockheed Martin to became a consultant for Intervise, Inc. in February 2016. From January 2010 until becoming a consultant he was the LM Corporate Director of Integration, Test and Evaluation. Throughout his Lockheed Martin career he worked on programs like the Space Shuttle, Hubble Telescope, GPS and several Satellite Command and Control centers. Tom is a member of the National Defense Industrial Association (NDIA) and was a past chairperson for the Industrial Committee on Test & Evaluation (ICOTE) as well as a past co-chair of the Development Test & Evaluation (DT&E) committee. He has presented at the Aerospace Testing Seminar as well as at the STAREAST and STARWEST conferences.

9/22/2016

1

SESSION ABSTRACTSome notions of continuous testing (CT) have been applied in software development methodologies for a while but it was never called by that term. Another term sometimes used for CT is parallel y ptesting. While some have mastered CT, most of us struggle with how to transform our current testing approaches to CT approaches and align them with evolving development methodologies. Join Tom Wissink as he discusses current examples of CT implementations across different software development methodologies (agile, waterfall, incremental) and describes where parallel or CT type testing yields the best benefits. Arguably the most challenging methodology that demands CT testing is DevOps. DevOps requires all phases of testing to be done quickly and in parallel with the development process and some contend that testing continues into actual operations. Leave this

i ith b tt d t di f CT d h thi h b b t l d i

2

session with a better understanding of CT, and how this approach can be best leveraged in your development environment.

9/22/2016

2

AGENDA• INTRODUCTION

• SURVEY OF ROLES AND METHODS• SURVEY OF ROLES AND METHODS

• TEST TYPES DEFINED

• DEVELOPMENT METHODOLOGIES

• CONTINUOUS TESTING (CT) APPROACHES

• “THE TOOLCHAIN”

• SUCCESSES WITH CT

• WRAP-UP

3

SURVEY

• WHAT IS YOUR ROLE?

• TEST/QA ENGINEER

• WHAT DEVELOPMENT

METHOD(S) ARE YOU USING?

• WHAT TYPE OF SOFTWARE?

• WEB APPS• TEST/QA ENGINEER

• SW DEVELOPER

• PROJECT LEADER

• BUSINESS ANALYST/USER

• MANAGER

• OTHER

METHOD(S) ARE YOU USING?

• WATERFALL

• AGILE

• DEVOPS

• HYBRID

• OTHER

• WEB APPS

• CLIENT/SERVER APPS

• MOBILE APPS

• LARGE SOFTWARE SYSTEMS

• OTHER?

OTHER OTHER

4

9/22/2016

3

TEST TYPES DEFINEDGoal: High test coverage for each test type or identify risk for reduced coverage

Unit Testing (White-Box)Statement, Condition, Decision & PathMetric is Logic Coverage – tools available

Integration Testing (Interface Testing: this is generally very weak)Identify threads through system Threads part of incremental/build plansThreads part of incremental/build plansEarly look at Technical Performance Measures – TPM’s (stability, performance, RMA, etc)Metric is Interface coverage – external & internal interface

5

TEST TYPES DEFINED (CONT’D)Functional Testing (Black-Box)

Boundary Value Analysis, Output Forcing, Equivalence Class Partitioning, Cause & Effect Graphing, Combinatorial, etc.

S SMetric is Function coverage – SRS requirements, etc

System Testing (Official Sell-Off)Types - Scenario-Based, Risk-Based, Exploratory, Model-Based, etcIncludes several categories of requirements – HMI, Non-Functional (Performance, RMA, Configuration, Installation), etc.Metric is Requirements coverage – System Spec, Ops Concept/Scenarios, etc

A t T ti (C t T ti )

6

Acceptance Testing (Customer Testing)Operational Test & Evaluation (OT&E), Beta, etcMetric typically owned by Customer but good to know

9/22/2016

4

DEVELOPMENT METHODOLOGIES

7

Effective and Efficient Test Lifecycle in any Software Development Methodology (WF, Incremental, etc.)

8

9/22/2016

5

Effective and Efficient Test Lifecycle in any Software Development

Methodology (Agile, DevOps, etc.)

9

Effective and Efficient Test Lifecycle in any Software Development Methodology (Agile, DevOps, etc.)

See SAFe at www.scaledagileframework.com

ART – Agile Release TrainsATDD – Acceptance Test Driven DevelopmentWSJF – Weighted Shortest Job First

10

9/22/2016

6

CONTINUOUS TESTING APPROACHES

11

Defects – Are teams spending too much time logging, triaging or analyzing defects? What about time spent on defects that aren’t “real”

“Continuous Testing: Shift Left and Find the Right Balance”

spent on defects that aren t real defects—where there is a misunderstanding between the test and the code? Or what if they could prevent entire schools of defects from ever being created in the first place?

12

9/22/2016

7

Test Management – Are teams spending time manually crafting status reports and rolling up test execution results? Or do they have a

“Continuous Testing: Shift Left and Find the Right Balance”

execution results? Or do they have a tool that provides that information in real time and allows stakeholders to drill down as needed? How do teams know if they are on schedule with their test effort, behind schedule or even ahead of schedule?

13

Test Automation – How efficient are teams at re-executing existing tests? Do they run most or even all of them

“Continuous Testing: Shift Left and Find the Right Balance”

manually? If they’ve automated tests, are they focused only on functional tests at the user interface layer, or are they running functional API-layer tests, performance tests and even security tests? Do they have a robust and maintainable test automation framework?

14

9/22/2016

8

Analytics – How do teams know which tests they should run, when and even why they are running those tests at those times? How good is their test

“Continuous Testing: Shift Left and Find the Right Balance”

times? How good is their test effectiveness–meaning, are they running the fewest number of tests that find the largest number of problems? Impact analysis is critical in selecting the right sets of tests to execute whenever they get a new build.

15

Test Environments – Are teams constantly waiting on test environments to be provisioned and configured properly? Do they run tests and discover ft th f t th t th t t i t

“Continuous Testing: Shift Left and Find the Right Balance”

after the fact that the test environment wasn’t “right,” so they have to fix the environment and then re-run all the tests again? Do they hear from developers, “It works on my machine!” but it doesn’t work in the test environment?

16

9/22/2016

9

Service Virtualization – Are teams waiting for dependent systems to become available before they can “really” test? Are they using a “big bang” approach to conduct end to end

“Continuous Testing: Shift Left and Find the Right Balance”

a big bang approach to conduct end-to-end system testing, where they throw all the components together and hope they work and interact properly? Can teams test exception and error scenarios before going to production? Are they testing the easiest parts first just because they are available, and delaying the high-risk areas for the end of the testing effort?g

17

“Continuous Testing: Shift Left and Find the Right Balance”

Test Data – Do teams have the needed sets of production-like test data to ensure they are covering the right test scenarios? Are there exception and error scenarios that we can’t execute because they don’t have the right sets of test data?

18

9/22/2016

10

“Continuous Testing: Shift Left and Find the Right g

Balance”

by Marianne Hollieron DEVOPS.com

19

“Understanding DevOps – Part 4: Continuous Testing and Continuous Monitoring”by Sanjeev Sharma on sdarchitect.wordpress.com

20

9/22/2016

11

A VENDORS VIEW OF CD WITH CT

Continuous delivery (CD): a SW engineering approach in which teams producein which teams produce software in short cycles…oContinuous Delivery is not

Continuous DeploymentoRelies on 3 foundations:

• Configuration management, • Continuous integration,• Continuous Testing

21

A VENDORS VIEW OF A TOOLCHAIN

22

9/22/2016

12

SOME OPEN-SOURCE CATEGORIES/TOOLS IN A TOOLCHAIN

23

MY SET OF TOOL CATEGORIES IN A TOOLCHAIN

Project M

Requirements Gathering & Versioning Configuration

MManagement Gathering & Management

Versioning Management

SW Development & Testing* Tools

Continuous IntegrationMonitoring

24

Testing Tools

* Test Automation tools work and provide value

9/22/2016

13

SUCCESS WITH CT IN WATERFALL AND AGILE

25

10 COMPANIES KILLING IT AT DEVOPS FROM TECHBEACON.COM

•1. AMAZON

•2 NETFLIX

•6. FACEBOOK

•7 ETSY•2. NETFLIX

•3. TARGET

•4. WALMART

•5. NORDSTROM

•7. ETSY

•8. ADOBE

•9. SONY PICTURES ENTERTAINMENT

•10. FIDELITY WORLDWIDE INVESTMENT

26

9/22/2016

14

WRAP-UP• CT can be used in any development

methodology and provide value

A b t t l h i i i l t f l CT• A robust toolchain is crucial to successful CT and CD

• Good testing practices are still important and maybe more so in CT as in any other development methodology

27

• DevOps demands CT to achieve desired outcomes of faster delivery with high quality