tools. techniques. trouble?

Download Tools. Techniques. Trouble?

Post on 17-Jan-2017

110 views

Category:

Software

2 download

Embed Size (px)

TRANSCRIPT

  • Tools. Techniques. Trouble?Why test automation is getting more difficult and what can be done about it.

    Gordon McKeown

    Group Director, Product Management, TestPlant UK

    Northern Lights

    Manchester 27 April 2016

  • Why?

  • Why?

    Source: Forrester

  • (Response) time is money

  • Creative tension

    Testing

    $

    costs--

    rate of change++ complexity++

  • The burden of testing

    Are we condemned to be like Sisyphus, to push a rock up a hill for eternity?

    Museo del Prado, Madrid, Spain

  • Historical precedent

    The division of labour and automation are the twin foundations of both a modern economy and effective software testing.

  • Automation within testing

    Process & personal productivity.

    Test automation

    Driving the System Under Test (SUT).

    Automating test creation

    Create scripts or test data .

  • Business context

    Two trends are in conflict

    Increasing frequency of releases

    requires more testing and therefore better test automation.

    Increasing numbers of suppliers involved with system delivery

    Technical support for testing.

    Creating test infrastructure.

    (The challenges of more complex technology stacks will be examined later.)

  • Multi-vendor challenges

    Contracts should explicitly require that suppliers provide technical and logistic support for testing.

    Testability should be a requirement

    Testability should be an important technical criteria when choosing technology.

    Components should be testable

    Apply this principle to internal development.

    Request (demand?) testability of third party components.

  • Technical challenges

    Mashup

    Multiple services.

    Multiple vendors.

    Multiple technology stacks.

    Heterogeneous clients and interfaces

    Desktops (Windows, OS X, Linux).

    Mobile (IOS, Android and more).

    Service consumers (many types, many frameworks).

    IOT, embedded systems.

    Web technology is getting complicated!

    Increasing rate of technical change

    Did I mention business pressures for short release cycles?

  • The mashup paradigm

    A mashup, in web development, is a web page, or web application, that uses content from more than one source to create a single new service displayed in a single graphical interface. (Wikepedia)

    Originated with public facing web sites

    Influencing internal enterprise applications.

    SOA (Service Oriented Architecture) and micro-service approaches create similar issues for testing.

    https://en.wikipedia.org/wiki/Web_developmenthttps://en.wikipedia.org/wiki/Web_pagehttps://en.wikipedia.org/wiki/Web_application

  • Automating mashup apps

    Move up the protocol stack to give holistic test (user experience).

    Multi-level / multi-protocol testing may also be required

    Background load (performance testing).

    Individual services / subset focus.

    More about this topic anon..

  • Shifting boundaries: the SUT

    it means just what I choose it to mean neither more nor less. (Humpty Dumpty in Lewis Carroll, Through the Looking Glass)

    Defining the SUT precisely and completely is essential.

    Get explicit agreement from all stake-holders!

    You may need to supply missing services

    Stubs or Mocks. http://martinfowler.com/articles/mocksArentStubs.html

    http://www.goodreads.com/author/show/8164.Lewis_Carrollhttp://www.goodreads.com/work/quotes/17240250http://martinfowler.com/articles/mocksArentStubs.html

  • Test automation challenges

    Tester productivity

    Coverage

    Script re-use & maintenance overhead across:

    Time (software releases, technology changes).

    Device types / platforms.

  • Heterogeneous clients

    Public facing applications

    Multiple mobile platforms plus desktops. Web services.

    Range of devices, variable power, screen size & resolution.

    Native apps plus web site.

    Internal / Enterprise

    Increased use of mobile so all of the above can apply.

  • Adding petrol to the flames

    Test executions = functional tests x client types x releases

    53% of respondents cite frequency of application functionality changes as a concern in 2015/2016 World Quality Report (Cap Gemini, HP, Sogeti).

    https://www.uk.capgemini.com/thought-leadership/world-quality-report-2015-16

  • GUI level automation may help

    High level GUI automation

    Test across services and components

    User experience, end to end

    Hardware costs declining & more flexible resource management through VMs, containers, Cloud

    Is becoming more relevant for load testing

    Not all SUTs can be tested via a GUI!

    Multi-paradigm testing for complex systems

    E.g. Web services, APIs and GUIs across devices.

  • Intelligent image recognition and OCR

    User-centric GUI test automation

  • Objects versus images

    For discussion: potential advantages of image based (+ OCR) versus object based:

    Total device control including reboot.

    Test whole GUI not just the app.

    Minimal intrusion.

    One script many devices (vary images or use image collections).

    Images may be less volatile than objects as system evolves.

    P.S. Assume Im biased!

  • Multi-user functional testing

    Today very little software runs in glorious isolation.

    Most functional testing is single path and misses whole classes of errors.

    Errors are often exposed by low concurrency load tests intended to debug scripts. This confirms that there is a systematic problem.

    Most load testing covers a small subset of functionality.

    We need to execute low concurrency (compared to load testing) parallel execution of functional tests.

    Shared components, servers, networks should be included in detailed functional testing.

    Multi-user functional testing is the missing link.

  • Network behaviour in scope or out?

    The network is (back) on the application testing agenda

    Twenty years ago the network was often in scope.

    The last decade: fast intranets + relatively simple web applications meant network was out of scope for much testing. However, it could be argued that this was sometimes the wrong decision!

    The rise of mobile devices and the immanent arrival of the IOT means that how software reacts to network characteristics should be an important concern.

  • Network emulation

    Provides real world network behaviour when the actual test environment has high bandwidth and low latency.

    Using (a sample of) real networks is expensive and difficult to control.

    Relevant for both functional and load testing.

  • Why network emulation?

    Test environment Real world

  • Why network emulation?

    Test environment Real world

    A 64MB file takes 5s to transfer on a LAN. On a FAST network from London to New York the latency is just

    90ms and the file takes 440s to transfer! There is nothing bandwidth can do about this!

  • Load testing challenges

    All the issues discussed so far apply to both functional and load testing.

    They are more acute for load testing.

    The changing nature of Web technology is particularly challenging.

  • Load testing and Web evolution

    Load testing of Web servers has traditionally been based on replaying or constructing http traffic.

    This is done more or less intelligently

    The evolution of Web technology is disrupting this approach.

  • HTTP traffic generation approaches

    Verbatim replay of N hours worth of network traffic

    This is a niche approach and is only employed by network oriented test tools (often with specialist hardware). Problems with system state, clocks etc.

    Specify http requests and the target throughput and unleash worker threads across injector machines.

    OK for basic throughput testing and where http requests are independent. Problematical when the application is based around conversation state.

    Virtual Users that model real users

  • The Virtual User advantage

    Mimics user activity (user may be software agent).

    Maintains conversation state.

    sessions, multi-step transactions, security authentication and authorisation

    More natural software model

    Variable test data, timings etc.

  • Protocol level load testing (emulation)

    Load testing tool

  • Application level load testing

    Load testing tool

  • Application level versus emulation

    Application level

    VU instance drives the real client-side technology.

    E.g. Web Browser, Application GUI or client-side non-GUI application code like a Java remote service stub.

    Emulation

    The tool emulates client-side behaviour.

    For Web testing the more sophisticated tools will emulate some browser features like security, re-directs, cookies and data caching out of the box.

    The rest of the emulation will be based on the recorded HTTP traffic supported by an internal API.

  • Web scripts from network recordings

    The traditional approach for high concurrency testing.

    Simple replay only works for simple sites.

    Key challenges:

    Parameterisation.

    Conversation state

    Dynamic data correlation originating from the server.

    Dynamic data originating from client-side scripting.

  • The Web technology new wave

    Ajax.

    HTTP/