how is testing changing?

16
How is Testing Changing? PREPARING FOR NEXTGEN TESTING AND AUTOMATION In the new Digital Age, the old ways of testing do not seem to work as well as they used to. The pressure on IT from business mean more dynamic, rapid, event-driven software development and change processes are required. This creates ever-changing IT environments, which puts pressure on testing and testers to do more in less time, and there is a risk that testing becomes an unacceptable bottleneck. Something must change, and it is the approach to testing that has to be re-thought. AUGUST 2020

Upload: others

Post on 11-Apr-2022

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: How is Testing Changing?

Page 1© 2020 Worksoft

HOW IS TESTING CHANGING?

How is Testing Changing?PREPARING FOR NEXTGEN TESTING AND AUTOMATION

In the new Digital Age, the old ways of testing do not seem to work as well as they used to. The pressure on IT from business mean more dynamic, rapid, event-driven software development and change processes are required. This creates ever-changing IT environments, which puts pressure on testing and testers to do more in less time, and there is a risk that testing becomes an unacceptable bottleneck. Something must change, and it is the approach to testing that has to be re-thought.

AUGUST 2020

Page 2: How is Testing Changing?

Page 2© 2020 Worksoft

HOW IS TESTING CHANGING?

Overview

This paper sets out the pressures on testing and offers a new, logistics-neutral model that describes how testers think. The model aligns with event-driven approaches and can be used to refine how testers think and behave in project teams. The skills needed by people who test are changing and the human aspects of collaboration emerge as more important. Better communication, persuasion, coaching, analysis, analytical and critical skills are required.

A major challenge to testers in large project environments is how to fit the requirements of end-to-end and regression testing into shorter delivery times. We offer some guidance on skills, attitudes and refined behaviors as well as a range of guidelines for implementing more responsive, automated regression regimes.

Contents

Influences on Testing ...........................................................................................................3

Challenges Faced by Testers and Teams ......................................................................3

Rethinking Testing.................................................................................................................7

The New Tester Perspective ...........................................................................................10

Integration in the (sometimes very) Large ..................................................................11

What Does This Mean for Testers? ...............................................................................13

About the Author .................................................................................................................16

How is Testing Changing?

Page 3: How is Testing Changing?

Page 3© 2020 Worksoft

HOW IS TESTING CHANGING?

Influences on Testing

Business Change

Across the business world, there is a revolution in the way that IT is being specified, developed, implemented, and used. There is hype surrounding the “Digital Transformation” phenomenon, but digital programs are affecting business across all industry and government sectors. There is no doubt that it also affects people in their daily lives.

Unlike Agile, which is an ongoing IT initiative, digital is driven by business. Agile has taken nearly 20 years to get halfway implemented in the software industry. Digital has taken no time at all — perhaps 2–3 years — and it is all-pervasive. Digital is the buzz-phrase of the moment.

Automation (and not just test automation) is critical. What business needs is IT responsiveness — what you might call true agility. This doesn’t necessarily mean hundreds of releases every day; but it does mean business wants rapid, regular turnaround from ideas to software delivery.

With continuous integration/deployment, DevOps, developers can now promise Continuous Delivery.

Testers need to provide Continuous Assurance.

This means automation through the (shortened) lifecycle. What exactly is possible and impossible with automation, right here, right now? Are Continuous Delivery and DevOps the route to success? Could testing be the bottleneck that prevents success? How do testers operate in dynamic, high-paced, automation-dominated environments?

As digital transformation accelerates, larger companies are finding it harder to evolve fast enough to keep pace.

Challenges Faced by Testers and Teams

In a recent assignment for a consortium of software businesses in Ireland, a group of 20 software QA managers identified the challenges they have with recruitment, development and retention of testers. The goal was to identify the skills gaps that needed to be filled and define what we’ve called the Tester Skills Program. In the table that follows, each challenge is mapped to a skill area requiring attention.

What business needs is IT responsiveness — what you might call true agility.

Page 4: How is Testing Changing?

Page 4© 2020 Worksoft

HOW IS TESTING CHANGING?

CHALLENGE SKILLS AREAS(S)

Tester candidates look great on paper, but they do not seem to be able to analyze a requirement and be a tester

Requirements Test Analysis, Testing Fundamentals

College kids want to write code, but not test their own work

Test Motivation, Testing Fundamentals

Agile teams & dev and test teams work as “Agile” but not together

Testing in Teams

Cafeteria Agile, teams choose to do what teams like to do

Adapting Testing to Change, Agile Testing

Who leads on testing in an Agile team? Test Leadership, Testing in Teams

Is the tester the lead on quality? If not, who is?

Testing in Teams

Everyone does their own thing, but who sets the strategy? Gaps and overlaps?

Testing in Teams, Test Strategy

Brief sprint — devs want to hand off asap, but testers are left behind, left with questions

Testing in Teams

Changed focus from testing to quality engineering — but what is quality engineering?

Adapting Testing

Should testers do more coaching than testing in Agile teams?

Coaching

TDD and role in test strategy Developer Testing

Good design for test — what is it? How to recognize and encourage developers

Developer Testing, Testability

Exploratory testing, critical mindset, seeing the difference in confirmation and challenging the product

Exploratory Testing, Critical Thinking, Test Motivation

Leading retrospectives, continuous improvement, leaving room for innovation and improvisation

Coaching, Process Improvement

Balance between control and innovation

Process Improvement

To be pro-active in test is a matter of critical thinking but also assertiveness

Testing in Teams, Assertiveness, Critical Thinking

Communication skills, how to articulate questions and information

Communication

Critical thinking, and influencing Critical Thinking, Communication

Lack of team and collaboration skills Collaboration, Communication

Testing as an activity not a role, but organizations exist to achieve outcomes

Testing and Stakeholders

Understanding what devs do well and testers/SDETSs do well (and not so well)

Developer Testing, SDET Role

Testing Skills Gap 20 software QA managers recently identified challenges associated with specific skills gaps in testing.

Page 5: How is Testing Changing?

Page 5© 2020 Worksoft

HOW IS TESTING CHANGING?

CHALLENGE SKILLS AREAS(S)

High performing teams have devs writing automation

Test Automation

A lot of testers lack the confidence/aptitude to explore and to question software

Test Motivation, Testing Fundamentals, Exploratory Testing

Testers unwilling to learn, improve Test Motivation, Adapting Testing

Reluctance to test below the UI Technical Testing, Testability

Unwillingness to test "outside the box" Test Motivation, Modelling

Inability to demonstrate the value of testing

Testing and Stakeholders, Test Motivation

Testing large interconnected systems Testing in the Very Large

We’ll discuss the new range of skills required for future testers later.

Does Continuous Delivery Mean Continuous Assurance?

Continuous Delivery, or an adapted version of it, is becoming increasingly popular in digital projects and if digital is the “future” for the majority of organizations, then we had better prepare for it. Testers need to adapt to fit into their continuous delivery regimes so let’s look at how continuous approaches are normally described.

The most common diagram one sees is the figure eight or infinite loop below. The principle is that the plan, code, build, test through release, deploy, operate, and monitor phases are sequential but are repeated for every release.

But there’s a problem here. If you unwrap the infinite loop, you can see that the phases are very much like the stages of a Waterfall development. There are no feedback loops, you have to assume one phase completes before another starts.

Testers need to adapt to fit into their continuous delivery regimes.

Page 6: How is Testing Changing?

Page 6© 2020 Worksoft

HOW IS TESTING CHANGING?

So, it appears that Continuous Delivery is just Waterfall in the small. What do we know about Waterfall-style developments?

• They are sequential — one stage follows another — no variation

• Dependencies rule — you can’t start one stage before previous stage is done

• They are not re-entrant — no flexibility to react to external events

• Testing has stages itself — we know that testing has itself stages of thinking and activities spread through the process

• Only one phase of testing — but there are developer and tester manual and automated test execution activities

• Testing is squeezed — timeboxed activities — the thinking, preparation and execution time is all limited

• No feedback loop(s) — we know that testing finds bugs — but the continuous process has no feedback loop.

If Agile has taught us anything, it’s that the dependence on staged approaches made Waterfall unsuccessful in more dynamic environments. Staged thinking won’t work in a continuous process. We need another way of looking at process to make Continuous Delivery work.

There are two problems to solve here:

• The first is that there is no one true way or best practice approach to implementing, for example, continuous delivery. Everyone does it slightly differently.

• The second is that employers must take on the role of explaining local practices and embedding skills.

These local practices are what we call logistics. Logistics are how a principled approach is applied locally. Locally might mean “across an entire organization” or it could mean every project you work on works differently. If you work on multiple projects, therefore, you will have to adapt to different practices — even if you are working in the same team.

Principles and thinking are global; logistics are local. We must separate the principles and thinking processes from the logistics, but how do we do this?

Staged thinking won’t work in a continuous process.

Page 7: How is Testing Changing?

Page 7© 2020 Worksoft

HOW IS TESTING CHANGING?

Rethinking TestingWe need to think clearly and remove logistics from our thinking. The simplest way to do this is to identify the aspects of the local environment and descope them, so to speak. Here are the key logistical aspects that we must remove “to clear our minds.”

• Document or not? We don’t care about documentation. Whether and how you record your tests is not germane to the testing thought process.

• Automated or manual? We don’t care whether you run tests by hand, so to speak, or use a tool, or use some form of magic. It isn’t relevant to the thought process. The mechanism for test execution is a logistical choice.

• Agile vs Waterfall? We don’t care whether we work in an Agile team or in a staged, Waterfall project or are part of a team doing continuous delivery. It’s not relevant to the testing thought process.

• Business? We don’t care what business it is. It doesn’t matter.

• Technology? We don’t care what technology we work with. It’s just not relevant to the thought process.

Thinking and Logistics

If we dismiss all these logistics — what’s left? Some people might think we have abandoned everything, but we haven’t. If you set aside logistics, what’s left is what might be called the universal principles and the thought process.

The New Model for Testing is an attempt to identify the critical testing thought processes. A webinar1 and white paper2 give a full explanation of the thinking behind the model, which is reproduced below.

The model doesn’t represent a process with activities, inputs, outputs, entry and exit criteria and procedures. Rather it represents the modes of thinking that people who test go through to achieve their goals. Our brains are such wonderful modelling engines, that we can be thinking in multiple modes at the same time and process thoughts in parallel. It might not be comfortable, but from time to time, we must do this.

The New Model suggests that our thinking is dynamic and event-driven, not staged. It seems like it could be a good model for testing in dynamic and event-driven approaches like continuous delivery.

Using the New Model as the basis for thinking fits our new world of testing.

The 10 thinking activities all have associated real activities (logistics usually) to implement them and if we can improve the way we think about the testing problem, we are better able to make informed choices of how we logistically achieve our goals.

The New Model for Testing is an attempt to identify the critical testing thought processes.

Page 8: How is Testing Changing?

Page 8© 2020 Worksoft

HOW IS TESTING CHANGING?

Figure 1 The New Model for Testing

There are several consequences of using the New Model. We’ll look in particular at two: how we assign status and the impact on skills.

What is Status?

As a collaborative team, all members of the team must have a shared understanding of the status of for example features in the pipeline. Now, the feature may be placed somewhere on a Kanban or other board, but does the location on the board truly represent the status of the feature?

Consider the three roles of user, developer and tester. When it comes to agreeing status, it is possible that the user believes their job is done — the requirement is agreed. The developer might be writing code — it’s a work-in-progress. But the tester might say, “I have some outstanding challenges for the user to consider. There are some gaps in the story and some ambiguous scenarios we need to look at.”

1 Agile Testing Days Webinar, “The New Model for Testing”, https://youtu.be/1Ra1192OpqY 19 July 2019

2 “A New Model for Testing — white paper”, https://gerrardconsulting.com/mainsite/wp-content/uploads/2019/06/NewModelTestingIntro.pdf 19 July 2019

As a collaborative team, all members of the team must have a shared understanding of the status.

Page 9: How is Testing Changing?

Page 9© 2020 Worksoft

HOW IS TESTING CHANGING?

Figure 2 Status is what we are thinking.

What is the status of the feature? Done? Work in progress? Or under suspicion? When all three participants are thinking the same way, then there is a consensus on the status of the feature.

Continuous Collaboration is an essential part of continuous delivery. The New Model provides a framework for meaningful discussion and consensus.

New Skillsets

The initial work of the Tester Skills Program was to define the range of skills required for a testing practitioner. It was understood at the outset that the range of skills meant that there had to be a graduated set of L&D schemes. The inventory of skills would be a “shopping list” of topics that could be part of Foundation, Advanced or Mastery level schemes.

The summary Topic Areas in the inventory appear below:

Adapting Testing Risk Management

Advanced Testing SDET Role

Agile Testing Approaches Technical Testing

Assertiveness Technology Skills

Certification Systems Thinking

Challenging Requirements Test Assurance

Collaboration Test Automation Frameworks

Communication Test Automation

Critical Thinking Coaching

Developer Testing Test Design — Model-Based

Exploratory Testing Test Design — Domain

Exploring Sources of Knowledge Test Design — State-Based

Page 10: How is Testing Changing?

Page 10© 2020 Worksoft

HOW IS TESTING CHANGING?

Facilitation Test Design — Logic

Hiring Testers Test Design — Purposeful Activity

Instrumentation Test Motivation

Modelling Test Strategy

Monitoring Testability

Non-Functional Testing Testing and Stakeholders

Process Improvement Testing Fundamentals

Planning Testing in Teams

Reconciliation Testing in the Very Large

Regression Testing Working Remotely

Requirements Test Analysis

There are quite a few topics that you won’t find on common test training courses. Personal and professional development topics include: Critical Thinking, Assertiveness, Collaboration, Communication, Critical Thinking, Facilitation, Hiring, Process Improvement, Systems Thinking, Coaching, Testing and Stakeholders, Testing in Teams and Working Remotely.

The New Tester Perspective

The tester of the future will have a different profile to the tester of the past.

The first consideration is that the tester’s role in some organisations will shift from being primarily concerned with running tests to providing an assurance service to their team. The scope of this assurance service will vary from project to project but will typically include:

• Risk identification, assessment and monitoring. Product risks, namely those potential failure modes of the product will be identified and evaluated and assigned a test approach, perhaps automated by developers, performed manually by users or testers or automated at the API or UI level.

• Alignment of testing to business goals. Typically, the system features that contribute towards achieving business goals will be tested to ensure they perform as required. In this way, testing will focus on the most valuable aspects of the new system. Testers may define these tests or oversee/monitor them.

• Definition and execution of test strategy. Test strategy is a blend of the risk-based and goal-based approaches above. The tester will define or support the definition of the strategy and ensure testing follows the strategy.

• Coaching/mentoring of the team in testing. Much of the testing in project is performed by non-testers, usually the developers and expert users. The tester will support their testing and provide guidance, coaching and direction.

• Oversight of the team’s testing. Policing is perhaps too strong a label for the oversight role. The tester is traditionally the conscience of the project, and in this sense, provides an independent assessment of the testing to assure stakeholders that their concerns are being addressed.

The tester of the future will have a different profile to the tester of the past.

Page 11: How is Testing Changing?

Page 11© 2020 Worksoft

HOW IS TESTING CHANGING?

• Conducting testing. Of course, the tester’s main focus could be to plan and perform testing on the system.

The tester’s role will be a (usually unique) blend of all the above activities.

Emergence of Interpersonal Skills

One noticeable difference between the new tester skills and what might be called legacy testing skills is the emergence of professional and personal development aspects. Technology leaders are recognising that personal and team skills in the interpersonal and business sense are much more important that previously regarded.

Testers have to extract knowledge from multiple sources to understand the requirement and implementation, they also need to assess the significance of risk and relevance of business goals and to construct a strategy to address stakeholder concerns. The strategy, planning, execution and the meaning of results have to obtained from people but also negotiated, interpreted and presented. Interpersonal skills are critical to a tester’s performance of their role.

Integration in the (sometimes very) Large

In a previous paper, “Large Scale Integration: Risks and Testing,” we discussed the approach to the integration and testing of larger systems. These systems are often hugely complex customised COTS products that are integrated with a mix of larger, legacy systems as well as more recent web-based and mobile applications. The integration “problem” does not go away just because modern approaches and technologies are being adopted in sub-projects that feed into larger programs of work requiring integration and testing.

There are major challenges for organizations implementing large-scale systems with changes being introduced in a rapid sequence. In principle, any change in a tightly integrated system could introduce regression effects in unchanged parts of the system or other integrated systems. There are two key problems to overcome:

1. How to reduce or ideally, eliminate the risks of smaller, but rapid changes.

2. How to reduce the scope, scale and duration of end-to-end testing so as not to be a bottleneck.

The problem is not new. In Waterfall or staged projects, the burden of acceptance and regression testing has always been large. For as long as large systems have existed, project managers and stakeholders have regretted the time and cost of this activity — but it has usually been seen as an inevitable cost of change. Continuous, rapid changes mean the problem is exacerbated because small-scale changes, even when well-tested pre-integration, can cause unknown effects so the need to regression test does not scale down in proportion.

Interpersonal skills are critical to a tester’s performance of their role.

There are major challenges for organizations implementing large-scale systems with changes being introduced in a rapid sequence.

Page 12: How is Testing Changing?

Page 12© 2020 Worksoft

HOW IS TESTING CHANGING?

The techniques available to address this problem are mostly well-known. Taken together, they bring several beneficial effects (e.g. reduce defects, identify/reduce scope, reduce maintenance overhead, accelerate testing):

• Conduct, where possible, a more thorough impact-analysis of the changes before the changes are implemented to pinpoint areas of high integration risk. This would include both technical impact and business impact (reduce defects, identify scope)

• Reduce or eliminate the coupling between components, sub-systems and systems (reduce scope)

• Test-first approaches — extended to large scale integration — make the creation and maintenance of test packs easier (reduce maintenance burden)

• Coordinate a regression test strategy across component, integration in the small and large (efficiency)

• Structure regression testing so that it can be aligned with risks identified above (focus/reduce scope)

• Automate regression testing, including environment, test data set ups and tear down (accelerate)

• Select regression tests to demonstrate "functional equivalence" rather than expose defects; design tests to act as trip wires to detect different, not perfect behavior (efficiency)

• Use analyses of production data to focus on "real" scenarios rather than synthetic scenarios (efficiency)

• Review the performance of your test packs periodically and cull useless tests (efficiency)

• Base regression tests on a model of the integrated system(s) so that what-if questions can be asked (explore scope) and test failures can be traced and diagnosed faster (accelerate)

Page 13: How is Testing Changing?

Page 13© 2020 Worksoft

HOW IS TESTING CHANGING?

The connected activities above need to be blended to suit different organizational goals, but the common theme is that tool support in the following areas will be required to implement them:

• Discovery of business and system processes and modelling

• Use of business models to illustrate proposed changes and ask what-if questions

• Use of the same models as the basis of a covering set of end-to-end tests

• Single source of knowledge of required and implemented business models

• Selection of subsets of test to address specific integration risk concerns

• Reporting of executed tests against the original business model

The Worksoft Connected Automation approach aligns directly with these automation areas.

What Does this Mean for Testers?

There are six changes in attitude and behaviour required:

1. Accept AutomationHistorically, most testers who test without tools have been wary of automation.

Often, their experience of automation has been one of failure. Managers with inflated expectations procure tools selected without an understanding their capabilities, the need for training, implementation, management, and an evolution of process. Failures leave a sour taste and testers usually say, “I told you so.”

But the advance of automation in testing is inevitable and testers need to embrace the changes it forces upon them. They need to adapt to the new ways of thinking and working and change their mindset and attitude. They need to become advocates of “automation in context.” That is, they need to accept and promote automation used in an effective, focused way.

2. Adopt Event-Driven ProcessesContinuous distributed approaches advance and evolve, and organisations are adopting them. Stage-based, phase-based and Waterfall thinking just won’t work anymore. The only alternative is to be far more responsive to interruptions and make collaborative decisions on available evidence.

To make this work, processes need to be event-driven and evidence-driven. Three things are essential:

• Since processes are continuous, feedback loops must be continuous too, which means evidence must be collected constantly, and where events demand, that evidence collection process (automated tests, logging, etc.) needs to change too.

Testers need to become advocates of “automation in context.”

Page 14: How is Testing Changing?

Page 14© 2020 Worksoft

HOW IS TESTING CHANGING?

• People need to respond to events much faster and be prepared for some responses to be automated. This will require better defined processes and human v. automated response mechanisms. People must trust those processes implicitly.

• Testers using exploration or automation to generate evidence need to be flexible and responsive to constantly changing circumstances.

Humans are very adaptable already when it comes to exploratory approaches. It comes naturally to many people. But managers must adapt to allow exploratory approaches to be the fallback position when information is needed quickly. The bigger challenge is to be as responsive when adapting automation to collect evidence.

3. Take Advantage of Adaptable, Model-based AutomationTest execution automation technologies rely on one form or another of execution engine that drives the system under test using a graphical or programmable interface. Execution engines rely on higher- or lower-level programming or scripting languages to control them. A new model-driven testing metaphor is required for the future. These models will articulate the structure of business and computer systems at three levels:

1. Models of the human-computer interactions in terms of workflows, usually implemented as end-to-end (E2E) tests.

2. Models of the human-computer interactions described with a domain specific language (DSL) in the style of given/when/then triplets or traditional, logical test cases or something similar.

3. Models of the user interface (usually GUI) or API. Typical GUI models identify the interactive elements on web, mobile phone or other screen formats often called GUI maps. APIs — usually web services — can be modelled or more precisely, specified using a formal language such as WSDL.

Historically, script languages implemented the human-computer interactions directly, and abstracted models (other than tabulated test data values) were never created. Increasingly, however, tools are enabling the separation and modelling of E2E flows, human-computer interactions and lower-level object interactions.

4. Separation of the Duties of Humans and ToolsThe explanation of automated v. manual testing has never been satisfactory. The usual thinking is that because we have tools and people it is the nature of the tests that determine whether tests are automated or run by humans. But

A new model-driven testing metaphor is required for the future.

Page 15: How is Testing Changing?

Page 15© 2020 Worksoft

HOW IS TESTING CHANGING?

this looks at the problem from the wrong perspective. A better lens is to see the “separation of duties” in terms of the status of the software under consideration:

• Software that is changing (unfinished or defective) needs behaviors to be explored and assessed. Some behaviours are never modelled so tests can’t be automated; other behaviours might not yet be well-enough defined or agreed upon. In either case only humans (for now) are able to assess these aspects of software.

• Software that is stable (or believed so) needs to be checked for functional equivalence (does the behaviour of this version match the previous version?).

It is the status of the models we use that mostly defines where software is changing or stable.

5. Abstract Models, Design and TestabilityFor the multi-layered, multi-modelled approach above to work, designers, developers and testers need tools that implement these models. But modelling systems after they have been built is expensive and unlikely to work. Like documentation, as systems evolve, documentation (which is a model nonetheless) tend to degrade over time and do not align with the systems they represent.

Modeling must be a design activity, not a documentary activity after the event. Designers use models to define the system as well as interactions, use the models to feed back examples to challenge requirements and evidence interactions before implementation. If the model changes to adapt the system to enhance functionality or fix a bug, the tests based on those models must adapt too. If not automatic, at least the impact of changes can be flagged to testers, pointing out the place where tests need to adapt in line.

A consequence of this approach — an elevation of test-first or test-driven design to the system-level, (beyond the program level) means better testability is a pre-requisite of test automation so is assumed from the outset, and not an afterthought.

6. Coach Your ManagementThe use of tools to implement and execute tests is a very simple concept — to managers who don’t understand testers and believe the marketing messages of some tool vendors. Although the execution of tests is simple if you regard testing as simply button pushing, the New Model suggests that the application of tests is just one activity in ten that testers must perform.

It may be that some managers can learn how automation fits into the overall test process and take a much more informed view on how automation can benefit teams and how it should be implemented. But it is more likely that testers must appreciate these matters first, and then articulate to their management: how automation benefits testing, how to separate the workload of tools and people, and how tools and people can provide the rapid feedback that projects require.

Strong critical thinking, communication and persuasion skills may be required.

Modeling must be a design activity, not a documentary activity after the event.

Page 16: How is Testing Changing?

Page 16© 2020 Worksoft

HOW IS TESTING CHANGING?

Paul Gerrard

Paul Gerrard is a consultant, teacher, author, webmaster, developer, tester, conference speaker, rowing coach and a publisher. He has conducted consulting assignments in all aspects of software testing and quality assurance, specializing in test assurance. He has presented keynote talks and tutorials at testing conferences across Europe, the United States, Australia, South Africa and occasionally won awards for them.

Educated at the universities of Oxford and Imperial College London, in 2010, Paul won the Eurostar European Testing excellence Award and in 2013, won The European Software Testing Awards (TESTA) Lifetime Achievement Award.

Paul has written several books on testing and assurance: “Risk-Based E-Business Testing,” “The Tester’s Pocketbook” and “Digital Assurance.”

He is Principal of Gerrard Consulting Limited and is the host of the UK Assurance Leadership Forum.

Mail: [email protected] Twitter: @paul_gerrard Blog: gerrardconsulting.com Web: gerrardconsulting.com

CONTACT US

15851 Dallas Parkway Suite 855 Addison, Texas 75001 (972) 993-0400 (866) 836-1773 [email protected] worksoft.com