test automation—it’s a journey, not a project
Post on 22-Apr-2015
105 Views
Preview:
DESCRIPTION
TRANSCRIPT
nt Session
Presen d by:
Paul Maddison T
Brought to you by:
340 Corporate Way, Suite Orange Park, FL 32073 888‐2
W7 Concurre4/9/2014 12:45 PM
“Test Automation‐It’s a Journey, Not a Project”
te
he CUMIS Group
300,68‐8770 ∙ 904‐278‐0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
Paul Maddison The CUMIS Group
Senior quality assurance analyst Paul Maddison has more than ten years of experience in automated testing using a variety of tools. Working with business analysts and testers to identify automation candidates, Paul is continually expanding testing coverage, increasing the return on investment, and reducing regression testing timeframes. He has coordinated the automation team’s development and maintenance of a regression test bed of more than 8,000 scenarios representing about 75 percent of the overall test effort. Recently, Paul designed and developed a self-serve approach to automation execution for use by business analysts and testers allowing the automation team to focus on coding efforts to replace script execution. Contact Paul at paul.maddison@cumis.com.
1
Test AutomationIt’s a Journey, Not a Project
Paul MaddisonThe CUMIS Grouppaul.maddison@cumis.com
The CUMIS GroupThe CUMIS Group Limited (CUMIS) partners with credit unions to deliver competitive insurance and financial solutions In doing so it creates financial financial solutions. In doing so, it creates financial security and promotes the growth and success of the credit union system in Canada.
As the leading provider of insurance-related products and services to the Canadian credit union system CUMIS serves approximately 380 credit
2
system, CUMIS serves approximately 380 credit unions, with a total of more than five million members.
2
Getting Started» Resources
–Experienced developers.d f–Aptitude for testing.
–Strong unit testing track record.
» Tool selection–Establish your requirements.–Demo on your software.
k d
3
–Take a test drive.–Report generation & augmentation.–Training availability.
»
Getting Started» Management Involvement
–Visible management support.ff l k h–Current staff may look at project as a threat.
–Establish development milestones.–Celebrate successes.–Communicate, communicate, communicate.
4
3
Return On Investment
5
Test Candidate Selection» Prerequisites
– Reliable test environment.
– Existence of effective manual test cases.
– Availability of subject matter experts.
» Failure Impact– Company credibility.
– Affect on bottom line.
M l T i Eff
6
» Manual Testing Effort– High number of resource intensive test cases.
– Similar test cases with various data combinations.
4
Test Candidate Selection
7
Test Candidate Selection» Effort Savings Formula
XX
8
e.g. 10 minutes * 50 test cases * 3 test cycles =
25 hours of manual testing effort
5
Test Candidate Selection» Insurance Premium Calculations
Coverage Term
9
Script Design» Framework For Reusable Code
– Flexible functions for data population and workflow.S i t i t i d d– Script maintenance is reduced.
– Dynamic environment selection.
» Coding Standards– Common naming conventions.– Internal & external documentation.
» Importing Data
10
p g– Allows for creation using other tools and their features.
6
Data Management» Importance of data driven tests
– Ease of expansion & maintenance.E b dd d f l– Embedded formulas.
– Automation script with multiple data sets.
» Watch Out For Dates– Use of day or year offsets. e.g. Birthdate vs Age
» Formatting– True/False.
11
/– Large numeric.
Automated Script Reporting» Purpose of Reports
– Must be adequate enough to manually reproduce failing test casestest cases.
» Levels of Granularity– Summary Report Contents
» Description of each test scenario and the execution result.
» Number of verifications performed.
» Timeframe required for execution.
Si l S i R t C t t
12
– Single Scenario Report Contents» Data used in test scenario, expected and generated values.
– Execution Log Contents» Identification of failing field, expected and generated value.
7
Summary Report
13
Single Scenario Report
14
8
Execution Log
15
Management Reports» Report Generation
– Source of Metrics» Test execution Summary Reports» Test execution Summary Reports.
» Manual testing candidate evaluations.
– Graphics vs. Numbers» Use of illustrations.
» Additional metrics can be supplied if requested.
» Slice & dice results to generate different views.
– Granularity
16
Granularity» Differentiate between Functional & Regression testing.
» Ensure total automation savings are included.
9
Management Reports
17
Management Reports
18
10
Taking It Further» Test data creation
– Manufacture data files with correct formatting for use in automated tests or for processing in other applicationsautomated tests or for processing in other applications.
» Data Extraction– Extract and save data with specified formatting.
» Environment Smoke Testing– Test connectivity between applications and verify
application functionality before starting a test cycle.
19
» Response Metrics– Compile response metrics for business team review.
Questions?
20
top related