avoiding shelfware: a manager’s view of …• don’t put off finding bugs in order to write test...

Post on 28-Aug-2020

4 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #1

STAR 98STAR 98

Avoiding Shelfware:

A Manager’s View ofAutomated GUI Testing

Cem Kaner J.D., Ph.D., ASQ-CQE

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #2

OverviewOverview

• The GUI regression test paradigm

• 19 common mistakes

• 27 requirements questions

• Planning for short-term ROI

• Successful architectures

• Conclusions from LAWST

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #3

GUI Regression Test ParadigmGUI Regression Test Paradigm

• Create a test case.

• Run it and inspect the output

• If program fails, report bug and try later.

• If program passes, save the resultingoutputs.

• In future tests run the program andcompare the output to the saved results.Report an exception when the currentoutput and the saved output don’t match.

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #4

Common Mistakes and ProblemsCommon Mistakes and Problems

• Underestimated cost

• Delayed testing, adding even more cost(albeit hidden cost.)

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #5

Common Mistakes and ProblemsCommon Mistakes and Problems

• Low power of regression testing

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #6

Common Mistakes and ProblemsCommon Mistakes and Problems

• Failure to recognize that we are writingapplications.

• Adoption of appalling design andprogramming practices.

» Embedded constants

» No modularity

» No source control

» No documentation

» No requirements analysis

» No wonder we fail.

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #7

Common Mistakes and ProblemsCommon Mistakes and Problems

• Maintainability is a core issue becauseour main payback is usually in the nextrelease, not this one.

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #8

19 Common Mistakes19 Common Mistakes

• Don’t underestimate the cost ofautomation.

• Don’t underestimate the need for stafftraining.

• Don’t expect to be more productive overthe short term.

• Don’t spend so much time and effort onregression testing.

• Don’t use instability of the code as anexcuse.

• Don’t put off finding bugs in order to writetest cases.

• Don’t write simplistic test cases.

• Don’t shoot for “100% automation.”

• Don’t use capture/replay to create tests.

• Don’t write isolated scripts in your sparetime.

• Don’t create test scripts that won’t beeasy to maintain over the long term.

• Don’t make the code machine-specific.

• Don’t fail to treat this as a genuineprogramming project.

• Don’t “forget” to document your work.

• Don’t deal unthinkingly with ancestralcode.

• Don’t give the high-skill work tooutsiders.

• Don’t insist that all of your testers beprogrammers.

• Don’t put up with bugs and crappysupport for the test tool.

• Don’t forget to clear up the fantasiesthat have been spoonfed to yourmanagement.

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #9

Requirements AnalysisRequirements Analysis

• Requirement: “Anything that drivesdesign choices.”

• 27 questions on later slides

• HERE’S ONE KEY EXAMPLE:

»Will the user interface of theapplication be stable or not?

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #10

Short Term Short Term ROIROI

• Smoke testing

• Configuration testing

• Stress testing

• Performance benchmarking

• Other tests that extend your reach

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #11

27 Requirements Questions27 Requirements Questions

• Will the user interface of the applicationbe stable or not?

• To what extent are oracles available?

• To what extent are you looking fordelayed-fuse bugs (memory leaks, wildpointers, etc.)?

• Does your management expect torecover its investment in automationwithin a certain period of time? How longis that period and how easily can youinfluence these expectations?

• Are you testing your own company’scode or the code of a client? Does theclient want (is the client willing to payfor) reusable test cases or will it besatisfied with bug reports and statusreports?

• Do you expect this product to sellthrough multiple versions?

• Do you anticipate that the product will bestable when released, or do you expect tohave to test Release N.01, N.02, N.03 andother bug fix releases on an urgent basisafter shipment?

• Do you anticipate that the product will betranslated to other languages? Will it berecompiled or relinked after translation (doyou need to do a full test of the program aftertranslation)? How many translations andlocalizations?

• Does your company make several productsthat can be tested in similar ways? Is therean opportunity for amortizing the cost of tooldevelopment across several projects?

• How varied are the configurations(combinations of operating system version,hardware, and drivers) in your market? (Towhat extent do you need to test compabilitywith them?)

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #12

27 Requirements Questions27 Requirements Questions

• What level of source control has beenapplied to the code under test? To whatextent can old, defective codeaccidentally come back into abuild?.How frequently do you receivenew builds of the software?

• Are new builds well tested (integrationtests) by the developers before they getto the tester?

• To what extent have the programmingstaff used custom controls?

• How likely is it that the next version ofyour testing tool will have changes in itscommand syntax and command set?

• What are the logging/reportingcapabilities of your tool? Do you haveto build these in?

• To what extent does the tool make it easyfor you to recover from errors (in the productunder test), prepare the product for furthertesting, and re-synchronizethe product andthe test (get them operating at the samestate in the same program).

• .(In general, what kind of functionality willyou have to add to the tool to make itusable?)

• .Is the quality of your product drivenprimarily by regulatory or liabilityconsiderations or by market forces(competition)?

• .Is your company subject to a legalrequirement that test cases bedemonstrable?

• .Will you have to be able to trace test casesback to customer requirements and to showthat each requirement has associated testcases?

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #13

27 Requirements Questions27 Requirements Questions

• Is your company subject to audits orinspections by organizations that prefer tosee extensive regression testing?

• If you are doing custom programming, isthere a contract that specifies theacceptance tests? Can you automatethese and use them as regression tests?

• What are the skills of your current staff?

• Do you have to make it possible for non-programmers to create automated testcases?

• To what extent are cooperativeprogrammers available within theprogramming team to provide automationsupport such as event logs, more uniqueor informative error messages, and hooksfor making function calls below the UIlevel?

• What kinds of tests are really hard in yourapplication? How would automation makethese tests easier to conduct?

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #14

Six Successful ArchitecturesSix Successful Architectures

1. Quick & dirty

2. Data-driven

3. Framework

4. Application-independent data-driven

5. Real-time simulator with event logs

6. Equivalence testing: Random tests using an oracle

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #15

Data-Driven ArchitectureData-Driven Architecture

• The program’s variables are data

• The program’s commands are data

• The program’s UI is data

• The program’s state is data

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #16

Data-Driven ArchitectureData-Driven Architecture

<<<Big Graphic Goes Here >>>CaptionTall rowboundingboxShort row

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #17

Data-Driven ArchitectureData-Driven Architecture

Cap

tio

n

loca

tio

n

Cap

tio

n

typ

efac

e

Cap

tio

n

styl

e

Cap

tio

n

Gra

ph

ic

(CG

)

CG

fo

rmat

CG

siz

e

Bo

un

din

g

bo

x w

idth

1 Top Times Normal Yes PCX Large 3 pt.2 Right Arial Italic No 2 pt.3 Left Courier Bold No 1 pt.

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #18

Data Driven ArchitectureData Driven Architecture

Note with this example:• we never ran tests twice

• we automated execution, not evaluation

• we saved SOME time

• we focused the tester on design andresults, not execution.

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #19

FrameworksFrameworks

Code libraries:• modularity

• reuse of components

• hide design evolution of UI or toolcommands

• partial salvation from the custom controlproblem

• important utilities such as error recovery

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #20

Application-Independent ExampleApplication-Independent Example

Numeric Input FieldN

othi

ng

LB o

f val

ue

UB

of v

alue

LB o

f val

ue -

1

UB

of v

alue

+ 1

0 Neg

ativ

e

LB n

umbe

r of

dig

its

or c

hars

UB

num

ber

of d

igits

or

cha

rs

Em

pty

field

(cl

ear

the

defa

ult v

alue

)

Out

side

of U

B

num

ber

of d

igits

or

char

s

Non

-dig

its

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #21

Two More Non-Regression ArchitecturesTwo More Non-Regression Architectures

• Simulator with event logs

• Random testing using an oracle.

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #22

Think About:Think About:

• Automation is software development.

• Regression automation is expensive andinefficient.

• Automation need not be regression--you canrun new tests instead of old ones.

• Maintainability is essential.

• Set management expectations with care.

Copyright (c) Cem Kaner, 1998. All rights reserved. kaner@kaner.com 408-244-7000 Slide #23

Los Altos WorkshopLos Altos Workshopon Software Testing (LAWST)on Software Testing (LAWST)

Much of this material was developed at the first 3 meetings ofLAWST. These are non-profit (no charge, invitation-only)meetings of experienced consultants and practitioners, inwhich we share good practices and lessons learned ontightly defined issues. LAWST 1-3 participants were:

Chris Agruss,Tom Arnold, James Bach, Richard Bender, JimBrooks, Karla Fisher, Chip Groder, Elizabeth Hendrickson,Doug Hoffman, Keith Hooper, III, Bob Johnson, Cem Kaner(host / founder of LAWST), Brian Lawrence (facilitator & co-host of LAWST), Tom Lindemuth, Brian Marick, ThangaMeenakshi, Noel Nyman, Jeffery E. Payne, Bret Pettichord,Drew Pritsker, Johanna Rothman, Jane Stepak, MeloraSvoboda, Jeremy White, and Rodney Wilson.

top related