swe 3643 lesson 1 this material serves to introduce you to...
TRANSCRIPT
SWE 3643 Lesson 1 This material serves to introduce you to testing.
Please read Ch. 1 of your textbook for next lecture.
Ch. 1 A Perspective on Testing - Spr2016Bernal 1
Please come to class with pen/pencil, paper and with the assigned reading/work done.
Participation points(PP) are earned by answering the PP Questions interlaced in the class lecture.
At the end of class lecture – always turn in your class participation sheet to Bernal’s front desk formatted with:
Name SWE 3643 Date 1. Your answer to 1PP – you do not need to
copy the question. 2. …etc…..
Ch. 1 A Perspective on Testing - Spr2016Bernal 2
Definitions and Concepts of Testing and Quality
Ch. 1 A Perspective on Testing - Spr2016Bernal 3
What is
Testing?
What is
Quality ?
How Related?
PP1. Does finding a Lot of Problems through testing improve Quality?
Testing is a relatively new discipline although programmers always “debugged” their programs.
◦ Testing was conducted to show that software “works.”
In the 1970’s Glen Myers wrote a book, Art of Software Testing (1978).
◦ He believed that the main purpose of testing is to find
“faults.”
Are these (“works” .vs. “faults”) opposing views? PP2. Why do we and why should we Test ?
Ch. 1 A Perspective on Testing - Spr2016Bernal 4
Because we can’t assume that the software “works” ◦ How do we answer the question: “Does this software work?” Is it reliable ? (system is continuously functioning > 720hrs) Is it functional ?
Complete Consistent
Is it responsive ? (response time < 1 second)
In general, “what is the Quality of our software?” ◦ How do we answer this?
Ch. 1 A Perspective on Testing - Spr2016Bernal 5
PP3. How would you answer this about the program that you wrote? How would you answer this about the program your friend wrote?
Error ◦ Represents mistakes made by people
Fault ◦ Is result of error. May be categorized as Fault of Commission – we enter something into representation
that is incorrect Fault of Omission – Designer can make error of omission, the
resulting fault is that something is missing that should have been present in the representation
Failure ◦ Occurs when fault executes.
Incident ◦ Behavior of fault. An incident is the symptom(s) associated
with a failure that alerts user to the occurrence of a failure Test case ◦ Associated with program behavior. It carries set of input
and list of expected output
Requirement Specs
Design
Coding
Testing
Fault Resolution
Fault Isolation
Fault Classification
Error
Fault
Fault
Fault
Error
Error
incident
Fix PP4. What is the result of an error?
Verification ◦ Process of determining whether output of one
phase of development conforms to its previous phase.
Validation ◦ Process of determining whether a fully developed
system conforms to its SRS document Verification versus Validation ◦ Verification is concerned with phase containment of
errors ◦ Validation is concerned about the final product to
be error free
Depends on what we include as software ◦ Just the executable code ----- How about ? the pre-populated data in the database the help text and error messages the source logic code the design document the test scenarios and test results the reference manuals the requirements document
When we talk about quality how much of the above (do we / should we) include?
How would you “test” these different artifacts?
Ch. 1 A Perspective on Testing - Spr2016Bernal 9
Some of us “hope” that our software works as opposed to “ensuring” that our software works? Why? Just foolish Lazy Believe that its too costly (time, resource, effort, etc.) Lack of knowledge
◦ DO NOT use the “I feel lucky” or “I feel confident” approach to
testing - - - - although you may feel that way sometimes.
Use a methodical approach to testing to back up the “I feel ‘lucky/confident” feeling
◦ Methods and metrics utilized must show VALUE ◦ Value, unfortunately, often are expressed in negative terms Severe problems that cost loss of lives or business Problems that cost more than testing expenses and effort
Ch. 1 A Perspective on Testing - Spr2016Bernal 10
Ch. 1 A Perspective on Testing - Spr2016Bernal 11
1. Traditionally, testing includes executing the code with test cases. (assuming - code is the main software artifact)
2. What do we do with the non-executable software artifacts?
Reviews and Inspections
- Expensive in terms of human resources - A lot to maintain and keep updated
3. Can we “prove” that the software works or is defect free?
- Theoretically, given an arbitrary program we can not show that it has no bug.
- We can use Formal Proofs to show the behavior of code
Professionals taking a course in testing: ◦ 60% were new to testing ◦ 20% had 1 to 5 years of experience in testing ◦ 20 % expert testers
Metric used in testing ◦ a) Most regularly used: Counting bugs found and ranking them by
severity ◦ b) Small number used : bug find rate or bug fix rate
Formal Methods used ◦ Almost none formally trained in inspection or analysis
Test tool: ◦ 76 % had been exposed to some automated test tool
Test definitions ◦ Most of the practicing testers could not supply good definitions of
testing terms; they just did the work!
Ch. 1 A Perspective on Testing - Spr2016Bernal 12
Ch. 1 A Perspective on Testing - Spr2016Bernal
13
-PC and Desktop Computing became ubiquitous -multiple vendors -quicker product development & shorter term investment -systems ran by non-computer individuals
1990’s
**New product was fashionable & “reboot” became acceptable.**
-Web availability to many -Business conducted on the Web -Software and systems are not hobbies but a “business” again
Late 1990’s & 2000’s
**Product Reliability & Quality is once again important** --especially SECURITY--
-Large Host & Centralized Systems -Single Vendor (hdw-sw-serv) -Long term development and long term investment(10 yrs) -Single platform -Systems ran by computer professionals
1980’s & Before
**Product Reliability & Quality was required and expected**
Even though we didn’t get good quality
1. Software is written by many with the “entrepreneurial” spirit: ◦ Speed to market ◦ New & innovation is treasured ◦ Small organization that can’t afford much more than
“coders”
2. Embracing “Agile” process and mistaking it as “fast production regardless of what”: ◦ Not much documented (requirements/design) ◦ Hard to test without documented material
3. Lack of Trained/Good/Experienced Testers
◦ Testers are not quite rewarded “as equals” to designers, but definitely gaining on and sometimes surpassing programmers
(takes “good” cops to catch the thieves) 4. Improvement in tools and standards making
development easier and less error prone Ch. 1 A Perspective on Testing -
Spr2016Bernal 14
PP5. Why ?
How ?
What is Quality? Some common comments: ◦ “I know it when I see it” ◦ Reliable ◦ Meets requirements ◦ Fit for use ◦ Easy to use ◦ Responsive ◦ Full function / full feature ◦ Classy and luxurious
Ch. 1 A Perspective on Testing - Spr2016Bernal 15
PP6. How would you define quality?
Pioneers: ◦ Joseph M. Juran Quality =>Fitness for use ◦ W. Edward Deming Quality => Non-faulty system
More recently: ◦ Philip Crosby Quality => conformance with requirements Achieve quality via “prevention” of error Target of Quality is “zero defect” Measure success via “cost of quality”
Ch. 1 A Perspective on Testing - Spr2016Bernal 16
Quality is composed of several characteristics: ◦ Functionality Completeness/Consistency ◦ Reliability ◦ Usability ◦ Efficiency ◦ Maintainability ◦ Portability
Ch. 1 A Perspective on Testing - Spr2016Bernal 17
How do you know if it is achieved & how would you “test” for these? Do these have to be specified in the requirements? ---- if they are not-------- do we ask for these during review/inspection?
What do these mean ?
Quality is a characteristic or attribute ◦ Needs to be clearly defined and agreed to May have sub-attributes (e.g. previous page) Needs specific metrics for each of the sub-attributes ◦ Needs to be measured with the defined metric(s) Needs to be tracked and analyzed Needs to be projected Needs to be controlled
Testing and Measurement are two key
activities that would help us manage quality.
Ch. 1 A Perspective on Testing - Spr2016Bernal 18
PP7. Explain ----- the relationship ?
Imagine what it is like without this.
Schedule (first to market) Requirements (“bad” or “missing”) New and Exciting (demands of “WANT” not “need”) Price and Availability (retail customers)
Ch. 1 A Perspective on Testing - Spr2016Bernal 19
Quality requirements Does Not always dictate schedule ! ◦ Market Condition often dictates schedule (especially for small
companies) BUT ◦ For large and multiple release software, quality is still a factor and may
affect schedule ---- albeit schedule is seldomly changed for quality ◦ Software development process today incorporates both the need for
speed and quality (incorporating the notion of ) a) service cost and b) rewrite for a replacement new product.
Quality does not require “zero defect” Reliability ◦ Commercial (non-life critical or mission critical) products are not
developed with “zero defect” goal in mind. They are much more market driven --- market prefers but does not “demand” zero defects.
Focus on proper support Focus on main functions and heavily used areas (not all defects are the same) Focus on customer productivity (e.g. easy to learn and use) Zero Defect is very expensive proposition ( time & resource)
Ch. 1 A Perspective on Testing - Spr2016Bernal 20
◦ Users may not know all the requirements (especially for large, complex systems which require professional or legal knowledge.)
◦ Users may not have the time or interest to “really focus” on the requirements at the time when asked (timing problem). Users have their own fulltime jobs.
◦ Users may not know how to prioritize needs from wishes ◦ Users may not know how to articulate clearly all the
requirements. (They are non-software development people.) ◦ Developers may not listen well or properly interpret the users
statements. (They are not industry specialists.)
Ch. 1 A Perspective on Testing - Spr2016Bernal 21
Requirements is a key factor in software development ---- why? How does it affect software quality? ----- think about definitions of Quality --- “meets requirements”
◦ Not all requirements are technically feasible; sometimes the “desired” new technology needs to be prototyped first.
◦ Sometimes the requirements are changed, causing re-design or re-code without proper assessment of schedule impact.
◦ Requirements are not always reviewed and signed off, but sometimes given in verbal form --- especially small changes.
◦ People mistake iterative development to mean continuous change of requirements.
Ch. 1 A Perspective on Testing - Spr2016Bernal 22
What’s the danger here? – cost, schedule, quality
◦ “If the product has all the ‘needed’ features it would sell” --- is not necessarily true; people often WANT new & extra features.
◦ Reliability is not always enough; sometimes customers will sacrifice quality for new and exciting features. The story of IBM OS/2 operating system and Microsoft’s DOS
operating system (even though both was commissioned by IBM). IBM went for Reliability of the old Host Machines for desktop PC’s Microsoft went for exciting individual user interfaces
Ch. 1 A Perspective on Testing - Spr2016Bernal 23
Over-emphasis of “exciting features” is one reason why we are regressing a little in software quality in the last ten years !
**Still, consider Apple i-phone success in spite of its activation & other problems
◦ At the commodity level software, the customers are individuals who wants the product NOW at an competitive price. (much like shopping for a home appliance such as a coffee maker, T.V. or an i-phone)
◦ Sophisticated and full feature software needs to be balanced and sometimes traded off for price and speed.
◦ Customers don’t always “need” all the functions and product maturity they think they require ---- if the price is right!
Ch. 1 A Perspective on Testing - Spr2016Bernal 24
Schedule (first to market) Requirements (“bad” or “missing”) New and Exciting (demands of “WANT” not
“need”) Price and Availability (retail customers)
Ch. 1 A Perspective on Testing - Spr2016Bernal 25
Customer Oriented “Goals” (Example): ◦ Show that your product “works” ( ---- perhaps x%) Test all main paths and show that there is no problem
◦ Show that your intent is customer satisfaction: Test and find as much problem as possible and fix them before
release, focusing on attributes such as: “usability” “reliability” “functional completeness” “price” and “innovation”
Developer Oriented “Goals” (Example): ◦ Focus on both product and process Process includes ample “testing” activities ---- within
cost/schedule Product is maintainable, easy to understand, reliable, complete,
etc.
Ch. 1 A Perspective on Testing - Spr2016Bernal 26
1. Define the sub-attributes of “Quality” interest
2. Define a metric or use an existing metric for that sub-attribute
3. Set a “goal” for that Quality interest ---- quantitative one
4. Measure, collect/record and analyze the collected data as we progress through the project.
5. Relate to (or prognosticate) and Assess the general quality of the product
Ch. 1 A Perspective on Testing - Spr2016Bernal 27
As software size and complexity increased in the 1960’s and 1970’s, many software projects started to fail. ◦ Many software did not perform the required functions ◦ Others had performance (speed) problems ◦ Some had large number of defects that prevented users from
completing their work; some just flat out won’t even install !
Software “Quality” Crisis was recognized and Quality Assurance was born, along with the term Software Engineering (1968 NATO conference).
Ch. 1 A Perspective on Testing - Spr2016Bernal 28
Software QA focused on 2 main areas
1. Software product 2. Software process
The focus on the process areas “borrowed” many techniques from the traditional i) manufacturing area and ii) systems engineering area ◦ Concept of reliability (number of defects, mean time to
failure, probability of failure, etc. metrics) ◦ Concept of process control in terms of looking at
“repeatability” of process--- repeatable process produces “similar” product (or controllable results).
Ch. 1 A Perspective on Testing - Spr2016Bernal 29
emphasis today : e.g. CMM & CMMI
A period of i) heavy emphasis on software development process and ii) excessive documentation dominated QA --- initially this improved the “software crisis.”
1. Process included multiple steps of reviews 2. Process included multiple steps of test preparation,
test execution, and test result analysis 3. Process was controlled by many documents and
document flow which also improved project communications
But ----- the price paid was ◦ a) speed and ◦ b) some innovation.
Ch. 1 A Perspective on Testing - Spr2016Bernal 30
Very Small Enterprises ( ≤ 25 people) could not afford process & documentation!
(**** Lots of energy spent on process, LESS on product ****)
Software Development is extremely labor intensive ◦ BUT ---People are not uniform like machines
used in manufacturing.
Software Development often requires some innovation ◦ Every software seems to be a one of its kind
although more and more are becoming “standardized by domain” ◦ The same set of people do not get to repeatedly
develop the exactly same software multiple times. ◦ .
Ch. 1 A Perspective on Testing - Spr2016Bernal 31
Improve the creation and maintenance of documents via an on-line repository or configuration manager
◦ Centrally protected data ◦ Sharable data ◦ Managed data (data relationships are managed)
Improve the “usability” and “productivity” by providing more and better visualization of data
◦ Replace numbers with graphs and figures ◦ Replace words with pictures
Ch. 1 A Perspective on Testing - Spr2016Bernal 32
track, project, control
Test Methodology Improvements ◦ Test coverage analysis ◦ Test case generation ◦ Test-Fix-Integrate Process ◦ Test results analysis ◦ Test metrics definition and measurements process ◦ Etc.
Test tools improvements ◦ Test coverage computation ◦ Test trace ◦ Test script generator ◦ Test result records keeping and automated analysis ◦ Build and integration (daily builds) ◦ Etc.
Ch. 1 A Perspective on Testing - Spr2016Bernal 33
Today we test because we know that systems have problems - - - we are fallible.
1. To find problems and find the parts that do not
work
2. To understand and show the parts that do work
3. To assess the quality of the over-all product (A major QA and release management responsibility)
Ch. 1 A Perspective on Testing - Spr2016Bernal 34
Error ◦ A mistake made by a human ◦ The mistake may be in the requirements, design, code, fix,
integration, or install Fault ◦ Is a defect or defects in the artifact that resulted from an error ◦ There may be defects caused by errors made that may or may
not be detectable (e.g. error of omission) Failure ◦ Is the manifestation of faults when the software is “executed.” Running code May show up in several places May be non-code related (e.g. reference manual)
Incident ◦ Is the detectable symptom of failures
Ch. 1 A Perspective on Testing - Spr2016Bernal 35
(includes error of omission and “no-code?”)
Example? (bank accnt)
Testing is concerned with all, but may not be able to detect all ◦ Errors ◦ Faults ◦ Failures ◦ Incidents
Testing utilize the notion of test cases to perform the activities of test(s) ◦ Inspection of non-executables ◦ Executing the code ◦ Analyzing and formally “proving” the non-
executables and the executable in a business workflow (or user) setting
Ch. 1 A Perspective on Testing - Spr2016Bernal 36
Ch. 1 A Perspective on Testing - Spr2016Bernal 37
Requirements
Design
Code Testing
Error
Error
Error
Fixing
Error
faults
faults
Faults/failures
Inspection
Inspection
Fixing
Note that in “fixing” failures, one can commit errors and introduce faults
Test Case number Test Case author A general description of the test purpose Pre-condition Test inputs Expected outputs (if any) Post-condition
Test Case history: ◦ Test execution date ◦ Test execution person ◦ Test execution result (s)
Ch. 1 A Perspective on Testing - Spr2016Bernal 38
Specified (expected) Behavior
Programmed (observed) Behavior
Missing Functionality (Fault of Omission)
Extra Functionality (Fault of Commission)
Correct portion
Correctness Impossible to demonstrate A term from “classical” computer science –“proofs” derived from code –Not derived from specification –Can only prove that the code does what it does! Better viewpoint: a relative term—program P is correct with respect to
specification S. Bottom Line: do the specification and the program meet the
customer/user's expectation
There are two levels of classification ◦ One distinguishes at granularity level Unit level System level Integration level ◦ Other classification (mostly for unit level) is based
on methodologies Black box (Functional) Testing White box (Structural) Testing
Ch. 1 A Perspective on Testing - Spr2016Bernal 41
Specification Implementation
Expected Actual
The ideal place where expectation and actual “matches” The other areas are of concern ---- especially to testers
Ch. 1 A Perspective on Testing - Spr2016Bernal 42
Area S = Yellow=Specification Area P = Blue= Programmed-Implementation
Area T = Pink=Test Cases What do these numbered regions mean to you?
S=Spec->Expected P=Program->Actual
T=Tested
5 6 2
4
1
3
7 8
Black box testing (Functional Testing) ◦ Look at mainly the input and outputs ◦ Mainly uses the specification as the source for
designing test cases. ◦ The internal of the implementation is not included
in the test case design. ◦ Hard to detect “missing” specification
White box testing (Structural Testing) ◦ Look at the internals of the implementation ◦ Design test cases based on the design and code
implementation ◦ Hard to detect “extraneous” implementation that
was never specified
Ch. 1 A Perspective on Testing - Spr2016Bernal 43
Use the same “form” describing the test case --- see earlier slide on recording test case and expand the “results” to include: ◦ State Pass or Fail on the Execution Result line ◦ If “failed”:
1. Show output or some other indicator to demonstrate the fault or failure
2. Assess and record the severity of the fault or failure found
Ch. 1 A Perspective on Testing - Spr2016Bernal 44
Very High severity – brings the systems down or a function is non-operational and there is no work around
High severity – a function is not operational but there is a manual work around
Medium severity – a function is partially operational but the work can be completed with some work around
Low severity – minor inconveniences but the work can be completed
Ch. 1 A Perspective on Testing - Spr2016Bernal 45
Mild – misspelled word Moderate - misleading or redundant info Annoying – truncated names; billing for $0.00 Disturbing – some transactions not processed Serious - lose a transaction Very serious - incorrect transaction execution Extreme – Frequent very serious errors Intolerable - database corruption Catastrophic – System shutdown Infectious - Shutdown that spreads to others
Ch. 1 A Perspective on Testing - Spr2016Bernal 46
Increasing severity
Input/output faults Logic faults Computation faults Interface faults Data faults
Ch. 1 A Perspective on Testing - Spr2016Bernal 47
Why do you care about these “types” of faults (results of errors made)?
Because they give us some ideas of what to look for in designing future test cases --- in process of for other products developed by same people
Ch. 1 A Perspective on Testing - Spr2016Bernal 48
Program unit A
Program unit B
Program unit T
.
.
.
Function 1
Function 2
Function 8
.
.
Component 1
Whole System
Component 3
.
Unit Testing Functional Testing Component Testing System Testing
Catastrophic problems (e.g. life or business ending ones) do not need any measurements---but--- others do: ◦ Measure the cost of problems found by customers Cost of problem reporting/recording Cost of problem re-creation Cost of problem fix and retest Cost of solution packaging and distribution Cost of managing the customer problem-to-resolution steps
◦ Measure the cost of discovering the problems and fixing them prior to release Cost of planning reviews and testing Cost of executing reviews and testing Cost of fixing the problems found and retest Cost of inserting fixes and updates Cost of managing problem-to-resolution steps
◦ Compare the above two costs AND include loss of
customer “good-will” Ch. 1 A Perspective on Testing -
Spr2016Bernal 49
Test as much as time allows us ◦ Execute as many test cases as (schedule) time allows?
Validate all the “key” areas ◦ Test only the designated “key”requirements?
Find as much problems as possible ◦ Test all the likely error prone areas and maximize test
problems found? Validate the requirements ◦ Test all the requirements?
To reach a quality target
Ch. 1 A Perspective on Testing - Spr2016Bernal 50
State your goal(s) for testing. - - - what would you like people to say about your system? Your goals may dictate your testing process
Very important to think about this ---
Quality Target?