1 software testing and quality assurance lecture 30 – testing systems

23
1 Software Testing and Quality Assurance Lecture 30 – Testing Systems

Post on 21-Dec-2015

217 views

Category:

Documents


1 download

TRANSCRIPT

1

Software Testing and Quality Assurance

Lecture 30 – Testing Systems

2

Lecture Objectives Learn how to write test cases from use

cases Learn how to quantify the quality of

testing

3

Complementary strategies for selecting test cases: Orthogonal Defect Classification (ODC)

ODC captures information about the type of faults that are present in a software system under development

4

Use cases as sources of test cases

Constructing use profiles Using scenarios to construct test cases The expected result section of a test

case

5

Use cases as sources of test cases: constructing use profiles

The construction of use profile begins with actors from the use case diagram.

After deployment the use case frequencies can be updated based on actual data and used for regression testing.

6

Use cases as sources of test cases: Using scenarios to construct test cases

The process for identifying specific values from the variables mentioned in a use case has four steps: Identify all of the values that will be supplied by the

actors contained in the use case. Identify equivalence classes of values for each input

data type. Construct tables that list combinations of values

from the various equivalence classes. Construct test cases that combine a single

permutation of values with the necessary environmental constraints.

7

Use cases as sources of test cases: Using scenarios to construct test cases— example

8

Use cases as sources of test cases: Using scenarios to construct test cases— example (cont...)

Each row specifies a test.

9

Use cases as sources of test cases: the expected result section of a test case

For complex systems it is very difficult to determine the expected results from a test run. E.g. spacecraft control software.

10

Use cases as sources of test cases: the expected result section of a test case

Techniques to reduce that amount of effort required to develop expected results: Construct the results incrementally

The test cases are written to cover some subset of a use of the system. E.g. database (from 50 to 1000 records)

Grand tour test cases: results of one test case are the inputs for the next test case

Disadvantage: a failure to test case 1 we can’t run test 2.

11

Testing multiple representations Systems are written using a composite of complied

servers (c++), interpreted clients (java), and multiple data sources.

Important features of these systems from a testing perspective: Interactions between two data models. Interactions between pieces written in different

languages. Interaction between static and dynamic portions of

the program.

12

Testing multiple representations (cont...)

To address these features perform a thorough unit test process:

All pieces of the system should be exercised Test the interactions across language and representation

boundaries.

Test the interfaces between two languages to determine that a complete object is transferred.

13

Testing multiple representations (cont...)

Test the following: Be certain that each connection in either

direction is exercised Validate the results by examining the

transformed objects in detail to ensure that nothing is missing or misinterpreted

Measure coverage by covering all of the primitive types of each language

14

What needs to be tested: testing for qualitative system attributes Testing for qualitative system attributes

Translate each qualitative claim into measurable attribute

Design test cases that can detect the presence or absence of these measurable attributes

Execute the test cases and analyze the results

Aggregate these results to determine if a specific claim is justified.

15

What needs to be tested: testing for qualitative system attributes Validate performance claims:

Quantify the term acceptable performance (a number of transactions per second)

Create new data or use historical data to use in the tests.

Run the tests, collect results and timing data.

Make determination (pass or fail) for the claim.

16

What needs to be tested: testing the system deployment

Testing the system deployment: required for configurable systems and for applications that require dynamic interaction with the environment.

Deployment testing is intended to ensure that the packaging used for the system provides adequate setup and delivers a product in working condition.

17

What needs to be tested: testing the system deployment (cont...)

The initial test case if a full, complete installation. Interactions between options Certain options may not be installed Libraries or drivers may be needed for

other options

18

What needs to be tested: testing the system deployment (cont...)

Levels of installation: Typical Custom full

19

What needs to be tested: deployment testing technique

Identify categories of platforms on which the system will be deployed.

Locate at least one system of each type that has a typical environment but that has not had the system installed on it.

Install the system using the deployment mechanism.

Run a regression set of system tests and evaluate the results.

20

What needs to be tested: test system security Three categories of issues that could be

classified as security: The ability of the application to allow

authorized access and to prevent access by unauthorized persons.

The ability of the code to access all of the needed resources that it needs to execute.

The ability of the application to prevent unauthorized access to other system resources not related to the application.

21

What needs to be tested: test system security

Modularity of executables and the dynamic aspects of the code does raise security issues: Permissions for files deployed in a

directories: test one resource in each directory and one user from each security class.

22

Key points Testing the system deployment:

required for configurable systems and for applications that require dynamic interaction with the environment.

23

Announcement SWE Revised Program Discussion

Monday December 29, 2008 at 12:10-1:00 in 24/141