software testing requirements in safety-related...
Post on 18-Jul-2020
1 Views
Preview:
TRANSCRIPT
International Conference On
Software Testing, Analysis & ReviewNovember 19 - 23 Stockholm, Sweden
P r e s e n t a t i o n
W1Wednesday 21st November, 2001
Software TestingRequirements in
Safety-Related Standards
Stuart C Reid
Wednesday 21 November 2001
W1
Software Testing Requirements in Safety-Related Standards Stuart C Reid Stuart Reid is a Senior Lecturer in Software Engineering for Cranfield University at the Royal Military College of Science. His research interests include software testing and process improvement. He is Chair of the BCS SIGIST Standards Working Party, which was responsible for the development of the software component testing standard (BS 7925-2) and a vocabulary of software testing terms (BS 7925-1). This working party is now developing a standard on non-functional testing techniques. He is also Chair of the ISEB Software Testing Certificate Board. He has a BSc (Hons) in Aeronautics and Astronautics from Southampton University, an MSc in Computing from the University of Wales (Cardiff), and a PhD in Software Testing from the University of Glamorgan. He has presented papers and tutorials at conferences in Europe, America and Asia.
1
Software Testing Requirements in Safety-Related Standards
Stuart Reid Cranfield University RMCS, Shrivenham
UK
This paper initially introduces the role of standards in specifying the testing requirements for safety-related software. It explains that there is an obvious requirement for industry-specific non-software standards to specify those features requiring standardisation that are peculiar to certain application areas. For instance, regulations regarding pressurisation and depressurisation in passenger aircraft are inappropriate for inclusion in standards for automobile or train manufacture, but wholly appropriate for inclusion in a civil aircraft standard. However, there is less justification for application-specific software standards, which appear to have emerged for historical reasons. In the safety-related arena many of the software standards are concerned with the development and testing of real-time control software. It is argued that there are no significant differences between the approaches for developing and testing this type of software, whether it is for use in an aircraft, automobile, or train, say, and thus there is no logical reason for separate standards to describe the approach. A classification scheme based on the ‘type’ of software (operating system, real-time control, database, knowledge-based, etc.) is suggested as a more reasonable alternative, where variations of software standard may be more easily justified.
The benefits of using software integrity levels in software testing standards are presented next. Safety-related standards were the first to include the concept of integrity levels, which allow a single standard to define different requirements dependent on the integrity level of the product to which the standard is being applied. Integrity levels are assigned on the basis of some form of risk analysis. Once the integrity level is determined then the corresponding requirements (techniques, coverage level, etc.) are selected based on that integrity level.
Many industries in the safety-related domain have their own set of software development standards that include software testing requirements. Examples of such industry-specific standards are DO-178B (avionics) [7], MISRA (automotive) [5], Def Stan 00-55 (defence) [1], and IEC 880 (nuclear) [2]. More generically, IEC 61508 [3] includes requirements for the software development and testing of safety-related systems in general, rather than for a particular industry area. IEC 61508 [3] may also be used as a ‘template’ for sector-specific standards, such as pr EN 50128 (railway control and protection systems) [6]. The paper investigates and compares the software testing requirements of each of the aforementioned standards. The differences and inconsistencies presented in the software testing requirements of the investigated standards demonstrate the difficulty of identifying (and communicating) the most efficient set of test techniques and completion criteria.
2
The paper goes on to identify further problems caused by different standards setting different software test requirements. These include difficulties with staff mobility and the reuse of software components when moving between industry sectors, as well as the extra complexity involved with multi-domain systems, which are required to comply with more than one standard. Given the argument that diverse, application-specific software standards are generally unnecessary and cause the problems mentioned above, then a common set of test requirements for a standard integrity level system are proposed as a potential solution. These are based on the results of this investigation, the results of earlier studies on test effectiveness, and experience.
The argument for a single set of software testing requirements matched to a standard set of integrity levels is extended to propose that the expertise gained from the field of safety-related applications be transferred for use in other (non safety-related) industry sectors. Finally a framework of standards to support software testing across all application domains is presented. The proposed framework includes application-specific standards for generating risk criteria, which are then applied to a standard for generating software integrity levels (such as [9]). Once integrity levels for software have been determined then they can be applied to the verification and validation standard, which will be used to decide which testing phases and activities to apply. Recently updated, IEEE 1012 [4], which specifically defines software verification and validation requirements based on integrity levels, is suitable for this task. The integrity levels could then be used in individual test phase standards to determine which techniques and completion criteria to apply in that phase. All testing standards would use a common terminology defined in a single vocabulary, an expanded version of BS 7925-1 [10]. Further details of this proposed framework and supplementary information on non safety-related standards are available in [8].
References
[1] Def Stan 00-55, Requirements for Safety-Related Software in Defence Equipment, Issue 2, 1997.
[2] IEC 880, Software for computers in the safety systems of nuclear power stations, 1986.
[3] IEC 61508:1998, Functional safety of electrical/electronic/programmable electronic safety-related systems.
[4] IEEE Std 1012-1998, Standard for Software Verification and Validation, 1998.
[5] Motor Industry Software Reliability Association (MISRA) Development Guidelines for Vehicle Based Software, 1994.
[6] pr EN 50128, Software for railway control and protection systems, Draft European Standard, Nov 1995.
[7] RTCA DO-178B, Software Considerations in Airborne Systems and Equipment Certification, RTCA, 1992.
[8] Reid, S., Software Testing Standards - do they know what they’re talking about?, Proceedings of EuroSTAR 2000, Copenhagen, Dec 2000
[9] ISO/IEC 15026:1998, Information Technology – System and software integrity levels.
[10] BS 7925-1-1998, Software Testing –Vocabulary.
1
© Stuart Reid, 2001
Software Testing Requirements
inSafety-Related Standards
Stuart ReidCranfield UniversityRMCS, Shrivenham
UK
2
Overview
• Background on Safety-Related Standards
• Integrity levels and risk-based testing
• Test requirements in various standards
• Problems (and solutions) for Safety-Related Standards
• A framework for software testing standards
• Conclusion and questions
3
Current Position
• Many industries/application areas have their own safety-related standards
• Alternatively, IEC 61508 is a generic safety-related standard (also used as a template for sector-specific standards)
• Software testing requirements in safety-related standards– are different for each industry sector
– are often incomplete, inconsistent, and/or contradictory
– are generally based on risk
4
Application-Specificnon-Software Standards
• There are good reasons for application-specific non-software standards– Different application areas have self-evident reasons for
different hardware requirements
• For instance, collision management standards are obviously different for aircraft (negligible), automobiles (bumpers) and trains (buffers).
5
Application-SpecificSoftware Standards
• Industries, which were justifiably producing application-specific hardware standards, started to produce software standards as their products began to include significant software content
Hardware
1950 2000
Software
COSTS
6
Application-Specific SoftwareStandards?
• But, were these industries justified in producing unique software standards?
• The only justification for application-specific software standards would be if software for different applications was justifiably developed and tested in different ways
• Many safety-related software standards cover real-time control software, the development and testing of which is very similar whichever industrial sector you are in
• For example, software developers and testers from the aerospace, railway and automotive industries find it relatively easy to move between these application areas
7
A Different Classification of Software Standards
• In fact, in modern complex safety-related systems there can be many types of software– real-time control– database– operating systems– knowledge-based– etc.
• There is a far better case for different development and testing approaches for each of the above types of software than there is for the the current application-specific software standards
8
Safety-Related Standards
• DO-178B Software Considerations in Airborne Systems and Equipment Certification
• MISRA Development Guidelines for Vehicle Based Software
• IEC 61508 Functional Safety of electrical/ electronic/programmable Safety-Related Systems
• pr EN 50128 Software for Railway Control and Protection Systems
• Def Stan 00-55 Requirements for Safety-Related Software in Defence Equipment
• IEC 880 Software for computers in the safety systems of nuclear power stations
9
Integrity Levels
• The Software Integrity Level (SIL) is based on a risk analysis
• Allows a single standard to define a hierarchy of levels of testing (and development)
• Systems are normally partitioned into subsystems for determining software integrity levels– so whole system is not defined at the highest level
• Initially restricted to safety-related standards, e.g. avionics, automotive, defence– but if you base the risk analysis on factors other than
safety, such as economic, the approach is universally applicable – commercial risk-based testing
10
Assignment of Integrity Levels
subsystemA
subsystemB
subsystemC
subsystemD
subsystemE
Assign Integrity
integritylevels
systemrisks
system context& behaviour
Risk Analysis
SIL 1
SIL 3
SIL 4
SIL 2
SIL 3
11
technique 1measure a
technique 2measure b
techniques 1+2measures a+b
techniques 2+3measures b+c
Example of the applicationof Integrity Levels
Really
Interesting
Software
Testing
Standard
(requires
software
integrity level)
SIL 1
SIL 2
SIL 3
SIL 4
12
Integrity Levels
IL00E
S1IL1SIL11D
S2IL2SIL22C
S3IL3SIL33B
No
levels –
just one
standard
S4IL4SIL44A
13
Test Independence
1 2 3Level ofIndependence
Integrity Level
?Integrity
Level
ABCD
All VerificationWhite Box Testing
No Independence
Level ofIndependence
Level ofIndependence
Integrity Level
?Integrity
Level
IL4IL3IL2IL1
IndependentTest Group
Tester Developer
No Independence
Level ofIndependence
IL0
IndependentValidation GpOR
IntegrityLevel
S4S3S2S1
V&V Team Dev’t Team
Level ofIndependence
V&V Team Dev’t TeamV&V Development
Level ofIndependence
14
Black Box Testing - I
IntegrityLevel
S4S3S2S1
Suitable techniques - ????
Level ofCoverage
EP and BVA
Level ofCoverage
IntegrityLevel
4321
“Black Box Testing”
Not White or Black Box testing?
Techniques Required
0
IntegrityLevel
ABCD
High and low level req’ts
High level req’ts
Techniques Required
EP,BVA,STT
15
Black Box Testing - II
IntegrityLevel
SIL4SIL3SIL2SIL1
HRHR
RHR
EP (1)HRHR
RHR
BVARR
--
Cause-EffectRR
RR
Error GuessingHRR
RR
EP (2)
EP (1) and BVA (1) are defined as part of “Functional / black box testing”, while EP (2) and BVA (2) are defined as part of “Dynamic analysis and testing”. However, both are required as part of “software module testing and integration”?
IntegrityLevel
IL4IL3IL2IL1IL0
EP (1) BVA (1) Cause-Effect Error GuessingEP (2)
R R - R-
HR HR R RHR HR
HR HR - RR HR
BVA (2)
-
16
White Box Testing
IntegrityLevel
S4S3S2S1
100% Branch and Statement
Level ofCoverage
No white box requirement 100% Decision and Statement
Level ofCoverage
IntegrityLevel
ABCD
MCDC plus…Decision plus…
No white box requirement
Level ofCoverage
Statement
IntegrityLevel
SIL4SIL3SIL2SIL1
HR – but ???HR – but ???
R – but ???
Level ofCoverage
R – but ???
IntegrityLevel
IL4IL3IL2IL1
HR – but ???
R – but ???
Level ofCoverage
IL0 No white box requirement
IntegrityLevel
4321
100% Coverage (LCSAJ/MCDC?)Defined Level = ????
No white box requirement
Level ofCoverage
0
17
Problems with the Standards
• Some are internally inconsistent• They disagree on the test requirements
– Is one correct (and all the rest wrong)?– Are the correct req’ts all there – but spread around?– Are some req’ts correct?– Are none of the req’ts correct?– WE DON’T KNOW!
• Research on which approach is best is inconclusive
• So, why not let each industry go its own way (as long as its not too weird)?
18
The Difficulty with manySafety-Related Standards
• Staff Mobility– Testers find it difficult to move between different
areas due to the different requirements
• Reuse– We want as much reuse as possible – but different
requirements mean components become tied to a particular standard
• Multi-disciplinary Systems– What do we do when we meet the problem of multi-
application area systems?• Duplicated Standards Effort
– Standards authors waste time re-inventing subtly-different safety-related test requirements
19
Recommendations• Standardise on just one of the current standards?
– This errs on the side of caution - feedback from projects should allow the requirements to be loosened over time
ConditionBVA+EGIndependentTest GroupSTU4
Defined LevelEP+EGNo IndependenceSTU0
StatementEP+EGIndependentTestersSTU1
StatementBVA+EGIndependentTestersSTU2
BranchBVA+EGIndependentTest GroupSTU3
White BoxBlack BoxIndependenceIntegrity Levels
20
Integrity Levels forall Software Testing
• Base all testing on risk– If we consider non-safety risks, such as economic
risks, then the same approach applies to all software testing (although the integrity levels may need to be extended to expand the lower levels into more classifications)
– Technology transfer from safety-related
• There are ‘generic’ standards based on integrity levels already available– V&V - IEEE 1012-1998
• Work is required on matching test techniques and completion criteria with integrity levels
21
In the longer term…..A New Framework
risk criteriaApplication-Specific
StandardsApplication-Specific
StandardsApplication-Specific
StandardsApplication-Specific
Standards
integrity levels
integrity levels
Integrity LevelsStandard(ISO 15026)
reference refere
nce
Testing Terms Vocabulary
reference
phases
V & VStandard(IEEE 1012)
test criteria
Testing PhaseStandards
TestingTechniques& Measures
Standard
test techniquesTesting PhaseStandards
Testing PhaseStandards
Testing PhaseStandards
22
Conclusions
• Stop generating application-specific test requirements
• Any new framework must encompass all application areas – not just safety-related
• We need to define a set of explicit and generic test requirement based on integrity levels
• Any questions?
top related