mtp for manual testing

Upload: craig-kris

Post on 06-Apr-2018

227 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/3/2019 MTP for Manual Testing

    1/25

    Software Testing Master Test Plan for Functional Testing

    Table of ContentsTable of Contents ...........................................................................................2 Revision History .............................................................................................4 Testing Framework...............................................................................

    ..........5 1.0 INTRODUCTION ...................................................

    ........................................................................ 5 1.2 TRADITIONAL TESTING CYCLE ...........................................................................5 2.0 VERIFICATION AND VALIDATION TESTING STRATEGIES................................... 6 2.1 VERIFICATION STRATEGIES ..............................................................................6 2.1.1 REVIEW S ......................................................................................72.1.2 INSPECTIONS ..................................................................................8 2.1.3 WALKTHROUGHS ..............................................................................8 2.2 VALIDATION STRATEGIES ................................................................................8 3.0 TESTINGTYPES .........................................................................

    ................................................... 9 3.1 WHITE BOX TESTING ....................................................................................

    .9 W HITE BOX TESTING TYPES.....................................................

    ............................ 10 3.1.1 BASIS PATH TESTING .......................

    ............................................... 10 3.1.2 FLOW GRAPH N OTATION ..

    ................................................................. 10 3.1.3 CYCLOMATIC COMPLEXITY ................................................................. 10 3.1.4 GRAPH MATRICES .......................................................................... 10 3.1.5 CONTROL STRUCTURE TESTING ........................................................... 10 3.1.5.1 Condition Testing ........................................................... 10 3.1.5.2 Data Flow Testing .......................................................... 10 3.1.6 LOOP TESTING ....................................................................................

    .... 11 3.1.6.1 Simple Loops ...................................................

    ....................... 11 3.1.6.2 Nested Loops ................................

    .......................................... 11 3.1.6.3 Concatenated Loops .......

    .......................................................... 11 3.1.6.4 Unstructured Loops .................................................................. 11 3.2 BLACK BOX TESTING ................................................................................... 11 BLACK BOX TESTING TYPES ................................................................................. 11 3.2.1 GRAPH BASED TESTING METHODS ....................................................... 11 3.2.2 EQUIVALENCE PARTITIONING .............................................................. 11 3.2.3 BOUNDARY VALUE ANALYSIS .............................................................. 12 3.2.4 COMPARISON TESTING ................................

    ..................................... 12 3.2.5 ORTHOGONAL ARRAY TESTING ............................................................ 12 3.3 SCENARIO BASED TESTING (SBT).......................................................... 12 3.4 EXPLORATORY TESTING ....................................................................... 13 4.0 STRUCTURAL SYSTEM TESTING TECHNIQUES ........................................................ 13 5.0 FUNCTIONAL SYSTEM TESTING TECHNIQUES......................................................... 13 4.0 TESTING PHASES ...................................................................................................................... 14 4.2 UNIT TESTING ........................................................................................... 15 4.3 INTEGRATIONTESTING ................................................................................. 15 4.3.1 TOP- DOWN I NTEGRATION.................................................................. 15

    Software Testing Framework V2.0 2 of 25

  • 8/3/2019 MTP for Manual Testing

    2/25

    4.3.2 BOTTOM- UP I NTEGRATION ................................................................. 15 4.4 SMOKE TESTING......................................................................................... 16 4.5 SYSTEM TESTING ........................................................................................ 164.5.1. RECOVERY TESTING ....................................................................... 16 4.5.2. SECURITY TESTING ........................................................................ 16 4.5.3. STRESS TESTING ...................

    ....................................................... 16 4.5.4. PERFORMANCE TESTING .................................................................. 16 4.5.5. REGRESSION TESTING .................................................................... 17 4.6 ALPHA TESTING ......................................................................................... 17 4.7 USER ACCEPTANCE TESTING ........................................................................... 17 4.8 BETATESTING ........................................................................................... 17 5.0 METRICS ......................................................................................................................................... 17

    6.0 TEST MODELS ................................................................

    .............................................................. 19 6.1 THE V MODEL................................................................................

    .......... 19 6.2 THE W MODEL ....................................................

    ..................................... 20 6.3 THE BUTTERFLY MODEL ...............

    ................................................................. 21 7.0 DEFECTTRACKING PROCESS.............................................................................................. 23

    8.0 TEST PROCESS FOR A PROJECT ........................................................................................ 24

    9.0 DELIVERABLES ........................................................................................................................... 25

    Software Testing Framework V2.0

    3 of 25

  • 8/3/2019 MTP for Manual Testing

    3/25

    Revision HistoryVersion No. 1.0 2.0 Date August 6, 2003 December 15, 2003 Author Harinath Harinath Notes Initial Document Creation and Posting on web site. Renamed the documentto Software Testing Framework V2.0 Modified the structure of the document. Added Testing Models section Added SBT, ET testing types.

    Next Version of this framework would include Test Estimation Procedures and More

    Metrics.

    Software Testing Framework V2.0

    4 of 25

  • 8/3/2019 MTP for Manual Testing

    4/25

    Testing FrameworkThrough experience t hey det erm ined, t hat t here should be 30 defect s per 1000 lines of code. I f t est ing does not uncover 30 defect s, a logical solut ion is t hat t he t est process was not effective.

    1.0 IntroductionTest ing plays an im port ant role in t odays Syst em Developm ent Life Cycle. Du

    ring Testing, we follow a systematic procedure to uncover defects at various stages of the life cycle. This fram ework is aim ed at providing t he reader various Test Types, Test Phases, Test Models and Test Met rics and guide as t o how to perform effect ive Test ing in t he project. All t he definit ions and st andards m ent ioned in t his fram ework are exist ing ones. I have not alt ered any definit ions, but where ever possible I t ried t o explain t hem in sim ple words. Also, t he fram ework, approach and suggest ions are m y experiences. My int ent ion of t his fram ework is t o help Test Engineers t o underst and t he concept s of t est ing, various t echniques and apply t hem effect ively in t heir daily work. This framework is not for publication or for monetary distribution. If you have any queries, suggest ions for im provem ent s or any point s found missing, kindly write back to me.

    1.2 Traditional Testing CycleLet us look at t he t radit ional Soft ware Developm ent life cycle. The figurebelow depicts the same. Requirements Requirements

    Design

    Design

    Te st

    Code

    Code

    Test

    Maintenance

    Maintenance Fig A Fig B

    I n t he above diagram ( Fig A) , t he Test ing phase com es aft er t he Codingis com plet e and before the product is launched and goes into maintenance.

    Software Testing Framework V2.0

    5 of 25

  • 8/3/2019 MTP for Manual Testing

    5/25

    But , t he recom m ended t est process involves t est ing in every phase of t helife cycle ( Fig B) . During t he requirem ent phase, t he em phasis is upon validat ion t o det erm ine t hat t he defined requirem ent s m eet t he needs oft he proj ect . During t he design and program phases, t he em phasis is on verificat ion t o ensure t hat t he design and program s accom plish t he defined requirem ent s. During t he t est and inst allat ion phases, t he em phasis is oninspect ion t o det erm ine t hat t he im plem ent ed syst em meets the system s

    pecification. The chart below describes the Life Cycle verification activities.Life Cycle Phase Requirements Verification Activities Determine verification approach. Determine adequacy of requirements. Generate functional test data. Determine consistency of design with requirements. Determine adequacy of design. Generate structural and functional test data. Determine consistency with design Determine adequacy of implementation Generat e st ruct ural and funct ional t est data for programs. Test application system. Place tested system into production. Modify and retest.

    Design

    Program (Build)

    Test Installation Maintenance

    Throughout the entire lifecycle, neither development nor verification is a straight- line act ivit y. Modificat ions or correct ions t o a st ruct ure at one phase will require modifications or re- verification of structures produced duringprevious phases.

    2.0 Verification and Validation Testing Strategies2.1 Verification StrategiesThe Verificat ion St rat egies, persons / t eam s involved in t he t est ing, and t he deliverable of that phase of testing is briefed below: Verification Strategy Requirements Reviews Performed By Users, Developers, Test Engineers. Explana

    tion Requirement Reviews help in base lining desired requirements to build a system. Design Reviews help in validating if the design meets the requirements and build an effective system. Code Walkthroughs help in analyzing the coding techniques and if the code is meeting the coding standards Deliverable Reviewed and approved statement of requirements. System Design Document, Hardware Design Document.

    Design Reviews

    Designers, Test Engineers

    Code Walkthroughs

    Developers, Subject Specialists, Test Engineers.

    Software ready for initial testing by the developer.

    Software Testing Framework V2.0

    6 of 25

  • 8/3/2019 MTP for Manual Testing

    6/25

    Code Inspections

    Developers, Subject Specialists, Test Engineers.

    Formal analysis of the program source code to find defects as defined by meetingsystem design specification.

    Software ready for testing by the testing team.

    2.1.1 Reviews The focus of Review is on a work product ( e.g. Requirem ent s docum ent , Code et c.) . Aft er t he work product is developed, t he Proj ect Leader calls for a Review. The work product is dist ribut ed t o t he personnel who involves in t he review. The m ain audience for t he review should be t he Proj ect Manager, Proj ect Leader and t he Producer of the work product. Major reviewsinclude the following: 1. In Process Reviews 2. Decision Point or Phase End Reviews 3. Post Implementation Reviews Let us discuss in brief about t he above m ent ioned reviews. As per st at ist ics Reviews uncover over 65% of t he defect sand t est ing uncovers around 30% . So, it s very important to maintain reviewsas part of the V&V strategies. In- Process Review In- Process Review looks at t

    he product during a specific t im e period of a life cycle, such as act ivit y.They are usually lim it ed t o a segm ent of a proj ect , wit h t he goal of ident ifying defect s as work progresses, rat her t han at t he close of a phase oreven later, when they are more costly to correct. Decision- Point or Phase- EndReview This review looks at t he product for t he m ain purpose of det erm ining whet her t o cont inue wit h planned act ivit ies. They are held at t he end of each phase, in a sem iform al or form al way. Defect s found are t racked t hrough resolut ion, usually by way of t he exist ing defect t racking syst em . The com m on phase- end reviews are Software Requirements Review, Critical DesignReview and Test Readiness Review.

    The Soft w a r e Re qu ir e m e n t s Re vie w is aim ed at validat ing and approving t he docum ent ed soft ware requirem ent s for t he purpose of est ablishing a baseline and ident ifying analysis packages. The Developm ent Plan, Soft ware Test Plan, Configurat ion Managem ent Plan are some of the documents reviewsduring this phase. The Cr it ica l D e sign Re vie w baselines t he det ailed design specificat ion. Test cases are reviewed and approved. The Te st Re a din ess Re vie w is perform ed when t he appropriat e applicat ion com ponent s are near com plet ing. This review will det erm ine the readiness of the applicationfor system and acceptance testing.

    Post Implementation Review These reviews are held aft er im plem ent at ion is com plet e t o audit t he process based on act ual result s. Post - I m plem entat ion reviews are also known as Postmortems and are held t o assess t he success of t he overall process aft er release and ident ify any opport unit ies for process im provem ent . They can be held up t o t hree t o six m ont hs after implementation, and are conducted in a format.Software Testing Framework V2.0 7 of 25

  • 8/3/2019 MTP for Manual Testing

    7/25

    There are three general classes of reviews: 1. Informal or Peer Review 2. Semiformal or Walk- Through 3. Format or Inspections Pe e r Re vie w is generally a one- to- one m eet ing bet ween t he aut hor of a work product and a peer, init iat ed as a request for im port regarding a part icular art ifact or problem . There is no agenda, and result s are not form ally report ed. These reviews occur on an as needed basis throughout each phase of a project. 2.1.2 Inspections A knowledgeable individual called a m oderat or, who is not a m em ber of t he t eam

    or t he aut hor of t he product under review, facilit at es inspect ions. A recorder who records the defects found and actions assigned assists the moderator. The meeting is planned in advance and m at erial is dist ribut ed t o all t he part icipant s and t he part icipant s are expect ed t o at t end t he m eet ing well prepared. The issues raised during t he m eet ing are docum ent ed and circulat ed am ong t he m em bers present and the management.

    2.1.3 Walkthroughs The aut hor of t he m at erial being reviewed facilit at es walk- Through. The part icipant s are led t hrough t he m at erial in one of t woform at s; t he present at ion is m ade wit hout int errupt ions and com m ents are m ade at t he end, or com m ent s are m ade t hroughout . I n eit her case, t he issues raised are capt ured and published in a report dist ribut ed t o t

    he part icipant s. Possible solut ions for uncovered defect s are not discussedduring the review.

    2.2 Validation StrategiesThe Validat ion St rat egies, persons / t eam s involved in t he t est ing, andt he deliverable of that phase of testing is briefed below: Validation StrategyUnit Testing. Performed By Developers / Test Engineers. Explanation Testing of single program, modules, or unit of code. Testing of integrated programs, modules, or units of code. Deliverable Software unit ready for testing with other system component. Portions of the system ready for testing with other portions of thesystem. Tested computer system, based on what was specified to be developed.

    Integration Testing.

    Test Engineers.

    System Testing.

    Test Engineers.

    Production Environment Testing.

    Developers, Test Engineers.

    Testing of entire computer system. This kind of testing usually includes functional and structural testing. Testing of the whole computer system before rollingout to the UAT.

    Stable application.

    Software Testing Framework V2.0

    8 of 25

  • 8/3/2019 MTP for Manual Testing

    8/25

    User Acceptance Testing.

    Users.

    Installation Testing.

    Test Engineers.

    Beta Testing

    Users.

    Testing of computer system to make sure it will work in the system regardless ofwhat the system requirements indicate. Testing of the Computer System during the Installation at the user place. Testing of the application after the installation at the client place.

    Tested and accepted system based on the user needs.

    Successfully installed application.

    Successfully installed and running application.

    3.0 Testing TypesThere are two types of testing: 1. Functional or Black Box Testing, 2. Structural or White Box Testing. Before t he Proj ect Managem ent decides on t he t est ing act ivit ies t o be perform ed, it should have decided t he t est t ype t hatit is going t o follow. I f it is t he Black Box, t hen t he t est cases shouldbe writ t en addressing t he funct ionalit y of t he applicat ion. I f it is the Whit e Box, t hen t he Test Cases should be writ t en for t he int ernal andfunctional behavior of the system. Funct ional t est ing ensures t hat t he requirem ent s are properly sat isfied by t he applicat ion syst em . The funct ions

    are t hose t asks t hat t he syst em is designed t o accomplish. Structural testing ensures sufficient testing of the implementation of a function.

    3.1 White Box TestingWhit e Box Test ing; also know as glass box t est ing is a t est ing m et hod where t he tester involves in testing the individual software programs using tools, standards etc. Using white box testing methods, we can derive test cases that:1) Guarant ee t hat all independent pat hs wit hin a m odule have been exercised at lease once, 2) Exercise all logical decisions on their true and false sides, 3) Execute all loops at their boundaries and within their operational bounds,and 4) Exercise internal data structures to ensure their validity. Advantages ofWhite box testing: 1) Logic errors and incorrect assum pt ions are inversely proport ional t o t he probability that a program path will be executed. 2) Oft en, a logical pat h is not likely t o be execut ed when, in fact , it m ay be execut ed on a regular basis. 3) Typographical errors are random.

    Software Testing Framework V2.0

    9 of 25

  • 8/3/2019 MTP for Manual Testing

    9/25

    White Box Testing TypesThere are various t ypes of Whit e Box Test ing. Here in t his fram ework I willaddress the most common and important types. 3.1.1 Basis Path Testing Basis path t est ing is a whit e box t est ing t echnique first proposed by Tom McCabe.The Basis pat h m et hod enables t o derive a logical com plexit y m easure of aprocedural design and use t his m easure as a guide for defining a basis set ofexecut ion pat hs. Test Cases derived t o exercise t he basis set are guarant e

    ed t o execute every statement in the program at least one time during testing.3.1.2 Flow Graph Notation The flow graph depict s logical cont rol flow using adiagram m at ic not at ion. Each structured construct has a corresponding flow graph symbol. 3.1.3 Cyclomatic Complexity Cyclom at ic com plexit y is a soft ware m et ric t hat provides a quant it at ive m easure of the logical complexity of a program. When used in the context of a basis path testing m et hod, t he value com put ed for Cyclom at ic com plexit y defines t he num ber for independentpat hs in t he basis set of a program and provides us wit h an upper bound fort he num ber of t est s t hat m ust be conduct ed t o ensure t hat all st at ement s have been executed at lease once. An independent pat h is any pat h t hrough t he program t hat int roduces at least one new set of processing statementsor a new condition. Computing Cyclomatic Complexity Cyclom at ic com plexit y ha

    s a foundat ion in graph t heory and provides us wit h extremely useful softwaremetric. Complexity is computed in one of the three ways: 1. The num ber of regions of t he flow graph corresponds t o t he Cyclom at ic complexity. 2. Cyclomatic complexity, V(G), for a flow graph, G is defined as V (G) = E- N+2 Where E, is the number of flow graph edges, N is the number of flow graph nodes. 3. Cyclomatic complexity, V (G) for a flow graph, G is also defined as: V (G) = P+1 WhereP is the number of predicate nodes contained in the flow graph G. 3.1.4 Graph Matrices The procedure for deriving t he flow graph and even det erm ining a setof basis pat hs is am enable t o m echanizat ion. To develop a soft ware t ool that assist s in basis pat h testing, a data structure, called a graph matrix can be quite useful. A Graph Mat rix is a square matrix whose size is equal to thenumber of nodes on the flow graph. Each row and colum n corresponds t o an ident ified node, and m at rix entries correspond to connections between nodes. 3.1.

    5 Control Structure Testing Described below are some of the variations of Control Structure Testing. 3.1.5.1 Condition Testing Condit ion t est ing is a t est case design m et hod t hat exercises t he logical conditions contained in a program module. 3.1.5.2 Data Flow Testing The dat a flow t est ing m et hod select st est pat hs of a program according t o t he locations of definitions and uses of variables in the program.Software Testing Framework V2.0 10 of 25

  • 8/3/2019 MTP for Manual Testing

    10/25

    3.1.6 Loop Testing Loop Test ing is a whit e box t est ing t echnique t hat focuses exclusively on t he validit y of loop const ruct s. Four classes of loops can be defined: Sim ple loops, Concat enat ed loops, nested loops, and unstructured loops. 3.1.6.1 Simple Loops The following set s of t est s can be applied t osim ple loops, where n is t he maximum number of allowable passes through the loop. 1. Skip the loop entirely. 2. Only one pass through the loop. 3. Two passes through the loop. 4. m passes through the loop where m

  • 8/3/2019 MTP for Manual Testing

    11/25

    1. I f an input condit ion specifies a range, one valid and one t wo invalid classes are defined. 2. I f an input condit ion requires a specific value, one valid and t wo invalid equivalence classes are defined. 3. I f an input condit ion specifies a m em ber of a set , one valid and one invalid equivalence class are defined. 4. If an input condition is Boolean, one valid and one invalid class aredefined. 3.2.3 Boundary Value Analysis BVA is a t est case design t echnique that com plem ent s equivalence part it ioning. Rat her t han select ing any elem

    ent of an equivalence class, BVA leads t o t he select ion of t est cases at the edges of t he class. Rat her t han focusing solely on input conditions, BVA derives test cases from the output domain as well. Guidelines for BVA are sim ilarin m any respect s t o t hose provided for equivalence partitioning. 3.2.4 Comparison Testing Sit uat ions where independent versions of soft ware be developedfor crit ical applicat ions, even when only a single version will be used in the delivered com put er based syst em . These independent versions from t he basis of a black box testing technique called Comparison testing or back- to- backtesting. 3.2.5 Orthogonal Array Testing The orthogonal array testing method is particularly useful in finding errors associated with region faults an error category associated with faulty logic within a software component.

    3.3 Scenario Based Testing (SBT)Dr.Cem Kaner in A Pat t ern for Scenario Test ing has explained scenario Based Testing in great detail that can be found at www.testing.com. What is Scenario Based Test ing and How/ Where is it useful is an int erest ing question. I shall explain in brief the above two mentioned points. Scenario Based Test ing is cat egorized under Black Box Test s and are m ost helpful when t he t est ing is concent rat ed on t he Business logic and funct ional behavior of t he applicat ion.Adopt ing SBT is effect ive when t est ing com plex applicat ions. Now, every applicat ion is com plex, t hen it s t he t eam s call as t o im plem ent SBT or not . I would personally suggest using SBT when t he funct ionalit y t o t est includes various feat ures and funct ions. A best exam ple would be while t est ingbanking applicat ion. As banking applicat ions require ut m ost care while t est ing, handling various funct ions in a single scenario would result in effectiv

    e results. A sam ple t ransact ion ( scenario) can be, a cust om er logging into t he applicat ion, checking his balance, t ransferring am ount t o anot her account , paying his bills, checking his balance again and logging out. In brief,use Scenario Based Tests when: 1. Testing complex applications. 2. Testing Business functionality. When 1. 2. 3. 4. designing scenarios, keep in mind: The scenario should be close to the real life scenario. Scenarios should be realistic. Scenarios should be traceable to any/combination of functionality. Scenarios should be supported by sufficient data.12 of 25

    Software Testing Framework V2.0

  • 8/3/2019 MTP for Manual Testing

    12/25

    3.4 Exploratory TestingExplorat ory Test s are cat egorized under Black Box Test s and are aim ed at test ing in conditions when sufficient time is not available for testing or proper documentation is not available. Exploratory testing is Testing while Exploring.When you have no idea of how the application works, exploring the application with the intent of finding errors can be termed as Exploratory Testing. PerformingExploratory Testing This is one big question for many people. The following can

    be used to perform Exploratory Testing: Learn the Application. Learn the Business for which the application is addressed. Learn the technology to the maximum extent on which the application has been designed. Learn how to test. Plan and Design tests as per the learning.

    4.0 Structural System Testing TechniquesThe following are the structural system testing techniques. Technique Stress Execution Recovery Operations Compliance Security Description Determine system performance with expected volumes. System achieves desired level of proficiency. System can be returned to an operational status after a failure. System can be executed in a normal operational status. System is developed in accordance with standards and procedures. System is protected in accordance with importance to organ

    ization. Example Sufficient disk space allocated. Transaction turnaround time adequate. Evaluate adequacy of backup data. Determine systems can run using document. Standards follow. Access denied.

    5.0 Functional System Testing TechniquesThe following are the functional system testing techniques. Technique Requirements Regression Error Handling Manual Support Intersystems. Control Description System performs as specified. Verifies that anything unchanged still performs correctly. Errors can be prevented or detected and then corrected. The people- computer interaction works. Data is correctly passed from system to system. Controlsreduce system risk to an acceptable level. Example Prove system requirements. Unchanged system segments function. Error introduced into the test. Manual procedures developed. Intersystem parameters changed. File reconciliation procedures wo

    rk.13 of 25

    Software Testing Framework V2.0

  • 8/3/2019 MTP for Manual Testing

    13/25

    Parallel

    Old systems and new system are run and the results compared to detect unplanneddifferences.

    Old and new system can reconcile.

    4.0 Testing PhasesRequirement Study Requirement Checklist

    Software Requirement Specification

    Software Requirement Specification

    Functional Specification Checklist

    Functional Specification Document

    Functional Specification Document

    Architecture Design

    Architecture Design

    Detailed Design Document

    Coding

    Functional Specification Document

    Unit Test Case Documents

    Unit Test Case Document Design Document Functional Specification Document SystemTest Case Document Integration Test Case Document Regression Test Case Document

    Unit/Integration/System Test Case Documents

    Functional Specification Document Performance Criteria Software Requirement Specification Regression Test Case Document Performance Test Cases and Scenarios Software Testing Framework V2.0

    Performance Test Cases and Scenarios

    User Acceptance Test Case Documents/Scenarios

    14 of 25

  • 8/3/2019 MTP for Manual Testing

    14/25

    4.2 Unit TestingGoal of Unit t est ing is t o uncover defect s using form al t echniques like Boundary Value Analysis ( BVA) , Equivalence Part it ioning, and Error Guessing. Defect s and deviat ions in Dat e form at s, Special requirem ent s in input condit ions ( for exam ple Text box where only num eric or alphabet s should be entered) , select ion based on Com bo Boxs, List Boxs, Opt ion but t ons, Check Boxs would be ident ified during t he Unit Testing phase.

    4.3 Integration TestingI nt egrat ion t est ing is a syst em at ic t echnique for const ruct ing t he program st ruct ure while at t he sam e t im e conduct ing t est s t o uncover errors associat ed wit h int erfacing. The obj ect ive is t o t ake unit t est edcom ponent s and build a program structure that has been dictated by design. Usually, the following methods of Integration testing are followed: 1. Top- down Integration approach. 2. Bottom- up Integration approach. 4.3.1 Top- down Integration Top- down int egrat ion t est ing is an increm ent al approach t o const ruct ion of program st ruct ure. Modules are int egrat ed by m oving downward t hrough t he cont rol hierarchy, beginning wit h t he m ain cont rol m odule. Modules subordinat e t o t he m ain cont rol m odule are incorporat ed int o t he st r

    uct ure in eit her a dept h- first or breadt hfirst manner. 1. The Integration process is performed in a series of five steps: 2. The main control module is used as a test driver and stubs are substituted for all components directly subordinate to the main control module. 3. Depending on t he int egrat ion approach select ed subordinat e st ubs are replaced one at a time with actual components. 4.Tests are conducted as each component is integrated. 5. On com plet ion of eachset of t est s, anot her st ub is replaced wit h t he real component. 6. Regression t est ing m ay be conduct ed t o ensure t hat new errors have not been introduced. 4.3.2 Bottom- up Integration Button- up int egrat ion t est ing begins const ruct ion and t est ing wit h at om ic m odules (i.e. components at the lowest levels in the program structure). Because components are int egrat ed from the but t on up, processing required for com ponent s subordinat e to a given level is always available and the need for stubs is eliminated. 1. A Bot t om - up

    int egrat ion st rat egy m ay be im plem ent ed wit h t he following steps: 2. Low level com ponent s are com bined int o clust ers t hat perform a specific software sub function. 3. A driver is written to coordinate test case input and output. 4. The cluster is tested. 5. Drivers are rem oved and clust ers are com bined m oving upward in t he program structure.

    Software Testing Framework V2.0

    15 of 25

  • 8/3/2019 MTP for Manual Testing

    15/25

    4.4 Smoke TestingSmoke testing might be a characterized as a rolling integration strategy. Sm oke test ing is an int egrat ion t est ing approach t hat is com m only used when shrink- wrapped soft ware product s are being developed. I t is designed as a pacingm echanism for t im e- critical projects, allowing the software team to assessits project on a frequent basis. The sm oke t est should exercise t he ent ire syst em from end t o end. Sm oke t est ing provides benefits such as: 1) Integrat

    ion risk is minimized. 2) The quality of the end- product is improved. 3) Errordiagnosis and correction are simplified. 4) Progress is easier to asses.

    4.5 System TestingSyst em t est ing is a series of different t est s whose prim ary purpose is t ofully exercise t he com put er based syst em . Alt hough each t est has a different purpose, all work t o verify t hat syst em elem ent s have been properly int egrat ed and perform allocated functions. The following tests can be categorized under System testing: 1. Recovery Testing. 2. Security Testing. 3. Stress Testing. 4. Performance Testing.

    4.5.1. Recovery Testing Recovery t est ing is a syst em t est t hat focuses t he

    soft ware t o fall in a variet y of ways and verifies t hat recovery is properly perform ed. I f recovery is aut om at ic, reinit ializat ion, checkpoint ing mechanism s, dat a recovery and rest art are evaluat ed for correct ness. I f recovery requires hum an int ervent ion, t he m ean- time- to- repair (MTTR) is evaluated to determine whether it is within acceptable limits. 4.5.2. Security Testing Securit y t est ing at t em pt s t o verify t hat prot ect ion m echanism sbuilt int o a syst em will, in fact , prot ect it from im proper penet rat ion.During Securit y t est ing, password cracking, unaut horized ent ry int o t hesoft ware, net work securit y are all t aken int o consideration. 4.5.3. StressTesting St ress t est ing execut es a syst em in a m anner t hat dem ands resources in abnorm al quant it y, frequency, or volum e. The following t ypes of t est s m ay be conduct ed during stress testing; Special t est s m ay be designed that generat e t en int errupt s per second, when one or two is the average rate

    . I nput dat a rat es m ay be increases by an order of m agnit ude t o det ermine how input functions will respond. Test Cases that require maximum memory or other resources. Test Cases that may cause excessive hunting for disk- resident data. Test Cases that my cause thrashing in a virtual operating system. 4.5.4. Performance Testing Perform ance t est s are coupled wit h st ress t est ing and usually require bot h hardware and software instrumentation.Software Testing Framework V2.0 16 of 25

  • 8/3/2019 MTP for Manual Testing

    16/25

    4.5.5. Regression Testing Regression testing is the re- execution of some subsetof tests that have already been conducted to ensure that changes have not propagated unintended side affects. Regression may be conducted manually, by re- executing a subset of al test cases or using automated capture/playback tools. The Regression test suit contains three different classes of test cases: A representative sample of tests that will exercise all software functions. Additional teststhat focus on software functions that are likely to be affected by the change.

    Tests that focus on the software components that have been changed.

    4.6 Alpha TestingThe Alpha testing is conducted at the developer sites and in a controlled environment by the end- user of the software.

    4.7 User Acceptance TestingUser Accept ance t est ing occurs j ust before t he soft ware is released t o the cust om er. The end- users along with the developers perform the User Acceptance Testing with a certain set of test cases and typical scenarios.

    4.8 Beta Testing

    The Bet a t est ing is conduct ed at one or m ore cust om er sit es by t he end-user of t he soft ware. The bet a t est is a live applicat ion of t he soft ware in an environm ent t hat cannot be controlled by the developer.

    5.0 MetricsMet rics are t he m ost im port ant responsibilit y of t he Test Team . Met ricsallow for deeper underst anding of t he perform ance of t he applicat ion and it s behavior. The fine t uning of t he applicat ion can be enhanced only wit h met rics. I n a t ypical QA process, there are many metrics which provide information. The following can be regarded as the fundamental metric: IEEE Std 982.2 -1988 defines a Functional or Test Coverage Metric. It can be used to measure test coverage prior to software delivery. It provide a measure of the percentage of the software tested at any point during testing. It is calculated as follows:

    Function Test Coverage = FE/FT Where FE is the number of test requirements thatare covered by test cases that were executed against the software FT is the total number of test requirements Software Release Metrics The software is ready forrelease when: 1. It has been tested with a test suite that provides 100% functional coverage, 80% branch coverage, and 100% procedure coverage. 2. There are nolevel 1 or 2 severity defects. 3. The defect finding rate is less than 40 new defects per 1000 hours of testing 4. The software reaches 1000 hours of operation5. Stress testing, configuration testing, installation testing, Nave user testing, usability testing, and sanity testing have been completed

    Software Testing Framework V2.0

    17 of 25

  • 8/3/2019 MTP for Manual Testing

    17/25

    IEEE Software Maturity Metric IEEE Std 982.2 - 1988 defines a Software MaturityIndex that can be used to determine the readiness for release of a software system. This index is especially useful for assessing release readiness when changes, additions, or deletions are made to existing software systems. It also provides an historical index of the impact of changes. It is calculated as follows: SMI= Mt - ( Fa + Fc + Fd)/Mt Where SMI is the Software Maturity Index value Mt isthe number of software functions/modules in the current release Fc is the number

    of functions/modules that contain changes from the previous release Fa is the number of functions/modules that contain additions to the previous release Fd isthe number of functions/modules that are deleted from the previous release Reliability Metrics Perry offers the following equation for calculating reliability.Reliability = 1 - Number of errors (actual or predicted)/Total number of lines of executable code This reliability value is calculated for the number of errorsduring a specified time interval. Three other metrics can be calculated during extended testing or after the system is in production. They are: MTTFF (Mean Timeto First Failure) MTTFF = The number of time intervals the system is operable until its first failure MTBF (Mean Time Between Failures) MTBF = Sum of the timeintervals the system is operable Number of failures for the time period MTTR (Mean Time To Repair) MTTR = sum of the time intervals required to repair the syste

    m The number of repairs during the time period

    Software Testing Framework V2.0

    18 of 25

  • 8/3/2019 MTP for Manual Testing

    18/25

    6.0 Test ModelsThere are various models of Software Testing. Here in this framework I would explain the three most commonly used models: 1. The V Model. 2. The W Model. 3. The Butterfly Model 6.1 The V Model The following diagram depicts the V Model RequirementsAcceptance Tests

    Specification

    System Tests

    Architecture

    Integration Tests

    Detailed Design

    Unit Tests

    Coding The diagram is self- explanatory. For an easy understanding, look at the

    following table: SDLC Phase Test Phase 1. Requirements 1. Build Test Strategy. 2. Plan for Testing. 3. Acceptance Test Scenarios Identification. 2. Specification 1. System Test Case Generation. 3. Architecture 1. Integration Test Case Generation. 4. Detailed Design 1. Unit Test Case Generation

    Software Testing Framework V2.0

    19 of 25

  • 8/3/2019 MTP for Manual Testing

    19/25

    6.2 The W Model The following diagram depicts the W model:

    Requirements

    Regression Round 3 Performance Testing Regression Round 2

    Requirements Review

    Specification

    Specification Review

    System Testing

    Architecture

    Regression Round 1

    Architecture Review

    Integration Testing

    Detailed Design

    Design Review Code

    Unit Testing

    Code Walkthrough

    The W model depicts that the Testing starts from day one of the initiation of theproject and continues till the end. The following table will illustrate the phas

    es of activities that happen in the W model: SDLC Phase 1. Requirements The first V1. Requirements Review The second V 1. Build Test Strategy. 2. Plan for Testing. 3. Acceptance (Beta) Test Scenario Identification. 1. System Test Case Generation. 1. Integration Test Case Generation. 1. Unit Test Case Generation. 1. ExecuteUnit Tests 1. Execute Integration Tests. 1. Regression Round 1. 1. Execute System Tests. 1. Regression Round 2. 1. Performance Tests 1. Regression Round 3 1. Performance/Beta Tests

    2. Specification 3. Architecture 4. Detailed Design 5. Code

    2. 3. 4. 5.

    Specification Review Architecture Review Detailed Design Review Code Walkthrough

    Software Testing Framework V2.0

    20 of 25

  • 8/3/2019 MTP for Manual Testing

    20/25

    In the second V, I have mentioned Acceptance/Beta Test Scenario Identification. This is because, the customer might want to design the Acceptance Tests. In this case as the development team executes the Beta Tests at the client place, the same team can identify the Scenarios. Regression Rounds are performed at regular intervals to check whether the defects, which have been raised and fixed, are re-tested.

    6.3 The Butterfly Model The t est ing act ivit ies for t est ing soft ware product s are preferable t o follow t he Butterfly Model. The following picture depicts the test methodology.

    Test Design Test Execution

    Test Analysis

    Fig: Butterfly Model

    I n t he Butterfly m odel of Test Developm ent , t he left wing of t he but t erfly depict s the Te st An a lysis. The right wing depict s t he Te st D e sign ,

    and finally t he body of t he but t erfly depict s t he Te st Ex e cu t ion . How t his exact ly happens is described below. Test Analysis Analysis is t he keyfact or which drives in any planning. During t he analysis, t he analyst understands the following:

    Verify t hat each requirem ent is t agged in a m anner t hat allows correlat ionof t he tests for that requirement to the requirement itself. (Establish Test Traceability) Verify traceability of the software requirements to system requirements. Inspect for contradictory requirements. Inspect for ambiguous requirements. Inspect for missing requirements. Check t o m ake sure t hat each requirem ent, as well as t he specificat ion as a whole, is understandable. I dent ify one

    or m ore m easurem ent , dem onst rat ion, or analysis m et hod t hat m ay be used to verify the requirements implementation (during formal testing). Creat e a test sket ch t hat includes t he t ent at ive approach and indicat es t he tests objectives. During Test Analysis t he required docum ent s will be carefully st udied by t he Test Personnel, and the final Analysis Report is documented. The following documents would be usually referred: 1. Software Requirements Specification. 2. Functional Specification. 3. Architecture Document.Software Testing Framework V2.0 21 of 25

  • 8/3/2019 MTP for Manual Testing

    21/25

    4. Use Case Documents. The An a lysis Re por t would consist of t he underst anding of t he applicat ion, t he funct ional flow of t he applicat ion, num ber ofm odules involved and t he effect ive Test Time. Test Design The right wing oft he but t erfly represent s t he act of designing and im plem ent ing t he t est cases needed t o verify t he design art ifact as replicat ed in t he im plem ent at ion. Like test analysis, it is a relatively large piece of work. Unlike test analysis, however, t he focus of t est design is not t o assim ilat e inform

    at ion creat ed by ot hers, but rat her t o im plem ent procedures, t echniques,and dat a set s t hat achieve t he t est s objective(s). The out put s of t he test analysis phase are t he foundat ion for t est design. Each requirem ent ordesign const ruct has had at least one t echnique ( a m easurem ent , dem onst rat ion, or analysis) ident ified during t est analysis t hat will validat e or verify that requirement. The tester must now implement the intended technique. Soft ware t est design, as a discipline, is an exercise in t he prevent ion, det ect ion, and elim inat ion of bugs in soft ware. Prevent ing bugs is t he prim ary goal of soft ware t est ing. Diligent and com pet ent t est design prevent s bugs from ever reaching t he im plem ent at ion st age. Test design, wit h it s at t endant t est analysis foundat ion, is t herefore t he prem iere weapon in the arsenal of developers and t est ers for lim it ing the cost associated with f

    inding and fixing bugs. During Test Design, basing on t he Analysis Report t het est personnel would develop the following: 1. 2. 3. 4. 5. Test Plan. Test Approach. Test Case documents. Performance Test Parameters. Performance Test Plan.

    Test Execution Any test case should adhere to the following principals: 1. Accurate tests what the description says it will test. 2. Economical has only the steps needed for its purpose. 3. Repeatable tests should be consistent, no matter who/when it is executed. 4. Appropriate should be apt for the situation. 5. Traceable the functionality of the test case should be easily found. During t he TestExecut ion phase, keeping t he Proj ect and t he Test schedule, t he t est cases designed would be execut ed. The following docum ent s will be handled duringthe test execution phase: 1. Test Execution Reports. 2. Daily/Weekly/monthly Defect Reports. 3. Person wise defect reports. After the Test Execution phase, the

    following documents would be signed off. 1. Project Closure Document. 2. Reliability Analysis Report. 3. Stability Analysis Report. 4. Performance Analysis Report. 5. Project Metrics.

    Software Testing Framework V2.0

    22 of 25

  • 8/3/2019 MTP for Manual Testing

    22/25

    7.0 Defect Tracking ProcessThe Defect Tracking process should answer the following questions: 1. When is the defect found? 2. Who raised the defect? 3. Is the defect reported properly? 4.Is the defect assigned to the appropriate developer? 5. When was the defect fixed? 6. Is the defect re- tested? 7. Is the defect closed? The defect tracking process has to be handled carefully and managed efficiently. The following figureillustrates the defect tracking process:

    The Tester/Developer finds the Bug.

    Reports the Defect in the Defect Tracking Tool. Status Open

    The concerned Developer is informed

    The Developer fixes the Defect

    The Developer changes the Status to Resolved

    If the Defect reoccurs, the status changes to Re- Open

    The Tester Re- Tests and changes Status to Closed

    Defect Classification This sect ion defines a defect Severit y Scale fram eworkfor det erm ining defect crit icalit y and t he associat ed defect Priorit y Levels t o be assigned t o errors found software.

    Software Testing Framework V2.0

    23 of 25

  • 8/3/2019 MTP for Manual Testing

    23/25

    The defects can be classified as follows: Classification Critical Major Minor Cosmetic Suggestion Description There is s funct ionalit y block. The applicat ionis not able t o proceed any further. The applicat ion is not working as desired. There are variat ions in the functionality. There is no failure report ed duet o t he defect , but cert ainly needs to be rectified. Defects in the User Interface or Navigation. Feature which can be added for betterment.

    Priority Level of the Defect The priorit y level describes t he t im e for resolut ion of t he defect . The priorit y level would be classified as follows: Classification Immediate At the Earliest Normal Later Description Resolve the defectwith immediate effect. Resolve the defect at the earliest, on priority at the second level. Resolve the defect. Could be resolved at the later stages.

    8.0 Test Process for a ProjectI n t his sect ion, I would explain how t o go about planning your t est ing activit ies effect ively and efficient ly. The process is explained in a t abularform at giving t he phase of testing, activity and person responsible. For t his, I assum e t hat t he proj ect has been ident ified and t he t est ing t eam consist s of five personnel: Test Manager, Test Lead, Senior Test Engineer and 2 T

    est Engineers. SDLC Phase 1. Requirements Testing Phase/Activity 1. Study the requirements for Testability. 2. Design the Test Strategy. 3. Prepare the Test Plan. 4. Identify scenarios for Acceptance/Beta Tests 1. Identify System Test Cases/ Scenarios. 2. Identify Performance Tests. 1. Identify Integration Test Cases /Scenarios. 2. Identify Performance Tests. 1. Generate Unit Test Cases PersonnelTest Manager / Test Lead

    2. Specification

    Test Lead, Senior Test Engineer, and Test Engineers. Test Lead, Senior Test Engineer, and Test Engineers. Test Engineers.

    3. Architecture

    4. Detailed Design

    Software Testing Framework V2.0

    24 of 25

  • 8/3/2019 MTP for Manual Testing

    24/25

    9.0 DeliverablesThe Deliverables from the Test team would include the following: 1. 2. 3. 4. 5.6. 7. 8. Test Strategy. Test Plan. Test Case Documents. Defect Reports. Status Reports (Daily/weekly/Monthly). Test Scripts (if any). Metric Reports. Product Sign off Document.

    Software Testing Framework V2.0

    25 of 25

  • 8/3/2019 MTP for Manual Testing

    25/25