testing material.doc
TRANSCRIPT
-
8/14/2019 Testing Material.doc
1/117
SOFTWARE TESTING
Software Testing:Testing is a process of executing a program with the intent of findingerror.
Software Engineering:Software Engineering is the establishment and use of soundengineering principles in order to obtain economically software that is more reliable andworks efficiently on real machines.
Software engineering is based on Computer Science, Management Science, Economics,Communication Skills and Engineering approach.
What should be done during testing?
Confirming product as
Product that has been deeloped according to specifications
!orking perfectly
Satisfying customer re"uirements
Why should we do testing?
Error free superior product
#uality $ssurance to the client
Competitie adantage
Cut down costs
How to test?
Testing can be done in the following ways%
Manually $utomation &'y using tools like !in(unner, )oad(unner, Test*irector +
Combination of Manual and $utomation.
Software Project:$ problem soled by some people through a process is called apro-ect.
nformation /athering 0 (e"uirements $nalysis 0 *esign 0 Coding 0 Testing 0Maintenance% $re called as Pro-ect
FIRSTMAN COMPUETRS4thFloor, White House,
M.G. Road, ia!a"a#ada 1
-
8/14/2019 Testing Material.doc
2/117
SOFTWARE TESTING
Software Developent Phases:
!nforation "athering:t encompasses re"uirements gathering at the strategic businessleel.
Planning:To proide a framework that enables the management to make reasonableestimates of
(esources Cost
Schedules
Si2e
#e$uireents %nalysis:*ata, 3unctional and 'ehaioral re"uirements are identified.
*ata Modeling% *efines data ob-ects, attributes, and relationships.
3unctional Modeling% ndicates how data are transformed in the system.
'ehaioral Modeling% *epicts the impact of eents.
Design:*esign is the engineering representation of product that is to be built.
*ata *esign% Transforms the information domain model into the data structures
that will be re"uired to implement the software.
$rchitectural design% (elationship between ma-or structural elements of the
software. (epresents the structure of data and program components that arere"uired to build a computer based system.
nterface design% Creates an effectie communication medium between a human
and a computer.
Component leel *esign% Transforms structural elements of the software
architecture into a procedural description of software components.
&oding:Translation into source code &Machine readable form
Testing:Testing is a process of executing a program with the intent of finding error
4nit Testing% t concentrates on each unit &Module, Component+ of the software
as implemented in source code.
5
Software Pro-ect
Problem Process Product
-
8/14/2019 Testing Material.doc
3/117
SOFTWARE TESTING ntegration Testing% Putting the modules together and construction of software
architecture.
System and 3unctional Testing% Product is alidated with other system elements
are tested as a whole
4ser $cceptance Testing% Testing by the user to collect feed back.
'aintenance:Change associated with error correction, adaptation and enhancements. Correction% Changes software to correct defects.
$daptation% Modification to the software to accommodate changes to its external
enironment.
Enhancement% Extends the software beyond its original functional re"uirements.
Preention% Changes software so that they can be more easily corrected, adapted
and enhanced.
(usiness #e$uireents Specification )(#S*:Consists of definitions of customerre"uirements. $lso called as C(S64(S &Customer (e"uirements Specification 6 4ser(e"uirements Specification
Software #e$uireents Specification )S+w#S*:Consists of functional re"uirements todeelop and system re"uirements&s6w 7 86w to use.
#eview:$ erification method to estimate completeness and correctness of documents.
High ,evel Design Docuent )H,DD*:Consists of the oerall hierarchy of the systemin terms of modules.
,ow ,evel Design Docuent ),,DD*:Consists of eery sub module in terms ofStructural logic &E(* and 'ackend )ogic&*3*
Prototype:$ sample model of an application without functionality is called asprototype&Screens
White (o- Testing:$ coding leel testing techni"ue to erify completeness andcorrectness of the programs with respect to design. $lso called as /lass 'T or Clear 'T
(lac. (o- Testing:t is a .exe leel of testing techni"ue to alidate functionality of anapplication with respect to customer re"uirements. *uring this test engineer alidateinternal processing depends on external interface.
"rey (o- Testing:Combination of white box and black box testing.
(uild:$ .Exe form of integrated module set is called build.
/erification:whether system is right or wrong9
/alidation:whether system is right system or not9
:
-
8/14/2019 Testing Material.doc
4/117
SOFTWARE TESTINGSoftware 0uality %ssurance)S0%*: S#$ concepts are monitoring and measuring thestrength of deelopment process.E-:)CT &)ife Cycle Testing
0uality:
Meet customer re"uirements
Meet customer expectations &cost to use, speed in process or performance,security
Possible cost
Time to market
3or deeloping the "uality software we need )C* and )CT
,&D:$ multiple stages of deelopment stages and the eery stage is erified forcompleteness.
/ odel:
(uild:!hen coding leel testing oer. it is a completely integration tested modules.Then it is called a build. 'uild is deeloped after integration testing. &.exe
Test 'anageent:Testers maintain some documents related to eery pro-ect. They willrefer these documents for future modifications.
Port Testing:This is to test the installation process.
;
$ssessment of *eelopment PlanPrepare TestPlan(e"uirements Phase Testingnformation /athering
7 $nalysis
*esign Phase TestingProgram Phase Testing &!'T*esign and Coding
3unctional 7 System Testing4ser $cceptance TestingTest Enironment Process
nstall 'uild
Port TestingTest Software ChangesTest Efficiency
Maintenance
-
8/14/2019 Testing Material.doc
5/117
SOFTWARE TESTING&hange #e$uest:The re"uest made by the customer to modify the software.
Defect #eovel Efficiency:
*(E< a6a=b.a < Total no of defects found by testers during testing.b < Total no of defects found by customer during maintenance.
*(E is also called as **&*efect *eficiency.
''T, 4$T and Test management process where the independent testers or testing teamwill be inoled.
#efineent for of /1'odel:*ue to cost and time point of iew >model is notapplicable to small scale and medium scale companies. This type of organi2ations aremaintaining a refinement form of >model.
2ig: #efineent 2or of /1'odel
*eelopment starts with information gathering. $fter the re"uirements gathering'(S6C(S64(S will be prepared. This is done by the 'usiness $nalyst.
*uring the re"uirements analysis all the re"uirements are analy2ed. at the end of this
phase S6w(S is prepared. t consists of the functional &customer re"uirements = System(e"uirements &h6w = S6w re"uirements. t is prepared by the system analyst.
*uring the design phase two types of designs are done. 8)** and ))**. Tech )eadswill be inoled.
*uring the coding phase programs are deeloped by programmers.
?
'(S64(S6C(S
S6w(S
8)**
))**
Code
4nit Testing
ntegration Testing
3unctional 7 System Testing
4ser $cceptance Testing
-
8/14/2019 Testing Material.doc
6/117
SOFTWARE TESTING*uring unit testing, they conduct program leel testing with the help of !'T techni"ues.
*uring the ntegration Testing, the testers and programmers or test programmersintegrating the modules to test with respect to 8)**.
*uring the system and functional testing the actual testers are inoled and conducts testsbased on S6w(S.
*uring the 4$T customer site people are also inoled, and they perform tests based onthe '(S.
3rom the aboe model the small scale and medium scale organi2ations are also conductslife cycle testing. 'ut they maintain separate team for functional and system testing.
#eviews during %nalysis:
#uality $nalyst decides on ? topics. after completion of information gathering andanalysis a reiew meeting conducted to decide following ? factors.
1. $re they complete95. $re they correct9 @r $re they right (e"uirements9:. $re they achieable9;. $re they reasonable9 & with respect to cost 7 time?. $re they testable9
#eviews during Design:
$fter the completion of analysis of customer re"uirements and their reiews, technicalsupport people &Tech )eads concentrate on the logical design of the system. n this eerystage they will deelop 8)** and ))**.
$fter the completion of aboe like design documents, they &tech leads concentrate onreiew of the documents for correctness and completeness. n this reiew they can applythe below factors.
s the design good9 &understandable or easy to refer
$re they complete9 &all the customer re"uirements are satisfied or not
$re they correct9 $re they right (e"uirements9 &the design flow is correct or not
$re they follow able9 &the design logic is correct or not
*oes they handle error handling9 & the design should be able to specify the
positie and negatie flow also
A
-
8/14/2019 Testing Material.doc
7/117
SOFTWARE TESTING
3nit Testing:
$fter the completion of design and their reiews programmers are concentrating oncoding. *uring this stage they conduct program leel testing, with the help of the !'T
techni"ues. This !'T is also known as glass box testing or clear box testing.
!'T is based on the code. The senior programmers will conduct testing on programs!'T is applied at the module leel.
There are two types of !'T techni"ues, such as
1. Execution Testing
'asis path coerage &correctness of eery statement execution.
)oops coerage &correctness of loops termination.
Program techni"ue coerage &)ess no of Memory Cycles and CP4
cycles during execution.
5. @perations Testing% !hither the software is running under the customer expectedenironment platforms &such as @S, compilers, browsers and etc+sys s6w.
!ntegration Testing:$fter the completion of unit testing, deelopment peopleconcentrate on integration testing, when they complete dependent modules of unit testing.*uring this test programmers are erifying integration of modules with respect to 8)**&which contains hierarchy of modules.
There are two types of approaches to conduct ntegration Testing%
Top>down $pproach
'ottom>up approach.
Stub:t is a called program. t sends back control to main module instead of sub module.Driver:t is a calling Program. t inokes a sub module instead of main module.
Top1down:This approach starts testing, from the root.
B
3ser ,ogin
!nbo-
3ser !nforation
!nvalid 3ser
-
8/14/2019 Testing Material.doc
8/117
SOFTWARE TESTING
(otto13p:This approach starts testing, from lower>leel modules. driers are used toconnect the sub modules. & ex login, create drier to accept default uid and pwd
Sandwich:This approach combines the Top>down and 'ottom>up approaches of theintegration testing. n this middle leel modules are testing using the driers and stubs.
Syste Testing:
Conducted by separate testing team
3ollows 'lack 'ox testing techni"ues
*epends on S6w(S
'ai
n
Sub
'odule4
Sub'odule5
Driver
Sub
'odule6
Stub
Sub
'odule5
Sub
'odule4
'ai
nDriver
'ain
Sub
'odule4
Stub
Sub
'odule4
-
8/14/2019 Testing Material.doc
9/117
SOFTWARE TESTING 'uild leel testing to alidate internal processing depends on external interface
processing depends on external interface
This phase is diided into ; diisions
$fter the completion of Coding and that leel tests &4 7 deelopment team releases a
finally integrated all modules set as a build. $fter receiing a stable build fromdeelopment team, separate testing team concentrate on functional and system testingwith the help of ''T.
This testing is classified into ; diisions.
4sability Testing &Ease to use or not. )ow leel Priority in Testing
3unctional Testing &3unctionality is correct or not. Medium Priority in Testing
Performance Testing &Speed of Processing. Medium Priority in Testing
Security Testing &To break the security of the system. 8igh Priority in Testing
4sability and System testing are called as Core testing and Performance and SecurityTesting techni"ues are called as $danced testing.
4sability Testing is a Static Testing. 3unctional Testing is called as *ynamic Testing.
3rom the testers point of iew functional and usability tests are important.
3sability Testing:4ser friendliness of the application or build. &!DS!D/.4sability testing consists of following subtests also.4ser nterface Testing
Ease of 4se & understandable to end users to operate )ook 7 3eel & Pleasantness or attractieness of screens
Speed in interface & )ess no. of eents to complete a task.
'anual Support Testing: n general, technical writers prepares user manuals aftercompletion of all possible tests execution and their modifications also. ow a days helpdocumentation is released along with the main application.
8elp documentation is also called as user manual. 'ut actually user manuals are prepared
after the completion of all other system test techni"ues and also resoling all the bugs.
2unctional testing:*uring this stage of testing, testing team concentrate on F MeetCustomer (e"uirementsF. 3or performing what functionality, the system is deelopedmet or not can be tested.
3or eery pro-ect functionality testing is most important. Most of the testing tools, whichare aailable in the market are of this type.
G
4ser nterface Testing.
(emaining System Testing techni"ueslike 3unctionality, Performance and
Security Tests.
Manual Support Testing.
SystemTesting
*eelopment Team releases 'uild
-
8/14/2019 Testing Material.doc
10/117
SOFTWARE TESTING
The functional testing consists of following subtests
System Testing
H I 3unctional Testing
H I 3unctionality 6 (e"uirements Testing
2unctionality or #e$uireents Testing:*uring this subtest, test engineer alidatescorrectness of eery functionality in our application build, through below coerage.f they hae less time to do system testing, they will be doing 3unctionality Testing only.
3unctionality or (e"uirements Testing has following coerages
'ehaioral Coerage & @b-ect Properties Checking .
nput *omain Coerage & Correctness of Si2e and Type of eery i6p @b-ect .
Error 8andling Coerage & Preenting negatie naigation .
Calculations Coerage & correctness of o6p alues .
'ackend Coerage & *ata Jalidation 7 *ata ntegrity of database tables .
4()Ks Coerage &)inks execution in web pages
Serice )eels & @rder of functionality or serices .
Successful 3unctionality & Combination of aboe all .
$ll the aboe coerages are mandatory or must.!nput Doain Testing:*uring this test, the test engineer alidates si2e and type ofeery input ob-ect. n this coerage, test engineer prepares boundary alues ande"uialence classes for eery input ob-ect.
E-:$ login process allows user id and password. 4ser id allows alpha numeric from ;>1A characters long. Password allows alphabet from ;> characters long.
(oundary /alue analysis:
'oundary alues are used for testing the si2e and range of an ob-ect.
E$uivalence &lass Partitions:
E"uialence classes are used for testing the type of the ob-ect.
#ecovery Testing:This test is also known as (eliability testing. *uring this test, testengineers alidates that, whether our application build can recoer from abnormalsituations or not.
1H
-
8/14/2019 Testing Material.doc
11/117
SOFTWARE TESTING
E-:*uring process power failure, network disconnect, serer down, databasedisconnected etc+
(ecoery Testing is an extension of Error 8andling Testing.
&opatibility Testing:This test is also known as portable testing. *uring this test, testengineer alidates continuity of our application execution on customer expectedplatforms& like @S, Compilers, browsers, etc..
*uring this compatibility two types of problems arises like1. 3orward compatibility5. 'ackward compatibility
2orward copatibility:
The application which is deeloped is ready to run, but the pro-ect technology orenironment like @S is not supported for running.
(ac.ward copatibility:
The application is not ready to run on the technology or enironment.
&onfiguration Testing: This test is also known as 8ardware Compatibility testing.*uring this test, test engineer alidates that whether our application build supportsdifferent technology i.e. hardware deices or not9
11
%bnoral
(ac.up 7 #ecovery
Procedures
8oral
(uil
d9S
(uil
d9S
-
8/14/2019 Testing Material.doc
12/117
SOFTWARE TESTING!nter Systes Testing: This test is also known as End>to>End testing. *uring this test,test engineer alidates that whither our application build coexistence with other existingsoftware in the customer site to share the resources &86w or S6w.
The first example is one system is our application and other one is sharable.The second example is same system but different components.
System Software )eel% Compatibility Testing8ardware )eel% Configuration Testing$pplication Software )eel% nter Systems Testing
!nstallation Testing:Testing the applications, installation process in customer specifiedenironment and conditions.
15
!ater 'ill $utomation
Electricity 'ill $utomation
Tele Phone 'ill $utomation
ncome Tax 'ill $utomation
,ocal
Data
(ase
Server
ewly $dded Component
!'$S
E'$S
TP'$S
T'$S
ew SererSharable(esource
)ocal ESea Center (emote Serers
'anking nformation System
'ank )oans
-
8/14/2019 Testing Material.doc
13/117
SOFTWARE TESTING
The following conditions or tests done in this installation process.
Setup Program% !hither Setup is starting or not9
Easy nterface% *uring nstallation, whither it is proiding easy interface or not 9
@ccupied *isk Space% 8ow much disk space it is occupying after the installation9
Sanitation Testing:This test is also known as /arbage Testing. *uring this test, testengineer finds extra features in your application build with respect to S6w (S.Maximum testers may not get this type of problems.
1:
Server
Test Engineer Systems
(uild
(uild
#e$uiredS+w
coponents to
run
application
&ustoer Site
,i.e
Environent
!nstal
lation
1. Setup Program
5. Easy nterface
:. @ccupied *isk Space
4ser d
Password
)ogin 3orgot Password
-
8/14/2019 Testing Material.doc
14/117
SOFTWARE TESTINGParallel or &oparitive testing:*uring this test, test engineer compares ourapplication build with similar type of applications or old ersions of same application tofind competitieness.
This comparatie testing can be done in two iews%
Similar type of applications in the market.
4pgraded ersion of application with older ersions.
Perforance Testing:t is an adanced testing techni"ue and expensie to apply.*uring this test, testing team concentrate on Speed of Processing.
This performance test classified into below subtests.
1. )oad Testing5. Stress Testing:. *ata Jolume Testing;. Storage Testing
,oad Testing:
This test is also known as scalability testing. *uring this test, test engineerexecutes our application under customer expected configuration and load to estimateperformance.
,oad:o. of users try to access system at a time.
This test can be done in two ways
1. Manual Testing. 5.'y using the tool, )oad (unner.
Stress Testing:*uring this test, test engineer executes our application build under
customer expected configuration and peak load to estimate performance.
Data /olue Testing:$ tester conducts this test to find maximum si2e of allowable or
maintainable data, by our application build.
Storage Testing:Execution of our application under huge amounts of resources to estimate
storage limitations to be handled by our application is called as Storage Testing.
1;
Technical Support 6 Technical )ead.4nit Testing > Senior Programmer.ntegration Testing > *eeloper 6 Test Engineer.3unctional 7 System Testing > Test Engineer.4ser $cceptance Testing > Customer site people with inolement of Testing team.Port Testing > (elease Team.Testing during Maintenance6
Test Software Changes
> Change Control 'oard
1B
Enhancement
mpact $nalysis CC'
Perform that change
Test that S6w Change
Missed *efect
mpact $nalysis
Perform that change
(eiew old test process capability toimproe
Test that S6w Change
&hange #e$uest
*eelopers
Tester
-
8/14/2019 Testing Material.doc
18/117
SOFTWARE TESTINGTesting Tea:
3rom refinement form of J>Model small scale companies and medium scale companiesare maintaining separate testing team for some of the stages in )CT.n their teams organisation maintains below roles
#uality Control% *efines the ob-ecties of Testing#uality $ssurance% *efines approach done by Test ManagerTest Manager% Schedule that approachTest )ead% Maintain testing team with respect to the test planTest Engineer% Conducts testing to find defects
#uality Control% *efines the ob-ecties of Testing#uality $ssurance% *efines approach done by Test ManagerTest Manager% Schedule, PlanningTest )ead% $ppliedTest Engineer% 3ollowed
1
Pro-ect Manager
Pro-ect )ead
Programmers
Test Managers
Test )ead
Test Engineer 6#$ Engineer
#uality Control
#uality $ssurance
-
8/14/2019 Testing Material.doc
19/117
SOFTWARE TESTING
Testing Terinology:1
'on.ey + &hipan;ee Testing:The coerage of main actiities only in yourapplication during testing is called as monkey testing.&)ess Time
"erilla Testing: To coer a single functionality with multiple possibilities to test iscalled /erilla ride or /erilla Testing. &o rules and regulations to test a issue
E-ploratory Testing:)eel by leel of actiity coerage of actiities in your applicationduring testing is called exploratory testing. &Coering main actiities first and otheractiities next
Sanity Testing:This test is also known as Tester $cceptance Test &T$T. They test forwhither deeloped team build is stable for complete testing or not9
So.e Testing:$n extra shakeup in sanity testing is called as Smoke Testing. Testingteam re-ects a build to deelopment team with reasons, before start testing.
(ebugging:*eelopment team release a build with known bugs to testing them.
(igbang Testing:$ single state of testing after completion of all modules deelopmentis called 'igbang testing. t is also known as informal testing.
!ncreental Testing:$ multiple stages of testing process is called as incrementaltesting. This is also known as formal testing.
Static Testing:Conduct a test without running an application is called as Static Testing.E-:4ser nterface Testing
Dynaic Testing:Conduct a test through running an application is called as *ynamic
Testing.E-:3unctional Testing, )oad Testing, Compatibility Testing
'anual /s %utoation:$ tester conduct a test on application without using any thirdparty testing tool. This process is called as Manual Testing.
$ tester conduct a test with the help of software testing tool. This process is called as$utomation.
1G
*eelopment Team (eleased 'uild
Sanity Test 6 Tester $cceptance Test
3unctional 7 System Testing
-
8/14/2019 Testing Material.doc
20/117
SOFTWARE TESTING
8eed for %utoation:
!hen tools are not aailable they will do manual testing only. f your company alreadyhas testing tools they may follow automation.
3or erifying the need for automation they will consider following two types%
!pact of the test:t indicates test repetition
&riticality:)oad Testing, for 1HHH users.
Criticality indicates complex to apply that test manually. mpact indicates test repetition.
#etesting:(e execution of our application to conduct same test with multiple test data iscalled (etesting.
#egression Testing:The re execution of our test on modified build to ensure bug fixwork and occurrences of side effects is called regression testing.
$ny dependent modules may also cause side effects.
5H
o1
o5
(esult
Multiply
!pact 7 &riticality
%utoation ) 1=>*
-
8/14/2019 Testing Material.doc
21/117
SOFTWARE TESTING
Selection of %utoation:'efore starting one pro-ect leel testing by one separatetesting team, corresponding pro-ect manager or test manager or "uality analyst definesthe need of test automation for that pro-ect depends on below factors.
Type of e-ternal interface:/4 0 $utomation.C4 0 Manual.
Si;e of e-ternal interface:Si2e of external interface is )arge 0 $utomation.Si2e of external interface is Small 0 Manual.
E-pected 8o@ of #eleases:
Seeral (eleases 0 $utomation.)ess (eleases 0 Manual.
'aturity between e-pected releases:More Maturity 0 Manual.)ess Maturity 0 $utomation.
Tester Efficiency:Nnowledge of automation on tools to test engineers 0 $utomation.o Nnowledge of automation on tools to test engineers 0 Manual.
Support fro Senior 'anageent:Management accepts 0 $utomation.Management re-ects 0 Manual.
51
(uil
d
'odifie
d (uild
!pacted Passed Tests
2ailed Tests
4= Tests Passed
44 Test 2ail
Developent
-
8/14/2019 Testing Material.doc
22/117
SOFTWARE TESTING
Testing Docuentation
55
Testing Policy
Test Strategy
Test 'ethodology
Test Plan
Test &ases
Test Procedure
Test Script
Test ,og
Defect #eport
Test Suary #eport
&opany ,evel
Project ,evelTest ,eadA Test
Engineer
Test ,ead
Test 'anager+
0% + P'
&@E@9
Test ,ead
-
8/14/2019 Testing Material.doc
23/117
SOFTWARE TESTING
Testing Policy:t is a company leel document and deeloped by #C people. Thisdocument defines testing ob-ecties, to deelop a "uality software.
%ddress
Testing *efinition % Jerification 7 Jalidation of S6wTesting Process % Proper Test Planning before start testingTesting Standard % 1 *efect per 5?H )@C 6 1 *efect per 1H 3PTesting Measurements % #$M, TMM, PCM.
&E9Sign
0%':#uality $ssessment MeasurementsT'':Test Management MeasurementsP&':Process Capability Measurements8ote: The test policy document indicates the trend of the organi2ation.
Test Strategy:
1. Scope 7 @b-ectie% *efinition, need and purpose of testing in your in yourorgani2ation
5. 'usiness ssues% 'udget Controlling for testing:. Test approach% defines the testing approach between deelopment stages and
testing factors.T(M% Test (esponsibility Matrix or Test Matrix defines mapping between test
factors and deelopment stages.;. Test enironment specifications% (e"uired test documents deeloped by testing
team during testing.?. (oles and (esponsibilities% *efines names of -obs in testing team with re"uired
responsibilities.A. Communication 7 Status (eporting% (e"uired negotiation between two
consecutie roles in testing.B. Testing measurements and metrics% To estimate work completion in terms of
#uality $ssessment, Test management process capability.. Test $utomation% Possibilities to go test automation with respect to corresponding
pro-ect re"uirements and testing facilities 6 tools aailable &either complete
automation or selectie automationG. *efect Tracking System% (e"uired negotiation between the deelopment and
testing team to fix defects and resole.1H. Change and Configuration Management% re"uired strategies to handle change
re"uests of customer site.11. (isk $nalysis and Mitigations% $naly2ing of future common problems appears
during testing and possible solutions to recoer.15. Training plan% eed of training for testing to start6conduct6apply.
5:
-
8/14/2019 Testing Material.doc
24/117
SOFTWARE TESTING
Test 2actor:$ test factor defines a testing issue. There are 1? common test factors inS6w Testing.
Ex%
#C 0 #ualityPM6#$6TM 0 Test 3actorT) 0 Testing Techni"uesTE 0 Test cases
PM6#$6TM 0 Ease of useT) 0 4 testingTE 0 MS A (ules
PM6#$6TM 0 PortableT) 0 Compatibility TestingTE 0 (un on different @S
Test 2actors:
1. %uthori;ation: Jalidation of users to connect to applicationSecurity Testing3unctionality 6 (e"uirements Testing
5. %ccess &ontrol: Permission to alid user to use specific sericeSecurity Testing3unctionality 6 (e"uirements Testing
:. %udit Trail: Maintains metadata about operations
Error 8andling Testing3unctionality 6 (e"uirements Testing
;. &orrectness: Meet customer re"uirements in terms of functionality$ll black box Testing Techni"ues
?. &ontinuity in Processing: nter process communicationExecution Testing@perations Testing
A. &oupling: Co existence with other application in customer sitenter Systems Testing
B. Ease of 3se: 4ser friendliness4ser nterface Testing
Manual Support Testing. Ease of 9perate: Ease in operations
nstallation testingG. 2ile !ntegrity: Creation of internal files or backup files
(ecoery Testing3unctionality 6 (e"uirements Testing
1H. #eliability: (ecoer from abnormal situations or not. 'ackup files using or not(ecoery Testing
5;
-
8/14/2019 Testing Material.doc
25/117
SOFTWARE TESTINGStress Testing
11. Portable: (un on customer expected platformsCompatibility TestingConfiguration Testing
15. Perforance: Speed of processing)oad TestingStress Testing*ata Jolume TestingStorage Testing
1:. Service ,evels: @rder of functionalitiesStress Testing3unctionality 6 (e"uirements Testing
1;. 'ethodology: 3ollows standard methodology during testingCompliance Testing
1?. 'aintainable: !hither application is long time sericeable to customers or notCompliance Testing &Mapping between "uality to testing connection
0uality "ap:$ conceptual gap between #uality 3actors and Testing process is called as#uality /ap.
Test 'ethodology:Test strategy defines oer all approach. To conert a oer allapproach into corresponding pro-ect leel approach, "uality analyst 6 PM defines testmethodology.
Step 1% Collect test strategyStep 5% Pro-ect type
Pro-ect Type nformation /athering 7$nalysis
*esign Coding SystemTesting
Maintenance
Traditional D D D D D
@ff>the>Shelf O O O D O
Maintenance O O O O D
Step :% *etermine application type% *epends on application type and re"uirements the#$ decrease number of columns in the T(M.Step ;% dentify risks% *epends on tactical risks, the #$ decrease number of factors&rows in the T(M.Step ?% *etermine scope of application% *epends on future re"uirements 6 enhancements,#$ try to add some of the deleted factors once again. &umber of rows in the T(MStep A% 3inali2e T(M for current pro-ectStep B% Prepare Test Plan for work allocation.
5?
-
8/14/2019 Testing Material.doc
26/117
SOFTWARE TESTING
Testing Process:
PET )Process E-perts Tools and Technology*: t is an adanced testing process
deeloped by 8C), Chennai.This process is approed by #$ forum of ndia. t is arefinement form of J>Model.Test Planning:$fter completion of test initiation, test plan author concentrates on testplan writing to define what to test, how to test, when to test and who to test .
!hat to test > *eelopment Plan8ow to test > S6w(S!hen to test > *esign *ocuments!ho to test > Team 3ormation
Tea 2oration n general test planning process starts with testing teamformation, depends on below factors.
$ailability of Testers
Test *uration
$ailability of test enironment resources
The aboe three are dependent factors.
Test Duration:
5A
Test
!nitiation
Test
Planning
Test
Design Test
E-ecution
Test
&losure
Test
#eport
Defect#egression
Testing
Developent Plan 7 S+w#S 7Design Docuents
T#'
Tea 2oration
!dentify tactical #is.s
Prepare Test Plan
#eview Test Plan
Test Plan
!nitial (uild
!nforation "athering )(#S*
%nalysis ) S+w#S *
Design ) H,DD 7 ,,DD * P' + 0% Test !nitiation
&oding
3nit Testing
!ntegration Testing
Test ,ead Test Planning
Study S+w#S 7 Design Docs
Test Design
,evel B = ) Sanity + So.e + T%T *
Test %utoation
Test (atches &reation
Select a batch and starts
e-ecution ) ,evel 1 4 *
!f u got any isatch then
suspend that (atch
9therwise
Test &losure
2inal #egression + Pre %cceptance +#elease + Post 'ortu + ,evel 16 Testing
3ser %cceptance Test
Sign 9ff
=
Defect
2i-ing
(ug
#esolving )#egression *
),evel B 5*
Defect
#eport
)'odified
(uild*
!ndependent
8e-t
-
8/14/2019 Testing Material.doc
27/117
SOFTWARE TESTINGCommon market test team duration for arious types of pro-ects.
C6S, !eb, E(P pro-ects > S$P, J', Q$J$ 0 Small > :>? monthsSystem Software > C, C== > Medium 0 B>G monthsMachine Critical > Prolog, )SP > 'ig > 15>1? months
System Software Pro-ects% etwork, Embedded, Compilers +Machine Critical Software% (obotics, /ames, Nnowledge base, Satellite, $ir Traffic.
4@ !dentify tactical #is.s
$fter completion of team formation, test plan author concentrates on risks analysisand mitigations.
1 )ack of knowledge on that domain5 )ack of budget: )ack of resources&h6w or tools; )ack of testdata &amount? *elays in delieries&serer downA )ack of deelopment process rigorB )ack of communication& Ego problems
5@ Prepare Test Plan
2orat:
1 Test Plan id% 4ni"ue number or name5 ntroduction% $bout Pro-ect: Test items% Modules; 3eatures to be tested% (esponsible modules to test
? 3eature not to be tested% !hich ones and why not9A 3eature pass6fail criteria% !hen aboe feature is pass6fail9B Suspension criteria% $bnormal situations during aboe features testing. Test enironment specifications% (e"uired docs to prepare during testingG Test enironment% (e"uired 86w and S6w1H Testing tasks% what are the necessary tasks to do before starting testing11 $pproach% )ist of Testing Techni"ues to apply15 Staff and training needs% ames of selected testing Team1: (esponsibilities% !ork allocation to aboe selected members1; Schedule% *ates and timings1? (isks and mitigations % Common non technical problems
1A $pproals% Signatures of PM6#$ and test plan author
6@ #eview Test Plan
$fter completion of test plan writing test plan author concentrate on reiew of thatdocument for completeness and correctness. n this reiew, selected testers also inoledto gie feedback. n this reiews meeting, testing team conducts coerage analysis.
5B
-
8/14/2019 Testing Material.doc
28/117
SOFTWARE TESTING S6w(S based coerage & !hat to test
(isks based coerage & $naly2e risks point of iew
T(M based coerage & !hither this plan tests all tests gien in T(M
Test Design:$fter completion of test plan and re"uired training days, eery selected
test engineer concentrate on test designing for responsible modules. n this phase testengineer prepares a list of testcases to conduct defined testing, on responsible modules.
There are three basic methods to prepare testcases to conduct core leel testing.
'usiness )ogic based testcase design
nput *omain based testcase design
4ser nterface based testcase design
(usiness ,ogic based testcase design: n general test engineers are writing list oftestcases depends on usecases 6 functional specifications in S6w(S. $ usecase in S6w(S
defines how a user can use a specific functionality in your application.
5
-
8/14/2019 Testing Material.doc
29/117
SOFTWARE TESTING
(#S
H,DD
,,DD
To prepare testcases depends on usecases we can follow below approach%
Step 1% Collect responsible modules usecasesStep 5% select a usecase and their dependencies & *ependent 7 *eterminant Step 5>1% identify entry conditionStep 5>5% identify input re"uiredStep 5>:% identify exit condition
Step 5>;% identify output 6 outcomeStep5>?% study normal flowStep 5>A% study alternatie flows and exceptionsStep:% prepare list of testcases depends on aboe studyStep ;% reiew testcases for completeness and correctness
Test&ase 2orat:
5G
S+w#S
3secases
2unctional
Specifications
&oding @E-e
Test&ases
-
8/14/2019 Testing Material.doc
30/117
SOFTWARE TESTING$fter completion of testcases selection for responsible modules, test engineer prepare anEEE format for eery test condition.
TestCase d % 4ni"ue number or nameTestCase ame % ame of the test condition3eature to be tested % Module 6 3eature 6 SericeTestSuit d % Parent batch dKs, in which this case is participating as a member.Priority % mportance of that testcasePo 0 'asic functionalityP1 0 /eneral 3unctionality &6p domain, Error handling +P5 0 Cosmetic TestCases&Ex% pH 0 os, p1>difft oss, p5 0 look 7 feelTest Enironment% (e"uired 86w and S6w to execute the test casesTest Effort% &Person Per 8our or Person 6 8r Time to execute this test case & 5H Mins Test *uration% *ate of executionTest Setup% ecessary tasks to do before start this case executionTest Procedure% Step by step procedure to execute this testcase.
Step o. $ction 6p (e"uired Expected (esult *efect * Comments
TestCase Pass63ail Criteria% !hen that testcase is Pass, !hen that testcase is fail.
!nput Doain based Test&ase Design:
To prepare functionality and error handling testcases, test engineers are using 4seCases
or functional specifications in S6w(S. To prepare input domain testcases test engineersare depending on data model of the pro-ect &E(* 7 ))*
Step1% dentify input attributes in terms of si2e, type and constraints.&si2e> range, type 0 int, float constraint 0 Primary keyStep5% dentify critical attributes in that list, which are participating in data retrieals and
manipulations.Step:% dentify non critical attributes which are input, output type.Step;% Prepare 'J$ 7 ECP for eery attribute.
ECP & Type 'J$ & Si2e 6 (ange
nput $ttribute Jalid nalid Minimum Maximum
3ig% *ata Matrix
3ser !nterface based testcase design:
:H
Test Design Test E-ecution
-
8/14/2019 Testing Material.doc
31/117
SOFTWARE TESTINGTo conduct 4 testing, test engineer write a list of test cases, depends on our organi2ationleel 4 rules and global 4 conentions.
3or preparing this 4 testcases they are not studying S6w(S, ))** etc+3unctionality testcases source% S6w(S. 6P domain testcases source% ))**
Testcases% 3or all pro-ects applicableTestcase1% Spelling checkingTesecase5% /raphics checking &alignment, font, style, text, si2e, micro soft A rulesTestcase:% Meaningful error messages or not. &Error 8andling Testing 0 related messageis coming or not. 8ere they are testing that message is easy to understand or not
TestCase;% $ccuracy of data displayed &!DS!D/ &$mount, d o b
Testcase?% $ccuracy of data in the database as a result of user input.&Tc; screen leel, tc? at database leel
TestcaseA% $ccuracy of data in the database as a result of external factors9
TestcaseB% Meaningful 8elp messages or not9&3irst A tc for uit and B manual supporttesting
#eview Testcases:$fter completion of testcases design with re"uired documentationREEE for responsible modules, testing team along with test lead concentrate on reiew
:1
@
2or
(al @C
DS8
Table
'ail
@"if
'ail Server!age
Decopression!age
copression
DS
!port
'ail
@"if
-
8/14/2019 Testing Material.doc
32/117
SOFTWARE TESTINGof testcases for completeness and correctness. n this reiew testing team conductscoerage analysis
1. 'usiness (e"uirements based coerage5. 4seCases based coerage:. *ata Model based coerage;. 4ser nterface based coerage?. T(M based coerage
3ig% (e"uirements Jalidation 6 Traceability Matrix.
'usiness (e"uirements Sources &4se Cases, *ata Model+ TestCases
:5
-
8/14/2019 Testing Material.doc
33/117
SOFTWARE TESTINGTest E-ecution:
Test E-ecution levels /s Test &ases:
)eel H 0 PH)eel 10 PH, P1 and P5 testcases as batches)eel 50 Selected PH, P1 and P5 testcases with respect to modifications)eel :0 Selected PH, P1 and P5 testcases at build.
Test 8arness < Test Enironment = Test 'ed
(uild /ersion &ontrol:4ni"ue numbering system. & 3TP or SMTP
$fter defect reporting the testing team may receie
Modified 'uild
::
Developent Site Testing Site!nitial (uild
Stable (uild
,evel14
)&oprehensive*
Defect #eport
Defect 2i-ing
(ug #esolving
'odified (uild
,evel15 )#egression*
,evel16 )2inal
#egression*
Test %utoation
1
Ties
,evel1= )Sanity +
So.e + T%T*
ServerSoftbase
(uild
Test
Environent
2TP
-
8/14/2019 Testing Material.doc
34/117
SOFTWARE TESTING Modified Programs
To maintain this original builds and modified builds, deelopment team use ersioncontrol softwares.
Server
Test
Environent
'odified
Progras
'odified
(uild
Ebed into
9ld (uild
,evel = )Sanity + So.e + T%T*:
$fter receiing initial build from deelopment team, testing team install into testenironment. $fter completion of dumping 6 installation testing team ensure that basicfunctionality of that build to decide completeness and correctness of test execution.
*uring this testing, testing team obseres below factors on that initial build.
:;
54
-
8/14/2019 Testing Material.doc
35/117
SOFTWARE TESTING
1. 4nderstandable% 3unctionality is understandable to test engineer.5. @perable% 'uild is working without runtime errors in test enironment.:. @bserable% Process completion and continuation in build is estimated by tester.;. Controllable% $ble to Start6 Stop processes explicitly.?. Consistent% Stable naigationsA. Maintainable% o need of reinstallationsB. Simplicity% Short naigations to complete task.. $utomatable% nterfaces supports automation test script creation.
This leel>H testing is also called as Testability or @ctangle Testing &bc2 based on factors.
Test %utoation: $fter receiing a stable build from deelopment team, testing teamconcentrate on test automation.
Test $utomation two types% Complete and Selectie.
,evel14: )&oprehensive Testing*:
$fter completion of stable build receiing from deelopment team and automation,
testing team starts test execution of their testcases as batches. The test batch is alsoknown as TestSuit or test set. n eery batch, base state of one testcase is end state ofpreious testcase.
*uring this test batches execution, test engineers prepares test log with three typesof entries.
1. Passed5. 3ailed:. 'locked
Passed:$ll expected alues are e"ual to actual.2ailed:$ny expected alue is ariated with actual.
(loc.ed:Corresponding testcases are failed.
:?
Test %utoation
&oplete Selective
F )%ll P= andcarefully
selected P4
Testcases*
-
8/14/2019 Testing Material.doc
36/117
SOFTWARE TESTING
,evel15 #egression Testing:$ctually this (egression testing is part of )eel>1 testing.*uring comprehensie test execution, testing team reports mismatches to deelopmentteam as defects. $fter receiing that defect, deelopment team performs modifications incoding to resole that accepted defects. !hen they release modified build, testing teamconcentrate on regression testing before conducts remaining comprehensie testing.
Severity:Seriousness of the defect defined by the tester through Seerity &mpact andCriticality importance to do regression testing. n organi2ations they will be giing threetypes of seerity like 8igh, Medium and )ow.
High:!ithout resoling this mismatch tester is not able to continue remaining testing.&Show stopper.'ediu:$ble to continue testing, but resole must.,ow:May or may not resole.
E-: 8igh% *atabase not connecting.
Medium% nput domain wrong. &$ccepting wrong alues also)ow% Spelling mistake.
Oy2 are three dependent modules. f u find bug in 2, then
*o on 2 and colleges% 8igh3ull 2 module% MediumPartial 2 module% )ow
:A
!n 0ueue
S.ip
(loc.ed
!n Progress
Passed
2ailed
Partial
Pass + 2ail
&losed
-
8/14/2019 Testing Material.doc
37/117
SOFTWARE TESTING
Possible ways to do #egression Testing:
&ase 4:f deelopment team resoled bug and its seerity is high, testing team will reexecute all PH, P1 and carefully selected P5 test cases with respect to that modification.
&ase 5:f deelopment team resoled bug and its seerity is medium, testing team will reexecute all PH, selected P1 RH>GH I and some of P5 test cases with respect to thatmodification.
&ase 6:f deelopment team resoled bug and its seerity is low, testing team will reexecute some of the PH, P1, P5 test cases with respect to that modification.
&ase
-
8/14/2019 Testing Material.doc
38/117
SOFTWARE TESTING
Defect #eporting and Trac.ing:
*uring comprehensie test execution, test engineers are reporting mismatches todeelopment team as defect reports in EEE format.
1. *efect d% $ uni"ue number or name.5. *efect *escription% Summary of defect.:. 'uild Jersion d% Parent build ersion number.;. 3eature% Module 6 3unctionality?. Testcase name and *escription% 3ailed testcase name with descriptionA. (eproducible% &Des 6 oB. f yes, attach test procedure.. f o, attach snapshots and strong reasonsG. Seerity% 8igh 6 Medium 6 )ow1H. Priority11. Status% ew 6 (eopen &after : times write new programs15. (eported by% ame of the test engineer1:. (eported on% *ate of Submission1;. Suggested fix% optional
1?. $ssign to% ame of PM1A. 3ixed by% PM or Team )ead1B. (esoled by% ame of the *eeloper1. (esoled on% *ate of soling1G. (esolution type%5H. $pproed by% Signature of the PM
*efect $ge% The time gap between resoled on and reported on.
Defect Subission:
3ig% )arge Scale @rgani2ations.
:
Test 'anager
Test ,ead
Test Engineer
Project 'anager
Tea ,ead
Developers
0%
Transittal #eports
-
8/14/2019 Testing Material.doc
39/117
SOFTWARE TESTINGDefect Subission:
3ig% Small Scale @rgani2ations.
Defect Status &ycle:
(ug ,ife &ycle:
:G
Test ,ead
Test Engineer
Project 'anager
Tea ,ead
Developers
Transittal #eports
8ew
2i-ed )9penA #ejectA Deferred*
&losed
#eopen
-
8/14/2019 Testing Material.doc
40/117
SOFTWARE TESTING
#esolution Type:
There are 15 resolution types such as1. *uplicate% (e-ected due to defect like same as preious reported defect.5. Enhancement% (e-ected due to defect related to future re"uirement of the
customer.:. 86w )imitation% (aised due to limitations of hardware &(e-ected;. S6w )imitation% (e-ected due to limitation of s6w technology.?. 3unctions as design% (e-ected due to coding is correct with respect to design
documents.A. ot $pplicable% (e-ected due to lack of correctness in defect.B. o plan to fix it% Postponed part timely &ot accepted and re-ected. eed for More nformation% *eelopers want more information to fix. &ot
accepted and re-ectedG. ot (eproducible% *eeloper want more information due to the problem is not
reproducible. &ot accepted and re-ected1H. 4ser misunderstanding% &'oth argues you r thinking wrong &Extra negotiation
between tester and deeloper
;H
Detect Defect
#eproduce Defect
#eport Defect
2i- (ug
#esolve (ug
&lose (ug
*efect (eport
(esolution Type
Testing *eelopment
-
8/14/2019 Testing Material.doc
41/117
SOFTWARE TESTING11. 3ixed% @pened a bug to resole &$ccepted15. 3ixed ndirectly% *iffered to resole &$ccepted
Types of (ugs:
4 bugs% &)ow seeritySpelling mistake% 8igh Priority!rong alignment% )ow Priority
nput *omain bugs% &Medium seerity@b-ect not taking Expected alues% 8igh Priority@b-ect taking 4nexpected alues% )ow Priority
Error 8andling bugs% &Medium seerityError message is not coming% 8igh PriorityError message is coming but not understandable% )ow Priority
Calculation bugs% &8igh seerityntermediate (esults 3ailure% 8igh Priority3inal outputs are !rong% )ow Priority
Serice )eels bugs% &8igh seerity*eadlock% 8igh Prioritymproper order of Serices% )ow Priority
)oad condition bugs% &8igh seerityMemory leakage under load% 8igh Priority*oesnUt allows customer expected load% )ow Priority
8ardware bugs% &8igh seerityPrinter not connecting% 8igh Prioritynalid printout% )ow Priority
'oundary (elated 'ugs% &Medium Seerity
d control bugs% &Medium seerity !rong ersion no, )ogo
Jersion Control bugs% &Medium seerity *ifference between two consecutie ersions
Source bugs% &Medium seerity Mismatch in help documents
Test &losure:
;1
-
8/14/2019 Testing Material.doc
42/117
SOFTWARE TESTING$fter completion of all possible testcase execution and their defect reporting andtracking, test lead conduct test execution closure reiew along with test engineers.
n this reiew test lead depends on coerage analysis%
'(S based coerage
4seCases based coerage &Modules *ata Model based coerage &i6p and op
4 based coerage &(ules and (egulations
T(M based coerage &PM specified tests are coered or not
$nalysis of the differed bugs%!hither deferred bugs are postponable or not.
Testing team try to execute the high priority test cases once again to confirm correctnessof master build.
2inal #egression Process:/ather re"uirementsEffort estimation &Person68rPlan (egressionExecute (egression(eport (egression
2inal #egression Testing:
3ser %cceptance Testing:
;5
/atherre"uirements
Effortestimation
Plan(egression
Execute(egression
(eport(egression
-
8/14/2019 Testing Material.doc
43/117
SOFTWARE TESTING$fter completion of test execution closure reiew and final regression, our organi2ationconcentrates on 4$T to collect feed back from customer 6 customer site like people.There are two approaches%
1. $lpha testing5. 'eta testing
Sign9ff:$fter completion of 4$ and then modifications, test lead creates Test Summary (eport&TS(. t is a part of s6w release note. This TS( consists of
1. Test Strategy 6 Methodology &what tests5. System Test Plan &schedule:. Traceability Matrix &mapping re"uirements and testcases;. $utomated Test Scripts &TS) = /4 map entries?. 3inal 'ug summary (eport
'ug d *escription 3ound 'y Status&Closed 6*eferred
Seerity Module 63unctionality
Comments
&ase Study )Schedule for G 'onths*:
Deliverable #esponsibility &opletion Tie
TestCase Selection Test Engineer 5H>:H days
TestCase (eiew Test )ead, Test Engineer ;>? days
(JM 6 (TM Test )ead 1 day
Sanity 7 Test $utomation Test Engineer 5H>:H daysTest Execution as 'atches Test Engineer ;H>AH days
Test (eporting Test Engineer 7 Test )ead @n going during testexecution
Communication and Status(eporting
Eeryone in testing team !eakly twice
3inal (egression Testing 7Closer (eiew
Test Engineer and Test )ead ;>? days
4ser $cceptance Testing Customer Site People& nolement of Testing Team
?>1H days
Test Summary (eport
&Sign @ff
Test )ead 1>5 days
Testing computer software 0 Cem NamerEffectie methods for software testing 0 !illiam E PerrySoftware Testing Tools 0 *r. N.J.N.N. Prasad
ScottVtestingWyahoo.com
!hat u r doing9!hat type of testing process going on ur company9
;:
-
8/14/2019 Testing Material.doc
44/117
SOFTWARE TESTING!hat type of test documentation prepared by ur organi2ation9!hat type of test documentation u will prepare9!hats ur inolement in that9!hat are key components of ur company test plan9!hat type of format u prepare for test cases98ow ur pm selects what type of tests need for ur pro-ect9!hen u will go to automation9!hat is regression testing9 !hen u will do this98ow u report defects to deelopment team98ow u know whither defect accepted or re-ected9!hat u do when ur defect re-ected98ow u will learn pro-ect with out documentation9!hat is the difference between defect age and 'uild interal period98ow u will do test without documents9!hat do u mean by green box testing9
Experience on winrunnerExposure to td+!inrunner 61H.)oad runner B61H.
%uditing:
*uring testing and maintenance, testing team conducts audit meetings toestimate status and re"uired improements. n this auditing process they can use threetypes of measurements and metrics.
0uality 'easureent 'etrics:
These measurements are used by #$ or PM to estimateachieement of "uality in current pro-ect testing Rmonthly once
Product Stability:
;;
o .
@f
bugs
*uration
5HI Testing 0 HI 'ugs
HI Testing 0 5HI 'ugs
-
8/14/2019 Testing Material.doc
45/117
SOFTWARE TESTINGSufficiency:
(e"uirements Coerage
Type 0 Trigger $nalysis &Mapping between coered re"uirements and applied
tests
Defect Severity Distribution 9rgani;ation trend liit chec.:
@rganisation trend limit check
Test 'anageent 'easureents:
These measurements are used by test lead during test execution of current pro-ectRweakly twice
Test Status
Executed tests
n progress
Det to execute
Delays in Delivery *efect $rrial (ate
*efect (esolution (ate
*efect $ging
Test Effort
Cost of finding a defect &Ex% ; defects 6 person day
Process &apability 'easureents:
These measurements are used by "uality analyst and test management to improe the
capability of testing process for upcoming pro-ects testing. &t depends on old pro-ectsmaintenance leel feedback
Test Efficiency
Type>Trigger $nalysis
(e"uirements Coerage
Defect Escapes
Type>Phase analysis.
&!hat type of defects my testing team missed in which phase of testing
Test Effort Cost of finding a defect &Ex% ; defects 6 person day
Win#unner C@=
;?
-
8/14/2019 Testing Material.doc
46/117
SOFTWARE TESTING *eeloped by Mercury nteractie
3unctionality testing tool & ot suitable to Performance, 4sability and SecurityTesting
Supports c6s and web technologies & J', c==, -aa, d5k, power builder, *elphi,8M) etc+
!in(unner wont supports .et, OM), S$P, People Soft, Maya, 3lash, oracle
applications etc+ To support .et, OM), S$P, People Soft, Maya, 3lash, OM), oracle applications
etc+ we can use #TP & #uick Test Professional
#TP is an extension of !in(unner.
Win#unner #ecording Process:
.
,earning:(ecogni2ation of ob-ects and windows in your application by testing tool iscalled )earning.
#ecording:$ test engineer records our manual process in winrunner to automate.
Edit Script:Test Engineer inserts re"uired check points into that recorded test script.
#un Script:$ test engineer executes automated test script to get results.
%naly;e #esults:$ test engineer analy2es test results to concentrate on defect tracking.
;A
,earning
Edit Script
#un Script
%naly;e that Test #esults
P
#ecording
3ser !d
9.
Password
&ancel
-
8/14/2019 Testing Material.doc
47/117
SOFTWARE TESTING
E-p: @k enabled after entering user id and password.
E-plain !cons in Win#unner
8ote: !in(unner B.H proides auto learning facility to recogni2e ob-ects and windows inyour pro-ect without your interaction.
Eery statement ends with X like C.
Test Script: $ test script consists of aigational Statements 7 Check Points. nwinrunner scripting language is also called as TS) & Test Script )anguage like as C.
%dd1in 'anager:This window proides a list of !in(unner supported technologieswith respect to our purchased license.ote% f all options in $dd in Manager are off by default it supports J', JC== interface&!in:5 $P.
#ecording 'odes:To record our business operations &aigations in winrunner we canuse 5 types of recording modes.
1. Context Sensitie mode &*efault Mode
5. $nalog mode
%nalog 'ode:To record mouse pointer moements on the desktop, we can use thismode. n $nalog Mode tester maintains constant monitor resolution and applicationposition during recording and learning
%pplication areas:*igital Signatures, /raphs drawing, image moements.
ote%1. n analog mode, !in(unner records mouse pointer moements with respect
to desktop co>ordinates. *ue to this reason, test engineer maintains
corresponding context sensitie mode window in default position in recordingand running.
5. f u want to use $nalog mode for recording, we can maintain monitorresolution as constant during recording and running.
moeVlocatorVtrack& % !in(unner use this function to record mouse pointer moementson the desktop in one unit of time.
;B
-
8/14/2019 Testing Material.doc
48/117
SOFTWARE TESTINGSyntax% moeVlocatorVtrack&track numberX
'y default it starts with 1. t is not based on time. 'ut based on operation.
mtype&% !in(unner uses this operations this function to record mouse pointer operationson the desktop.
Syntax% mtype&YT Track umberZ Y N key on the mouse used Z = 6 > X
Ex% mtype &FYT5HZYk)eftZ=FX
Track no 0 *eck top coordinates in which you operate the mouse. t stores the mousecoordinates. $ctually it is a memory location.
Type&% !e can use this function to record keyboard operations in analog mode.
Syntax% type&Typed characters[6[$SC notation[X
&onte-t Sensitive ode:To record mouse and key board operations on our applicationbuild, we can use this mode. t is a default mode. n general functionality test engineercreates automation test scripts in Context Sensitie mode with re"uired check points. nthis mode !in(unner records our application operation with respect to ob-ects andwindows. To record mouse and key board operations on our application build, we can usethis mode. t is a default mode.
3ocus to !indow SetVwindow&!indow ame[, timeX
Text'ox EditVset&Edit ame[,[Typed Characters[X
Password text box PasswordVeditVset&Pwd @b-ect[,[Encrypted Pwd[X
Push 'utton 'uttonVpress&'utton ame[X
(adio 'utton 'uttonVset&'utton ame[,@X'uttonVset&'utton ame[,@33X
Check 'ox 'uttonVset&'utton ame[,@X'uttonVset&'utton ame[,@33X
)ist6Combo 'ox )istVselectVitem&)ist1[, Selected tem[X
Menu MenuVselectVitem&Menu ameX @ption ame[X
(ase State:$n application state to start test is called as 'ase State.End State:$n application state to stop test is called as 'ase State.&all State:$n intermediate state of an application between base state and end state iscall state.2unctionality or #e$uireents Testing has following coverages
'ehaioral Coerage & @b-ect Properties Checking .
nput *omain Coerage & Correctness of Si2e and Type of eery i6p @b-ect .
Error 8andling Coerage & Preenting negatie naigation .
Calculations Coerage & correctness of o6p alues .
'ackend Coerage & *ata Jalidation 7 *ata ntegrity of database tables .
;
-
8/14/2019 Testing Material.doc
49/117
SOFTWARE TESTING 4()Ks Coerage &)inks execution in web pages
Serice )eels & @rder of functionality or serices .
Successful 3unctionality & Combination of aboe all .
&hec. points:!in(unner is a functionality testing tool, it proides a set of facilities to
coer below sub tests.
To automate aboe sub tests, we can use ; check points in !in(unner%1. /4 check points5. 'itmap check points:. *ata 'ase check points;. Text check points
"3! &hec. point:To automate behaior of ob-ects we can use this check point. tconsists of sub options.
1. 3or Single Property5. 3or @b-ect6!indow:. 3or Multiple Properties
2or Single Property:To test a single property of an ob-ect we can use this option.
8avigation:select a position in Script, Create Menu, /4 check point, for singleproperty, select testable ob-ect&*ouble Click, select re"uired property with expected,click paste.
E-:4pdate @b-ect
4pdate @rder3ocus to !indow *isable
@pen a (ecord *isable
Perform Change Enable
Synta-:ob-ectVcheckVinfo&F@b-ect ameF, FPropertyF, Expected alueX
E-:buttonVcheckVinfo&F4pdate @rderF,FenabledF,HX
f the checkpoints are for numeric alue, then no need for double "uotes.f the checkpoints are for string alue, then place the data in between double "uotes.
'ut winrunner takes any alue by default in string with double "uotes.Proble:
3ocus to !indow 0 tem o should be 3ocused@k enabled after filling itemno 7 "ty.
;G
tem o
#uantity
Scott Shopping
@k
-
8/14/2019 Testing Material.doc
50/117
SOFTWARE TESTING
E-pected: o. of tems in 3ly to e"ual to, no of items in 3ly 3rom >1, when you selectan item in fly from.
E-:if u select an item in a list box then the no of items in next list boxes decreased by 1.
Proble:
3ocus to !indow 0 @k should be *isabledEnter (oll o > @k should be disabledEnter ame 0 @k should be disabledEnter Class 0 @k should be disabled
?H
3ly 3rom
3ly To
Scott Qourney
@k
tem o
#uantity
Scott Shopping
@k
-
8/14/2019 Testing Material.doc
51/117
SOFTWARE TESTING
Proble:f type is $, $ge is 3ocused, f type is ', /ender is 3ocused, f type is C,#ualification is 3ocused. Else others is focused. &use switch stmt
switch&x\case $[% editVcheckVinfo&$ge[,[focused[, 1X
breakX]
?1
)ist1
)ist5
Scott Qourney
@k
)ist:
Type
$ge /ender #ualification @thers
)ist
@k
Text
-
8/14/2019 Testing Material.doc
52/117
SOFTWARE TESTING
E-p:Selected tem in )ist box appears in text box after Clicking @k button.
E-p:Selected tem in )ist box appears in Sample 5 text ob-ect after clicking displaybutton.
Sample1 Sample5
Proble:
f basic salary Z< 1HHHH then commission < 1HI of basic salary.Else f basic salary in between ?HHH 7 1HHHH then commission < ?I of basic salary.Else f basic salary Y ?HHH then commission < 5HH (s.
Proble:
f Total Z< HH then /rade < $.Else f Total in between HH 7 BHH then /rade < '.Else /rade < C.
?5
)ist1
@k
*isplay
Text
Emp o
*ept o
Scott Employee
@k
' Sal Comm
-
8/14/2019 Testing Material.doc
53/117
SOFTWARE TESTING
2or 9bject+Window:To test more than one properties of a single ob-ect, we can use thisoption.
Ex% 4pdate @b-ect
4pdate @rder
3ocus to !indow *isable
@pen a (ecord *isable
Perform Change Enable 7 3ocused
Synta-:ob-VcheckVgui&ob- name[, Check )ist 3ile.ckl[, expected alues file.txt[,time to create Xn the aboe syntax check list file specifies list of properties to test of a single ob-ect. $nd
its extension is .cklExpected alues file specifies list of expected alues for that selected or testableproperties. $nd its extension is .txt
E-:ob-VcheckVgui&F4pdate @rderF, Flist1.cklF, Fgui1F, 1X
2or 'ultiple 9bjects:To test more than one property of more than one ob-ect in asingle checkpoint we can use this option. To create this checkpoint tester selects multipleob-ects in a single window.
Ex%
nsert @rder 4pdate @rder *elete @rder 3ocus to !indow *isable *isable *isable
@pen a (ecord *isable *isable Enable
Perform Change *isable Enable 7 3ocused Enable
8avigation:select position in script, create menu, /4 checkpoint, 3or Multiple @b-ects,click add, select testable ob-ects, right click to reliee, specify expected for re"uiredproperties for eery selected ob-ect, click ok.
?:
#oll 8o
"rade
@k
Total "rade
-
8/14/2019 Testing Material.doc
54/117
SOFTWARE TESTING
Synta-:winVcheckVgui&F@b-ect ameF, FCheck )ist 3ile.cklF, FExpected Jalues 3ileF,Time to CreateX
E-:winVcheckVgui&F3light (eserationF, Flist:.cklF, Fgui:F, 1X
&ase Study: !hat type of properties you check for what ob-ects9
@b-ect Type Properties
Push 'utton Enabled, 3ocused
(adio 'utton Status & @n , @ff
Check 'ox Status & @n , @ff
)ist 'ox Count & o of items in )ist 'ox , Jalue & Current Selected Jalue
Table /rid (ows, Columns, Table Content
Text 6 Edit 'ox Enabled, 3ocused, Jalue, (ange, (egular Expression, *ate 3ormat,Time 3ormat
&hanging &hec. Points:
!in(unner allows us to perform changes in the existing check points. There are 5 typesof changes in existing checkpoints due to pro-ect sudden changes or tester mistake.
1. Change expected alues5. $dd new properties to test
&hange e-pected values
!r allows u to perform changes in expected alues in existing checkpointsaigarion% execute test script, click results, perform changes in expected alues in
results window of re"uired, click ok, reexecute the test script to get right result
%dd new properties to test
Sometimes test engineer add extra properties to existing checkpoint due toincompleteness of test through below naigation.
8avigation:Create menu, edit gui check list, select check list file name, click ok, selectnew properties to test, click ok, to oerwrite, change run mode to update, click runexecuted &default alues selected as exp alues, click run in erify mode to get results,perform changes in result if re"uired
?;
Enabled3ocusedJalue
@@33
Default /alue
-
8/14/2019 Testing Material.doc
55/117
SOFTWARE TESTING
#unning 'odes in Win#unner:
/erify ode:in this mode wr compare our expected alues with actual.3pdate ode:in this runmode, default alues select as expected alueDebug ode:to run our test scripts line by line.
*uring /4 check point creation !inrunner creates checklist files and expected aluesfiles in 8ard*isk. !inrunner maintains the test scripts by default in tmp folder
Script% c%^program files^mi^wr tmp^testname^scriptChecklists% c%^program files^mi^wr tmp^testname^chklist^list1.cklExp alues% c%^program files^mi^wr^tmp^testname^exp^gui1
!nput Doain &overage: #ange and Si;e
8avigation:Create Menu, /4 Check point, for ob-ect6window, select ob-ect, selectrange property, enter from 7 to alues, click ok.
Synta-: ob-VcheckVgui&ob- name[, !hat Property you are checking[, (ange ofJalues from 7 To[, time to create n the aboe syntax check list file specifies list of properties to test of a single ob-ect. $ndits extension is .cklExpected alues file specifies list of expected alues for that selected or testableproperties. $nd its extension is .txt
Ex% ob-VcheckVgui&F4pdate @rderF, Flist1.cklF, Fgui1F, 1X
!nput Doain &overage: /alid and !nvalid &lasses
8avigation:Create Menu, /4 Check point, for ob-ect6window, select ob-ect, select(egular Expression property, enter Expected Expression as R, click ok.
Synta-: ob-VcheckVgui&ob- name[, !hat Property you are checking[, (ange ofJalues from 7 To[, time to create n the aboe syntax check list file specifies list of properties to test of a single ob-ect. $ndits extension is .ckl
??
Scott Sample
$ge
-
8/14/2019 Testing Material.doc
56/117
SOFTWARE TESTINGExpected alues file specifies list of expected alues for that selected or testableproperties. $nd its extension is .txt
Ex% ob-VcheckVgui&F4pdate @rderF, Flist1.cklF, Fgui1F, 1X
Proble: The ame text box should allow only lower leel characters
1. $lphabets in lower case and initial cap only
5. $lpha numeric and starting and ending with alphabets only:. $lphabets in lower case but starts with ( ending with o only;. $lphabets in lower case with 4nder Score in middle?. $lphabets in lower case with Space and 4nder Score in middle
(itap &hec. Point:t is an optional checkpoint in functionality testing tool. Tester canuse this checkpoint to compare images, logos, graphs and other graphical ob-ects.& )ikesignatures
This checkpoint consists of two sub types%1. 3or @b-ect6!indow &Entire mage Testing5. 3or Screen $rea &Part of mage Testing
These options supports testing on static images only. !in(unner doesnUt supportdynamic images deeloped using 3lash, Maya+
2or 9bject+Window:To compare our expected image with actual image in yourapplication build, we can use this option.
8avigation:select a position in script, create menu, bitmap checkpoint, forob-ect6window, select image ob-ect.ob-VcheckVbitmap&mage ob-ect ame[,Expected image file. 'mp[, time to create the image check point
winchec.bitap)I%bout 2light #eservation SysteIA I!g4IA 4*J
(un on different ersions.
?A
Scott Sample
ame
-
8/14/2019 Testing Material.doc
57/117
SOFTWARE TESTING
Expected 0 (ecord time$ctual 0 (un time*ifferences 0 what are differences
2or Screen %rea )Part of !age Testing*% To compare our expected image part withactual image in your application build, we can use this option.
aigation% select a position in script, create menu, bitmap checkpoint, for Screen $rea,select re"uired region in testable image, right click to releae.
ob-VcheckVbitmap&mage ob-ect ame[, mage file. 'mp[, time to create the checkpoint, x, y, width, height
winchec.bitap)I%bout 2light #eservation SysteIA I!g5IA 4A 44A 5A 455A
C4*J
(un on different ersions.
Expected 0 (ecord time$ctual 0 (un time*ifferences 0 what are differences
8ote:TS) supports ariable si2e of parameter line a function oerloading3or eery pro-ect functionality testing, gui checkpoint is obligatory to use. 'y bitmapcheck point used by tester depends on re"uirements
Database &hec. Point:To conduct backend testing using !in(unner we can use this
option.
(ac. End Testing: Jalidating Completeness and Correctness of front end operationimpact on the backend tables. This process is also known as the database testing. ngeneral, the 'ackend testing is also known as alidation or data and integrity of data.
To automate this test, *atabase checkpoint proides three sub options1. *efault Check &*epends on Content5. Custom Check &*epends on rows count, columns count and content:. (untime (ecord Check &ew option in !in(unnerB.H
?B
-
8/14/2019 Testing Material.doc
58/117
SOFTWARE TESTING
Default &hec.:To check data alidation and data integrity in database, depends oncontent, we can use this option.
DS8: Data Source 8ae. t is a connection string between front end and back end. twill maintain the connection process.
Steps:
1. Connect to the database5. Execute the select statement:. (eturn results in Excel Sheet;. $naly2e the results manually
n bitmap checking test between two ersions of images
n /4 checking test same application but with expected behaior.n *atabase checking test twice on the original data.
To conduct testing, test engineer collects some information from deelopment team.
Connection Sting or *S
Table definitions or *ata dictionary
Mapping between front end forms and backend tables.
?
DS8
%pplication Data(ase
2ront End (ac. End
DS8
%pplication Data(ase
2ront End(ac. End
Data (ase
&hec. Point
Wi;ard
Select
6
5
4
-
8/14/2019 Testing Material.doc
59/117
SOFTWARE TESTING
Database Testing Process:
Create *atabase checkpoint &Current content of database selected as Expected.nsert 6 *elete 6 4pdate operation through front end.Execute *atabase checkpoint &Current content of database selected as $ctual
8avigation: n /4 7 'itmap checkpoints we will starts with selecting the position inscript.
Create Menu, *atabase Checkpoint, default checkpoint, specify connection to database&@*'C 6 *ata Qunction , select s"l statement&c%^^P3^M^!(^temp^testname^ms"r1.s"l,click next, click create to select *S, write select statement & select from orders , clickfinish.
Syntax% dbVcheck&FCheck )ist 3ile .cdlF, F#uery (esult 3ile.Ols&EOE 3ileFX
Ex% dbVcheck&Flist?.cdlF, Fdbf?FX
Criteria% Expected *ifference 0 Pass !rong *ifference 0 3ail
!hat 4pdated% *ata Jalidation!ho and when updated% *ata ntegrity
ew (ecord > /reen Color.Modified (ecord > Dellow Color.
&usto &hec.:Test engineer use this option to conduct backend testing depends onrows count or columns count or table content or combination of aboe three properties.
Default &hec.point:Content is Property 7 Content is Expected&usto &hec.point:(ows Count is Property 7 o of rows is Expected.
*uring Custom check point creation, winrunner proides a facility to select theseproperties, in general test engineers are using default check option as maximum. 'ecausecontent is also suitable to find the number of rows and columns.
Syntax% dbVcheck&FCheck )ist 3ile .cdlF, F#uery (esult 3ile.Ols&EOE 3ileFX
dbVcheck&Flist11.cdlF, FdbfFX
?G
-
8/14/2019 Testing Material.doc
60/117
SOFTWARE TESTING
3ront End 0 Programmers &Programming *iision
'ack End 0 *ata'ase $dministrators &*' *iision
The front ob-ects names should be understandable to the end user. &!DS!D/
#untie #ecord &hec.point:Sometimes test engineer use this option to find mappingbetween front end ob-ects and backend columns, it is optional checkpoint.
8avigation: Create Menu, *atabase Checkpoint, runtime record check, specify S#)statement, click next, click create to select *S, write select statement with doubtfulcolumns & select orders.orderVnumber, orders.customername from orders, select doubtfulfront end ob-ects for that columns, click next, select any of below options
Exactly one match @ne or more match
o match record
Click finish.
ote% 3or custom and default check points you hae to gie X at the end of the s"lstatement. 'ut in (untime record check point u hae no need to gie it.
Synta-:dbVrecordVcheck&FCheck list 3ile ame.crF, *J(V@EVM$TC86*J(V@EVM@(EVM$TC8 6 *J(V@VM$TC8, JariableX
Ex% dbVrecordVcheck&Flist1.crF, *J(V@EVM$TC8, recordVnumX
n the aboe syntax checklist specifies expected mapping to test and ariable specifiesnumber of records matched. f mapping correct the same alues will be presented.
(untime record checkpoint allows you to perform changes in existing mapping, throughbelow naigation.
AH
% (
6
4
5
4
5
6
K L
E-pected:
K %
L (
-
8/14/2019 Testing Material.doc
61/117
SOFTWARE TESTINGCreate menu, edit runtime recordlist, select checklist file name, click next, change "uery&if u want to test on new columns, click next, change ob-ect selection for new ob-ectstesting, click finish.
Synchroni;ation:
To define the time mapping between testing tool and application, we can usesynchroni2ation point concepts.
Wait)*:To define fixed waiting time during test execution, test engineer use thisfunction.
Synta-:wait &time in secondsX
E-:wait &1HX
Drawbac.:This function defines fixed waiting time, but our applications are takingariable times to complete, depends on test enironment.
&hange #untie settings:
*uring our test script execution, winrunner doesnUt depends on recording timeparameters. To maintain any waiting state, in winrunner we can use wait & function orchange runtime settings.
t maintains mainly following information%Delay:Time to wait between window focusingTieout:8ow much time application should wait for context sensitie stage andcheckpoints.
There are two runtime settings time parametersDelay:3or window synchroni2ationTieout:3or execute in context sensitie and check points
!indow based statements are not able to execute% *elay = Timeout.@b-ect based statements are not able to execute% Timeout.
8avigation: Settings, /eneral options, run tab, change delay 7 timeout depends onre"uirement, click apply, click ok.
!indow statements % delay 0 1secTo focus 0 1H sec
SetVwindow&[,A X
>>
time < 11sec
A1
-
8/14/2019 Testing Material.doc
62/117
SOFTWARE TESTING
>>5.
buttonVpress&okXtime < 1H
:. buttonVcheckVinfo&ok[,[enabled[,1Xtime < 1Hsec
Drawbac.s in &hange Settings:f you are changing the Settings once they will beapplied to each and eery test without user specifications.
*ue to this most of the times they are not using this change runtime settings option.
ow a days most of the test engineers are using the for ob-ect 6 window property foraoiding the time mismatch problems
2or 9bject+Window Property:
8avigation: Select position in script, create menu, synchroni2ation point, forob-ect6window property, select ob-ect, specify property with expected & Ex% Status 6Progress 'ar 0 1HHI completed and enabled+, specify maximum time to wait, click ok.
Synta-:objwaitinfo)I9bject 8aeIA IPropertyIA E-pected /alueA'a-iu
tie to wait*J
E-: objwaitinfo)I!nsert Done@@@IAIenabledIA4A4=*J
2or 9bject + Window (itap:
Sometimes test engineer defines time mapping between tool and pro-ect depends onimages in that application.
8avigation:Select position in script, create menu, synchroni2ation point, forob-ect6window 'itmap, select the re"uired image,
Synta-:ob-VwaitVbitmap&F@b-ect ameF, Fimage1.bmpF, Maximum time to waitX
2or Screen %rea (itap:
Sometimes test engineer defines time mapping between tool and pro-ect depends onimage area in that application.
8avigation:Select position in script, create menu, synchroni2ation point, for Screen$rea 'itmap, select the re"uired image region, right click to releae.
Synta-:ob-VwaitVbitmap&F@b-ect ameF, Fimage1.bmpF, Maximum time to wait, x, y,width, heightX
A5
-
8/14/2019 Testing Material.doc
63/117
SOFTWARE TESTING
Te-t &hec. Point:
To coer calculation and other text based tests, we can use this option 6 concept in!in(unner.
To create this type of check points in testing, we can use this /et Text[ option from thecreate menu.
This option consists of two sub options %1. 3rom ob-ect 6 window5. 3rom screen area
2ro object + window: To capture ob-ect alues into ariables we can use this option.
8avigation: Create Menu, /et Text, 3rom @b-ect 6 !indow, select re"uired ob-ect &*blClick
Synta-: objgette-t)I9bject 8aeIA /ariable*J
E-: objgette-t)I2light 8o:IA te-t*J
Synta-: objgetinfo)I9bject 8aeIA IPropertyIA /ariable*J
E-: objgetinfo)IThunderTe-t(o-6IAIvalueIAv4*J
2ro Screen %rea: To capture static text in your application build screen we can usethis option.
8avigation: Create Menu, /et Text, 3rom Screen area, select re"uired re"uired region tocapture alue R=sign , right click to reliee
Synta-: objgette-t)I9bject 8aeIA /ariableA -4Ay4A-5Ay5*J
E-: objgette-t)I2light 8o:IA te-tA5A6AG=A=*J
A:
!nput
9utput
Scott Saple
-
8/14/2019 Testing Material.doc
64/117
SOFTWARE TESTING
#etesting: (e execution of our test on same application build, with multiple test data iscalled retesting. n !in(unner retesting is also called as *ata *rien Test &**T.
*ata is driing or changing to test the application.
n !in(unner test engineers are conducting (etesting in ; ways1. *ynamic test data submission5. Through flat file ¬epad:. 3rom front end grids &)ist box;. Through excel sheet
*uring the test execution based on first type, tester gies alues based on that testexecution will be completed &like scanf& in C'ut in the remaining three types can be done with out tester execution.
Dynaic test data subission: To conduct retesting, to alidate functionality, testengineer submits re"uired test data to tool dynamically.To read keyboard alues during the test execution, test engineer use below TS)statements.
Syntax% createVinputVdialog& Message X
Ex% createVinputVdialog& Enter Dour $ccount umber % X
A;
tem o
@k
#uantity
Price Total_
-
8/14/2019 Testing Material.doc
65/117
SOFTWARE TESTING
Exp% res < no1 no5
Tlstep)*:tl stands for test log. Test log means that test result. !e can use this functionto define user defined pass or fail message.
Pass 0 green 0 H
3ail 0 red 0 1
PasswordVeditVset&pwd[,passwordVencrypt&y
A?
(uildMey (oard
Test Script
o1
o5
(esult
Multiply
tem o
@k
#uantity
Price Total_ _
-
8/14/2019 Testing Material.doc
66/117
SOFTWARE TESTING
Proble:
3irst enter Empo and Click @k 'utton. Then it will displays bsal, comm. and gsal.Exp% gsal < bsal = comm.'sal Z< 1?HHH then comm. is 1?If bsal between 1?HHH and HHH then commission is ?If bsal Y HHH then comm. is 5HH.
Through flat file )notepad*
Sometimes test engineer conducts data drien testing, depends on multiple test data inflat files &like notepad .txt files.
AA
4ser d
Password
)ogin ext
*isplay
Text5
Saple 5
@k
Text1
Saple 4
Emp o
*ept o
Scott Employee
@k
' Sal Comm
-
8/14/2019 Testing Material.doc
67/117
SOFTWARE TESTING
To manipulate file data for testing test engineer uses below TS) functions
fileopen)*: To load re"uired flat file into ($M, with specified permissions, we canuse function.Synta-: fileVopen&Path of the 3ile[,3@VM@*EV(E$*6 3@VM@*EV!(TE63@VM@*EV$PPE*X
filegetline)*: !e can use this function to read a line from a opened file.Synta-: fileVgetline& Path of the 3ile[, JariableX)ike in C file pointer incremented automatically.
fileclose)*: !e can use function to swap out a opened file into ($M.Synta-: fileVclose&Path of the 3ile[X
fileprintf)*:!e can use this function to write specified text into a opened file in!(TE or $PPE* mode.Synta-: fileVprintf&Path of the 3ile[, [3ormat[, what alues you want to write orwhich ariable alues you want to writeX
Id > integer, Is > string, If 0 floating point, ^n 0 ew )ine, ^t > Tab, ^r 0 Carriagereturn
Substr:we can use this function to separate a substring from gien string.Synta-: Substr &main string, start position, length of substringX
Split:we can use this function to diide a string to field%Synta-: Split&main string, array name, separatorXn the aboe syntax separator must be a single character.
2ile1copare:to compare two file contents.Synta-:fileVcompare& path of file1[, path of file5[, path of file:[X3ile: is optional. $nd it specifies concatenate of content of both files.
AB
-
8/14/2019 Testing Material.doc
68/117
SOFTWARE TESTING
Exp% res < no1 no5
A
(uild/alues
Test Script
2ile
@t-t
o1
o5
(esult
Multiply
tem o
@k
#uantity
Price Total_ _
-
8/14/2019 Testing Material.doc
69/117
SOFTWARE TESTING
2ro 2ront End "rids ),ist(o-*:
Sometimes test engineer conducts retesting depends on multiple test data ob-ects &likelist box.
To manipulate file data for testing test engineer uses below TS) functions
listgetite)*: !e can use this function to capture specified list box item throughitem number.
listVgetVitem&)ist'ox ame[,tem o,JariableX
listselectite)*: !e can use this function to select specified list box item throughgien ariable.
listVselectVitem&)ist'ox ame[,JariableX
listgetinfo)*: !e can use this function to information about the specifiedproperty&like enabled, focused, count of list box item into gien ariable.
listVgetVinfo&)ist'ox ame[, Property, ariableX
AG
4ser d
Password
)ogin ext
-
8/14/2019 Testing Material.doc
70/117
SOFTWARE TESTING
BH
(uild
Test Script
Test
Data
*isplay
Text5
Saple 5
@k
Text1
Saple 4
*isplay
Text
Type
$ge /ender #ualification @thers
)ist1
@k
Scott Sample1 Scott Sample5
-
8/14/2019 Testing Material.doc
71/117
SOFTWARE TESTING
Data Driven Testing:n generally test engineers are creating data drien tests, dependson excel sheet data.
2ro E-cel Sheet:
n general test engineers are creating retest test scripts depnds on multiple test data in
excel sheet. To generate this type of script,