M A N U

Prasanna
M
A
N U A
TESTING
L
‘Testing’ is a Process in which the defect are identified, isolated, subjected for rectification and ensure that the product
is defect free in order to produce a quality product in the end and hence customer satisfaction.
BIDDING THE PROJECT
‘Bidding the Project’ is defined as request for proposal, estimation and Sign-Off.
KICKOFF MEETING:
‘Kick-Off Meeting’ is the initial meeting held in the software company soon after the project is Sign-Off in order
to discuss the overall view of the Project and to select a Project Manager.
N O T E : Pr oje c t Ma na ge r s (PM ) , Te a m Ma nag e rs (T M ), S of t ware Qual i ty Ma na ge rs , Te s t le ad s and Hi g h le ve l M an age me n t wi ll be
in v olve d i n t hi s me e t in g .
SDLC (S OFTWARE D EVELOPMENT L IFE C YCLE )
There are 6 phases in the development cycle.
1. Initial Phase / Requirement Phase.
2. Analysis Phase.
3. Design Phase.
4. Coding Phase.
5. Testing Phase.
6. Delivery & Maintenance Phase.
1. I NITIAL P HASE :
T A S K : Interacting the customer and gathering the Requirements.
R O LE S : Business Analyst (BA), Engagement Manager (EM).
PROCESS:
First of all the Business Analyst will take an appointment from the customer. Collect the template from the
company and meets the customer on the appointed date and gather the requirements with the help of ‘Template’.
If at all any requirements are given by the customer then the Engagement Manager is responsible for the
excess cost of the Project. He also responsible for prototype demonstration.
‘Template’ is a predefined format which is used to prepare any document.
‘Prototype’ is a roughly and rapidly developed model which is used for gathering the clear requirements and to win the
confidence of a customer.
P RO O F :
The proof document of initial phase is FRS (Functional Requirement Specification). This doc can also be
called as
CRS (Customer Requirement Specification)
BRS (Business Requirement Specification)
BDD (Business Development Document)
BD (Business Document)
NOTE:
FRS.
So me C o m pa ny wi ll be pu t ti n g t he Bu si ne s s ove r all fl o w o f t he a p pl ic a ti o n i n t he BR S a nd d e ta ile d r e quire me n t i nf or mat i o n i n
2. A NALYSIS P HASE :
TASK:
Feasibility Study, Tentative Planning, Technology Selection, Requirement Analysis
R O LE S : SA (System Analyst), PM (Project Analyst), TM (Team Manager)
‘Feasibility Study’ is a detailed study of the requirement in order to check whether the requirements are possible
or not.
‘Tentative Planning’ is Resource planning and time planning is temporarily done here in this session.
‘Technology Selection’– All the technologies that are required to accomplish the project successfully will be
selected and listed out in this session.
‘Requirement Analysis’ What are all the requirements that are require to accomplish the project successfully will
be analyzed and listed out here in this session.
SRS (System Requirement Specification ):– The proof Document of Analysis Phase is SRS; requirements may be Hardware
requirements or Software requirements.
2
3. D ESIGN P HASE :
T A S K : High Level Designing (HLD), Low Level Designing (LLD)
R O LE S : HLD is done by CA (Chief Architect)
LLD is done by TL (Technical Lead)
PROCESS:
‘High Level Designing’ is a process of dividing the whole project in to the modules with the help of some
diagrams.
‘Low Level Designing’ is a process of dividing a module in to sub modules with the help of some diagrams.
NOTE:
These diagrams are designing by using a language called Unified Modeling Language (UML).
The Proof Document on this phase is ‘Technical Diagram Document’ (TDD). TDD contains some diagrams and
‘Pseudo Code’.
PSEUDO CODE:
‘Pseudo Code’ is not a real code, but a set of English statements which are very much used by the developers to
develop the actual code.
4. C ODING P HASE :
T A S K : Developing.
R O LE S : Developers.
PROCESS:
The developers will develop the actual code using the pseudo code and following the coding standards with
proper indentation, color coding and proper commenting, etc.
5. T ESTING P HASE :
T A S K : Testing.
R O LE S : Test Engineers.
PROCESS:
1. FRS Review.
2. While reviewing if at all the test engineers gets any doubts, he will list out all the doubts in review report.
3. He will send the review report to the Author of the Document for the clarification.
4. After Understanding the requirements very clearly he will take the test case template and write the ‘ Test Cases’.
5. Once ‘Built’ is released, he will execute the Test Cases.
6. If at all any defects are identified, he will isolate them. Once the defect profile is ready he will send it to the
development department.
7. Once the next Built is released, he will ensure whether the product is defect free by re executing the Test
Cases. This process continues till the product is defect free.
THE
P RO O F O F T ES T IN G P H A S E IS
Q U A L I T Y P RO D U C T .
3
6. D ELIVERY & M AINTENANCE P HASE :
D E LIV E R Y :–
T A S K : Development (Installation)
R O LE S : Deployment Engineer or Senior Test Engineer.
P R O C E S S : A Deployment Engineer will deploy the application in to the client’s environment by following the
guidelines by given by the development department in the deployment department.
M A IN TE N A N C E :–
Whenever the problem arrases that problem becomes a task depending upon the problem the corresponding role is
appointed based on the problem. The problem will be defined and the problem is solved.
Q)
Where exactly testing comes in to practice? What sort of testing you are expecting?
A: There are 2 sorts of Testing.
1. Unconventional Testing.
2. Conventional Testing.
‘Unconventional Testing’ is a process of testing conducted on each and ever out come document right from the initial
phase by the Quality Assurance people.
‘Conventional Testing’ is a process of testing applications in the testing phase by the engineers.
‘Test Case’ is a check list of all the different presumptions of a test engineer to test a specific future of functionality.
TESTING METHADOLOGY
T E S T IN G
ME T HO D S O R T E ST IN G T E C HN I Q U E S :
These are 3 methods of testing.
1. BLACK BOX TESING:
If one performs testing only on the functional part of an application without having any structural knowledge
then that method of testing is known as ‘Black Box Testing’. It is usually done by Test Engineers.
2. WHITE BOX TESTING:
If one performs testing on the structural part of an application then that method of testing is know as ‘ White
Box testing’. It is usually done by Developers of Test Engineers.
3. GRAY BOX TESTING:
If one performs testing on both functional and as well as structural part of the application is known as ‘ Gray
Box Testers’.
NOTE:
Gray Testing is done by the Test Engineers with structural knowledge.
LEVELS OF TESTING
There are 5 levels of Testing.
1. Unit level Testing.
2. Module level Testing.
3. Integration Testing.
4. System level Testing.
5. User Acceptance Testing. (UAT)
1. Unit level Testing:–
‘Unit’ is defined as a smallest part of an application.
If one performs Testing on a Unit, then that level is known as ‘ Unit Level Testing’. It is a White Box
Testing and usually done by White Box Testers or Developers.
4
2. Module level Testing:–
If one performs Testing on a Module then that level of Testing is known as ‘ Module level Testing’. It is a
Black Box Testing and usually done by Test Engineers.
3. Integration Testing:–
Once the Modules are ready they will be integrated with the help of inter phases (linking Program) by the
developers and those inter phases are tested by the developers in order to check whether the modules are integrated
properly or not.
It is a white Box Testing and usually done by the developers. The developers will integrate the modules
in following approaches.




Top Down Approach. (TDA)
Bottom Up Approach. (BUA)
Hybrid Approach.
Big Bang Approach.
Top Down Approach
(TDA)
(BUA)



Bottom Up Approach
Hybrid Approach
M1
M1
M1
M2
M2
M2
M3
M3
M3
M4
M4
M4
In TDA the parent module are linked with the sub modules.
In BUA the sub modules are linked with the parent module.
In Hybrid Approach is a mixture of both T D A and B U A .
Big Bang Approach: – Once all the modules are ready at a time integrating the modules is known as ‘Big Bang
Approach’
S T UB
While integrating the modules in TDA if at all any mandatory module is missing then that module is replaced
with a temporary program known as ‘ S T U B ’.
D R IV E R S
While integrating the modules in BUA if at all any mandatory module is missing then that module is replaced
with a temporary program is known as ‘ D R I V E R ’
4. System Level Testing:–
If one performs testing on a complete application after deploying in to the environment then it is known as
System Level Testing.
5. User Acceptance Testing:–
If one performance the same system testing in the presence of user then it is known as ‘User Acceptance
Testing’ (UAT). This is a Black Box Testing and done by Test Engineers.
5
SOFTWARE DEVELOPMENT MODELS
W ATER F ALL M ODEL
P HA SE
A C TIV I TY
O U TC O ME
Req. gathering
BRS
Analysis
Sys. Design
SRS
Design
S/W Design
Initial
TDD, GUI
Unit Test
UTR
Implementation
Coding
Int. Test
Black Box Testing
Testing
Mod. Test
UTR
Sys. Test
STR
UAT
Delivery & Maintenance
MTR
MTR
Delivery to Client
Whenever the client is very clear with this requirement then the company will choose these requirements.
Advantages: – Simple model and transparency
Disadvantages: – New changes or Extra requirements cannot be accepted in the model.
P RO TO TY P E M O D E L
Unclear
Requirement
SRS Doc
Base Lined
Client
Environment
Conformation
H/W
Prototype
Demo to Client
S/W
Prototype
Demo to Client
BRS Doc
Base Lined
Requirements
are Refined
Prototype
6
Whenever the client is not with their requirement the company will choose this model to gather
requirements.
Advantage:– This model is very much used for gathering the requirements whenever the client is confused.
Disadvantage:– Time consuming, should be built on company cost not a full-fledged model user may stick to the
prototype.
E VOLUTIONARY M ODEL
Initial Req.
Development
Application
Advantage: –
N
Feedback with
New Req.
Y
Feedback with
New Req.
User
Accept
User Validation
This is the best suitable model if at all the client is evolving the requirements.
Disadvantages:–Dead lines are not defined, No Transparency.
S PIRAL M ODEL
Defining the Objects / WA / Constraints
Risk root cause analysis
estimation contingencies.
Refining & Planning for the next Cycle
Implementation
Advantages: – This is the best suited model for risk based projects.
Disadvantages:–Time consuming, expensive
7
F ISH M ODEL
Delivery &
Mantainence
Analysis
Coding
Design
System
Testing
HLD
Req.
gathering
SCD
SRS
LLD
BRS
Review
Black Box
Testing
SRS
View
TDD
Review
Test S/W
Changes
White Box
Tesing
Advantages:–As the verification and validation are ensure the out come of fish model is Quality.
Disadvantages: – Time consuming, expensive.
Verification:
Verification is a process of checking whether the product is being developing in a right manner or not.
Validation:–
Validation is a process of checking whether the product is right or not.
‘V’ Model
Verification
Initial &
Analysis
Validation
BRS
SRS
Preparing Pro. Plan
Preparing Test Plan
Req. Phase Testing
Design &
Coding
TDD
SCD
Design Phase Testing
Program Phase Testing
Testing
S/W Build
System Testing
Test Management Process
User Acceptance Testing
Port Testing
S/W Efficiency
DRE=A/A+B
Test S/W Changes
Delivery &
Maintenance
A = Defects found by the Testing Team.
B = Defects raised by the Customer.
Advantages:– As the verification, validation are ensure and the test management process is taken care then out come
of the V-Model is Quality product.
Disadvantages:–Time consuming, Efficiency.
DRE– Defect Removable Efficiency.
8
ENVIRONMENT
Environment is a combination of 3 layers.
A. Presentation Layer
B. Business Layer
C. Database Layer
T Y P E S O F E N V I RO N ME N T
There are 4 types of Environment.
1.
2.
3.
4.
Stand alone Environment [or] One tier Architecture.
Client Server Environment [or] Two tier Architecture.
Web Environment [or] Three tier Architecture.
Distributed Environment [or] N-tier Environment.
1. Stand alone Environment [or] One tier Architecture
If at all the three layers are present in a single system of single then it is known as ‘ Stand alone environment’
DL
BL
DBL
2. Client Server Environment
In this environment clients resides in one tire and the Database Server resides in another tire. Client will be
containing the presentation layer as well as the Business layer. So that the corresponding layer logic will be installed.
The Database server contains the Database layers. So that the corresponding logics can be installed.
PL+BL
C
Client Side
DBL
C
C
3. Web Environment
This environment contains 3 tires. Client resides in the first tire, application server resides in the middle tire
and the Database server resides on the other tire. Client contain presentation layer. Application contain Business
Layer, Database Layer contains Database. So, that corresponding logics are installed.
C
AS
C
DBS
C
Application
Server
Client
Database Server
4. Distributed Environment
‘Distributed environment’ is just similar to Web environment. But number of Application servers is increased in
order to distribute the business logic. So, that number of layers will be distributed.
NOTE:
Eac h and e ve r y a p pl ic a ti o n se rve r s h ou ld b e re pre se n ti n g i n o ne tire .
N= Nu m be r of A p pl ic a ti o n Se r ve r s + 2
WS
C
AS1
C
DBS
AS2
C
Application
Server
Client
9
Database Server
TYPES OF TESTING
1. Build Verification Testing (BVT) [or] sanity testing
BVT is a type of testing in which a Test engineer will perform the overall testing on the released Build in
order to check whether every thing is available and proper for further detailed testing.
2. Regression Testing
‘Regression Testing’ is a type of testing in which one will perform testing on the already testing functionality
once again. It is usually done in 2 scenarios.
a. Whenever the test engineers raise the defect, developer rectifies it. Next build is released to the testing
department then the test engineer to the testing department then the test engineer will test the rectified functionality
as well as the related functionality once again in order to ensure while rectifying the defect related functionality is
not affected.
b. Whenever the new changes are proposed by the customer incorporated by the developers and the build is
released to the testing department. Then the test engineer will test the already tested related functionality in order to
ensure that the old functionality remains same despite of new change.
3. Re-Testing
It is a type of testing in which one will perform testing on the already tested functionality again and again with
multiple sets of data in order to ensure the functionality is working fine or the defect is reproduced with multiple
sets of data.
4. Alpha (α) Testing
It is a type of User Acceptance Testing done in the company by our test engineers.
Advantage: –If at all any defects are identified, there is a chance of rectified them immediately.
5. Beta (β) Testing
It is also a type of User Acceptance Testing done in the clients place either by the 3 r d party test engineers or
by the end users.
Disadvantage:–If at all any defects are identified there is a chance to rectify them immediately.
6. Static Testing
It is a type of testing in which one will perform testing on application or its related factors whenever it is not
been executed.
Ex: Doc Testing, Code Analysis, GUI Testing
7. Dynamic Testing
It is a type of testing in which one will perform testing on the application whenever it is being executed.
Ex: Functionality Testing.
8. Installation Testing
It is a type of testing in which a test engineers will try to install the application in to the environment by
following the guidelines given in the deployment by developers. If the installation is successful then he will come to
a conclusion that the guide lines are correct. Otherwise he will conclude that there are some problems in the
guidelines.
9. Compatibility Testing
It is a type of testing usually done for the products in which a test engineer may have to deploy the application
in to the environments prepare with multiple combination on environmental components in order to check whether
it is compatible with those environment or not.
10. Monkey Testing [or] Gorilla Testing
It is a type of testing in which one will perform abnormal actions intentionally on the application in order to
check the stability.
10
11. End to End Testing
It is a type of testing in which one will perform testing on a complete transaction or an end to end scenario.
Ex :
Login → Balance Enquiry → Withdraw → Balance Enquiry → Logout
12. Usability Testing
It is a type of testing in which one will concentrate on the user friendliness of the application.
13. Exploratory Testing
It is a type of testing in which one will perform testing on the application with out any requirement document
support by exploring the functionality usually it is done by the domain experts.
14. Port Testing
It is a type of compatibility testing done at the clients place after deploying the application in order to check
whether it is compatible with that environment or not.
15. Security Testing
It is a type of testing in which one will concentrate on the following areas.
Authentication, Direct URL testing, Firewall Testing.
16. Reliability Testing ( Soak Testing )
It is a type of testing in which one will perform testing for longer period of time in order to check stability.
17. Mutation Testing
It is a White Box Testing done by the Developers where they do some changes to the program and check for
its performance. Since it is associated with multiple mutations, it is known as ‘Mutation Testing’.
18. AD HOC Testing
It is a type of testing in which one will perform testing on the application in his own style after understanding
the requirements very clear.
S
T
L
C
[Software Testing Life Cycle]
STLC contains 6 phases.
(1)T EST
PLANNING
(2)T EST
DEVELOPMENT
(3)T EST
EXECUTION
(4)R ESULT
(5)B UG
ANALYSIS
TRACKING
(6)R EPORTING
T E S T
P L A N N I N G
Plan: – It is a strategic document which describes how to perform a task in an effective, efficient and optimized way.
Test Plan: – It is a strategic document which describes how to perform testing on an application in an effective and
optimized way.
Optimization: – It is a process of reducing the inputs and gathering the same output or even more output.
NO T E: Te st Pla n is pr e p ar e d by t he Te st L e ad .
11
TEST PLAN INDEX [or] CONTENTS
1.0
I N T RO D U CT I O N
2.0
1.1
Objective.
1.2
Reference Doc.
C O V E RA G E O F T E S T I N G
2.1
Feature to be tested.
2.2
Feature not to be tested.
T E S T S T RA T E G Y
3.0
4.0
3.1
Levels of Testing.
3.2
Types of Testing.
3.3
Test Design Techniques.
3.4
Configuration Management.
3.5
Test Metrics.
3.6
Terminology.
3.7
Automation Plan.
3.8
List of Automated Tools.
B A S E C RI T E RI A
5.0
4.1
Acceptance Criteria.
4.2
Suspension Criteria.
TEST
E N V I RO N M E N T
R E S O U RC E
PLA N N I N G
8.0
S CH E D U L I N G
9.0
S T A F FI N G &
10.0
RISKS &
11.0
A S S U M PT I O N S
12.0
A P P RO V A L
1.0
I N TRO D UC T IO N
T RA I N I N G
CO N T I G E N S I S
I N FO RM A T I O N
Objective: – The purpose of the test plan doc is clearly describes here in this session.
Reference Document: – The list of all the documents that are referred to prepare the test plan will be listed out
here in this session.
2.0
C O V E RA G E
OF
TESTING
Features to be Tested: – The lists of all the features that are with in the scope are listed out here in this session.
Features not to be Tested: – The list of all the features that are not planned for testing based on the following
criteria will be listed out here in this session.
o Features out scope
o Low risk features
o Features that are to be skipped based on the time constraints.
o Features functionality.
3.0
T E S T S T RA TE G Y
It is an organization level term which is used for testing all the projects in the organization.
NO T E: Us u al ly t e s t st ra te gy i s c om m o n f or a ll th e pr oje c ts . Nut u p on c o stu me r r e que st t he m may be sl ig h t c ha n ge s
th e r e m ay be sl ig h t c ha n ge s i n i t .
12
Test Plan: – Test Plan is defined as project level term which is used for testing a specific project.
Levels of Testing: – The list of all the levels of testing that are followed by that company are listed out here in
this session
Types of Testing: – The lists of all the types of testing that are followed by that company are listed out here in
this session.
Test Design Techniques: – Technique is some thing that is used for accomplish a complex task in easy manner.
The lists of all the techniques that are followed by that company are listed out here in this session.
Boundary Value Analysis (BVA)
Equivalence Class partition (ECP)
Configuration Management: –
Test Metrics: – The list of all the metrics that are maintained in the organization that are listed out here in this
session.
Terminology: –The list of all the terns that are followed in that company along with the meaning will be listed
out here in this session.
Automation Plan: – The list of all the areas that are planned for automation listed out here in this session.
List of Automated Tools: – The list of all the automated tools that are used by the company are listed out here in
this session.
4.0
B A S E C R I TE RI A
4.1 Acceptance Criteria: – When to stop testing in a full-fledged manned is clearly describe here in this session.
4.2 Suspension Criteria : – When to reject or suspend testing is clearly described here in this session.
5.0
T E S T D E L IV E RA B LE S – The List of all the documents that are about to be delivered are listed out here in this
session.
E X : T E S T C A S E D O C , R E V I E W R E PO RT , D E F E CT P R O FI LE D O C …….
T E S T E N V I RO N ME N T – The client specified environment is clearly described here in this session.
7.0
R E SO URC E P L A N N IN G – ‘Who have to do What’ is clearly describes in this session.
8.0
S C H E D U L IN G – The starting dates and the ending dates of each and every task is clearly describes in this
session.
9.0
S TA F F IN G & T RA IN I N G – How much staff is to be recruited what kind of training should be provided for the
newly recruited staff and for the existing employee to accomplish this project successfully will be clearly
described in this session.
10.0
R I SK S & C O N T IN G E N C E S – The list of all the potential risks and the corresponding solutions are listed out
here in this session.
RI S K S
– Unable to deliver the project with in the deadline.
–
Customer imposed deadlines.
–
Employees leave the company in the middle of the project.
–
Unable to test all the features with in the time lake of expertisation.
13
C O N T I N G E N CE S
11.0
FOR
SOLUTION
–
Proper plan ensure.
–
What not to be tested will be increased in case of customer imposed deadlines.
–
People should be maintaining on the Bench.
–
Severity priority based execution.
–
Training should be provided.
A SS U MP TIO N S
What are all the things that a test engineer should assure is mentioned here in this session.
12.0
A P P RO V A L I N F O R M A TIO N
Who has to approve what is clearly describe here in this session.
T E S T
D E V E L O P M E N T
P H A S E
‘Use case’ is a description of functionality of certain feature of an application in terms of actors, actions and
responses.
I N PU T I N FO RM A T I O N R E Q U I RE D F O R P RE P A RI N G T H E U S E
CA S E S
APPLICATION
USER NAME
PASS WORD
CONNECT TO
LOG IN
CLEAR
CANCEL
Functional Requirements:
1. ‘ LO G I N ’ screen should contain Username, Password and Connect to fields, Login, Clear and Cancel Buttons.
2. ‘Connect To’ is not a mandatory field but it should allow the user to select a database object.
3. Upon entering the valid user name, password and click on ‘Login’ button the corresponding page must be
displayed.
4. Upon entering some information into any of the fields and clicking on ‘Clear’ button all the fields must be
clear and the cursor should be placed in the user name field.
5. Upon clicking on ‘Cancel’ button login screen should be closed.
Special Requirements [or] Validations [or] Business Rules:
1. Initially whenever the login screen is opened ‘Login’ and ‘Clear’ buttons must be disabled.
2. ‘Cancel’ button must be always enable.
3. Upon entering the user name and password the ‘login’ button must be enabled.
4. Upon entering information into any of the fields clear button must be enabled.
5. The tabbing order must be User Name, Password, Connect to, Login, Clear and Cancel.
14
T E MP LA T E
O F TH E ‘ US E C A SE ’
1. Name of the Use Case.
2. Brief description of the Use Case.
3. Actors involving.
4. Special Requirements.
5. Pre Conditions.
6. Post Conditions.
7. Flow of Events.
U S E C A SE D OC U M E N T
1.
2.
3.
4.
Name of the Use Case: – ‘Log in’ Use case
Brief Description Of the Use Case: – This is Use Case is used for describing the functionality of all the features in the Login screen.
Actors Involved: – Admin, Normal User.
Special Requirements: –
a) Explicit Requirement – Copy the requirements which are given by the client.
b) Implicit Requirement – Implicit requirements are the requirements that are analyzed by the Business Analyst in order to
provide the value to the Application.
Ex:
Once the login screen is invoked, the cursor should be placed in the user name field.
5. Preconditions – Login screen must be available.
6. Post conditions – Either Home page or Admin page for the valid user and error message for the invalid user.
7. Flow of Events
MAIN FLOW
Response
Action
− Login screen is displayed with the following fields.
− Actor invokes the application.
− Actor enters valid user name, Password and clicks
on login button.
User Name 2. Password 3. Connect to.
− Authentication either Home page or Admin page is
displayed depends on the actor entered.
− Actor enters valid username valid password selects
a database option and click on the login button.
− Authentication either Homepage of Admin page is
displayed with a mentioned data base connection
depending upon the actor entered.
− Actor enters invalid username, valid password and
clicks on login button.
− Actor enters valid username, invalid password and
click on login button.
− Go to Alternative flow Table ‘1’.
− Actor enters invalid user name, password and click
on login button.
− Go to Alternative flow Table ‘3’.
− Actor enters some information into any of the
fields and click on ‘clear’ button.
− Go to Alternative flow Table ‘5’.
− Go to Alternative flow Table ‘2’.
− Go to Alternative flow Table ‘4’.
− Actor clicks on ‘Cancel’ button.
Alternative Table 1: [Invalid User Name]
Action
− Actor enters invalid username valid
password and click on login button.
Response
− Authenticates Error message is
displayed. “Invalid User name,
Please Try Again”
Alternative Table 2: [Invalid Password]
Action
− Actor enters valid username invalid
password and click on login button.
Response
− Authenticates Error message is
displayed.
“Invalid Password,
Please Try Again”
15
Alternative Table 3: [Invalid Username & Password]
Action
− Actor enters invalid username and
invalid password and click on login
button.
Response
− Authenticates Error message is
displayed. “Invalid Username &
Password, Please Try Again”
Alternative Table 4:
Action
− Actor enters some information in
to any of the fields and click on
‘Clear’ button.
Response
− All the fields are cleared and the
cursor is placed in the username
field.
Alternative Table 5:
Action
− Actor clicks on ‘Cancel’ button.
Response
− Login screen is closed.
The Guide Lines to be followed by a test engineer to develop the Test case from a given Use Case.
1. Identify the Module to which the Use case belongs to.
–
Security Module.
2. Identify the functionality of the Use case with respect to the total functionality of the application.
–
Authentication.
3. Identify the functional points and prepare the functional point doc.
4. Identify the action involved.
–
Admin, Normal User
5. Identify the inputs require to perform the use case.
–
Valid and Invalid Input.
6. Identify whether the Use Case is linked with any other Use Case.
–
Home Page, Admin Page.
7. Identify the ‘Pre condition’
–
Login screen must be available.
8. Identify the ‘Post condition’
–
Home page or Admin page for valid users and ‘Error message’ for in valid Users.
9. Understand the main flow of the Use case.
10. Understand the alternative flow of the use cases.
11. Understand the special requirements of Business rules.
12. Document the Test Cases for the Main flow.
13. Document the Test Cases for the Alternative flow.
14. Document the Test Cases for the Special requirements.
15. Prepare the cross reference Matrix. (Traceability Matrix).
16
Functional Point: –
FRS
FRS
Functional Point Document
Master Test Case Doc
User Name Entry
Password Only
D.B entry
UCD
Detailed Test Case Doc
Defect Profile
Doc
Validation login, connect to,
cancel, clear
Traceability Matrix
UCD
FPD
Functional Point: –
MTCD
DTD
DPD
The point where the user can perform action can be considered as functional point.
Traceability / Cross requirements: – Traceability is a table which contains some information used for tracing back
for the reference by linking the corresponding documents in any kind of obvious situation.
Types of Test Cases: – The Test Cases are broadly classified in to 2 types.
(i)
User interface Test Cases.
(ii)
Functional Test Cases.
The functional Test cases are further classified in two types.
(i)
+ve Test Cases.
(ii)
-ve Test Cases.
Guidelines for developing the user interface Test Cases.
1.
2.
3.
4.
Check
Check
Check
Check
for
for
for
for
the
the
the
the
availability of all the objects.
alignments of the objects.
consisting of the objects [Size, Color, Font, Type.]
Spelling & Grammar.
Guidelines for developing the +ve Test Cases.
1. A Test engineer should have the +ve perception.
2. A Test engineer should consider the +ve flow of the application.
3. He should always use only the valid inputs.
Guidelines for developing the –ve Test Cases.
1. A Test engineer should have –ve perception.
2. A Test engineer should consider the –ve flow of the application.
3. He should use the invalid inputs.
T E S T C A S E T E MP LA T E
1.
2.
3.
4.
5.
Test
Test
Test
Test
Test
Objective.
Scenario.
Procedure (Functional level term used to test particular functionality).
Data.
Cases.
1.
2.
3.
4.
5.
Test
Test
Test
Test
Test
Objective: – The main purpose of the document is described in this session.
Scenario: – The situation that are to be tested described in this session.
Procedure: – It is a functional level term which describes how to perform testing on functionality.
Data: – The Data that is required for testing is described in this session.
Cases
17
TEST CASES
T.C
1
T.C
Type
UI
2
UI
3
UI
4
5
6
7
8
9
10
UI
UI
+ve
+ve
+ve
+ve
+ve
11
+ve
12
+ve
13
14
–ve
–ve
Description
Expected Value
Actual Value
Check for the availability All the objects must be
of all the objects.
available as per the OBJ
Tab
Check for the consistence All the objects must be
of all the objects.
consistence.
Check for the spellings of All the objects must be
all the objects.
spelled properly as per
the OBJ Tab
Check for the enable Login,
clear
buttons
property of login, clear and must be disabled and
cancel buttons.
cancel button must be
enable.
Check
for
the
curser Curser must be placed in
placement
in
the the username field.
application
Enter username , password Corresponding
page
as per the V I T and click must be displayed as per
on login button
the V I T.
Enter Username, password Corresponding
page
as per the V I T, select a must be displayed as per
database option and click the
VIT
with
the
on login button.
mentioned
database
connection.
Enter username, password Login button must be
and check for the enable enabling.
property of login button.
Enter some information in Clear button must be
to any of the fields and enabling.
check
for
the
enable
property of clear button.
Enter some information in All the fields must be
to any of the fields and clear and the curser
click on clear button.
should be placed in the
username field.
Result
All the objects are
available as per the
OBJ Tab
All the objects are
consistent.
All
the
objects
spelled properly as
per the OBJ Tab.
Login,
clear
and
cancel buttons are
enable
Curser is placed in
the user name field
Corresponding page
displayed as per the
V I T.
Corresponding page
displayed with the
mentioned database
connection.
Login
button
enabling.
is
Clear
enable.
is
button
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
Pass
All the fields are
cleared
but
the
curser is not placed
in
the
username
field.
Click on the cancel button. Login screen must be Login
screen
is
closed.
closed.
Check for the tabbing Tabbing order must be Tabbing
order
is
order.
username,
Password, working properly.
connect to, login, clear
and cancel.
Enter username, password Corresponding
error Corresponding pages
as per the IVIT and click message
should
be are displayed as per
on login button.
displayed as per the the IVIT.
IVIT.
Enter either username or Login button must be Login
button
is
password
or
select
a disabled.
enabled.
database option and check
for the enable property of
login button.
Pass
Pass
Pass
Pass
Fail
OBJ TABLE
S.no
1
2
3
4
5
6
Object
User Name
Password
Connect To
Login
Clear
Cancel
Type
Text Box
Text Box
Combo Box
Button
Button
Button
V A L ID I N P UT S T A B L E
S.no
1
2
3
4
User Name
Suresh
Santos
Admin
Madhav
Password
QTP
Bunny
Admin
Mmd
18
Expected
Page
Admin
Homepage
Admin
Home page
Actual Page
Admin
Home page
Admin
Home page
Savoir
Prior
ity
Refer
ence
I N V A L ID I N P UT S T A B LE
User Name
S.no
Expected
Page
Invalid
Username
Please try
again
Invalid Pass
word Please
try again
Invalid
Username &
Password
Please try
again
Password
1
Sures
QTP
2
Santos
Bun
3
Test
Test
Actual Page
Error
message
displayed.
TEXT EXCUTION PHASE
In this phase the test engineers will do the following actions.
1. He will perform the action that is described in the description column.
2. He will observe the actual behavior of the application.
3. He will document the observed value under the actual value column of the test case document.
RESULT ANALYSIS PHASE
In this phase the Test engineer will compare the actual value with the expected value and both are matched he
will mention the result as pass otherwise failed.
BUG TRACKING
‘Bug tracking’ is a process in which the defects are identified, isolated and maintained.
1
2
3
4
5
6
7
8
9
10
11
(1) Defect ID.
(2) Defect Description.
(3) Steps for reproducibility.
(4) Submitter.
(5) Date of Submission.
(6) Build Number.
(7) Version Number.
(8) Assigned to.
(9) Severity.
(10) Priority.
(11) Status.
(1) Defect ID: – The sequence of defect numbers will be there in this session.
(2) Defect Description: – What exactly the defect is clearly describes here in this session.
(3) Steps for reproducing: – The lists of all the steps that are followed by the test engineer to identify the defect are
listed out here in this session. So, that the developer will follows the same steps in order to reproduce the
defects.
(4) Submitter: – The name of the test engineer who submitted the defect is mentioned here in this session.
(5) Date of Submission: – The date on which the defect is submitted is mentioned.
(6) Build Number: – The corresponding build number will be mentioned here in this session.
(7) Version Number: – The corresponding version will be mentioned here in this session.
(8) Assigned to: – This field is not filled by the Test engineer, but it is filled by the Project Manager or Project
Leader with the name for where the defect is assigned.
(9) Severity: – Severity describes how serious the defect is this is classified in to 4 types.
a) Fatal
b) Major
c)
Minor d)
Suggestion.
a) Fatal: – If at all the problem is related to the navigational block or unavailability of the functional then
such types of problems is treated to be ‘Fatal’.
19
b) Major: – If at all the major functionality is not working fine then such type problems treated to be ‘Major’
defects.
c) Minor: – If at all the problems is related to the feel of ‘Look and Feel’ of the application. Such type of
defects is treated to be ‘Minor’ defects.
(10) Priority: – The priority defines the sequence in which the defects has to be rectified. Priority is classified in to 4
types.
a) Critical. b) High.
c) Medium.
d) Low.
Usually the ‘Fatal’ defects are given ‘Critical Priority’.
‘Major’ defects are given ‘High Priority’.
‘Minor’ defects are given ‘Medium Priority’.
‘Suggestion’ defects are given ‘Low Priority’.
But there are some situations where in the priority change.
Case 1: – Low severity High Risk. In case of client visit all the look and feel defects are given highest priority.
Case 2: – High severity Low priority. Whenever the functionality unavailable the test engineer will raise it as a ‘Fatal
defect’. But, if that functionality is under development and it taken some more time and that such situation is
‘Low priority’ given by Project Manager or Project Leader.
BUG LIFE CYCLE
Hold
As per Design
Yes
No
Testing Error
Requiremen
t
Is it really
a Defect?
Development
M1
Yes
Rectification
Fixed for verification.
B≠1
B≠2
Testing
Yes
New/
Open
Reopen
No
If Defect
Is defect
is really
rectified
No
Stop the Testing
Yes
Closed
New/ open: – Whenever the test engineer identifies the defect newly for the first time then he will set the states as
new/open. But some companies will say it as new and once the developer accept as defect he will set as open.
Fixed for verification: – Whenever the test engineer raises the defects and the developer rectifies it the he will set
the status as of the defect as ‘Fixed for verification’ before releasing the next build.
Reopen and Closed: – Whenever the defects are rectified and next build is releasing to the testing department then
the test engineer will check whether the defects rectified properly or not if they feel the defects is not rectified
properly then they will wet the status as ‘Reopen’ if at all they feel that the defect is rectified properly then they will
set status as ‘Close’.
Hold: – Whenever the developer is in a confusion situation to accept or reject the defect in such situation he will set
the status as ‘Hold’.
As per Design: – Whenever the new requirements are given by the development team and the developers
incorporated those new fields and release to the testing department. As the test engineers are not aware of those new
changes he will raise as them defects. But the developers will set the status as ‘As per Design’.
Testers Error: – If at all the developer feels it is not at all a defect then he will set the status as ‘Tester Error’.
20
REPORTING PHASE
(1) Classical Bug Reporting Process:
Drawbacks: – Time consuming, Redundancy, insecurity.
(2) Common Repository oriented Bug reporting Process:
Drawbacks: – Time consuming, Redundancy.
(3) Bug Tracking Tool oriented Bug Reporting Process:
Bug Tracking Toll is software which can be accessed by the authorized persons only. It is used for complete
bug tracking process.
TEST DESIGN TECHNIQUES
Whenever the test engineer is developing the test cases he may face some difficulties. In order to overcome those
difficulty and complete the tasks easily there are some techniques known as ‘Test Design Techniques’
(1) Boundary Value analysis: – Whenever the test engineer need to develop the test cases for a range kind of the
input, then he will use a technique called ‘Boundary value analysis’.
Using this technique one will
(2) Equivalence Class Partition: – It is a technique used by the test engineer in order to develop the +ve and –ve
test cases easily for a functionality which has more number of validations. Using this technique one can divide the
valid class inputs and invalid class inputs.
Case Study: – Develop the test cases to test a Text Box which has the following validation.
(a) It should accept minimum 4 characters and maximum of 20 characters.
(b) It should accept small a – z only.
(c) It should accept special symbols ‘@’ and ‘_’ only.
21