Tuesday, January 22, 2008

test manual

In the earlier decades of software development, development process is done through adhoc methodology which means people use to develop the software with out any proper planning. These results and It finally leads to increases of maintenance cost of the project, organizations are spending lot of money while maintaining because of improper planning, at that minute software engineering came into picture which will guide you in a proper way to develop software with complete planning from starting to ending. This results in decrease of maintenance cost of the project.
Software Engineering: It is a systematic approach that which will help in software development and maintain software until it becomes an error free superior product. In this SDLC will play’s a major role and it specifies systematic approach through its phases
ReleaseàTestingàcodingàDesigningà AnalysisàThey are Requirements Gathering and maintenance

In those 6 phases Testing phase will play’s a vital role before release the software product to the customer

What is testing?
A: Software testing the process of seeking errors. Mod: Testing is a process of confirmation, that product is working perfectly, meeting customer requirements and satisfying the customer.

Why should we do testing?
A: Because of the cost of not testing is much greater than the cost of testing. To meet the customer requirements. Mod: for quality assurance, error free superior product, competitive advantage .

What is the need for testing in SDLC?
A: It is the assurance given to customer that our product is a quality product. Quality: It is meeting customer requirements. Customer requirement in terms of functionality Customer requirement in terms of expectations Cost to purchase Time to release

How to do testing?
A: Software Testing is done in 2 ways
1. Manual Testing
2. Automation Testing SDLC will help us to develop the software by using its four different models
1. Water fall model
2. Prototype model
3. Spiral model
4. Agile model

Water fall model: this model is preferred when customer requirements are clear and complete
Prototype model: this model is preferred when customer requirement’s are not clear ,Development people will help the customer to get a clear and complete idea by preparing sample’s or prototypes through power point presentation.

Spiral Model: This is model is preferred when the product is need to be release in the form of versions for future enhancement. Eg: ver1, ver2, ver3

Agile model : When customer requirements are changing dynamically this model is preferred eg: cricket web site. Here customer need to be modifying the web pages at the next minute at that stage agile model is preferred .

SQA: software quality assurance : The monitoring and measuring the strength of development process is called software quality assurance. From these SQA concepts every organization is planning for multiple stages of testing and separate testing team .

Fish Model: It is a theoretical model Requirement
Analysis Design coding maintenance [BRS] Gathering Software Requirement HLD’S and LLD’s System Testing Specification +System Program

Reviews Review Prototype White Box Black Box Test software Changes

Program: set of instructions passed in to the computer in order to perform an operation is
known as program.
Module: Collection of programs is known as module Software Project: Collection of modules is known as Software Project .

BRS: Business Requirement specification: It defines the requirements of the customer to be developed as a software product. This document is also known as customer requirement specification or user requirement specification. Prepared by: Business Analyst

SRS or FRS: Software Requirement specification: it defines the functional requirements to be developed and system requirements to be used. It is an implemented form of BRS prepared by business analyst

Review: It is a static testing technique. The responsible people are using this technique to estimate completeness and correctness of corresponding documents eg: walk troughs’ inspections and pre reviews.

SRS is just an English like document .It does not need dynamic testing just review is needed

Prototype: It is a sample model of an application eg: power point presentation.

HLD’s: High Level Design document which will specify the entire architecture of the system to be developed. It specify from root to leaf functionalities. It is also known as architectural design or external design

LLD’s : Low Level Design document . The internal logic of the corresponding module or functionality. This LLD’s are also known as detailed design or internal design documents

Build : After completion of development, a finally integrated all program set in executable form. Prototype is a sample and build is a real

White box testing: It is a program testing technique. During the white box testing , responsible people are verifying the internal structure of the corresponding program. This testing in also known as open box testing

Black Box Testing: It is a build level testing.
During this test, responsible people are validating the external functionality of complete software.
This Black Box testing is also known as closed box testing. The combination of white box and black box is called grey box testing.

Software Testing: The verification of the software development process and validation of software product before release to customer is called software testing

Reviews in analysis: In general , the software development process starts with requirements gathering and analysis. In this phase, business analyst category people will develop BRS and SRS. Before going to design, the same analyst people are conducting review meetings to estimate the completeness and correctness of that document. In this review the business analyst people are applying below check list

on that
BRS and SRS SRSàBRS
1. Are they complete?
2. Are they right requirements
3. Are they Achievable
4. Are they reasonable
5. Are they testable
Reviews in design: After completion of analysis and their reviews, designing category people are developing the HLD’s and LLD’s. After development of these documents, the same category designers will conduct a review meeting to estimate the completeness and correctness of the design documents.

LLDàHLD
1. Are they understandable
2. Are they Complete
3. Are they meet right requirements
4. Are they follow able[w . r .t coding]
5. Are they handling Errors?
Unit Testing: After completion of design and their reviews, the corresponding
programmers will start coding. To construct software physically in this coding stage,
corresponding programmers are verifying the internal structure of every program with the help of white box testing technique. The white box testing techniques are classified into 4 divisions

1. Basis Path Testing
2. Control Structure Testing
3. Program Technique Testing
4. Mutation Testing
Basis Path Testing: After completion of program writing the corresponding programmer follows below approach to verify the execution of every statement in that program
Draw Flow Graph for Corresponding program Calculate Cyclamate complexity [no of individual paths in the program]
run that program more than one time to cover all independent paths
Control structure Testing: After completion of program execution in different angles, the corresponding programmer is concentrating an correctness of that program functionality with the help of below coverage’s.

1. Conditional coverage
2. Data flow coverage
3. Loops coverage

1. Conditional coverage : correctness of every condition in that program
2. Data flow coverage : The creation and utilization of every variable in that program
3. Loops Coverage: Termination of every loop after execution of n number of times.

Program Testing Technique: Whether the corresponding program is taking less time to complete execution or not. Programmer are using java and .net profiles to calculate execution time of the program

Mutation Testing: Mutation means a change in program logic / coding. Programmers are using this Technique to confirm whether that program completely tested or not. after changes, if the testing is effected, then the program is complete.

Integration testing: After completion of individual programs development and unit testing, programmers are inter connecting them or integrating them to form a system (i.e. a complete software). This integration testing is also known as interface testing. There are 4 approaches in this integration

1. Top down approach [ STUB]
2. Bottom up approach[ DRIVER]
3. Hybrid approach
4. Big bang approach

Top down approach: In this approach, programmers are interconnecting main module with some of sub modules. In the place of remaining under constructing sub modules, programmers are using temporary programs called STUBS. These STUBS are also known as programs

Bottom up approach: In this approach, programmers are interconnecting sub modules without using under constructive main module. In this approach, programmers are using a temporary program instead of main module called DRIVER, which is also known as calling program

Hybrid approach: it is also known as sandwich approach. This is a combined approach of top down and bottom up approach

Big Bang Approach: it is also known as system approach. In this approach, the programmers are inter connecting programs after completion of all programs development and unit testing.

Case Study:
Top down approach is follow able when the requirements are changing dynamically Bottom Up approach: It is follow able when the customer requirements are constant but architecture is changing

Hybrid approach: It is follow able when the requirements and architecture both are constants

If our system consists of less number of interconnections in between the modules, at that situation big bang approach is follow able.

Black box testing:

System Testing: After completion of integration testing, development people release initial build to separate testing team in that organization. This separate testing team follows black box testing to validate the requirements of the customer in that build, the system testing classified into three divisions such as

1. Usability testing
2. Functional testing
3. Non Functional testing.

Usability testing: in general, the separate testing team is starting software testing through validation user friendliness. This usability is testing is classified into below sub tests
User interface testing:

1. Ease to use [ understandability of screen of elements]
2. Look and feel [attractiveness of the screen elements]
3. Speed in interface [ less number of events to complete a task During this test, test engineers are concentrating on above factors to be applied on every screen [i.e window] build

Manual support testing: in this separating testing team is validating the correctness and completeness of corresponding user manual or help document with respective to build operations. Receive build from developers Receive build from developers .

User Interface testing
Usability Testing
Functional and Non Functional
Manual Testing Functional Testing:

This part is a mandatory level in system testing phase. In this part the separate testing team is concentrating on “requirements meeting in build” Functionality testing: It is also known as “Requirement testing” During this test, test engineers are validating the correctness of every functionality through below coverage’s

1. GUI Coverage : changes in properties of objects in screen’s Eg: 1st cursor highlighted in username, it will come to password and after that ok button. This total execution is under this coverage.
2. Error Handling Coverage: Preventing wrong operations. If we enter any wrong user id and password then it will give u error message as invalid password.
3. Input domain coverage: Correctness of size and type of inputs.
4. Output manipulations coverage : correctness of functionality outputs
5. Back end coverage or Data base coverage: The impact of front end operations on back end tables
6. Service level coverage: [order of functionality] :when the object should response and when it should not

Input domain coverage:
It is a part of functionality testing but test engineers, giving a special treatment with the help of mathematical notations such as
BVA: Boundary value analysis ECP: Equivalence class partition
BVA like notation’s used for range on size of objects and ECP used for type of the object values BVA changes : Min –pass Min-1----fail Min+1----fail Max -----pass Max-1-----pass Max+1-----fail

Example
A login process allows user id and password to authorize users. User id object is taking alpha numeric’s in low case from 4 to 18 characters long. The password object is taking alphabets in lower case from 4 to 8 characters long prepare BVA and ECP for

user id and password

User Id:
BVA [size]
passà4charactersàMin failà3charactersàMin-1
passà5charactersàMin+1
passà18characteràMax
passà17charactersàMax-1
failà19CharacteràMax+1
ECP[Type]
Valid Invalid a ----z A—Z 0----9 special characters and blank fields
Password:
BVA[size]
passà4charactersàMin
failà3charactersàMin-1
passà5charactersàMin+1
passà8charactersàMax
passà7charactersàMax-1
failà9charactersàMax+1
ECP Valid Invalid a---z 0—9
Special characters
A-Z
Blank spaces

Sanitation testing:
It is also known as garbage testing. During this test, test engineers are detecting extra functionalities in that applications build w.r.t requirements

Non functional Testing:
After completion of all possible usability and functional test, testing team is concentrating on other quality factors to validate in that application or product [build].

Recovery testing :
It is also known as reliability testing during this test, test engineers are validating that whether our application build is changing from abnormal state to normal state or not

Normal
Abnormal
Back up / recovery procedure Normal

Inter system testing: It is also known as end to end testing, during this test , test engineers are validating that whether out application build is sharing common resources along with the other applications build. If there are no common resources, then test engineers are not applying this testing.

Security testing: It is also known as penetration testing during test , test engineers are validating

1. Authorization
2. Access control
3. Encrypt and decrypt


1. Authorization: For user validation to connect to application eg : login with password, digital signature, pin no ,finger prints etc.
2. Access Control : For authority of valid users to access specified services or functionality eg : administrator and user
3. Encrypt and decrypt : used for code conversion between client and server processes

Request (In the form of encrypt) Decrypt Response Decrypt Encrypt Encrypt

Compatibili ty: It is also known as portability testing. During this test, test engineers are validating that whether our build is run on all customer expected plat forms or not. Plat form means that operating system, compiler, browser and other system software’s.

Configuration Testing: It is also known as hardware compatibility testing. During this test, test engineers are validating that whether our build is run with different technology hardware devices or not. Eg: Different technology networks, different topology [arrangement of computers different technology printers etc

Performance testing:

Load Testing: The execution of our build under the customer expected configuration and customer expected load to estimate performance is called load testing or scalability testing or capability testing. Load or scale means the number of the number of concurrent users, which are accessing or operating our build

Stress Testing: The execution of our build under customer expected configuration and various levels from low to peak to estimate consistency or reliability in process is called stress testing. [ long time whether application is respond load or not]

Storage Testing: It is also known as memory testing. During this test, testing people are calculating peak limit of storage handled by our build to store user data. Eg: ms-access technology builds support 2gb data base as maximum. In terms of bytes

Data volume testing: During this test, testing people are calculating the size of data in terms of number of records handled by our build.

Parallel Testing: It is also known as competitive testing or comparative testing. During this test, test engineers are comparing our build with old versions of some product or with similar products in market to find competitiveness.

Installation testing: During this test, test engineers are validating correctness and completeness of installation process of our build into customer expected configuration systems.

Installation

User Acceptance Testing: To collect feed back from real customers or model customers, out project management is conducting user acceptance testing. There are two types of user acceptance testing such as alpha testing and beta testing.


Alpha Testing Beta Testing
1. In development site 1. In model customer site
2. By Real Customer 2. By Model Customer
3. For software application 3. For software product.

Release and maintenance: After completion of user acceptance testing and their modification project management is defining release team along with few developers and few testers and few software engineers .These release team is conducting port testing in corresponding customer site.

During this part testing, release team members are observing below factors
1. Compact installation
2. Overall functionality
3. Input device handling [ eg : keyboard , mouse etc ]
4. Output device handling [ eg: monitors]
5. Secondary storage device handling [ floppy disk , CDROM]
6. o/s [operating system] error handling.
7. Co existence with other software in customer site environment to share common resources.

After completion of above observation release team is providing training sessions to end users in customer site.
During utilization of that software customer site people are sending change request to our company. To receive and perform that change, project management is defining ccb (change control board) along with few developers few testers and few hardware engineers

End of part1



Part II

Planned testing vs adhoc testing : In general every testing team is planning complete system test to be applied on build. Due to some risks, some times the testing team is not conducting that planned system testing. Instead of planned system testing, testing team is conducting informal testing called ad-hoc testing. There are different types of adhoc testing
a. Monkey testing: due to lack of time, the testing team is covering testing on main activities of build only. It is also known as chimpanzee testing

b. Buddy testing: due to lack of time, test engineers are sitting with programmers to conduct testing while coding programs. In this style of testing, test engineers are not waiting upto receive complete build. “Buddy” means the group of programmers and test engineers.

c. Pair testing: Due to lack of knowledge on that domain [type of project], test engineers are grouping together to conduct testing. Every pair of testers consists of junior tester and senior tester to share their knowledge’s.

d. Exploratory testing : due to lack of documentation , test engineers are applying testing on build w.r.t available documents, through discussions , past experiences , similar products browsing and internet surfing. This type of testing is called exploratory testing

e. Be bugging: to estimate efficiency of test engineers, programmers are adding known bugs to programs. This type of bug seeding is called be bugging.

Testing Terminology:
Test Strategy: it is a document and it specifies approach to be followed by testing team

Test planning: it is a document and it specifies schedule of testing including work allocation. The test plan is an implemented form of test strategy document.

Test Scenario or test case tittles: It specifies a unique condition to be applied on build to validate with the help of test cases

Test Procedure: It is also a document and it specifies a step by step procedure to execute corresponding test case and that build. This procedure is also known as test script

Test Log : it is also a document and it specifies the results of test cases after execute on build. Eg: password failed.

Error, defect and bug: A mistake in coding is called error. This mistake found by test engineers during testing is called defect or issue. This defect or issue accepted by developers to resolve it is called bug.

Regression testing : The re-execution of selected tests on modified build to ensure bug fix work and occurrence of side effects is called regression testing. Tests

Related passed test
Failed Tests
Remaining Testing Pass
failed

Defect
Developers Report

Retesting and iterative testing :
The re execution of test on same application build with multiple test data is called re testing or data driven testing. System testing process:

Modified V model :
Requirements Gathering [BRS]
Analysis[SRS]

Design Test Initialization

Coding and unit Testing Test Planning

Integration Testing Test Design

Initial Build

Test Execution

Test Closure

User Acceptance Testing

Release and maintenance
In above development software process model, testing process is inserted with separate testing team. This separate testing team is only conducting system testing. Test Initiation : in this stage , project manager or quality analyst is developing test strategy documents. This document defines an approach to be followed by testing team.

Components in test Strategy: 1. Scope and objective:
The importance of testing in that project.
2. Business issues : budget control for a testing
Eg: 100 % project Cost

64% 36% Development
Testing & Maintenance

3. Test Approach :
Selected list of test factors to be applied on build. This factor selection is depending on project requirements , scope of that requirements and risks involved in that companies

4. Roles and responsibilities : The name of jobs in testing team and their responsibilities

5. Test Deliverables : The names of testing documents and their formats[IEEE 829].

6. Communication and Status Reporting : The required negotiation in between every 2 jobs in testing team.

7. Testing Automation and Testing Tools : Test automation and testing the possibility of test automation and availability of testing tools in that organization

. 8. Defect reporting and tracking: The required negotiation in between the development team and testing team to resolve defects

9. Risks and mitigation : The list of reasonable risks rised during testing and solutions to overcome

10. Change and Configuration Management : This concept maintains software development process deliverables w.r.t changes in requirements. Eg: SRS,H.L.D’s ,L.L.D’s , programs, test plans, test cases etc.

11. Testing measurements and matrices : The list of measurements and matrices

12. Training plan : The required number of training sessions for testing people to under stand project / customer requirements.

Test Factors: Every test factor defines a testing issue or a testing topic.
In general ,15 testing topics are available in market to define a quality software. These 15 factors are not mandatory for every project system testing. The selection of required factors in that 15 factors, depending on requirements of that project , scope of the requirements and risks involved , future enhancement in that company

Testing Topics :
1. Authorization : validity of users to connect to application..
2. Access control : Authorities of valid users to user specific services in that software .
3. Audit Trail: An internal data generation w.r.t user operations.
4. Continuity of processing: Integration of project execution with out any dead lock.
5. Correctness : Meet customer requirements in terms of functionality
6. Coupling: co existence with other software application to share common resources.
7. Data integrity: out software is taking correct type and size of inputs.
8. Ease of use: user friendliness of that software .
9. Ease to operate : Easy to install and uninstall
10. Portable: Run on any platform or multiple platforms.
11. Performance: Speed of processing
12. Reliability: Recover from abnormal situation
13. Service levels : order of functionality
14. Maintenance : Whether our software product is long time serviceable to customer site people or not
. 15. Methodology : whether our testing team is following rules and standards while testing or not

. Test factors vs testing Techniques Security

1. Authorization Testing Security testing
2. Access Control
3. Audit Trail Functionality Testing Integration testing by developers
4. Continuity functionality testing
5. Correctness inter system
6. Coupling testing Input Domain testing
7. Data Integrity User
8. Ease of Use Interface Testing and manual support testing
9. Ease of operate installation testing Compatibility testing and configuration
10. Portable testing Load testing , stress testing, storage testing and
11. Performance data volume testing ,Recovery testing and stress
12. Reliability testing testing Functionality testingà
13. Service level
14. Maintainable compliance testing Compliance testing an organization
15. Methodology have to follow complete standards and protocols

TestQuality software Test Cases Testing Techniques Factors

Test Planning : After finalization of test strategy for a corresponding project, the responsible test lead category people are preparing system test plan and unit test plans. The test plan document defines what to test ? How to test ? when to test ? and who to test? To develop these test plan documents , test lead category people are following below approach

Testing team formation

Identification of Tactical Risks

Test Plan Documents

Review Test plan documents

From the above approach test lead is preparing system test plan and then divides that plan into multiple modules or unit test plans.

Testing team formation : in general the test planning process starts with testing team formation. In this stage test lead is depending on below factors .

project size
Availability of Testè Engineers
Test Duration
Availability of test environment resources.

3 to 5 months of system testingàClient / server , web, ERP
7 to 9 months of system testingàSystem software
12àMachine critical to 15 months of system testing
Robotic approach
Satellite approach

Identification of tactical risks : After completion of testing team formation test lead is concentrating on risk analysis . eg:

Risk1 : lack of knowledge on that domain
Risk2: lack of time
Risk3: Lack of resources
Risk4: Delays in delivery
Risk5: Lack of communication in between testing team and development team
Risk6: Lack of documentation
Risk7: lack of development process
Risk8: lazyness of development

Prepare test plans:
After completion of testing team formation and risk analysis , test lead category people are developing test plan document in IEEE 829

Format :

1. Test plan Id : The unique name or number to refer this planed document in future
2. Introduction : The summary of project
3. Features to be tested : List of modules to be prepared or conducted , test cases or by new test cases .
4. Features not to be tested: List of modules to be tested by existing test cases.
5. Test approach : List of selected testing technique to be applied w.r.t test strategy
6. Entry criteria: Establish test environment with required with hardware
and software , prepare complete and correct test cases, Receive complete build from developers.
7. Suspension criteria : Problems rises in test environment , problems rises like pending
defect or more . pending defects are also known as quality gap.
8. Exit criteria : all selected modules tested and then all high security bugs resolved.
i. All features tested
ii. All major defects resolved
iii. Test duration Exceeded

9. Test Deliverables : list of test document names to be prepared by test engineers. They are test cases , test procedures, automation programs, test logs, defect reports and summary reports.
10. Test Environment : List of required all hardware’s and software’s
11. Staff : Names of selected test engineers.
12. Responsibilities : Mapping between selected test engineers and modules . [who to test]
13. Schedules : dates and times [ when to test ]
14. Risks : List of previously analyzed risks by test leads.
15. Approvals : signatures of test lead and project managers

Review test plan : after completion of test plan document preparation , test lead category people are conducting a review meeting to estimate completeness and correctness of that test plan document . in this review meeting, test lead is depending on 3 coverages

Requirementsè coverage [ what to test ]
Testing Techniques coverage [ how to test]è
è Risks coverage [ who and when to test]

After completion of test plan review and their modification’s , test lead is conducting training sessions to selected test engineers about project . In this training your project management is inviting some subject experts or domain experts to share knowledge with test engineers.

Test Design : After completion of required no of training sessions, every test engineer in the testing team, prepare test cases. Every test case defines a unique test condition to be applied on build .there are 3 methods to prepare complete test cases for system testing in our project

. 1. Business logic based or requirements based test case design
2. Input domain based on design document based test case design
3. User interface based or prototype based test case design

1. Business logic based test case design :
In general , test engineers are preparing maximum test cases depending on functional specifications in SRS . in this method, test engineers follows below approach

Step 1: Collect required functional specification from SRS
Step 2: select one functional specification from that list

i. Identify entry point [base state]
ii. Identify inputs[test data]
iii. Identify output [ expected]
iv. Study normal flow
v. Identify exit point [ end state]
vi. Identify alternative flows and exception

Step 3: prepare test case titles or test scenario’s
Step 4: Review that test case titles for completeness and correctness
Step 5: prepare documents for every test case title with complete information .
Step 6: Go to step2: until all functional specification in that list. Functional Specification

1. A login process allows user id and password to authorize users. user id is taking alpha 16 characters long. The password is takingànumeric’s in lower case from 4 alphabets in lower case from 4 to 8 characters long . prepare test case titles. Every line to observe and factors are to be found out. Every test case starts with verify or check word

Test case 1:
Verify user ID:
BVA [size]
passà4 chars àMin
failà3 charsàMin-1
passà5 charsàMin+1
passà16 charsàMax
passà15 chars àMax-1
failà17 charsàMax+1


Test case 1:
Verify user ID:
BVA [size]
passà4 chars àMin
failà3 charsàMin-1
passà5 charsà
Min+1 passà16 charsà
Max passà15 chars à
Max-1 failà17 charsàMax+1

ECP [ type]
Valid Invalid a---z A---Z

0---9 special characters and blank spaces.


Test case 2:
Verify Password BVA[size]
passà4 chars àMin
failà3 charsàMin-1 5àMin+1
passàchars passà8 charsà
Max passà 7 charsàMax-1 9 charsà
Max+1 failà
ECP[type]
Valid invalid
a---z A ---Z 0---9

Special Characters and blank spaces
Test case 3:
Verify login operation
User ID password criteria
Valid Valid pass Valid Invalid fail Invalid Valid fail
Valid blank fail Blank
Valid fail form validation
Functional specification: A shopping application allows purchase order for different types of items. In this purchase order, user select item no and enter quantity up to 10. These purchase order return one item price and total amount w.r.t given quantity. Prepare test case titles or test scenarios

Select means the object is list box, enter means the object is text box. For select one’s ECP and BVA are not required

Up to means min is 1 and max is 10 . when one object is alphabet we write size , if numeric then range

Test case 1:
Verify item number selected :
Test Case 2:
Verify Quantity
BVA [ range] passà1
àMin failà0 àMin-1 passà2àMin+1 passà10à
Max passà9àMax-1 failà11àMax+1 ECP[type]
Valid Invalid 0---9 a---z Or
Numerics A---Z Special character and blank field

Case 4: if the development team released modified build due to sudden changes in customer requirements, then test engineers are re executing all p0, all p1 and carefully selected p2 test cases w.r.t changes in requirements
Test Reporting: During level 1 and level 2 test executions , test engineers are reporting mismatches in between test cases specified expected and build specified actual to development team. In this test operations test engineers are using IEEE 829 defect report format.
Format :
1. Defect ID : unique names or numbers.
2. Description : The summary of defect
3. Build version ID : The version number of build, in this build test engineers detected this defect.
4. Feature : The name of modules or functionality in this module, test engineers detected this defect eg: in build version mailing module is detected
5. Test case name : The name of failed test case in this case execution , test engineers detected this defect.
6. Reproducible : yes / No Yes : defect is coming every time in the test execution No : defect is coming rarely in test execution
7. If yes ,attach this procedure
8. If no , attach snap shot and strong reason.
9. Severity : the seriousness of defect in terms of functionality .
a. High : without resolving this defect , test engineers are not able to continue remaining testing. Compulsory and urgent to resolve
b. Medium: after continuing remaining testing but compulsory to resolve this defect before release software to customer, compulsory but not urgent to resolve.
c. May or may not to resolve, but able to release software to customer with out resolving.
10. Priority : The importance of defect in terms of customer [ high medium low ]
11. Status : New / Re open New : Reporting first time Re Open : Re Reporting
12. Detected by : name of the test engineer
13. Detected on : date of reporting
14. Assigned to : The name of the responsible person at development site to receive this defect report
15. Suggested fix : [optional ] A list of suggestions to resolve this defect Defect submission process: Test manager[img][/img][img][/img]
Test Case 3: Verify Calculation such as Total = price * given quantity Functional specification : In an insurance application users can apply for different types of policies if an user select type –A insurance , then system asks age of that user. This age value should be greater than 16 years and should be less than 80 years . prepare test cases tittles or test scenarios

Test Case 3:
Verify Calculation such as Total = price * given quantity
Functional specification : In an insurance application users can apply for different types of policies if an user select type –A insurance , then system asks age of that user. This age value should be greater than 16 years and should be less than 80 years . prepare test cases tittles or test scenarios
Test case 1:
Verify type –A insurance selected
Test case 2: Verify age appearance / focus after selection of type –A insurance
Test case 3: Verify age value BVA[range]
passà17 àMin failà16àMin-1 passà18àMin+1 passà79àMax passà78àMax-1 failà80àMax+1 ECP[type]
Valid invalid 0—9 a—z A—Z

Special Characters and blank field
Functional specification: A door opened when a person comes to in front of the door and the door closed when the person comes to inside. Prepare test cases.

Test case 1:
Verify door open Person door
criteria Present opened pass
Present closed fail Absent
opened fail Absent closed pass

Test Case 2:
Verify door closed:
Person door criteria
Inside closed pass
Inside opened fail

Test case 3:
Verify door when the person is standing in the middle of the door
Functional specification : A computer shut down operations

Test case 1:
Verify shut down selection using start menu
Test case 2:
Verify shut down selection using alt+f4
Test case 3:
Verify shut down operation completion
Test case 4:
Verify shut down operation when a process is running
Test case 5:
Verify shut down operation when power off

Functional specification :
Washing machine operation

Test Case 1: Verify power supply
Test case 2: Verify door open Test case
3: Verify water filling Test case
4: Verify clothes filling Test case
5: verify door close Test case
6: verify door closing when clothes over flow Test case
7: verify washing settings Test case
8: verify washing operations Test case
9: verify washing operations with improper power supply Test case10: verify washing operations with clothes overflow or over load
Test case11: verify washing operation due to water leakage from door
Test case12 : verify washing operation due to door open in the middle of the process
Test case13 : verify washing operation due to improper settings.
Test case14: verify washing operation due to machinery problem
Test case 15: Verify washing operation due to lack of water

Functional specification:
In the E – Banking application, user are connecting to bank server through internet . in this E-Banking application users are entering below fields values to connect to corresponding bank server

Prefix –3 digits no but does not start 0 and 1 Suffix -6 digits alphanumeric
Password – 6 digits number
Area code -3 digits number but optional
Commands – cheque deposit,
bills pay,
money transfer and minimum statements
prepare test case titles
Test Case 1:
Verify prefix :
BVA [ range] passà200àMin failà199àMin-1 passà201àMin+1 passà999àMax failà1000àMax+1
ECP [type]
Valid Invalid
0---9 a—z A—Z Special characters and blank fields

Test case 2 :
Verify suffix BVA[size] passàMin=max = 6 digit value failà5 digit value àMin=max= failà7 digit valueàMin=max= ECP [ type ]
Valid Invalid
0—9 Special characters and blank fields a—z
A—Z

Test case 3:
verify password BVA [range] Min = max = 6 digit number Min= max = 5 digit failànumber failàMin = max = 7 digit number
ECP [ type] Valid Invalid 0—9 a—z A—Z
Special characters and blank fields.

Test case 4:
Verify area code :
BVA[range] passàMin = max = 3 digit no failàMin = max = 2 digit no failàMin= max = 4 digit no ECP Valid Invalid 0—9 a—z Blank fields A—Z
Special characters
Test case 5:
Verify command select.
Such as cheque deposit, bills pay, money transfer, mini statements.
Test case 6:
Verify connection to bank server
All fields Criteria
All fields are valid Pass
Any one field is In invalid fail

Test Case 7:
Verify connection to bank server when area code is blank Remaining all fields Area Code Criteria
All fields are valid blank pass
All fields are valid valid - value pass a
Any one is invalid value Blank fail
Any one is blank Blank fail

Functional specification :
ATM with draw operation with all rules and regulations Test cases
Test case 1: Verify ATM machine Working
Test case 2: verify card insertion with wrong angle
Test case 3: verify message like “ please enter your pin “, after switching with draw
Test case 4: verify pin number enter

: BVA[ range ]
passàMin = max = 4 digit no
failàMin = max = 3 digit no failàMin = max = 5 digit no
ECP [ type ] Valid Invalid 0---9 a---z
A---Z
Special characters and blank fields.
Test case 5: verify language selection
Test case 6: verify operation when wrong pin number entered 3 times
Test case 7: verify account type selection
Test case 8: Verify operation when u selected wrong account type w.r.t that inserted card
Test case 9: Verify with draw operation selection
Test case 10: verify amount entry
Test case 11: verify with draw operation with correct amount and right receipt.
Test case 12: Verify with draw operation when amount specified with wrong denomination eg :999 rs
Test case 13: Verify with draw operation when available balance is less than with draw amount
Test case 14: Verity with draw operation when amount greater than day limit of bank [1 transaction / multiple transactions on that day
Test case 15: verify with draw operation when the ATM consists lack of amount
Test case 16: verify with draw operation when current transaction number greater than number of transactions per day.
Test case 17: verify operations with network problem
Test case 18: Verify cancel after insertion of card Always try to prepare both + ve an – ve test cases.
Test case 19: verify cancel after insertion of card and entering the pin and its validation
Test case 20: verify cancel after selecting the language
Test case 21: verify cancel after insertion card , language selection, pin validation and selection of account type and with draw selection
Test case 22: verify cancel after insertion of card, language selection, pin validation, account type selection with draw selection and amount entry.
Test case 23 : Verify with invalid data , time delay between 2 consecutive transactions or operations.


Test case should cover entire functionality.
Number of test cases is not important. Test case document format [IEEE 829] After selection of test case titles or test scenarios, test engineers are documenting the test cases with complete information

Format number
1: Test Case ID: unique number or name to refer Format number
2: Test case Name: previously selected test case name or titles Format number
3: Features to be tested: The name of the module of features or functions Format number
4: Test suits id : The name of test batch, in this batch current test case is a member Format number
5: Priority : The importance of test case w.r.t functionality basic functionalityàP0 General FunctionalityàP1 [Recovery compatibility, security, load and stress…] CosmeticàP2 functionality [user interface] Format number
6: Test Environment: list of required hardware‘s and software’s including testing tools Format number
7: Test duration: Expected date and time to execute the test cases on build Format number
8: Test Effort: person per hour [ 20 mins required to execute one test case by average ]
Format number
9: Pre condition / Test setup: Necessary tasks to do before start the case execution on build. The expected time to execute the test case Format number
10: Test procedure / Data of metrics Step Number Action Input Required Expected Actual Result Comment

Test Design Test Execution

OR
Data Matrix
Table
Input ObjectName
Equivalence Class Partition [Type]
Boundary Value Analysis [ Size ]
Valid In valid Minimum Maximum
Data Matrices
Test Design Format number
11: Test case pass or fail criteria : when our test case is passed or when ever test case is failed

Note 1 : in general , every test engineer is not filling 11 fields to some time. They can try to fill some of the mandatory fields in that 11 fields
Note 2: If out case is covering an object, test engineers are preparing data metrics for that object
If our test case is covering an operation [ more than 1 object participating )then test engineers are preparing test procedures.

Functional specification 9: A login process allows user id and password to authorize
users.
User id object is taking alpha numeric in lower case from 4 to 16 chars long.
The password is taking alphabets in lowercase from 4 to 8 chars long.
Prepare test case documents
Doc 1: 1: Test case ID : TC_LOGIN_1 [ all caps]
2: Test case Name: verify User ID
3: Features to for login operation we want to test]àbe tested [Login
4: Test Suit ID : TS_LOGIN
5: Priority:: p0
6: Pre condition: user ID object is taking values from the key board
7: Data matrices

Object Name Ecp [ type] BVA[size]
Valid Invalid Minimum Maximum
User ID a----z A----Z 4
chars 16 chars 0----9
Special Characters Blank spaces
Doc 2 :
1. Test case ID :
TC_LOGIN_2
2. Test case Name : verify password
3. Features to be tested [if same features for different test cases no need to specify features
4. Test suit ID: TS_LOGIN
5. Priority : p0
6. Pre condition: password object is taking values from key board Object Name Ecp [ type] BVA[size] Valid Invalid Minimum Maximum Password a----z A----Z 4 chars 8 chars Special Characters Blank spaces 0---9

Doc 3:
1. Test case ID: TC_LOGIN-3
2. Test case Name: Verify Login Application
3. Features to be tested : LOGIN
4. Test suit ID: TS_LOGIN
5. Priority : P0
6. Pre condition: Registered User ID and Password available in hand.
7. Test Procedure.
Step Number Action Input Required Expected Value

1 Focus to login window --- User ID focused
2 Enter user ID and password Userid, Password “OK” Button enabled [ ready to click]
3 Click “OK”

User ID Password
Valid Valid
Valid Invalid
Valid Blank
Blank Valid I
nvalid Valid
Next Page
Error page
Error page
Error page
Error page

Functional specification:
In a bank application the valid bank employees are creating fixed deposit form this form allows before fields as input.
Depositor Name : alphabets in lower case with initcap [initially capital letters ]
Amount : 1500 to 10000 Tenure : up to 12 months Interest : Number with one decimal
Eg: 10. 6 From the corresponding bank rules , when tenure is greater than 10 months, interest also greater than 10% prepare test case documents.
Doc1:
1. Test Case ID: TC_FD_1
2. Test Case Name : Verify depositor Name
3. Features to be tested : Fixed Deposit
4. Test Suit ID : TS_FD 5. Priority : P0
6. Precondition: Depositor name is taking values from keyboard
7. Data matrices

Input Object Name Ecp [ type] BVA[size] Valid Invalid Minimum Maximum
Depositor Name a----z with Init A---Z a---z with init a – z a---z with middle A—Z a—z with end A—Z 2 chars 256 chars Special Characters Blank spaces 0---9

Doc 2:
1. Test Case ID : TC_FD_2
2. Test Case Name : Verify Amount
3. Features to be tested : Fixed Deposit
4. Test Suit ID: TS_FD
5. Priority : P0
6. Pre Condition : amount object is taking values from key board
7. Data matrices

Object Name
Ecp [ type]
BVA[size]
Valid Invalid
Minimum Maximum
Amount 0---9 A----Z a---z 1500 100000
Special Characters Blank spaces

Doc 3:
1. Test case ID: TC_FD_3
2. Test Case Name : Verify Tenure [time to fix the money]
3. Features to be tested fixed deposit
4. Test Suit ID: TS_FD
5. Priority : P0
6. Pre Condition : Tenure object is taking values from the key board
7. Data Matrices

Object Name Ecp [ type] BVA[size]
Valid Invalid Minimum Maximum
Tenure 0---9 A----Z a---z 4 months 12 months
Special Characters Blank spaces

Doc 4 :
1. Test Case ID : TS_FD_4
2. Test Case Title : Verify interest
3. Features to be tested : Fixed deposit
4. Test suit ID : TS_FD
5. Priority : p0
6. Pre condition : Interest object is taking values from the key board
7. Data matrices

Object Name Ecp [ type] BVA[size]
Valid Invalid Minimum Maximum
Interest 0---9 with 1 decimal A----Z a---z 0.1 100
Special Characters Blank spaces 0—9 with greater than one decimal

Doc 5:
1. Test case ID : TS_FD_5
2. Test Case Title : Verify fixed deposit operation
3. Features to be tested : Fixed deposit
4. Test Suit ID : TS_FD
5. Priority : p0
6. Pre condition : Above four input’s taking valid values and employees ID is valid
7. Test procedures
Step Number Action Input Required Expected Value
1 login by employee Valid empid Menu Appears
2 Selected fixed deposit option - Fixed deposit form opened
3 Enter values for field and click “OK” All r valid Any one is invalid Any one is blank Action
Error
Error

Doc 6:
1. Test case ID : TS_FD_6
2. Test Case Title : Verify fixed deposit operations along with bank rule
3. Features to be tested: Fixed deposit
4. Test suit ID : TS_FD
5. Priority : P0
6. Pre condition : Above input objects are taking valid values and emp id is also valid
7. Test Procedure

Step Number
Action Input Required Expected Value
1 login by employee Valid empid Menu Appears
2 Selected fixed deposit option - Fixed deposit form opened
3 Enter values for field and click “OK” Valid depositor name
Amount and tenure
With interest > 10
With interest < 10
Action error

Input domain based :
Test case design : In general test engineers are preparing maximum test cases depending on functional specification in SRS. But some of the functional specifactions not able to provide complete information ,due complete information due to this reason, test engineers are also studying design documents. In this study, test engineers are following below approach
Step 1: collect required design documents
Step 2: Study that design documents. Logic to get information above size, type and rules / constraints or input objects
Step 3: Identify critical and non critical input and prepare data matrices for every input
Which input is participating?
In internal logic more than 1 time is called critical input and vice versa
Account Number
Account Name
Critical Non Critical
Balance
Account Address Note :
Business logic based test case design is mandatory. But input domain based test cases design is optional because functional specification in SRS can try to provide.
Complete information about functionality / customer requirements User Interface Based test case design or prototype based test case design : it is a mandatory level in test case design. In this test case design test engineers are preparing test cases depending on prototype [sample screens

] Eg: Test Case 1: Verify contrast of each object in every screen
Test case 2: Verify grouping of objects which are functionally related
Test case 3: verify border of that groups
Test case 4: verify alignment of that groups
Test case 5: verify title font, which is uniform through out all screens
Test case 6: verify title size which is uniform through out all screens
Test case 7: verify spacing , which is uniform through out all screens
Test case 8: verify label or title of every object with init cap
Test case 9: verify correctness of captions or tool tip
Test case 10: verify correctness of help for objects
Test case 11: verify information redundancy evidence
DOB:
(dd / mm / yyyy)
DOJ
No need to specify date format again.
Test case 12 : Verify abbreviation in capital letters
Test case 13: verify place of list box , menu, table grid, active x controls and data window in screen
Test case 14: verify default object in every screen.
Test case 15: verify minimize, maximize and close operation of ever screen
Test case 16: verify short cuts to menus with uniqueness
Test case 17: verify every object access with key board
Test case 18: verify back ground color which is uniform through all screens.
Test case 19: verify meaningful error messages
Test case 20: verify help documents [manual support testing]

Note :
above test cases are applicable on any type of GUI applications Review Test Cases: After completion of all reasonable test case selection testing team along with test lead is conducting a review meeting to estimate completeness and correctness of that test cases.

In this review meeting, testing team is depending on below factors
1. Functional specification based review
2. Design documents based review Requirements based reviews
3. Prototype based review testing technique based reviews.à
4. Test Strategy based review
Common rules in test cases selection: During test design, test engineers are following below rules to prepare test cases Rules
1 start every test case title with verify or check
2 Test case mush be designed as reusable [ cases are to be repeated for modified build]
3 Test case is giving consistent results to independent of testers
4 Test case must be specific to test an object or on operation
5 Every test case specifies operation to perform and expected response of build
6 Test case must be divided into different test cases to keep them as short and avoid confusion
7 One test case contains maximum of 10—15 steps
8 The test case must be uniquely identified for future reference eg: Test Case ID
9 Every test case must reviewed by test lead

Case study SNO
Testing Technique Test case design methods Source documents
1 User interface testing User interface based test case design Prototype
2 Manual support testing User interface based test case design Prototype
3 Functionality testing Business logic based test case design Functional specification in SRS
4 Input domain testing Input based test case design documents
5 Recovery Testing Business logic based test case design Functional specification in SRS
6 Inter System Testing Business logic based test case design Functional specification in SRS


Test Execution: Development Team
Testing Team Initial Build Stable
Build Level 0 [sanity / T.A.T / B.V.T ]
Tester Acceptance Testing[T.A.T]
.Build Verification Testing [B.V.T]
Bug Fixing Defect Report Level 1[comprehensive testing]
Bug Resolving Modified Build Level 2[regression testing]
After completion of this loop regression testing occurs.
Level 3[post mortem testing or Re regression testing or Regression testing ]
After complete of test design and their reviews , testing people are receiving build from development team and starts different level of test execution on that build as above
Levels of test execution vs. test cases:
Level -0 : p0 test case [basic functionality ]
Level -1: all p0, p1,and p2 test cases [ entire functionality ]
Level -2: selected p0 ,p1, and p2 test cases[ modified functionality]
Level -3: selected p0,p1,p2 test cases w.r.t high bug density

How testing team receive build from development team ?
Development people are posting builds into software’s. testing people are downloading that builds to conduct testing. To distinguish old build and modified build, development people are assigning unique version numbers to that build. This task is called build version control.

Level -0: [ sanity testing ] :
after receiving initial build from development team, testing people are conducting sanity testing to estimate stability of that build. During this test, test engineers are concentrating on basic functionality to analyze below factors

1. understandable
2 operatable
3. observable
4. controllable
5. consistency
6. simplicity
7. maintainable
8. automatable

To conduct testing the ability is called so A good test engineers should have all these factors testability quality. This level -0 sanity testing is also known as “Tester acceptance testing “ or build verification testing” or testability testing .

If the build is not stable w.r.t above factors , then test engineers are rejecting that build with out starting testing. If the build is stable , then test engineers are concentrating on level-1 testing or real testing to detect defects some times , test engineers are rejecting build with reasons when build is not working. This type of a small shake up in sanity testing is called smoke testing [it is not mandatory it is an optional but level -0 testing is mandatory]

Level -1: After receiving stable build from development team testing team are executing all their test cases on that build either in manual or in automation. During this test execution test engineers are preparing test – log documents. This documents specify result of every test case in terms of passed, failed and blocked[current test case post pone to future] .

Passed[all expected values are equal to actual ] Failed [ any one expected vitiated with actual ] Blocked[ corresponding test case execution post pone to future because corresponding functionality is wrong.

Level 2: during label 1 comprehensive testing , test engineers are reporting miss matches to development team. After resolving accepted bugs, testing team is receiving modified build from development team . Testing team is conducting regression testing to estimate bug fix week and occurrence of side effects in the modified build

Development Team

High middle Low All p0 all p0 some p0 All
p1 carefully selected p1 some p1 Carefully selected
p2 test cases some of the p2 test cases some p2 On modified build

Case 1: if the development team resolved bug, severity is high , then test engineers are repeal all p0 all p1, and carefully in that modified build w.r.t modification in coding. selected p2 testing cases

Case 2: if the development team resolved bug, severity is high, then test engineers are re executing all p0 , carefully selected p1 and some of p2 test cases on that modified build w.r.t modification in coding.

Case 3: if the development team resolved by severity is low , then engineers are re executing some of p0,p1,p2 test cases on that modified build w.r.t modifications in coding.

bank sites

http://www.onlinesbi.com/
http://www.karnatakabank.com/
http://www.icicibank.com/
http://www.axisbank.com/
http://www.hdfcbank.com/

For ticket reservation

http://www.redbus.in/
http://www.irctc.co.in/ (train)
http://www.indianrail.gov.in/
http://www.rail.gov.in/
http://www.erail.in/

testing sites

www.softwaretestingsucks.com/quality.htm
www.softwaretestinghelp.com/
http://forums.sureshkumar.net/
http://www.testingeducation.org/
http://www.geekinterview.com/
http://www.smartsoftwaretesting.com/
http://www.softwareqatest.com/
http://www.qatester.com/
http://www.testingtigirs.org/
http://www.softwaretestingadvice.com/
http://www.softwareqatestings.com/
www.kabinfo.net www.qaforums.com
www.sqa.fyicenter.com
http://softwaretestingguide.blogspot.com/
www.automation.org.uk/tutorials.htm

some mail liks

http://www.yahoo.com/
http://www.gmail.com/
http://www.orkut.com/
http://www.google.com/