Software Testing Experience

Wednesday, April 9, 2008

Software Testing Guidelines


SOFTWARE TESTING GUIDELINES


TABLE OF CONTENTS

1 Introduction 3

1.1 Purpose of the document. 3

1.2 References 3

1.3 List Of Abbreviations 3

2 Testing Activities 3

2.1 Test Planning 4

2.2 Test Environment Preparation 4

2.3 Test Execution 4

2.4 Test Verification 4

2.5 Test Forms 4

3 Process Definition and Ability to Perform 5

3.1 Common Policy Statement 5

3.2 Common Commitment 5

3.3 Unit/Component Testing 5

3.4 Integration Testing 6

3.5 System Testing 7

3.6 Quality Assurance Testing 7

3.7 Acceptance Testing 7

3.8 Other Testing Levels 8

3.9 Roles and Responsibilities of Testing Group 8

3.10 Inputs 8

3.11 Entry Criteria 9

3.12 Activities 9

3.13 Exit Criteria 10

3.14 Output 10

3.15 Auditability 10

3.16 Measurements 10

3.17 Documented Procedure 10

3.18 Training 10

4 Templates 11

4.1 Code Inspection Check List 11

4.2 Unit Test Plan 12

4.3 Integration Test Plan 13

4.4 System Test Plan 14

  1. Introduction

( L3_SPE )

  1. Purpose of the document.


The purpose of Testing Guidelines document is to specify standard approach to software testing. As with any product development process, test set development requires planning, requirements (test objectives), design, implementation and evaluation.


Testing is the process of analyzing a software item to detect the differences between existing and required conditions and to evaluate the features of the software item.


Testing also can be considered as a verification mechanism to ensure the sound software engineering practice.


Testing is performed at several points in the life cycle as the product is constructed component by component (unit by unit) into a functioning system. These points in life cycle can be viewed as levels of capabilities that need to be tested. The levels progress from the smallest or single component through combining or integrating the single units into larger components or stages. These levels are



This guideline document outlines the purpose and templates of testing at each level.

  1. References


Document

Author

Version

CMM Practices – TR 25

SEI

1.1


  1. List Of Abbreviations


PL - Project Leader

PM - Project Manager

PR - Peer Review

SWP - Software Work Product

TPL - Technical Project Leader

  1. Testing Activities


Testing is one of several complementary verification and validation activities. Code walk through or inspection can be considered as part of testing activity. The following activities can be considered as part of testing activities. Depending on the nature of the project, some of these activities may not become applicable.



The above activities are carried out at each and every level of Testing stated earlier. Apart from these Installation Testing can also be carried out to ensure the creation of proper product execution environment. At macro level these activities can also be classified as



  1. Test Planning


This activity consists of deciding a testing strategy, preparation of test condition, execution plan, test data and test script.


  1. Test Environment Preparation


This activity consists of preparing hardware and software environment for testing

  1. Test Execution


This activity consists of execution of the test cycles and capture of actual results for actions performed.

  1. Test Verification


This activity consists of verification of the actual test results obtained during the test execution.

  1. Test Forms


Testing will consist of following forms:


1. Test Conditions

2. Execution Plan

3. Test Script

4. Input Data

5. Output Data


Testing should focus on executing a program with the intent of finding an error.


  1. Process Definition and Ability to Perform

  2. Policy Statement


The policy is that each project identifies various levels of testing that software work products will undergo. The goal of Testing is to find the undiscovered error in the Software Work Products and remove them early in Software Development Life Cycle.

  1. Commitment


The Testing at the following levels is mandatory.



Each Project should also carry out the following Testing activities.



The Objective of the process is stated in the common commitment section.


Based on the needs of the project for testing the software work products, testing can be planned and resources allocated during the project plan preparation. Each project will identify the work products that will undergo testing during the preparation of the project plan.

  1. Unit/Component Testing


The activities at this level include description of general approach, features, inputs, tasks and outputs. There will be an integrated document comprising of Plan, Procedure, Test Cases and Test Results. This document is prepared by the person who does the unit/component. This document is verified and executed by yet another peer member of the team. The findings are recorded in the same document. This is over and above the execution and recording of results by the individual who did the unit.


The tests that occur as part of unit tests are:


1. Local Data Structure


The local data structure is examined to assure that data stored temporarily maintain their integrity during all steps in an algorithm's execution.


2. Boundary Conditions


Boundary conditions are tested to assure that the unit runs properly at boundaries established to limit or restrict processing.


3. Independent Paths


All independent paths through the control structure are exercised to assure that all statements in a module have been executed at least once.


4. Error handling


All error-handling paths are tested.


5. Interface Testing


Tests of data flow across an interface are required before any other test is initiated. Following is a proposed checklist for interface testing:

- Number of input parameters equal to number of arguments?

- Parameter and argument attributes match?

- Parameter and argument units systems match?

- Number of arguments transmitted to called modules equal to number of parameters?

- Attributes of arguments transmitted to called modules equal to attributes of parameters?

- Units system of arguments transmitted to called modules equal to attributes of parameters?

- Number attributes and order of arguments to built-in functions correct?

- Any references to parameters not associated with current point of entry?

- Input only arguments altered?

- Global variable definitions consistent across unit systems?

- Constraints passed as arguments?


When a Unit System performs external I/O, additional interface tests must be conducted as follows:

- File attributes correct?

- OPEN statement correct?

- Format specification matches I/O statement?

- Buffer size matches record size?

- Files opened before use?

- End-of-file conditions handled?

- I/O errors handled?

- Any textual errors in output information?


Unit testing is carried out by the programming team. Responsibility of test data creation is also with the programming team.

  1. Integration Testing


Integration Testing is an orderly progression of testing in which software elements, hardware elements, or both are combined and tested until the entire system has been integrated.


Team member not involved in design is assigned the responsibility of preparing the Test Plan, Test Design Specifications and Test case specifications. For preparing this, the Design specification is made use of. These are executed when SWP is ready for Integration Testing.


The purpose of this testing is to ensure that design objectives are met. This testing is conducted by the resource identified in the project plan and the Test Log is prepared.

  1. System Testing


System Testing is the process of testing an integrated hardware and software system to verify that the system meets its specified requirements.


Team member not involved in SRS preparation is assigned the responsibility of preparing the Test Plan, Test Design Specifications and Test case specifications. For this purpose, System Requirements Specifications prepared earlier is used. These are executed when SWP is ready for Integration Testing.


The purpose of this testing is to ensure that the software as a complete entity complies with its operational requirements.


This testing is conducted by the person identified in the project plan and the Test Log is prepared.

  1. Quality Assurance Testing


Quality Assurance Testing is to ensure, that before delivery of the SWP to the customer, the SWP is tested for stated specifications.


The Unit, Integration and System test specifications, prepared earlier, are executed by team member not involved in SWP.


The purpose of this testing is to ensure that design objectives are met. This testing is conducted by the resource identified in the project plan and the Test Log is prepared.

  1. Acceptance Testing


Acceptance Testing is defined as the process of formal testing conducted to determine whether or not the system satisfies its acceptance criteria and to enable the customer to determine whether or not accept the system.


Based on the System Requirements Specifications, acceptance criteria is determined in consultation with the Customer. Customer prepares the Test Plan and Test Case Specifications. The defined test cases are executed by Customer. The results are communicated to the project team. During this, the project team can give the necessary support.


The purpose of this testing is to ensure that Customers' requirements objectives are met and that all the components are correctly included in a customer package.


This testing is conducted by the Customer with or with out the support of the Project Team. The Test Log is prepared is communicated to the project team.

  1. Other Testing Levels


Installation Testing


Installation Testing ensures that a completed and tested SWP or application satisfies user requirements for using the system in the operation environment defined. Installation Testing ensures that



Regression Testing


Regression Testing ensures that basic defined and incorporated functioning of a system is not affected by changes made. The changes can come due to enhancements, bug fix or further development on the product. Regression testing can be done by executing the test cases written earlier to test the features of the system.

  1. Roles and Responsibilities of Testing Group


Tester


Tester is responsible for executing the Unit Test cases given to him and record the results.


Testing Group


Testing Group is responsible for conducting the Integration and System Testing and responsible for recording the results and mark Pass/Fail for each test conducted. This team will also prepare Test Summary report. This team is constituted in such a way that the person will always test others code and not his code.


Management


Project Team Management / General Management is responsible for acting on the Test Report on timely manner.

  1. Inputs


The following are the common inputs to the PR Process.



  1. Entry Criteria


Authorization


The need for testing of specific SWP and the testing level and resources are defined in the project planning documents. Depending on the rework, the testing may be carried out once again. This decision comes from PM/PL.


Initiating Testing


The responsible individual for the SWP indicates the readiness for Unit Testing to TPL/PL. In case of Integrated or System Testing, as per the plan it is executed. In case of Acceptance Testing, Client's readiness is obtained.


  1. Activities


Planning


The Testing is planned as part of project management plan and the resources are allocated. The necessary environment is created for testing purposes, which may be different from the development environment.


Overview


Testing is normally coordinated by PM/PL. In case of Acceptance Testing, Client's involvement is obtained. If the testing of deliverables are involved, then Quality Assurance Group overviews the activities.


Preparation


Environment set-up is identified and the SWP is also identified.


Examination


During the execution of Test Cases, the actual results are compared with expected results and Pass/Fail recording done against each and every test case. Deviations from the test cases also will be carried out and the findings will be reported.


Prior to unit testing, code walk through or code inspection can be done as part of testing. For this, standard checklist can be referred. If any rework needed, it is informed at this stage.


After the Test Cases Execution, a Test Log report and Test Summary report are generated.


In case of defects, the defect report is passed on to the project team to examine and rectify the same. On rectification, the testing again is done.


In case of Integration and System Testing, the suspension and restart criteria for testing are defined.


  1. Exit Criteria


Testing is complete when


  1. All issues identified in Testing have been addressed and re-tested.
  2. The Testing Criteria is met.
  3. Test Log and Summary Report have been issued.


  1. Output


The outputs from Testing are:




    1. Auditability


The Test Log, Test Summary and Defect Log reports are auditable.


  1. Measurements


The Defect Log Report serves as the measurement by the classification of severity of defects. This can be measured as:



  1. Documented Procedure


Testing will be based on the Test Plans, Test Procedures and Test cases supplied by the project team.


  1. Training


Informal training is given to tester by proper mix of experience and new staff in Testing team.



  1. Templates

  2. Code Inspection Check List


CODE INSPECTION REPORT


Project Name : Client Name :

Developer Name: References :

Module Name

<indicate the module name>

Function

<indicate the functionality of the program>

Program Names/Procedures

<List each and every program with version numbers>

Library Created

<Purpose of this Library>

Standards Adherence

Yes/No. <Give Details – Program-wise>

Indenting

Yes/No. <Give deviations>

Error Messages

<Meaningful>

Help Messages

<Meaningful>

Program Documentation

<Meaningful>

Variable Definitions and declarations

<Uniform and Methodical>

Parameters Passing

<Consistent>

Error Checking – SQL

<Consistent>

Exit and Returns

<Consistent>

Program Structure

<Readable>

Screen Handling and Checking

<Consistent>

Data Type usage

<Consistent>


Review's Remarks






Reviewed By ---------------------------------------------

Name, Signature and Date

  1. Unit Test Plan


Project Name :

Client Name :

Developer Name:

References :

Scope :


Special Instructions:

Environment Setup: <Outline the OS version, Database Requirements etc.,>

Data Set-up : <Outline the pre-requirements and Data Set-up needed for testing>

Validations : <For Entry Elements, indicate the field level validations also>

Computation Elements: <Give example and explain>


Test Suites, Test Cases, Results filled by Developer

Suite #

Case #

Test Scenario

Test Data

Expected Results

Actual results


Developer's Signature with Date



Test Suites, Test Cases, Results filled by Tester


Suite #

Case #

Test Data

Actual results

Pass/Fail

Rework Needs


Tester's Signature with Date :


Rework Verified (if any) by, with Date:

  1. Integration Test Plan


Project Name :

Client Name :

References :

Scope :



Introduction :


Special Instructions:

Environment Setup: <Outline the OS version, Database Requirements etc.>

Data Set-up : <Outline the pre-requirements and Data Set-up needed for testing>

Interface Testing: <Give the details for Integration Testing>


Suspension Criteria and Resumption Requirements: < Give details here>



<To be filled by Tester>

Suite #

Case #

Test Scenario

Test Data

Expected Results

Actual Results

Pass/Fail



Tester's Signature with Date:

System Test Plan


Project Name :

Client Name :

References :

Scope :



Introduction :


Special Instructions:

Environment Setup : <Outline the OS version, Database Requirements etc.,>

Data Set-up : <Outline the pre-requirements and Data Set-up needed for testing>

System Testing : <Give the details for System Testing and also map with user requirements>


Suspension Criteria and Resumption Requirements: < Give details here>


<To be filled by Tester>

Suite #

Case #

Test Scenario

Test Data

Expected Results

Actual Results

Pass/Fail



Tester's Signature with Date :


Labels:

posted by Balaji Visharaman at 10:06 PM

0 Comments:

Post a Comment

<< Home