DOC

automated testing form

By Jamie Stone,2014-05-04 17:04
10 views 0
automated testing form

Generic Project

    Prepared for

    Project Name

    Prepared by

    Company Name

    Date

    July 17, 2001

    ? 2001, COMPANY NAME. All rights reserved. This documentation is the confidential and proprietary intellectual

    property of COMPANY NAME. Any unauthorized use, reproduction, preparation of derivative works, performance, or display of this

    document, or software represented by this document, without the

    express written permission of Sabre Inc. is strictly prohibited.

    COMPANY NAME and the COMPANY NAME logo design are trademarks and/or service marks of an affiliate of COMPANY

    NAME. All other trademarks, service marks, and trade names are

    owned by their respective companies.

    PROJECT NAME

    Automated Testing Detail Test Plan

DOCUMENT REVISION INFORMATION

    The following information is to be included with all versions of the document.

    Project Name Project Number Prepared by Date Prepared

    Revised by Date Revised Revision Reason Revision Control No.

    Revised by Date Revised Revision Reason Revision Control No.

    Revised by Date Revised Revision Reason Revision Control No.

Sabre Inc. Confidential/All Rights Reserved iii

    PROJECT NAME

    Automated Testing Detail Test Plan

DOCUMENT APPROVAL

    COMPANY NAMEPROJECT This signature page is to indicate approval from sponsor and Client sponsor for the attached NAMEPROJECT NAME Detail Test Plan for the . All parties have reviewed the attached document and agree with its contents.

    COMPANY NAME Project Manager: Name, Title: Project Manager, PROJECT NAME Date

CUSTOMER Project Manager: Name, Title:

    Date

COMPANY NAME/DEPARTMENT Sponsor: Name, Title:

    Date

COMPANY NAME Sponsor: Name, Title:

    Date

CUSTOMER NAME Sponsor: Name, Title:

    Date

COMPANY NAME Manager: Name, Title:

    Date

Sabre Inc. Confidential/All Rights Reserved iv

Table of Contents

    1 Introduction .................................................................... 1

    1.1 Detail Test Plan Overview .................................................................................................................. 1 2 Test Description .............................................................. 2

    2.1 Test Identification ............................................................................................................................... 2 2.2 Test Purpose and Objectives ............................................................................................................. 2 2.3 Assumptions, Constraints, and Exclusions........................................................................................ 2 2.4 Entry Criteria ...................................................................................................................................... 2 2.5 Exit Criteria ......................................................................................................................................... 3 2.6 Pass/Fail Criteria................................................................................................................................ 3 3 Test Scope ...................................................................... 5

    3.1 Items to be tested .............................................................................................................................. 5 3.2 Items not to be tested ........................................................................................................................ 5 4 Test Approach ................................................................. 6

    4.1 Description of Approach..................................................................................................................... 6 5 Test Definition ................................................................ 7

    5.1 Test Functionality Definition (Requirements Testing) ........................................................................ 7 5.2 Test Case Definition (Test Design) .................................................................................................... 7 5.3 Test Data Requirements .................................................................................................................... 7

    5.4 Automation Recording Standards ............................................................................

    5.5 Winrunner Menu Settings .......................................................................................................

    5.6 Winrunner Script Naming Convention ...................................................................

    5.7 Winrunner GUIMAP Naming Conventions ........................................................

    5.8 Winrunner Result Naming Conventions ............................................................

    5.9 Winrunner Report Naming Conventions ................................................................

    5.10 Winrunner Script, Result, and Report Repository ............................................

    6 Test Preparation Specifications ....................................... 11

    6.1 Test Environment ............................................................................................................................. 11 6.2 Test Team Roles and Responsibilities ............................................................................................ 12 6.3 Test Team Training Requirements ................................................................................................... 13

    6.4 Automation Test Preparation

    Sabre Inc. Confidential/All Rights Reserved Table of Contents v

    7 Test Issues and Risks ..................................................... 14

    7.1 Issues ............................................................................................................................................... 14 7.2 Risks ................................................................................................................................................ 14 8 Appendices .................................................................... 16

    8.1 Traceability Matrix ............................................................................................................................ 16 8.2 Traceability Matrix .............................................. Error! Bookmark not defined. 8.3 Problem Report Format ..................................... Error! Bookmark not defined. 8.4 Definitions for Use in Manual Testing .............................................................................................. 18 8.4.1 Test Requirement ..................................................................................................................... 18 8.4.2 Test Case ................................................................................................................................. 18 8.4.3 Test Procedure ......................................................................................................................... 18 8.5 Test Cases ....................................................................................................................................... 17 8.5.1 Pond Test Cases ........................................ Error! Bookmark not defined. 8.5.2 River Test Cases ........................................ Error! Bookmark not defined. 8.5.3 Lake Test Cases ........................................ Error! Bookmark not defined. 8.5.4 Sea Test Cases .......................................... Error! Bookmark not defined. 8.5.5 Ocean Test Cases ...................................... Error! Bookmark not defined. 8.5.6 Miscellaneous Test Cases ......................... Error! Bookmark not defined. 9 Project Glossary ............................................................. 21

    9.1 Glossary Reference ......................................................................................................................... 21 9.2 Sample Addresses for Testing ......................................................................................................... 22 9.3 Test Credit Card Numbers ............................................................................................................... 23 Sabre Inc. Confidential/All Rights Reserved Table of Contents vi

Introduction 1

    1 Introduction 1.1 Automated Testing DTP Overview

    This Automated Testing Detail Test Plan (ADTP) will identify the specific tests that are to be performed to ensure the quality of the delivered product. System/Integration Test ensures the product functions as designed and all parts work together. This ADTP will cover information for Automated testing during the System/Integration Phase of the project and will map to the specification or requirements documentation for the project. This mapping is done in conjunction with the Traceability Matrix document, that should be completed along with the ADTP and is referenced in this document.

    This ADTP refers to the specific portion of the product known as PRODUCT NAME. It provides clear entry and exit criteria, and roles and responsibilities of the Automated Test Team are identified such that they can execute the test.

    The objectives of this ADTP are:

     Describe the test to be executed.

     Identify and assign a unique number for each specific test. Describe the scope of the testing.

     List what is and is not to be tested.

     Describe the test approach detailing methods, techniques, and tools. Outline the Test Design including:

     Functionality to be tested.

     Test Case Definition.

     Test Data Requirements. Identify all specifications for preparation.

     Identify issues and risks.

     Identify actual test cases.

     Document the design point or requirement tested for each test case as it is developed. Sabre Inc. Confidential/All Rights Reserved Heading 1 1

Test Description 2

    2 Test Description 2.1 Test Identification

    This ADTP is intended to provide information for System/Integration Testing for the PRODUCT NAME module of the PROJECT NAME. The test effort may be referred to by its PROJECT

    REQUEST (PR) number and its project title for tracking and monitoring of the testing progress. 2.2 Test Purpose and Objectives

    Automated testing during the System/Integration Phase as referenced in this document is intended to ensure that the product functions as designed directly from customer requirements. The testing goal is to identify the quality of the structure, content, accuracy and consistency, some response times and latency, and performance of the application as defined in the project documentation.

    2.3 Assumptions, Constraints, and Exclusions

    Factors which may affect the automated testing effort, and may increase the risk associated with the success of the test include:

     Completion of development of front-end processes

     Completion of design and construction of new processes

     Completion of modifications to the local database

     Movement or implementation of the solution to the appropriate testing or production environment Stability of the testing or production environment

     Load Discipline

     Maintaining recording standards and automated processes for the project Completion of manual testing through all applicable paths to ensure that reusable automated scripts are valid

    2.4 Entry Criteria

    The ADTP is complete, excluding actual test results. The ADTP has been signed-off by appropriate sponsor representatives indicating consent of the plan for testing.

    The Problem Tracking and Reporting tool is ready for use. The Change Management and

    Configuration Management rules are in place.

    Sabre Inc. Confidential/All Rights Reserved Heading 2 2

    The environment for testing, including databases, application programs, and connectivity has been defined, constructed, and verified.

2.5 Exit Criteria

    In establishing the exit/acceptance criteria for the Automated Testing during the System/Integration Phase of the test, the Project Completion Criteria defined in the Project Definition Document (PDD) should provide a starting point. All automated test cases have been executed as documented. The percent of successfully executed test cases met the defined criteria. Recommended criteria: No Critical or High severity problem logs remain open and all Medium problem logs have agreed upon action plans; successful execution of the application to validate accuracy of data, interfaces, and connectivity.

2.6 Pass/Fail Criteria

    The results for each test must be compared to the pre-defined expected test results, as documented in the ADTP (and DTP where applicable). The actual results are logged in the Test Case detail within the Detail Test Plan if those results differ from the expected results. If the actual results match the expected results, the Test Case can be marked as a passed item, without logging the duplicated results.

    A test case passes if it produces the expected results as documented in the ADTP or Detail Test Plan (manual test plan). A test case fails if the actual results produced by its execution do not match the expected results. The source of failure may be the application under test, the test case, the expected results, or the data in the test environment. Test case failures must be logged regardless of the source of the failure.

    Any bugs or problems will be logged in the DEFECT TRACKING TOOL.

    The responsible application resource corrects the problem and tests the repair. Once this is complete, the tester who generated the problem log is notified, and the item is re-tested. If the retest is successful, the status is updated and the problem log is closed.

    If the retest is unsuccessful, or if another problem has been identified, the problem log status is updated and the problem description is updated with the new findings. It is then returned to the responsible application personnel for correction and test.

    Severity Codes are used to prioritize work in the test phase. They are assigned by the test group and are not modifiable by any other group. The following standard Severity Codes to be used for identifying defects are:

    Sabre Inc. Confidential/All Rights Reserved Heading 2 3

Table 1 Severity Codes

    Severity Code Severity Code

    Number Name Description

    1 Critical Automated tests cannot proceed further within applicable test

    case (no work around)

    2 High The test case or procedure can be completed, but produces

    incorrect output when valid information is input. 3 Medium The test case or procedure can be completed and produces

    correct output when valid information is input, but produces

    incorrect output when invalid information is input.

    (e.g. no special characters are allowed as part of

    specifications but when a special character is a part of the

    test and the system allows a user to continue, this is a

    medium severity)

    4 Low All test cases and procedures passed as written, but there

    could be minor revisions, cosmetic changes, etc. These

    defects do not impact functional execution of system

The use of the standard Severity Codes produces four major benefits:

     Standard Severity Codes are objective and can be easily and accurately assigned by those

    executing the test. Time spent in discussion about the appropriate priority of a problem is

    minimized.

     Standard Severity Code definitions allow an independent assessment of the risk to the on-

    schedule delivery of a product that functions as documented in the requirements and design

    documents.

     Use of the standard Severity Codes works to ensure consistency in the requirements, design, and

    test documentation with an appropriate level of detail throughout.

     Use of the standard Severity Codes promote effective escalation procedures.

    Sabre Inc. Confidential/All Rights Reserved Heading 2 4

Test Scope 3

    3 Test Scope The scope of testing identifies the items which will be tested and the items which will not be tested

    within the System/Integration Phase of testing.

3.1 Items to be tested by Automation

    1. PRODUCT NAME

    2. PRODUCT NAME

    3. PRODUCT NAME

    4. PRODUCT NAME

    5. PRODUCT NAME

3.2 Items not to be tested by Automation

    1. PRODUCT NAME

    2. PRODUCT NAME

    Sabre Inc. Confidential/All Rights Reserved Heading 1 5

Report this document

For any questions or suggestions please email
cust-service@docsford.com