1. Introduction
  2. Test Plan Objectives
    1.  Define the activities required to prepare for and conduct smoke, functional, system, interface and webservices testing.
    2.  Communicate to all responsible parties about the System Test strategy
    3.  Define deliverables and responsible parties
    4.  Communicate to all responsible parties the various Dependencies and Risks
  3. Scope
    1. In Scope
      1. The overall purpose of testing is to ensure the new web based system, GPS meets all of its technical, functional and business requirements. The purpose of this document is to describe the overall testplan and strategy for testing GPS. The approach described in this document provides the framework for all testing related to this application. Individual test cases will be written for each iteration based on the use cases provide by Virtusa (vendor) team. The test plan document will also be updated as required for each release if there are any improvements/ modifications along with the change request document
    2. Out Of Scope
      1. Load Testing
      2. Automation Testing
  4. Main Topic 7
    1. No. Machine Server Name Purpose Software Requirements Hardware Configuration Qty. 1 Web Server AEWEBSVCDEV4 Installation of the application OS – Win2008R2 Intel(R) Xenon, 4 GB RAM, 30 GB HDD 1 IIS – 7.5 Net – 4.5 2 App Server AEWEBAPPIDDEV Managing application data OS – Win2008R2 Intel(R) Xenon, 4 GB RAM, 30 GB HDD 1 IIS – 7.5 Net – 4.5 3 DB Server DEVSQLCL2A\ DEVINST2A Database server to support the Content Management Tool OS - Win2008R2 Intel(R) Xenon, 1 GB RAM, 100 GB HDD 1 MS SQL Server 2008 R2 4 QA Requirements QA Test Environments: UT1 for GPS and Other systems (PPL, Jaguar, MediaPulse and others) DB – ut1 for GPS and Other systems (PPL, Jaguar, MediaPulse and others)
  5. Main Topic 4
  6. Main Topic 5
    1. Subtopic 1
      1. Subtopic 1
      2. Subtopic 2
      3. Subtopic 3
      4. Subtopic 4
        1. Subtopic 1
  7. Test Strategy
    1. Entry Criteria
      1.  All business requirements are documented and approved by the business users
      2.  All design specifications like BRD, Use Cases, Wireframes have been reviewed and approved
      3.  Unit testing has been completed by the development team
      4.  All hardware needed for the test environment is available
      5.  The application delivered to the test environment is of reliable quality
      6.  Initial smoke test of the delivered functionality is approved by the testing team
      7.  Code changes made to the test site will go through a change control process
    2. Exit Criteria
      1.  All test scenarios and test cases have been completed successfully
      2.  All issues prioritized and priority 1 issues resolved
      3.  All outstanding defects are documented in a test summary with a priority and severity status
      4.  Go/No-go meeting is held to determine acceptability of product
    3. Test Execution
      1. Smoke Testing
        1. The released code is working properly and ensures the system is basically operational.The QA environment is properly configured
      2. Functional Testing
        1. Functional testing focuses on the functional requirements of the software and is performed to confirm that the application operates accurately according to the documented specifications and requirements, and to ensure that interfaces to external systems are properly working
      3. Interface Testing
        1. This testing follows a transaction through all of the product processes that interact with it and tests the product in its entirety. Interface testing shall be performed to ensure that the product actually works in the way a typical user would interact with it. This involves the webservices testing
      4. System Testing
        1. Testing the fully integrated applications including external peripherals in order to check how components interact with one another and with the system as a whole. This is also called End to End testing. Verify thorough testing of every input in the application to check for desired outputs. Testing of the user's experience with the application
      5. Cross Browser Testing
        1. Please find below the scope of browser testing Target Design Resolution is 1280x800 and it should work with any screen resolution
          1. Iteration 1
          2. • IE 9 and IE10 • Safari 5.1.7 and Latest Version of Safari • Chrome 37 and Latest Version of Chrome • Mozilla 30 and Latest Version of Mozilla
          3. Latest Version of Chrome
          4. Iteration 2
          5. Latest Version of Chrome
          6. • IE 9 and IE10 • Safari 5.1.7 and Latest Version of Safari • Chrome 37 and Latest Version of Chrome • Mozilla 30 and Latest Version of Mozilla
          7. Iteration 3
          8. • IE 9 and IE10 • Latest Version of Safari • Latest Version of Chrome • Latest Version of Mozilla
          9. • IE 9 and IE10 • Safari 5.1.7 and Latest Version of Safari • Chrome 37 and Latest Version of Chrome • Mozilla 30 and Latest Version of Mozilla
      6. Regression Testing
        1. Regression testing shall be performed at the end of Iteration 3 to verify that previously tested features and functions do not have any new defects introduced, while correcting other problems or adding and modifying other features
      7. User acceptance testing
        1. User acceptance testing activities will be performed by the business users. The purpose of this testing will be to ensure the application meets the users’ expectations. This also includes focuses on usability and will include; appearance, consistency of controls, consistency of field naming, accuracy of drop down field information lists, spelling of all field name/data values, accuracy of default field values, tab sequence, and error/help messages
    4. Test Organization
      1. Test Types
        1. Smoke Testing
        2. UI and Functional Testing (UI and Database testing)
        3. Interface Testing (Web Services Testing)
        4. System Testing
        5. User Acceptance Testing
      2. Constraints
        1. Frequent change in the requirements
        2. Unavailability of Test Data
        3. Unavailability of Test Environment
    5. Test Case Development
      1. Test Case ID
      2. Module
      3. Test Name
      4. Test Data
      5. Action
      6. Expected Result
      7. Test Result
      8. Comments
      9. Use Case
    6. Requirement Tracebility Matrix
  8. Main Topic 8