1. -Reviews the QA deliverables -Ensured the team meets the scheduled timelines for each Iteration -Monitors the test status report for each Iteration and signsoff the test summary report for each Iteration
  2. Establish test plan Peer Reviews Providing the Traceability Matrix Design/Develop Test cases Test Data Preparation Execute Test cases Evaluate test results & publish Test report Log defects Report status and risks/issues to leads/PM
  3. Approvals
    1. Ada Chan
    2. 11/11/2015
    3. Tricia Riccio
    4. Brian Doherty
    5. Susan Tanamli
  4. Dependencies
    1. Personal Dependencies
      1. The test team requires experience testers to develop, perform and validate tests
    2. Software Dependencies
      1. The source code must be unit tested and provided within the scheduled time outlined in the Project Schedule
    3. Hardware Dependencies
      1. The requirements according to (Step 4: Environment Requirements) should be made available before the testing. Any downtime will affect the test schedule
    4. Test Data & Database
      1. The test data needs to be provided before the QA test execution starts 1) The data in the dropdown (GPS UI) 2) The referential data (which are required for interface testing) 3) Required DB tables are to be created for testing ** All the above data to be provided by Virtusa team before the QA test execution starts **ITLUTI – To be provided on 10/15/14 for QA to start
  5. Resumption Criteria
    1. If testing is suspended, resumption will only occur when the problem(s) that caused the suspension has been resolved. When a critical defect is the cause of the suspension, the “FIX” must be verified by the test department before testing is resumed
  6. Test Suspension Criteria
    1. If any defects are found which seriously impact the test progress, the QA manager may choose to Suspend testing. Criteria that will justify test suspension are
      1. Hardware/software is not available at the times indicated in the project schedule
      2. Source code contains one or more critical defects, which seriously prevents or limits testing progress.
      3. Assigned test resources are not available when needed by the test team
  7. Deliverables
    1. Iteration 1
      1. A&E QA Test Cases
      2. 8/25/14
      3. 15 Days
      4. 9/15/14
      5. A&E QA Test Execution
      6. 10 Days
      7. 10/29/14
      8. 11/14/14
      9. Test Results
      10. 1 Day
      11. 11/14/14
      12. 11/14/14
    2. Iteration 2
      1. A&E QA Test Cases
      2. 11/15/14
      3. 15 Days
      4. 11/29/14
      5. A&E QA Test Execution
      6. 10 Days
      7. 11/30/14
      8. 12/13/14
      9. Test Results
      10. 1 Day
      11. 12/14/14
      12. 12/14/14
    3. Iteration 3
  8. Control Procedures
    1. Reviews
      1. The project team will perform reviews for each Iteration. (Test Case Review and Final Test Summary Review). A meeting notice, with related documents, will be emailed to each participant
    2. Bug Review meetings
      1. The project team will perform reviews for each Iteration. (Test Case Review and Final Test Summary Review). A meeting notice, with related documents, will be emailed to each participant
    3. Change Request
      1. Once testing begins, changes to the GPS system are discouraged. If functional changes are required, these proposed changes will be discussed with the managers The managers will determine the impact of the change and if/when it should be implemented. The additional/modified fucnitonality should be documented as the Change Request document
    4. Defect Reporting
      1. When defects are found, the testers will complete a defect report on the JIRA defect tracking system. JIRA is accessible by testers, developers & all members of the project team. When a defect has been fixed or more information is needed, the developer will change the status of the defect to indicate the current state. Once a defect is fixed by developers the testers would re-test close the defect report if it passes. Below diagram explains the defect lifecycle D- Developers, T-Testers, PM- Project Managers
      2. Bug LifeCycle
  9. Documentation
    1. Test Plan
    2. Test Plan
    3. Requirements Traceability Matrix
    4. Defect reports
    5. Final Test Summary Report
  10. Risks
    1. Schedule The schedule for each phase is very aggressive and could affect testing. A slip in the schedule in one of the other phases could result in a subsequent slip in the test phase. Close project management is crucial to meeting the forecasted completion date
    2. Technical Since this is the new web based solution being developed which interacts with other system like PPL, RDM, Jaguar, Media Pulse, Ceiton, Platform etc. Any major bugs encountered during the interface testing would delay the project timeline
    3. Management Management support is required so when the project falls behind, the test schedule does not get squeezed to make up for the delay. Management can reduce the risk of delays by supporting the test team throughout the testing phase and assigning people to this project with the required skills set
    4. Personnel Due to the aggressive schedule, it is very important to have experienced testers on this project. Unexpected turnovers can impact the schedule. If attrition does happen, all efforts must be made to replace the experienced individual
    5. Personnel Due to the aggressive schedule, it is very important to have experienced testers on this project. Unexpected turnovers can impact the schedule. If attrition does happen, all efforts must be made to replace the experienced individual
  11. Test Schedule (Dates only for representation)
    1. Iteration 1
    2. 09/01/2015
    3. 20/03/2015
    4. Dhiren Shah
    5. Robinson Batchu
    6. Iteration 2
    7. Iteration 3
  12. Introduction
  13. Test Plan Objectives
    1. Define the activities required to prepare for and conduct smoke, functional, system, interface and webservices testing.
    2. Communicate to all responsible parties about the System Test strategy
    3. Define deliverables and responsible parties
    4. Communicate to all responsible parties the various Dependencies and Risks
  14. Scope
    1. In Scope
      1. The overall purpose of testing is to ensure the new web based system, GPS meets all of its technical, functional and business requirements. The purpose of this document is to describe the overall testplan and strategy for testing GPS. The approach described in this document provides the framework for all testing related to this application. Individual test cases will be written for each iteration based on the use cases provide by Virtusa (vendor) team. The test plan document will also be updated as required for each release if there are any improvements/ modifications along with the change request document
    2. Out Of Scope
      1. Load Testing
      2. Automation Testing
  15. Environment Requirements
    1. Environment Requirements
        1. 1
        2. Web Server
        3. AEWEBSVCDEV4
        4. Installation of the application
        5. Net – 4.5 IIS – 7.5 OS – Win2008R2
        6. Intel(R) Xenon, 4 GB RAM, 30 GB HDD
        7. 1
      1. 2
      2. App Server
      3. AEWEBAPPIDDEV
      4. Managing application data
      5. Net – 4.5 IIS – 7.5 OS – Win2008R2
      6. Intel(R) Xenon, 4 GB RAM, 30 GB HDD
      7. 1
      8. DB Server
      9. 3
      10. DEVSQLCL2A\ DEVINST2A
      11. Database server to support the Content Management Tool
      12. OS - Win2008R2 MS SQL Server 2008 R2
      13. Intel(R) Xenon, 4 GB RAM, 30 GB HDD
      14. 1
      15. 4
      16. QA Requirements
      17. UT1 for GPS and Other systems (PPL, Jaguar, MediaPulse and others) DB – ut1 for GPS and Other systems (PPL, Jaguar, MediaPulse and others)
      18. UT1 for GPS and Other systems (PPL, Jaguar, MediaPulse and others) DB – ut1 for GPS and Other systems (PPL, Jaguar, MediaPulse and others)
      19. UT1 for GPS and Other systems (PPL, Jaguar, MediaPulse and others) DB – ut1 for GPS and Other systems (PPL, Jaguar, MediaPulse and others)
  16. Test Strategy
    1. Entry Criteria
      1. All business requirements are documented and approved by the business users
      2. All design specifications like BRD, Use Cases, Wireframes have been reviewed and approved
      3. Unit testing has been completed by the development team
      4. All hardware needed for the test environment is available
      5. The application delivered to the test environment is of reliable quality
      6. Initial smoke test of the delivered functionality is approved by the testing team
      7. Code changes made to the test site will go through a change control process
    2. Exit Criteria
      1. All test scenarios and test cases have been completed successfully
      2. All issues prioritized and priority 1 issues resolved
      3. All outstanding defects are documented in a test summary with a priority and severity status
      4. Go/No-go meeting is held to determine acceptability of product
    3. Test Execution
      1. Smoke Testing
        1. The released code is working properly and ensures the system is basically operational.The QA environment is properly configured
      2. Functional Testing
        1. Functional testing focuses on the functional requirements of the software and is performed to confirm that the application operates accurately according to the documented specifications and requirements, and to ensure that interfaces to external systems are properly working
      3. Interface Testing
        1. This testing follows a transaction through all of the product processes that interact with it and tests the product in its entirety. Interface testing shall be performed to ensure that the product actually works in the way a typical user would interact with it. This involves the webservices testing
      4. System Testing
        1. Testing the fully integrated applications including external peripherals in order to check how components interact with one another and with the system as a whole. This is also called End to End testing. Verify thorough testing of every input in the application to check for desired outputs. Testing of the user's experience with the application
      5. Cross Browser Testing
        1. Please find below the scope of browser testing Target Design Resolution is 1280x800 and it should work with any screen resolution
          1. Iteration 1
          2. • IE 9 and IE10 • Safari 5.1.7 and Latest Version of Safari • Chrome 37 and Latest Version of Chrome • Mozilla 30 and Latest Version of Mozilla
          3. Latest Version of Chrome
          4. Iteration 2
          5. Latest Version of Chrome
          6. • IE 9 and IE10 • Safari 5.1.7 and Latest Version of Safari • Chrome 37 and Latest Version of Chrome • Mozilla 30 and Latest Version of Mozilla
          7. Iteration 3
          8. • IE 9 and IE10 • Latest Version of Safari • Latest Version of Chrome • Latest Version of Mozilla
          9. • IE 9 and IE10 • Safari 5.1.7 and Latest Version of Safari • Chrome 37 and Latest Version of Chrome • Mozilla 30 and Latest Version of Mozilla
      6. Regression Testing
        1. Regression testing shall be performed at the end of Iteration 3 to verify that previously tested features and functions do not have any new defects introduced, while correcting other problems or adding and modifying other features
      7. User acceptance testing
        1. User acceptance testing activities will be performed by the business users. The purpose of this testing will be to ensure the application meets the users’ expectations. This also includes focuses on usability and will include; appearance, consistency of controls, consistency of field naming, accuracy of drop down field information lists, spelling of all field name/data values, accuracy of default field values, tab sequence, and error/help messages
    4. Test Organization
      1. Test Types
        1. Smoke Testing
        2. UI and Functional Testing (UI and Database testing)
        3. Interface Testing (Web Services Testing)
        4. System Testing
        5. User Acceptance Testing
      2. Constraints
        1. Frequent change in the requirements
        2. Unavailability of Test Data
        3. Unavailability of Test Environment
    5. Test Case Development
      1. Test Case ID
      2. Module
      3. Test Name
      4. Test Data
      5. Action
      6. Expected Result
      7. Test Result
      8. Comments
      9. Use Case
    6. Requirement Tracebility Matrix
  17. Tools
    1. Subtopic 1
      1. 1
        1. DB
        2. SQL Server 2008
      2. 2
        1. Defect management
        2. JIRA
      3. 3
        1. Test Management
        2. Excel (Test Summary Report)
  18. Functions To Be Tested
    1. Iteration 1
      1. Deals
      2. Deal Offer
      3. Programming
      4. Tiered Pricing
      5. Orders
    2. Iteration 2
      1. Fulfillment
      2. PPL-GPS Integration
      3. Deals Offer – Mobile View
      4. Media Pulse-GPS Integration
      5. Images Media
      6. WIP
      7. NWCO-GPS Integration
      8. Short Form Deals
      9. PPL-GPS Integration
      10. RDM-GPS Integration
      11. Jaguar-GPS Integration
      12. Please refer to the link for the latest project plan implementation plan for more details
    3. Iteration 3
      1. Production
      2. Referential Data Mgmt
      3. Reports
      4. Global Search Updates
      5. UCA Requirements
      6. 1
      7. Provide text search solution using data in the IDB
      8. High
      9. 2
      10. Provide faceted navigation of search results
      11. High
      12. 3
      13. Provide ability to export search results in MS Excel
      14. Medium
      15. 4
      16. Provide ability to export search results in PDF
      17. Medium
      18. 5
      19. Provide view of search results items using read-only international screens or links to other systems
      20. Medium
      21. 6
      22. Read only episode information view from PPL
      23. High
      24. 7
      25. Automatically consume Episode Rights information from Rights Data mart including off-net/on-net
      26. High
      27. 8
      28. Automatically retrieve and view information for international Assets
      29. High
      30. 9
      31. Create/Update/View international assets information for episodes
      32. High
      33. 10
      34. Add/View fulfillment assets for episodes
      35. High
      36. 11
      37. Tag keywords for international episodes searchable from website
      38. Medium
      39. 12
      40. View/Update rights information in international master
      41. High
      42. 13
      43. View/Update attributes information from jaguar with expiration Date for each region
      44. High
      45. 14
      46. Associate support documents from PPL/Debut to international episodes
      47. High
      48. 15
      49. Queue programs for Int’1 production
      50. High
      51. 16
      52. Push information for Int’1 Master to Media Pulse(Metadata and asset )
      53. High
      54. 17
      55. Identify vendor Information based on Business Logic
      56. High
      57. 18
      58. Search Orders/Assignment/Tasks by episode metadata
      59. Medium
      60. 19
      61. Create/Update/View/Assignments for production
      62. Medium
      63. 20
      64. Create/Update/View Tasks for Assignment for multiple reels
      65. High
      66. 21
      67. View Production tech Spec
      68. Medium
      69. 22
      70. View/create/Update Worksheet with multiple reels for each episode
      71. Medium
      72. 23
      73. Search programming information based on latest quarter release
      74. Medium
      75. 24
      76. Add New Digital Assets available for episodes/programs
      77. High
      78. 25
      79. Update digital asset association to episodes/Programs/Channels to be viewed in Int’1 Website
      80. High
      81. 26
      82. Maintain source information for royalty payments
      83. High
      84. 27
      85. Create XML from tech spec to match existing requirements for DMC
      86. Medium
      87. 28
      88. Improve image workflow
      89. Medium
      90. 29
      91. Responsive Design for approximately 20% of Sale Forms
      92. Low
      93. 30
      94. Mobile entry – entry to be like transactional entry and not web entry
      95. Low
      96. 31
      97. Tiered pricing module
      98. High
      99. 32
      100. Integrate URL Reporting Dashboard
      101. High
      102. 33
      103. Integrate existing (UCA) contact administration for GPS
      104. High
  19. Ada Chan(QA Manager)
    1. Dhiren Shah(QA lead)
      1. Robinson Batchu(QA Analyst)
  20. Please refer to the below link for the latest project plan implementation schedule