- -Reviews the QA deliverables
-Ensured the team meets the scheduled timelines for each Iteration
-Monitors the test status report for each Iteration and signsoff the test summary report for each Iteration
- Establish test plan
Peer Reviews
Providing the Traceability Matrix
Design/Develop Test cases
Test Data Preparation
Execute Test cases
Evaluate test results & publish Test report
Log defects
Report status and risks/issues to leads/PM
-
Approvals
- Ada Chan
-
- 11/11/2015
- Tricia Riccio
- Brian Doherty
- Susan Tanamli
-
Dependencies
-
Personal Dependencies
- The test team requires experience testers to develop, perform and validate tests
-
Software Dependencies
- The source code must be unit tested and provided within the scheduled time outlined in the Project Schedule
-
Hardware Dependencies
- The requirements according to (Step 4: Environment Requirements) should be made available before the testing. Any downtime will affect the test schedule
-
Test Data & Database
- The test data needs to be provided before the QA test execution starts
1) The data in the dropdown (GPS UI)
2) The referential data (which are required for interface testing)
3) Required DB tables are to be created for testing
** All the above data to be provided by Virtusa team before the QA test execution starts
**ITLUTI – To be provided on 10/15/14 for QA to start
-
Resumption Criteria
-
If testing is suspended, resumption will only occur when the problem(s) that caused the suspension has been resolved. When a critical defect is the cause of the suspension, the “FIX” must be verified by the test department before testing is resumed
-
Test Suspension Criteria
-
If any defects are found which seriously impact the test progress, the QA manager may choose to
Suspend testing. Criteria that will justify test suspension are
- Hardware/software is not available
at the times indicated in the project
schedule
- Source code contains one or more critical defects,
which seriously prevents or limits testing progress.
- Assigned test resources are not
available when needed by the test team
-
Deliverables
-
Iteration 1
- A&E QA Test Cases
- 8/25/14
- 15 Days
- 9/15/14
- A&E QA Test Execution
- 10 Days
- 10/29/14
- 11/14/14
- Test Results
- 1 Day
- 11/14/14
- 11/14/14
-
Iteration 2
- A&E QA Test Cases
- 11/15/14
- 15 Days
- 11/29/14
- A&E QA Test Execution
- 10 Days
- 11/30/14
- 12/13/14
- Test Results
- 1 Day
- 12/14/14
- 12/14/14
- Iteration 3
-
Control Procedures
-
Reviews
- The project team will perform reviews for each Iteration. (Test Case Review and Final Test Summary Review). A meeting notice, with related documents, will be emailed to each participant
-
Bug Review meetings
- The project team will perform reviews for each Iteration. (Test Case Review and Final Test Summary Review). A meeting notice, with related documents, will be emailed to each participant
-
Change Request
- Once testing begins, changes to the GPS system are discouraged. If functional changes are required, these proposed changes will be discussed with the managers The managers will determine the impact of the change and if/when it should be implemented. The additional/modified fucnitonality should be documented as the Change Request document
-
Defect Reporting
- When defects are found, the testers will complete a defect report on the JIRA defect tracking system. JIRA is accessible by testers, developers & all members of the project team. When a defect has been fixed or more information is needed, the developer will change the status of the defect to indicate the current state. Once a defect is fixed by developers the testers would re-test close the defect report if it passes. Below diagram explains the defect lifecycle
D- Developers, T-Testers, PM- Project Managers
- Bug LifeCycle
-
Documentation
- Test Plan
- Test Plan
- Requirements Traceability Matrix
- Defect reports
- Final Test Summary Report
-
Risks
- Schedule
The schedule for each phase is very aggressive and could affect testing. A slip in the schedule in one of the other phases could result in a subsequent slip in the test phase. Close project management is crucial to meeting the forecasted completion date
- Technical
Since this is the new web based solution being developed which interacts with other system like PPL, RDM, Jaguar, Media Pulse, Ceiton, Platform etc. Any major bugs encountered during the interface testing would delay the project timeline
- Management
Management support is required so when the project falls behind, the test schedule does not
get squeezed to make up for the delay. Management can reduce the risk of delays by supporting the test team throughout the testing phase and assigning people to this project with the required skills set
- Personnel
Due to the aggressive schedule, it is very important to have experienced testers on this project. Unexpected turnovers can impact the schedule. If attrition does happen, all efforts must be made to replace the experienced individual
- Personnel
Due to the aggressive schedule, it is very important to have experienced testers on this project. Unexpected turnovers can impact the schedule. If attrition does happen, all efforts must be made to replace the experienced individual
-
Test Schedule (Dates only for representation)
- Iteration 1
- 09/01/2015
- 20/03/2015
- Dhiren Shah
- Robinson Batchu
- Iteration 2
- Iteration 3
- Introduction
-
Test Plan Objectives
- Define the activities required to prepare for and conduct smoke, functional, system, interface and webservices testing.
- Communicate to all responsible parties about the System Test strategy
- Define deliverables and responsible parties
- Communicate to all responsible parties the various Dependencies and Risks
-
Scope
-
In Scope
- The overall purpose of testing is to ensure the new web based system, GPS meets all of its technical, functional and business requirements. The purpose of this document is to describe the overall testplan and strategy for testing GPS. The approach described in this document provides the framework for all testing related to this application. Individual test cases will be written for each iteration based on the use cases provide by Virtusa (vendor) team. The test plan document will also be updated as required for each release if there are any improvements/ modifications along with the change request document
-
Out Of Scope
- Load Testing
- Automation Testing
-
Environment Requirements
-
Environment Requirements
-
- 1
- Web Server
- AEWEBSVCDEV4
- Installation of the application
- Net – 4.5
IIS – 7.5
OS – Win2008R2
- Intel(R) Xenon, 4 GB RAM, 30 GB HDD
- 1
- 2
- App Server
- AEWEBAPPIDDEV
- Managing application data
- Net – 4.5
IIS – 7.5
OS – Win2008R2
- Intel(R) Xenon, 4 GB RAM, 30 GB HDD
- 1
- DB Server
- 3
- DEVSQLCL2A\
DEVINST2A
- Database server to support the Content Management Tool
- OS - Win2008R2
MS SQL Server 2008 R2
- Intel(R) Xenon, 4 GB RAM, 30 GB HDD
- 1
- 4
- QA Requirements
- UT1 for GPS and Other systems (PPL, Jaguar, MediaPulse and others)
DB – ut1 for GPS and Other systems (PPL, Jaguar, MediaPulse and others)
- UT1 for GPS and Other systems (PPL, Jaguar, MediaPulse and others)
DB – ut1 for GPS and Other systems (PPL, Jaguar, MediaPulse and others)
- UT1 for GPS and Other systems (PPL, Jaguar, MediaPulse and others)
DB – ut1 for GPS and Other systems (PPL, Jaguar, MediaPulse and others)
-
Test Strategy
-
Entry Criteria
- All business requirements are documented and approved by the business users
- All design specifications like BRD, Use Cases, Wireframes have been reviewed and approved
- Unit testing has been completed by the development team
- All hardware needed for the test environment is available
- The application delivered to the test environment is of reliable quality
- Initial smoke test of the delivered functionality is approved by the testing team
- Code changes made to the test site will go through a change control process
-
Exit Criteria
- All test scenarios and test cases have been completed successfully
- All issues prioritized and priority 1 issues resolved
- All outstanding defects are documented in a test summary with a priority and severity status
- Go/No-go meeting is held to determine acceptability of product
-
Test Execution
-
Smoke Testing
- The released code is working properly and ensures the system is basically operational.The QA environment is properly configured
-
Functional Testing
- Functional testing focuses on the functional requirements of the software and is performed to confirm that the application operates accurately according to the documented specifications and requirements, and to ensure that interfaces to external systems are properly working
-
Interface Testing
- This testing follows a transaction through all of the product processes that interact with it and tests the product in its entirety. Interface testing shall be performed to ensure that the product actually works in the way a typical user would interact with it. This involves the webservices testing
-
System Testing
- Testing the fully integrated applications including external peripherals in order to check how components interact with one another and with the system as a whole. This is also called End to End testing. Verify thorough testing of every input in the application to check for desired outputs. Testing of the user's experience with the application
-
Cross Browser Testing
-
Please find below the scope of browser testing
Target Design Resolution is 1280x800 and it should work with any screen resolution
- Iteration 1
- • IE 9 and IE10
• Safari 5.1.7 and Latest Version of Safari
• Chrome 37 and Latest Version of Chrome
• Mozilla 30 and Latest Version of Mozilla
- Latest Version of Chrome
- Iteration 2
- Latest Version of Chrome
- • IE 9 and IE10
• Safari 5.1.7 and Latest Version of Safari
• Chrome 37 and Latest Version of Chrome
• Mozilla 30 and Latest Version of Mozilla
- Iteration 3
- • IE 9 and IE10
• Latest Version of Safari
• Latest Version of Chrome
• Latest Version of Mozilla
- • IE 9 and IE10
• Safari 5.1.7 and Latest Version of Safari
• Chrome 37 and Latest Version of Chrome
• Mozilla 30 and Latest Version of Mozilla
-
Regression Testing
- Regression testing shall be performed at the end of Iteration 3 to verify that previously tested features and functions do not have any new defects introduced, while correcting other problems or adding and modifying other features
-
User acceptance testing
- User acceptance testing activities will be performed by the business users. The purpose of this testing will be to ensure the application meets the users’ expectations. This also includes focuses on usability and will include; appearance, consistency of controls, consistency of field naming, accuracy of drop down field information lists, spelling of all field name/data values, accuracy of default field values, tab sequence, and error/help messages
-
Test Organization
-
Test Types
- Smoke Testing
- UI and Functional Testing (UI and Database testing)
- Interface Testing (Web Services Testing)
- System Testing
- User Acceptance Testing
-
Constraints
- Frequent change in the requirements
- Unavailability of Test Data
- Unavailability of Test Environment
-
Test Case Development
- Test Case ID
- Module
- Test Name
- Test Data
- Action
- Expected Result
- Test Result
- Comments
- Use Case
- Requirement Tracebility Matrix
-
Tools
-
Subtopic 1
-
1
- DB
- SQL Server 2008
-
2
- Defect management
- JIRA
-
3
- Test Management
- Excel (Test Summary Report)
-
Functions To Be Tested
-
Iteration 1
- Deals
- Deal Offer
- Programming
- Tiered Pricing
- Orders
-
Iteration 2
- Fulfillment
- PPL-GPS Integration
- Deals Offer – Mobile View
- Media Pulse-GPS Integration
- Images Media
- WIP
- NWCO-GPS Integration
- Short Form Deals
- PPL-GPS Integration
- RDM-GPS Integration
- Jaguar-GPS Integration
- Please refer to the link for the latest project plan implementation plan for more details
-
Iteration 3
- Production
- Referential Data Mgmt
- Reports
- Global Search Updates
- UCA Requirements
- 1
- Provide text search solution using data in the IDB
- High
- 2
- Provide faceted navigation of search results
- High
- 3
- Provide ability to export search results in MS Excel
- Medium
- 4
- Provide ability to export search results in PDF
- Medium
- 5
- Provide view of search results items using read-only international screens or links to other systems
- Medium
- 6
- Read only episode information view from PPL
- High
- 7
- Automatically consume Episode Rights information from Rights
Data mart including off-net/on-net
- High
- 8
- Automatically retrieve and view information for international Assets
- High
- 9
- Create/Update/View international assets information for episodes
- High
- 10
- Add/View fulfillment assets for episodes
- High
- 11
- Tag keywords for international episodes searchable from website
- Medium
- 12
- View/Update rights information in international master
- High
- 13
- View/Update attributes information from jaguar with expiration
Date for each region
- High
- 14
- Associate support documents from PPL/Debut to international episodes
- High
- 15
- Queue programs for Int’1 production
- High
- 16
- Push information for Int’1 Master to Media Pulse(Metadata and asset )
- High
- 17
- Identify vendor Information based on Business Logic
- High
- 18
- Search Orders/Assignment/Tasks by episode metadata
- Medium
- 19
- Create/Update/View/Assignments for production
- Medium
- 20
- Create/Update/View Tasks for Assignment for multiple reels
- High
- 21
- View Production tech Spec
- Medium
- 22
- View/create/Update Worksheet with multiple reels for each episode
- Medium
- 23
- Search programming information based on latest quarter release
- Medium
- 24
- Add New Digital Assets available for episodes/programs
- High
- 25
- Update digital asset association to episodes/Programs/Channels
to be viewed in Int’1 Website
- High
- 26
- Maintain source information for royalty payments
- High
- 27
- Create XML from tech spec to match existing requirements for DMC
- Medium
- 28
- Improve image workflow
- Medium
- 29
- Responsive Design for approximately 20% of Sale Forms
- Low
- 30
- Mobile entry – entry to be like transactional entry and not web entry
- Low
- 31
- Tiered pricing module
- High
- 32
- Integrate URL Reporting Dashboard
- High
- 33
- Integrate existing (UCA) contact administration for GPS
- High
-
Ada Chan(QA Manager)
-
Dhiren Shah(QA lead)
- Robinson Batchu(QA Analyst)
- Please refer to the below
link for the latest project plan
implementation schedule