The Living Balance Sheet
Test Strategy – Phase 4
Version 1.1
Revision History
Date | Version | Description | Author |
04/25/2007 | 1.0 | Initial Version | Kannan Chandrasekaran |
05/09/07 | 1.1 | Review comments included | Kannan Chandraskearan |
| | | |
| | | |
Table of Contents
1. Introduction 4
1.1 Purpose 4
1.2 Intended Audience 4
1.3 Document Terminology and Acronyms 4
1.4 References 4
2. Test Mission 5
2.1 Scope 5
2.2 Test Goals 8
2.3 Assumptions 8
3. Test Approach 9
3.1 Functional Tests 10
3.2 Non-Functional Tests 11
4. Test Environment Needs 15
4.1 Required System Hardware 15
4.2 Required Software 15
4.3 Required Productivity and Support Tools 15
4.4 Required Data Needs 16
5. Responsibilities, Staffing, and Training Needs 16
5.1 Test Roles 17
5.2 Training Needs 17
6. Risks 18
7. Management Process and Procedures 19
7.1 Problem Reporting, Escalation, and Issue Resolution 19
7.2 Measuring the Extent of Testing 20
7.3 Test Coverage 20
Test Strategy
1. Introduction
1.1 Purpose
- Provide a central artifact to govern the strategic approach to the test effort. It defines the general approach that will be employed to test the project deliverables and to evaluate the results of UAT testing. The strategy defined in this document will guide the planning effort of the subsequent testing work required on this project.
- Provide visibility to project stakeholders on the testing effort and communicate that adequate consideration has been given to various aspects of executing the testing effort.
1.2 Intended Audience
The audience for this Test Strategy document is:
- Project Steering Committee
- Project Team
1.3 Document Terminology and Acronyms
This subsection provides a list of acronyms, terms and abbreviations and their definition used elsewhere within the Test Strategy.
Acronym/Term | Definition |
BTS | Business Technology Systems |
LBS | Living Balance Sheet |
SSO | Single Sign On |
GOL | Guardian Online |
SLA | Service Level Agreement |
UAT | User Acceptance Testing |
1.4 References
This subsection provides a list of the documents referenced elsewhere within the Test Strategy.
Document | Version | Date |
LBS Phase I -- Functional Specifications | | |
LBS Phase II -- Functional Specifications | 0.5 | 11/22/05 |
LBS Phase III -- Functional Specifications | | |
2. Test Mission
The Test Mission clearly defines the scope and goals and objectives of the Test Strategy. Each of the subsequent test plans in the lifecycle of this application will elaborate on how the scope and objectives will be met through the execution of tests.
2.1 Scope
Test Items Considered in Scope
The purpose of testing is to verify that the system, when fully integrated, performs as specified in the Business Requirements and without error. The testing phase will be divided into six Guardian units performed by three individual testers according to the components described in the business requirements.
Unit and string testing performed by the build eMoney team is designed to ensure accurate processing of individual programs/units. The testing phase, however, will test the overall system, processes, and procedures that the individual programs have been designed to support, verifying that the execution and interaction of the processing units meet the business objectives. eMoney will perform unit and string testing for the LBS System. eMoney and Guardian will test GOL SSO functionality.
Integration - The system interfaces successfully with the related application modules and legacy application systems
System - Customizations such as data field modifications, functions, security, conversions, and reports all execute correctly. The intent of system testing is the operation of a system, as it will be used in the production environment.
Acceptance – The application system meets the business requirements and complies with the system acceptance criteria.
Stress / Performance - The hardware and application software are capable of operating and maintaining the system at sustainable volume levels to support the business requirements
As per the Business Requirements, the following in-scope system functions must be validated in UAT, as these are crucial business requirements comprising this release:
Functional Testing | Comments |
Client Site | Ensure all required modifications are applied to the following features and/or modules, and are functioning properly: - Welcome section |
Financial Representative Site (formerly Advisor site) – Financial Representative Dashboard | Ensure all the required changes to the Financial Representative Dashboard are completed and functioning properly – specifically: - Add New Client (on green navigation bar) - Contact Us |
Financial Representative Site (formerly Advisor site) – Start | Ensure that all the required changes to the Financial Representative Start menu are completed and functioning properly – specifically: - View Client List - Add New Client |
Financial Representative Site (formerly Advisor Site) -- Tools | Ensure that all the required changes to the Financial Representative Tools menu are completed and functioning properly – specifically the addition of new features: - Document Center - Document Status |
Financial Representative Site (formerly Advisor Site) -- Workflow Wizard | The Workflow Wizard feature must be tested for accuracy and functionality, specifically these new features: - Section describing functionality - Letters |
Enterprise Configuration | Ensure all the new and updated features appear and are functioning properly, specifically: - Printed Fact Finder (updated) - Presentation Templates (updated) - Entitlements (new) |
Financial Representative Site (formerly Advisor Site) -- Fact Finder | - |
New added functionalities on Phase 4 | |
Security | Test functional security as it applies to Authentication and Authorization function of the system. Pay particular attention to logins and system role privileges per function. Ensure that access to specific features is properly configured for Advisors and Clients, as specified in the Business Requirements. Test New Agent functionality. |
The following technology related areas are considered to be in scope of eMoney for testing. Key to ensuring that the underlying infrastructure of the system is functioning as intended the following testing will be performed by eMoney:
Non Functional Testing | Comments |
Security (eMoney) | Access to flat files and all external data sources must be validated to ensure the system is allowing/disallowing access appropriately as per supporting requirements. Testing new agents web contents with different access levels |
Stress and Performance (eMoney) | Ensure the system is handling the stress requirements properly. Ensure the system is performing according to the guidelines specified in the Supporting Requirements document. |
Backups/Restore (eMoney) | EMoney will ensure backup and restore procedures are working as planned |
The system will be ready for UAT Testing after successful completion of all testing phases of the project by eMoney. As a prerequisite for the testing phase of the project, unit and string testing of all batch and on-line programs is assumed to be complete.
At the conclusion of the testing phase, the following level of testing will take place:
The majority of the testing effort must be completed by eMoney, the functional resources assigned to the project. This will allow them to experience the system first hand and provide acceptance of the functionality.
Test Items Considered Out of Scope
Functional Testing | Comments |
eMoney User Signup and Billing Systems | |
Testing of complete Manager’s functionalities | |
2.2 Test Goals
The goals of UAT testing activities include:
- Discover and send the same to correct errors to eMoney prior to implementation;
- Validate site flow;
- Exercise all functional processing capabilities defined in the Business Requirements document;
- Find important problems, assess perceived quality risks;
- Determine if a program, tested as a unit, works effectively as a component of a progressively larger whole;
- Establish the validity of all program-to-program interfaces;
- Test the accuracy of the data entry, transaction processing and run-to-run controls;
- Establish the validity of all user procedures;
- Ensure that acceptance criteria are met;
- Review all printed reports for accuracy includes alignment, logo etc
2.3 Assumptions
Assumption | Impact of assumption being incorrect | Owner |
User Acceptance Testing environment will mirror the production environment in order to have the end-user community properly approve the system quality. | Not having a production-like environment in place for end-user testing may pose unnoticed defects in the true production environment introducing additional post-production break-fix costs while negatively impacting the day-to-day business activities. | UAT Test from Guardian and defect fixing by eMoney |
3. Test Approach
The Test Approach presents the standard strategy for identifying the appropriate tests to conduct on the application by eMoney. The UAT testing is categorized by Functional and Non-Functional tests. UAT tests are required to be performed, as they are critical to ensuring the quality of the functional delivered by eMoney. The Non-Functional tests are the features of the eMoney.
Figure 1 - Standard Guardian UAT Test Methodology
3.1 Functional Tests
Functional tests are focused on validating the business functions of the application or product, ensuring the application is providing the required services as per functional requirements. These tests are implemented and executed in their respective environments including Unit/Development, Integration, System, and User Acceptance environments.
Test | Description | Required (Y/N) | Comments/Assumptions |
Unit Test (eMoney) | Unit testing is implemented against the smallest testable element (units) of the software, and involves testing the internal structure such as logic and data flow, and the unit's function and observable behavior. | Y | Simplistic test scenarios with two or three objects stringed together Exit criteria: Pass 100% of test cases, pass regression test Performed by eMoney |
Integration Test (eMoney) | Integration testing is performed to ensure that implemented system components operate properly when combined to execute business functionality. Integration testing will verify flow from GOL to the Living Balance Sheet. | Y | Add layers to complexity to test end to end business process within the system without running interfaces Entrance criteria: Complete Unit Test 100% Exit criteria: Pass 100% of test cases, pass regression test Performed by eMoney |
System Test (eMoney) | System testing is done when the software is functioning as a whole. The target is the system's end-to-end functioning elements. System and user test scripts will be automated using SilkTest to perform subsequent regression testing. | Y | End to End process test with interfaces Entrance criteria: Complete Integration Test 100% Exit criteria: Pass 100% of test cases, pass regression test. Performed by eMoney |
User Acceptance Test (Guardian) | User acceptance testing is the final test action taken before deploying the software. The goal of acceptance testing is to verify that the software is ready, and that end users to perform those functions and tasks for which the software was built can use it. | Y | Entrance criteria: Pass System Test Final test and signoff by key users and working steering committee. Key user community acceptance sign-off. |
3.2 Non-Functional Tests
Tests are focused by eMoney on non-functional aspects of the application or product as defined by supplemental requirements. These tests are implemented and executed as appropriate in their respective environments including Unit/Development, Integration, System, and User Acceptance environments.
Test | Description | Required (Y/N) | Comments/Assumptions |
Security Tests | Tests that focus on adherence to security protocols, confidentiality and privacy issues, and file input/output access. | Y | Test user roles and access authorization / entitlements |
Usability Tests | Tests that focus on behavioral aspects across the user interfaces of the system | N | Using existing eMoney user interfaces |
Reliability Tests | Tests that focus on how the system responds to adverse conditions in the execution environment. | | |
Stress Test | A type of reliability test that focuses on evaluating how the system responds under abnormal conditions. Stresses on the system could include extreme workloads, insufficient memory, unavailable services and hardware, or limited shared resources. These tests are often performed to gain a better understanding of how and in what areas the system will break, so that contingency plans and upgrade maintenance can be planned and budgeted for well in advance. | Y | |
Performance Tests | Tests that focus on the evaluation of performance-related characteristics of the target-of-test, such as the timing profiles, execution flow, response times, and operational reliability and limits. | | eMoney SLA’s call for monitoring and maintaining standards. (ASP) |
Application Performance Test | A test in which the target-of-test’s timing profile is monitored, including execution flow, data access, function and system calls to identify and address both potential performance bottlenecks and inefficient processes. | Y | Test system response-time parameters for transaction and processing |
Contention Test | Tests focused on validating the target-of-test are ability to acceptably handle multiple user demands on the same resource (data records, memory, and so on). | Y | This will be monitored in all phases. |
Test | Description | Required (Y/N) | Comments/Assumptions |
Load Test | A type of performance test used to validate and assess acceptability of the operational limits of a system under varying workloads while the system-under-test remains constant. In some variants, the workload remains constant and the configuration of the system-under-test is varied. Measurements are usually taken based on the workload throughput and in-line transaction response time. The variations in workload usually include emulation of average and peak workloads that occur within normal operational tolerances. | N/A | |
Stress Test | Testing focused on verifying the target-of-test's ability to handle large amounts of data, either as input and output or resident within the database. Volume testing includes test strategies such as creating queries that would return the entire contents of the database, or that would have so many restrictions that no data is returned, or where the data entry has the maximum amount of data for each field. | Y | Use the SILK Performance Tool |
Supportability Tests | | | |
Compatibility Test | Tests focused on ensuring target-of-test functions as intended on different hardware and software configurations. (Ex. OS platforms, Browser versions, Browser platforms, etc) | Y | Lab testing by GOL group. |
Installation Test | Tests focused on ensuring the target-of-test installs as intended on different hardware and software configurations on different environments. Key is to ensure that every application build, before its tested, has the right components of the application present in the testing environment. This may also be referred to as a “Smoke Test”. This test is implemented and executed against applications and systems. | N/A | Web Access |
4. Test Environment Needs
4.1 Required System Hardware
The following table sets forth the system resources for the test effort presented in this Test Strategy.
Hardware | ||
Resource | Quantity | Name and Type |
Hosted by eMoney | N/A | N/A |
| ||
|
4.2 Required Software
The following base software elements are required in the test environment for this Test Strategy.
Software | Version | Type and Other Notes |
Internet Browsers – IE | 6.0+ | |
Rational ClearQuest for Defect Tracking | 2003 Enterprise Suite | |
Rational TestManager for Test management | 2003 Enterprise Suite | |
4.3 Required Productivity and Support Tools
The following tools will be employed by eMoney to support the test process for this Test Strategy.
Tool Category or Type | Tool Brand Name | Vendor or In-House | Version |
Performance Test Tool | Segue Silk Performer | eMoney Vendor | |
4.4 Required Data Needs
The system testing team will be responsible for identification of test data for each step in the plan. Much of this data will be generated as part of the test cases. User input and sample transactions will be gathered to satisfy requirements. Business process owners will be consulted for suggestions regarding sources of realistic test data.
To enhance the rigor of the system testing effort, the team will use actual business data from other client systems wherever possible. The incorporation of actual production data into the system testing process supplements the use of test specific data, enabling a greater comfort-level to occur, as programs run using actual production data. The use of real-life data, generated from actual production operations, ensures a more complete test effort and allows greater coverage during system testing. The use of production data often "generates" conditions and event sequences not covered in test plans.
To completely test edits, validation, error handling, and miscellaneous other routines, the test data set often includes a combination of real data and contrived data. Creation of contrived data may be used in combination with actual data to force certain conditions to occur. Contrived data will contain values, which distinguish the “test specific” records from production data extracts.
The test plan provides a cross-reference of test data set to test condition. Test data sets may be either electronic or manual. Where possible, the cross-reference will be completed before execution, but, when test data is not created (or known) until immediately prior to execution, the cross-reference will be completed as the test condition is carried out. This cross-reference is an important component of test support.
Test Data Requirements
Test | Data Type | Data Source | Volume Required | Conditions or Constraints |
All | | | | |
Integration | | | | |
User Acceptance | | | | |
Conversion Testing | | | | |
Stress/Performance | | | | |
Security | | | | |
5. Responsibilities, Staffing, and Training Needs
5.1 Test Roles
This table shows the staffing assumptions for the test effort.
Role | Specific Responsibilities or Comments |
Test Manager (Kannan Chandrasekaran) | Provides management oversight on activities and deliverables - Present management reporting - Advocates the interests of test - Evaluate effectiveness of test effort |
UAT Test Analyst & Testers (Neha Bansal & Hari Reddy – Fulltime Rose Saldana, Donna Moyer, Barbara Dierolf – Part time) | Identifies and defines the specific tests to be conducted Responsibilities include: - Identify test ideas - Define test details - Determine test results - Document change requests - Evaluate product quality Implements and executes the tests. Responsibilities include: - Implement tests and test suites - Execute test suites - Log results - Analyze and recover from test failures - Document incidents |
5.2 Training Needs
This section outlines how to approach staffing and training the test roles for the project.
· Test Analyst & Testers will be walk-through the Phase 4 implementations
· Training for all test personnel on use of the Rational ClearQuest for defect tracking system.
· Training for all test personnel on use of Rational TestManager for Test execution ad test execution analysis.
6. Risks
Risk ID | Description | Mitigation |
1 | Inadequate Time | · Prioritize testing strategy so that most critical tests are performed first. |
2 | Inadequate Resources | Make efficient use of available resources. Use field to augment internal resources. |
3 | User Acceptance test environment may not be ready in time | Incorporate activities in the project plan anticipating all the required test environments for this project. |
4 | Frequent Change of Business Requirements | · Specifications have been finalized by the Business and have not changed. · If there are any changes, UAT Manager is asked to track all changes to the business specs in addition to Business Specifications and Use Cases sent by Project Manager |
5 | Inadequate skills of UAT Team to execute automated test scripts | · Additional resources may be assigned to created automated test scripts · Also Adequate training will be provided to the testing team |
8 | Shortage of human resources for preparation and execution of test cases | · Adequate human resources will be recruited based on the estimated efforts and available resources |
9 | Deployment of testers on multiple projects at the same time | · UAT Resources are dedicated to this project |
10 | Attrition of UAT Team Members | · Necessary commitment from the contractor would be taken to retain the resources until the testing is completed |
7. Management Process and Procedures
7.1 Problem Reporting, Escalation, and Issue Resolution
Issues Tracking
When problems are identified in UAT testing, the person performing the test will complete an entry to the Rational ClearQuest Tracking System. Reasons for an issue are (1) when a program/set of programs does not function as specified in the program design specification or (2) when a problem occurs and it is uncertain why it happened. The issue will contain a description of the problem, the test step where the problem was encountered, and identify the program or transaction causing the problem. Additionally, any relevant information regarding the problem (e.g., screen prints, reports, etc.) will be included with the issue. The team member assigned to fixing the problem, whether it be a construction, test, or technical resource, is responsible for resolving the problem and making whatever modifications are necessary. Once the test team member has satisfactorily retested the program or transaction, the issue will be closed.
Guardian and eMoney Project Management will monitor the Rational ClearQuest for open defects.
Issue Submission
When a team member encounters what appears to be a problem, a reasonable amount of research will be done to determine if the problem is a software, configuration, or data problem vs. something that needs to be sent back to the construction team for fixing. Many problems send from data, which was set up incorrectly. Once the team member is convinced there is a software problem, the issue process will be initiated.
When an issue is identified, the team member that identifies the problem will complete a new issue entry in Rational ClearQuest. This will enable the project team to track the resolution of all issues. He will provide all relevant supporting documentation (i.e., screen prints, error messages, etc.) to the team leader for reference. All issues will be identified during testing and will be corrected in the test case in which the errors have occurred.
Retesting Issues
The UAT team member responsible for resolving an issue will also fully retest the condition or series of test steps, which caused the unexpected result. In addition, tester will update the issue to reflect the resolution and retest for fixes.
7.2 Measuring the Extent of Testing
Internal software issues/defects will be logged using Rational ClearQuest. Defects will be logged using the standard issue reporting system. Daily/Review to monitor testing/defect fix status.
7.3 Test Coverage
The application will be UAT tested for both new Phase 4 functionalities and previous phases functionalities. UAT team also track the previous phases open defects in Phase 4 UAT testing.
No comments:
Post a Comment