A good test strategy document may contains the following.
Table of Contents
- GLOSSARY OF TERMS
- PURPOSE
- GUIDING PRINCIPLES
- CONFORMANCE WITH ORGANISATIONAL TEST STRATEGY
- PROJECT BACKGROUND
- PROCESS OVERVIEW DIAGRAM
- PROJECT SCOPE
- IN SCOPE (HIGH. LEVEL)
- OUT OF SCOPE (HIGH LEVEL)
- TEST APPROACH
- TEST DELIVERABLES
- TEST RESOURCES/SUPPORT
- TEST PHASES AND ENVIRONMENTS
- TEST ENVIRONMENTS
- SYSTEM TEST
- SYSTEM INTEGRATION TEST
- END TO END TEST
- REGRESSION TEST
- PERFORMANCE
- TEST PREPRATION
- TEST DATA
- TEST SETUP AND EXECUTION
- DEFECT MANAGEMENT
- ENTRY AND EXIT CRITERIA
- TEST EVIDENCE AND REPORTS
- TEST SCHEDULE
- TEST TOOLS
- RISKS, ISSUES, ASSUMPTIONS AND DEPENDENCIES
GLOSSARY OF TERMS
Useful in the context of your project. i.e.
Abbreviation | Description |
API | Application Programming Interface |
BigQuery | Storage and super fast processing super-fast SQL queries using the processing power of Google’s infrastructure. |
Unit | Small item of testable software code. Sometimes referred to as Module or Component |
ETL | Extract, Transformation and Load |
Defect | An unexpected result during testing, non-compliance to the requirements and an erroneous operation that is generated from software. |
PURPOSE
The purpose of the Test Strategy is to create a shared understanding of the overall approach, tools, methodology and timing of test activities.
GUIDING PRINCIPLES
The test team will adhere to the following test principles during all test activities:
Principle | Description |
Shared Responsibility | Every member of the scrum team is responsible for testing and quality assurance alongside delivering value to the business |
Respect | Team work, share success, share failures, helping each other and sharing knowledge. |
Respond to change | Testers will add value to the team: by embracing change and adapt to deliver maximum value in achieving the right product. |
Working software over comprehensive documentation | Our priority is the delivery of working software and will focus on ensuring that the product is for purpose |
Test Automation | Where possible we will automate testing – unit, acceptance testing, system integration, regression, performance and end to end. |
Manual Testing | In certain situations may be necessary only option especially during data quality checks (exploratory testing). |
Test Data | It is everyone’s responsibility to develop, maintain, manage and share test data for testing purposes. Where possible tests should be autonomous as possible creating and/or adding test data for use during the test, followed by clear down procedure (if possible) once complete. |
PROJECT BACKGROUND
i.e. Saving Companies House’s basic data to Google BigQuery. This will be accomplished in two phases:
- Develop ETL pipeline to extract data from Companies House and save them as parquet file locally
- Develop ETL pipeline to move data to Google BigQuery
PROCESS OVERVIEW DIAGRAM
PROJECT SCOPE
The scope of the project is expressed as User Stories that describe the requirements approved by the business. The scope of each sprint will be agreed and documented at the sprint planning meeting and each sprint will have its own sprint test plan.
TEST APPROACH
The project is delivering products in an agile way utilising scrum development, employing all relevant ceremonies, we will work closely with the developers (DE), data architectures (DA) and business analysts (BA) to ensure testing is carried out as early as possible and to minimise the risk of defects at a late stage. This test approach is outlined in the following sections.
Test Analysis/Preparation period
- We review baseline lined/approved data dictionary (design document) to understand the requirements. While the design document is being created we will start to write test cases based on agreed uses stories.
- We will request DE, BA & DA to review test cases to ensure that we understood the requirements and written correctly as part of our testing.
- We will then prepare test data for:
- checking source and target data
- checking structures of source and target
- carrying out sanity checks
- regression test to ensure that new code and files have not impacted existing process/data
Test Execution
- Production of sprint test plan
- Smoke test environment to be ensure software artefacts have deployed correctly
- Test Case execution to cover everything planned ion Test Analysis/Preparation phase and defect fixes
- Run Regression test
- Test Completion report produced
TEST DELIVERABLES
A number of test artefacts will be produced during project and respective sprints as outlined below:
Deliverable | Format | Author(s) | Reviewers |
User Stories | JIRA story | Scrum Team | Scrum Team |
Test Scripts | JIRA X-ray | QAs (Testers) | Test Lead, Scrum Team |
Test Plan | Confluence orWord document | Test Manager or Lead QA | Test Lead, Peer-review, Scrum Team |
Project Test Strategy (this document) | Confluence or Word document | Test Manager or Lead QA | Test Lead, Peer-review, Scrum Team |
Manual Test Runs including Test Execution Evidence | Jira/X-Ray | QAs | Scrum Team |
Defect Log | JIRA | QAs | Scrum Team |
Sprint/Test Completion Report | Word or Confluence | Test Manager or Lead QA | Scrum Team, Business Analyst, Project Manager, Business and Stakeholders |
TEST RESOURCES/SUPPORT
Test resources along with any support required will be documented in the Test Plan.
TEST PHASES AND ENVIRONMENTS
Please refer to Organisational Test Strategy for a breakdown of the test phases that can be utilised.
TEST ENVIRONMENTS
This very. specific to project and organisational setup.
TEST PREPRATION
During the Sprint Planning meeting the QA team will work together to create User Stories that are ‘Ready’ and can be completed within the next sprint.
Developers (DEs) and Testers will meet regularly to refine the Acceptance Criteria for each test that will validate that the Acceptance Criteria of the overall User Story have been met and decide which testing type is suitable.
TEST DATA
For a given release we will be creating test data to smoke test the systems. We will then be using a cut of live data to provide full test coverage against requirements.
Analysis for required Test Data will be undertaken when test scripts/cases are being created.
Type of Data | Data Requirements |
System Testing | |
Regression Testing | |
System Integration Testing | |
End to End Testing |
TEST SETUP AND EXECUTION
All manual tests will be recorded within Excel/Confluence.
Test cases should be recorded in Jira linked with specific Test Plan.
DEFECT MANAGEMENT
Defect log will be recorded in Jira and will be adhered to this project Test Strategy.
ENTRY AND EXIT CRITERIA
We will follow the Organisation Test Strategy for entry and exit criteria.
TEST EVIDENCE AND REPORTS
We will be using Confluence and Jira to store test evidences and reports.
TEST SCHEDULE
For detailed test schedule and release, please refer to project Confluence and Jira pages.
TEST TOOLS
A number of tools will be utilised throughout the development of a product but those relevant to the testing are:
Tool | Use |
Confluence | Test documentation |
JIRA | Task Management – Sprint management – User Stories – Tasks – Test plans – Test cases – Issues/Bugs |
GIT | Code and other artefacts – Release Notes – Unit Tests – End to End Tests – System Tests |
Jenkins/other tool | CICD (Build tool) |
Test Management Software – JIRA Plugin (X-Ray) | Test Management – Acceptance Criteria – Test Plan – Test Script definition – Test Cases – Test Execution results – Defect management |
RISKS, ISSUES, ASSUMPTIONS AND DEPENDENCIES
Risk | Level of Risk | Mitigation |
Further changes to Data Dictionary or Design Document could increase the scope | M | Getting DA/BA signed off the requirements |
No formal design has been agreed by developers | M | Developers are working on proof of concept to ensure that their solutions are. workable within the time frame. |
No baseline documentation for testers to begin writing test cases, this will cause potential loss of test preparation time | M | If testing time scales are reduced we will take a risk based approach to check high priority test cases as agreed with DAs and DEs |
Operational or live issues might delay the deadline | H | Need to escalate to higher authority |
ASSUMPTION
We will have all the requirements before development begins.
DEPENDENCY
We are dependent on business to provide test data.