Test Strategy document

A good test strategy document may contains the following.

Table of Contents

  1. GLOSSARY OF TERMS
  2. PURPOSE
  3. GUIDING PRINCIPLES
    1. CONFORMANCE WITH ORGANISATIONAL TEST STRATEGY
  4. PROJECT BACKGROUND
  5. PROCESS OVERVIEW DIAGRAM
  6. PROJECT SCOPE
    1. IN SCOPE (HIGH. LEVEL)
    2. OUT OF SCOPE (HIGH LEVEL)
  7. TEST APPROACH
  8. TEST DELIVERABLES
  9. TEST RESOURCES/SUPPORT
  10. TEST PHASES AND ENVIRONMENTS
  11. TEST ENVIRONMENTS
    1. SYSTEM TEST
    2. SYSTEM INTEGRATION TEST
    3. END TO END TEST
    4. REGRESSION TEST
    5. PERFORMANCE
  12. TEST PREPRATION
  13. TEST DATA
  14. TEST SETUP AND EXECUTION
  15. DEFECT MANAGEMENT
  16. ENTRY AND EXIT CRITERIA
  17. TEST EVIDENCE AND REPORTS
  18. TEST SCHEDULE
  19. TEST TOOLS
  20. RISKS, ISSUES, ASSUMPTIONS AND DEPENDENCIES

GLOSSARY OF TERMS

Useful in the context of your project. i.e.

AbbreviationDescription
APIApplication Programming
Interface
BigQueryStorage and super fast
processing super-fast SQL
queries using the processing
power of Google’s
infrastructure.
UnitSmall item of testable software code. Sometimes referred to as Module or Component
ETLExtract, Transformation and
Load
DefectAn unexpected result during
testing, non-compliance to the
requirements and an erroneous operation that is generated from software.

PURPOSE

The purpose of the Test Strategy is to create a shared understanding of the overall approach, tools, methodology and timing of test activities.

GUIDING PRINCIPLES

The test team will adhere to the following test principles during all test activities:

PrincipleDescription
Shared Responsibility Every member of the scrum
team is responsible for testing
and quality assurance alongside delivering value to the business
RespectTeam work, share success,
share failures, helping each
other and sharing knowledge.
Respond to changeTesters will add value to the
team: by embracing change
and adapt to deliver maximum value in achieving the right
product.
Working software over
comprehensive documentation
Our priority is the delivery of
working software and will focus on ensuring that the product is
for purpose
Test AutomationWhere possible we will
automate testing – unit,
acceptance testing, system
integration, regression,
performance and end to end.
Manual TestingIn certain situations may be
necessary only option
especially during data quality
checks (exploratory testing).
Test DataIt is everyone’s responsibility to
develop, maintain, manage and
share test data for testing
purposes. Where possible tests
should be autonomous as
possible creating and/or adding
test data for use during the test,
followed by clear down
procedure (if possible) once
complete.

PROJECT BACKGROUND

i.e. Saving Companies House’s basic data to Google BigQuery. This will be accomplished in two phases:

  1. Develop ETL pipeline to extract data from Companies House and save them as parquet file locally
  2. Develop ETL pipeline to move data to Google BigQuery

PROCESS OVERVIEW DIAGRAM

PROJECT SCOPE

The scope of the project is expressed as User Stories that describe the requirements approved by the business. The scope of each sprint will be agreed and documented at the sprint planning meeting and each sprint will have its own sprint test plan.

TEST APPROACH

The project is delivering products in an agile way utilising scrum development, employing all relevant ceremonies, we will work closely with the developers (DE), data architectures (DA) and business analysts (BA) to ensure testing is carried out as early as possible and to minimise the risk of defects at a late stage. This test approach is outlined in the following sections.

Test Analysis/Preparation period

  • We review baseline lined/approved data dictionary (design document) to understand the requirements. While the design document is being created we will start to write test cases based on agreed uses stories.
  • We will request DE, BA & DA to review test cases to ensure that we understood the requirements and written correctly as part of our testing.
  • We will then prepare test data for:
    • checking source and target data
    • checking structures of source and target
    • carrying out sanity checks
    • regression test to ensure that new code and files have not impacted existing process/data

Test Execution

  • Production of sprint test plan
  • Smoke test environment to be ensure software artefacts have deployed correctly
  • Test Case execution to cover everything planned ion Test Analysis/Preparation phase and defect fixes
  • Run Regression test
  • Test Completion report produced

TEST DELIVERABLES

A number of test artefacts will be produced during project and respective sprints as outlined below:

DeliverableFormatAuthor(s)Reviewers
User StoriesJIRA storyScrum TeamScrum Team
Test ScriptsJIRA X-rayQAs (Testers)Test Lead,
Scrum Team
Test PlanConfluence orWord
document
Test Manager or Lead QATest Lead,
Peer-review,
Scrum Team
Project Test
Strategy (this
document)
Confluence or
Word
document
Test Manager or Lead QATest Lead,
Peer-review,
Scrum Team
Manual Test
Runs including
Test Execution
Evidence
Jira/X-RayQAsScrum Team
Defect LogJIRAQAsScrum Team
Sprint/Test
Completion
Report
Word or
Confluence
Test Manager
or Lead QA
Scrum Team,
Business
Analyst,
Project
Manager,
Business and
Stakeholders

TEST RESOURCES/SUPPORT

Test resources along with any support required will be documented in the Test Plan.

TEST PHASES AND ENVIRONMENTS

Please refer to Organisational Test Strategy for a breakdown of the test phases that can be utilised.

TEST ENVIRONMENTS

This very. specific to project and organisational setup.

TEST PREPRATION

During the Sprint Planning meeting the QA team will work together to create User Stories that are ‘Ready’ and can be completed within the next sprint.

Developers (DEs) and Testers will meet regularly to refine the Acceptance Criteria for each test that will validate that the Acceptance Criteria of the overall User Story have been met and decide which testing type is suitable.

TEST DATA

For a given release we will be creating test data to smoke test the systems. We will then be using a cut of live data to provide full test coverage against requirements.

Analysis for required Test Data will be undertaken when test scripts/cases are being created.

Type of DataData Requirements
System Testing
Regression Testing
System Integration Testing
End to End Testing

TEST SETUP AND EXECUTION

All manual tests will be recorded within Excel/Confluence.

Test cases should be recorded in Jira linked with specific Test Plan.

DEFECT MANAGEMENT

Defect log will be recorded in Jira and will be adhered to this project Test Strategy.

ENTRY AND EXIT CRITERIA

We will follow the Organisation Test Strategy for entry and exit criteria.

TEST EVIDENCE AND REPORTS

We will be using Confluence and Jira to store test evidences and reports.

TEST SCHEDULE

For detailed test schedule and release, please refer to project Confluence and Jira pages.

TEST TOOLS

A number of tools will be utilised throughout the development of a product but those relevant to the testing are:

ToolUse
ConfluenceTest documentation
JIRATask Management
– Sprint management
– User Stories
– Tasks
– Test plans
– Test cases
– Issues/Bugs
GITCode and other artefacts
– Release Notes
– Unit Tests
– End to End Tests
– System Tests
Jenkins/other toolCICD (Build tool)
Test Management Software
– JIRA Plugin (X-Ray)
Test Management
– Acceptance Criteria
– Test Plan
– Test Script definition
– Test Cases
– Test Execution results
– Defect management

RISKS, ISSUES, ASSUMPTIONS AND DEPENDENCIES

RiskLevel of RiskMitigation
Further changes to Data Dictionary or
Design Document
could increase the
scope
MGetting DA/BA
signed off the
requirements
No formal design
has been agreed by
developers
MDevelopers are
working on proof of
concept to ensure
that their solutions
are. workable within
the time frame.
No baseline
documentation for
testers to begin
writing test cases,
this will cause
potential loss of test
preparation time
MIf testing time scales are reduced we will
take a risk based
approach to check
high priority test
cases as agreed
with DAs and DEs
Operational or live
issues might delay
the deadline
HNeed to escalate to
higher authority

ASSUMPTION

We will have all the requirements before development begins.

DEPENDENCY

We are dependent on business to provide test data.