Know and understand test strategies including to test each module, each function and the whole system

7. The Systems Life Cycle – Test Strategies

Objective

Students will be able to know and understand test strategies, including how to test each module, each function and the whole system. They will also see how testing fits with the other stages of the Systems Life Cycle (analysis, design, development, implementation and maintenance) and how it links to the Cambridge assessment objectives:

  • AO1 – Knowledge and understanding: terminology such as functional/non‑functional requirements, unit testing, regression testing, etc.
  • AO2 – Application: selecting appropriate test techniques, preparing test data, choosing an implementation method.
  • AO3 – Analysis and evaluation: linking test results to requirements, evaluating the chosen implementation method and suggesting improvements.

7.1 Analysis

The analysis stage records the current system, identifies user requirements and produces a requirements specification document (a compulsory artefact for AO3).

Types of requirements

  • Functional requirements – what the system must do (e.g. “record each pupil’s attendance”).
  • Non‑functional requirements – how the system must perform (e.g. “response time < 2 seconds”, “data stored securely”).

Typical research techniques (school‑attendance example)

TechniqueHow it is used
ObservationWatch teachers taking attendance on paper to see what data is captured.
InterviewAsk the headteacher which reports are needed each week.
QuestionnaireSurvey pupils about preferred login methods.
Document reviewExamine existing attendance registers and school policies.

Output of analysis

  • Requirements specification (functional & non‑functional).
  • Initial data‑flow diagram (high‑level).
  • List of constraints (hardware, legal, budget).

7.2 Design

Design converts the analysed requirements into concrete data structures, input/output formats, file structures and validation routines.

Data‑flow example – Student‑record system

[User] → (Enter Student Details) → [Validation] → (Store in Database) → (Generate Report) → [User]

Sample data‑structure table

FieldTypeLengthExample
StudentIDNumeric6 digits123456
FirstNameAlphanumeric30Emma
LastNameAlphanumeric30Brown
GradeNumeric2 (0‑100)85
AttendanceNumeric3 (0‑365)180

Validation routine (range‑check for grades)

IF Grade >= 0 AND Grade <= 100 THEN

Accept value

ELSE

Display “Grade must be between 0 and 100”

END IF

Input / output format design (relevant to Sections 11‑16)

  • Screen‑form layout – field labels, tab order, mandatory indicators.
  • File format – e.g., CSV file with headings StudentID,FirstName,LastName,Grade,Attendance, UTF‑8 encoding.
  • Presentation output – report template (header, footer, page‑breaks) to be opened in a word‑processor.

7.3 Development & Testing

Test plan – one‑page template

SectionWhat to include
Test objectivesWhat the testing will prove (e.g., “all login functions work correctly”).
ScopeModules and features to be tested; any exclusions.
ResourcesTesters, hardware, software, test‑data sources.
ScheduleDates for unit, integration, system and acceptance testing.
Test environmentDetails of test server, database copy, network settings.
Risks & contingenciesPotential problems (e.g., “live data not available”) and mitigation.
DeliverablesTest case documents, test log, defect report, final test summary.

Test data – normal, abnormal and extreme cases (login form example)

Case typeInputExpected result
NormalUsername: john, Password: Pass123Login successful.
AbnormalUsername: john, Password: wrongError “Invalid password”.
ExtremeUsername: john, Password: aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa (32 chars)Error “Password exceeds maximum length”.

Levels of testing

Test levelPurposeWhen performedTypical testerKey activities
Module (Unit) testingConfirm each module performs its defined function correctly.After a module is coded, before integration.Developer / programmer

  1. Identify test cases for every function/method.
  2. Prepare test data (valid, invalid, boundary).
  3. Execute tests and record results.
  4. Fix defects and re‑test.

Integration testingVerify that combined modules interact correctly.After a set of modules has passed unit testing.Developer or test engineer

  1. Define integration scenarios (data flow, control flow).
  2. Use interface test cases to check data passing.
  3. Apply both black‑box and white‑box techniques.
  4. Document any interface defects.

System testingValidate the whole system against functional and non‑functional requirements.After all modules are integrated.Independent test team

  1. Develop a test plan covering all requirements.
  2. Create test scripts for functional, performance, security, usability.
  3. Run tests in a test environment that mirrors production.
  4. Log defects and track resolution.

Acceptance testingConfirm the system meets user needs and is ready for release.After system‑testing sign‑off.End‑users / client representatives

  1. Prepare acceptance criteria and test cases.
  2. Conduct user‑scenario testing (e.g., “teacher generates weekly attendance report”).
  3. Obtain formal sign‑off if criteria are met.
  4. Record any remaining issues for future releases.

Designing effective test cases

Each test case should contain:

  1. Test case ID
  2. Reference requirement (link to the specification document)
  3. Pre‑conditions
  4. Test steps
  5. Test data (including edge cases)
  6. Expected result
  7. Actual result (filled after execution)
  8. Pass/Fail status

Common test techniques

  • Black‑box testing – focus on inputs and expected outputs; internal code is not examined.
  • White‑box (structural) testing – examine internal logic, paths, loops and conditions.
  • Boundary‑value analysis – test values at the edges of input ranges.
  • Equivalence partitioning – divide input data into equivalent classes to reduce the number of test cases.
  • Decision‑table testing – useful for complex business rules.
  • Usability testing – assess ease of navigation and screen layout (relevant to Section 12).
  • File‑management testing – verify that a saved document opens correctly on another device, that data is not corrupted, and that file permissions are respected (Sections 11‑16).

Recording and reporting test results

Test IDDescriptionStatusDefect ID (if any)Comments
T001Login with valid credentialsPass
T002Login with invalid passwordFailD045System accepts incorrect password.

Regression testing (maintenance phase)

Whenever a change is made (e.g., a new field added to the student record), re‑run a selected set of previously‑passed test cases on a copy of the live database to ensure existing functionality has not been broken.

7.4 Implementation

Four common implementation methods are used to move a tested system into live operation.

MethodHow it worksAdvantagesDisadvantagesTypical school example
Direct change‑overOld system is switched off and the new system is switched on at a single point in time.Fast; no need to run two systems.High risk – if the new system fails, service is lost.Introducing a new online timetable at the start of a term.
Parallel runningOld and new systems run side‑by‑side for a period; results are compared.Low risk – fallback to the old system is possible.More costly (double staffing, double data entry).Running the new attendance‑recording system alongside the paper register for one month.
Pilot (trial) implementationNew system is introduced to a small, representative group first.Problems can be identified before full roll‑out.May not reveal issues that appear at scale.Deploying the new library catalogue in one year group before school‑wide launch.
Phased (incremental) implementationSystem is introduced in stages (module by module).Gradual learning curve; issues isolated to specific phases.Longer overall implementation time.First roll‑out of student registration, then grades, then reporting.

Post‑implementation activities

  • Training – brief sessions for teachers, pupils and administrative staff.
  • Documentation hand‑over – user manuals, help files, maintenance guides.
  • Support and maintenance plan – schedule for backups, security updates and future enhancements.
  • Review – collect user feedback and compare actual performance with the original non‑functional requirements.

7.5 Documentation

Documentation is a compulsory part of the SLC and is assessed under AO3.

Document typePurposeTypical content (example)
Requirements specificationRecord functional and non‑functional requirements.List of features, performance criteria, security constraints.
Design specificationShow how requirements will be met.Data‑flow diagrams, data‑structure tables, screen mock‑ups, validation rules.
Test plan & test casesGuide the testing process and provide evidence of coverage.Objectives, scope, schedule, test case ID, steps, expected results.
User manual / help guideAssist end‑users in operating the system.Step‑by‑step procedures, screenshots, FAQs.
Maintenance logRecord changes made after deployment.Date, description of change, reason, impact on other modules.

7.6 Evaluation

Evaluation links the whole project back to the original objectives and assesses its success.

Evaluation criterionWhat to considerEvidence to provide
Requirement fulfilmentDid the system meet all functional and non‑functional requirements?Traceability matrix linking test results to each requirement.
Testing effectivenessCoverage of test cases, defect detection rate, regression testing results.Summary of test logs, percentage of passed tests.
Implementation choiceWas the selected implementation method appropriate for risk, cost and timetable?Comparison of planned vs. actual downtime, user feedback.
User satisfactionEase of use, speed, reliability as reported by teachers/pupils.Survey results, acceptance‑testing sign‑off.
Maintenance considerationsIs the system easy to update and support?Maintenance log, documentation completeness.

Linking testing to the Systems Life Cycle

  1. Requirements analysis – define test criteria linked to each functional and non‑functional requirement.
  2. Design – create test cases, choose techniques, produce data‑flow diagrams and validation rules.
  3. Development – perform module (unit) testing as code is written.
  4. Integration – carry out integration testing after each build.
  5. System testing – execute full‑system functional, performance, security and usability tests.
  6. Acceptance testing – obtain formal sign‑off from end‑users before go‑live.
  7. Implementation – choose an implementation method; run final acceptance tests in the live environment.
  8. Maintenance – run regression tests whenever changes are made and update documentation accordingly.

Key take‑aways

  • Test early and often – unit testing catches defects before they propagate.
  • Use both black‑box and white‑box techniques for thorough coverage.
  • Document every test case, data set (normal, abnormal, extreme) and result; this provides evidence of quality and links back to the specification.
  • Choose an implementation method that balances risk, cost and the school’s timetable, and plan post‑implementation training and support.
  • Acceptance testing is the final checkpoint before the system goes live; a formal sign‑off is required.
  • Evaluation must demonstrate how the system meets the original requirements and how the testing and implementation choices contributed to success.