Students will be able to know and understand test strategies, including how to test each module, each function and the whole system. They will also see how testing fits with the other stages of the Systems Life Cycle (analysis, design, development, implementation and maintenance) and how it links to the Cambridge assessment objectives:
The analysis stage records the current system, identifies user requirements and produces a requirements specification document (a compulsory artefact for AO3).
| Technique | How it is used |
|---|---|
| Observation | Watch teachers taking attendance on paper to see what data is captured. |
| Interview | Ask the headteacher which reports are needed each week. |
| Questionnaire | Survey pupils about preferred login methods. |
| Document review | Examine existing attendance registers and school policies. |
Design converts the analysed requirements into concrete data structures, input/output formats, file structures and validation routines.
[User] → (Enter Student Details) → [Validation] → (Store in Database) → (Generate Report) → [User]
| Field | Type | Length | Example |
|---|---|---|---|
| StudentID | Numeric | 6 digits | 123456 |
| FirstName | Alphanumeric | 30 | Emma |
| LastName | Alphanumeric | 30 | Brown |
| Grade | Numeric | 2 (0‑100) | 85 |
| Attendance | Numeric | 3 (0‑365) | 180 |
IF Grade >= 0 AND Grade <= 100 THEN
Accept value
ELSE
Display “Grade must be between 0 and 100”
END IF
StudentID,FirstName,LastName,Grade,Attendance, UTF‑8 encoding.| Section | What to include |
|---|---|
| Test objectives | What the testing will prove (e.g., “all login functions work correctly”). |
| Scope | Modules and features to be tested; any exclusions. |
| Resources | Testers, hardware, software, test‑data sources. |
| Schedule | Dates for unit, integration, system and acceptance testing. |
| Test environment | Details of test server, database copy, network settings. |
| Risks & contingencies | Potential problems (e.g., “live data not available”) and mitigation. |
| Deliverables | Test case documents, test log, defect report, final test summary. |
| Case type | Input | Expected result |
|---|---|---|
| Normal | Username: john, Password: Pass123 | Login successful. |
| Abnormal | Username: john, Password: wrong | Error “Invalid password”. |
| Extreme | Username: john, Password: aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa (32 chars) | Error “Password exceeds maximum length”. |
| Test level | Purpose | When performed | Typical tester | Key activities |
|---|---|---|---|---|
| Module (Unit) testing | Confirm each module performs its defined function correctly. | After a module is coded, before integration. | Developer / programmer |
|
| Integration testing | Verify that combined modules interact correctly. | After a set of modules has passed unit testing. | Developer or test engineer |
|
| System testing | Validate the whole system against functional and non‑functional requirements. | After all modules are integrated. | Independent test team |
|
| Acceptance testing | Confirm the system meets user needs and is ready for release. | After system‑testing sign‑off. | End‑users / client representatives |
|
Each test case should contain:
| Test ID | Description | Status | Defect ID (if any) | Comments |
|---|---|---|---|---|
| T001 | Login with valid credentials | Pass | – | – |
| T002 | Login with invalid password | Fail | D045 | System accepts incorrect password. |
Whenever a change is made (e.g., a new field added to the student record), re‑run a selected set of previously‑passed test cases on a copy of the live database to ensure existing functionality has not been broken.
Four common implementation methods are used to move a tested system into live operation.
| Method | How it works | Advantages | Disadvantages | Typical school example |
|---|---|---|---|---|
| Direct change‑over | Old system is switched off and the new system is switched on at a single point in time. | Fast; no need to run two systems. | High risk – if the new system fails, service is lost. | Introducing a new online timetable at the start of a term. |
| Parallel running | Old and new systems run side‑by‑side for a period; results are compared. | Low risk – fallback to the old system is possible. | More costly (double staffing, double data entry). | Running the new attendance‑recording system alongside the paper register for one month. |
| Pilot (trial) implementation | New system is introduced to a small, representative group first. | Problems can be identified before full roll‑out. | May not reveal issues that appear at scale. | Deploying the new library catalogue in one year group before school‑wide launch. |
| Phased (incremental) implementation | System is introduced in stages (module by module). | Gradual learning curve; issues isolated to specific phases. | Longer overall implementation time. | First roll‑out of student registration, then grades, then reporting. |
Documentation is a compulsory part of the SLC and is assessed under AO3.
| Document type | Purpose | Typical content (example) |
|---|---|---|
| Requirements specification | Record functional and non‑functional requirements. | List of features, performance criteria, security constraints. |
| Design specification | Show how requirements will be met. | Data‑flow diagrams, data‑structure tables, screen mock‑ups, validation rules. |
| Test plan & test cases | Guide the testing process and provide evidence of coverage. | Objectives, scope, schedule, test case ID, steps, expected results. |
| User manual / help guide | Assist end‑users in operating the system. | Step‑by‑step procedures, screenshots, FAQs. |
| Maintenance log | Record changes made after deployment. | Date, description of change, reason, impact on other modules. |
Evaluation links the whole project back to the original objectives and assesses its success.
| Evaluation criterion | What to consider | Evidence to provide |
|---|---|---|
| Requirement fulfilment | Did the system meet all functional and non‑functional requirements? | Traceability matrix linking test results to each requirement. |
| Testing effectiveness | Coverage of test cases, defect detection rate, regression testing results. | Summary of test logs, percentage of passed tests. |
| Implementation choice | Was the selected implementation method appropriate for risk, cost and timetable? | Comparison of planned vs. actual downtime, user feedback. |
| User satisfaction | Ease of use, speed, reliability as reported by teachers/pupils. | Survey results, acceptance‑testing sign‑off. |
| Maintenance considerations | Is the system easy to update and support? | Maintenance log, documentation completeness. |
Your generous donation helps us continue providing free Cambridge IGCSE & A-Level resources, past papers, syllabus notes, revision questions, and high-quality online tutoring to students across Kenya.