Purpose of testing – to verify that the system meets the specified requirements, works reliably under expected conditions and that all validation routines (data‑type checks, range checks, format checks, mandatory‑field checks, etc.) function correctly before the system is released for operational use.
Test designs are systematic ways of selecting inputs so that defects are most likely to be revealed. Each design is linked to a type of validation routine that appears in the IGCSE ICT syllabus.
Example: For a student‑record data‑entry form, enter a valid student name and ID and check that the record is saved; then enter a non‑numeric ID and verify that the system displays an “Invalid ID” error.
Example: Review the routine that calculates a student’s average grade to ensure every possible branch (division by zero, missing marks) is executed at least once.
Example: If the “Score” field accepts 0‑100, test with 0, 1, 99, 100 and 101 to confirm correct handling of the lower and upper limits.
Example: For a “Year of Birth” field, create three partitions: (a) valid years (1900‑2025), (b) years too early (e.g., 1800), (c) years in the future (e.g., 2030). Test one value from each partition.
Example: A login routine that checks (i) username entered, (ii) password entered, (iii) account not locked. The decision table lists all combinations (e.g., username missing + password entered → error “Username required”).
A test strategy describes the overall approach and the sequence of testing activities. The table links each strategy to the relevant Cambridge assessment objectives (AO).
| Test Strategy | What is tested | Related AO |
|---|---|---|
| Unit testing | Individual modules, functions or procedures in isolation (e.g., the routine that validates a student ID) | AO2 – apply skills to develop and test |
| Integration testing | Interaction between combined modules (e.g., data passed from the entry form to the database update routine) | AO2 – apply skills to develop and test |
| System testing | Complete, integrated system against the full set of specifications, including overall input‑output validation | AO3 – evaluate the solution |
| Acceptance testing | Performed by end‑users or clients to confirm the system is ready for deployment and meets business needs | AO3 – evaluate the solution |
The test plan records every piece of information required to carry out the tests and to evaluate the results. It is a living document that is updated after each test cycle.
| Test Case ID | Test Data (Inputs) | Expected Outcome (E) | Actual Outcome (A) | Result (Pass/Fail) | Justification | Remedial Action |
|---|---|---|---|---|---|---|
| TC01 | Username: admin, Password: 1234 | Login successful; dashboard displayed | Login successful; dashboard displayed | Pass | System behaved as specified | None |
| TC02 | Username: admin, Password: wrong | Error message “Invalid password” displayed | No error displayed – system logged in | Fail | Authentication routine did not enforce validation | Log defect, fix routine, retest |
| TC03 | Input value: 0 (boundary of range 0‑100) | Value accepted and stored as 0 | Value accepted and stored as 0 | Pass | Boundary‑value check succeeded | None |
*The columns “Expected Outcome (E)” and “Actual Outcome (A)” use the syllabus terminology; the footnote links to the AO3 evaluation criteria.*
For each test case, the following information must be recorded:
Note: AO3 carries an 8 % weighting in the IGCSE ICT exam. When answering exam questions, be sure to address at least three items from the checklist to demonstrate thorough evaluation.
Your generous donation helps us continue providing free Cambridge IGCSE & A-Level resources, past papers, syllabus notes, revision questions, and high-quality online tutoring to students across Kenya.