Know and understand the whole SLC – analysis, design, development & testing, implementation, documentation and evaluation – and be able to define, select and use normal, abnormal and extreme test data effectively.
The aim of analysis is to produce a clear, written description of the current system and the requirements for the new system.
| System Specification | |
|---|---|
| Purpose | What the system will achieve. |
| User requirements | Functions users need (e.g., calculate payroll). |
| Information requirements | Data that must be stored, processed and reported. |
| Inputs | All data entered by users or other systems. |
| Outputs | Reports, screens, printed documents. |
| Constraints | Hardware, software, legal or security limits. |
Design converts the specification into detailed artefacts that guide development and testing.
| Field | Type | Length | Valid Range / Format | Mandatory? |
|---|---|---|---|---|
| Employee ID | Alphanumeric | 6 | AA1234 | Yes |
| Salary | Numeric | 7 | 0 – 99 999 | Yes |
| Date of Birth | Date | 10 | DD/MM/YYYY | No |
+-------------------+-------------------+-------------------+
| Record 1 (Employee) |
|---------------------------------------------------|
| EmpID (6) | Name (30) | Salary (7) | DOB (10) |
+-------------------+-------------------+-------------------+
| Record 2 (Employee) …………………………………………|
+---------------------------------------------------+
Write the program, create the database or configure the off‑the‑shelf package according to the design artefacts. Follow good coding practice – naming conventions, comments and modular structure – to make later testing easier.
| Item | What to include |
|---|---|
| Test objectives | What each test is trying to prove (e.g., “salary field accepts values up to the maximum”). |
| Test cases | Numbered description of the action to be performed. |
| Test data | Normal, abnormal and extreme values for each input. |
| Expected results | Exact output or error message that should appear. |
| Pass/Fail criteria | How the tester decides if the case succeeded. |
| Tester & date | Who performed the test and when. |
| Aspect | Normal (Typical) data | Abnormal (Invalid) data | Extreme (Boundary) data |
|---|---|---|---|
| Purpose | Confirm correct functionality under everyday conditions. | Verify error handling and validation. | Check limits and uncover off‑by‑one or overflow errors. |
| Typical values | Within the defined valid range. | Outside the range, wrong type, missing mandatory field. | Exactly at the minimum or maximum of the valid range (and just beyond). |
| Examples (Payroll system) | Salary = £2 500 | Salary = “twenty thousand” (text) or –£100 | Salary = £0, £99 999 (max), £100 000 (just above max) |
| Expected system response | Successful processing and correct output. | Clear error message; transaction rejected. | Accept if within limits; otherwise error message. |
Implementation moves the tested solution into the live environment. Four common strategies are used, each with advantages and disadvantages.
| Method | How it works | When it is best used |
|---|---|---|
| Direct changeover | Old system is switched off and the new system is switched on at the same moment. | Simple, low cost, low risk of data duplication – suitable when the old system is obsolete. |
| Parallel running | Both old and new systems run simultaneously for a set period; results are compared. | High confidence in correctness – ideal for critical systems (e.g., banking). |
| Pilot (trial) implementation | New system is introduced to a small, representative group first. | Useful when the organisation is large or the change is significant. |
| Phased (step‑by‑step) implementation | System is introduced in stages (module by module or department by department). | Reduces risk and allows learning – good for complex, integrated solutions. |
| Criteria | Direct | Parallel | Pilot | Phased |
|---|---|---|---|---|
| Cost | Low | High | Medium | Medium |
| Risk of data loss | High | Low | Medium | Low |
| User training needed | High | Medium | Low | Medium |
| Time to full operation | Fast | Slow | Medium | Slow |
Good documentation supports users, maintainers and future developers. Two families are required by the syllabus.
Tailor language, level of detail and visual aids to the intended audience – technical staff need precise terminology, whereas non‑technical staff benefit from screenshots and plain‑English instructions.
After implementation the solution is reviewed against the original requirements and the six evaluation criteria set out in the syllabus.
| Evaluation criterion | Guiding questions |
|---|---|
| Functional requirements | Does every required function work as specified? |
| Efficiency | Is the system fast enough? Does it use resources (CPU, storage) reasonably? |
| Usability | Is the interface clear for the intended users? Are error messages helpful? |
| Security & data protection | Are passwords, encryption and access controls adequate? Were test data handled safely? |
| Reliability & maintainability | Does the system recover gracefully from failures? Is the code/documentation easy to modify? |
| Legal & ethical compliance | Does the system meet data‑protection legislation and e‑safety guidelines? |
Document the findings, suggest improvements and, if necessary, plan a new iteration of the SLC (often called a “maintenance cycle”).
Your generous donation helps us continue providing free Cambridge IGCSE & A-Level resources, past papers, syllabus notes, revision questions, and high-quality online tutoring to students across Kenya.