7 – The Systems Life Cycle (SLC)
7.1 Analysis – Understanding the current system
7.1.1 Why analyse the current system?
Before a new computer‑based solution can be designed, the existing (often manual) system must be fully understood. The analysis provides the foundation for:
- Identifying what works well and what does not.
- Determining the exact data that must be captured (inputs) and the results required (outputs).
- Describing the processing steps that transform inputs into outputs.
- Gathering the users’ expectations and the information that the new system must provide.
- Producing a clear system specification – a document that records the findings and forms the reference for later stages.
7.1.2 Current‑system documentation
| Component | Typical examples | Comments / Issues |
|---|
| Inputs | Paper forms, telephone calls, sensor readings, manual calculations | Data‑entry errors, missing fields, illegible handwriting |
| Processing | Sorting files, calculating totals by hand, applying business rules on paper | Time‑consuming, inconsistent application of rules, high error rate |
| Outputs | Printed reports, handwritten invoices, verbal feedback | Delayed distribution, limited accessibility, difficult to archive |
7.1.3 Problem categorisation (syllabus requirement)
Problems are usually grouped into three categories. For each problem note the impact and possible cause.
- Technical problems – data loss, lack of backup, poor data integrity.
- Organisational problems – duplicated work, bottlenecks, lack of standard procedures.
- Human problems – user fatigue, high error rates, resistance to change.
7.1.4 User requirements (AO2 – produce ICT solution)
Write requirements in plain, testable language. Use the template below:
| ID | Requirement | Priority (H/M/L) |
|---|
| UR‑01 | The system must allow data entry via keyboard and touchscreen. | H |
| UR‑02 | Users must be able to retrieve a report within 5 seconds of request. | H |
| UR‑03 | Only managers may delete records (access level control). | M |
| UR‑04 | The interface shall be intuitive, requiring no more than one hour of training. | M |
| UR‑05 | All mandatory fields must be highlighted and validated before a record can be saved. | H |
7.1.5 Information requirements (data‑type, format, quality)
- Data types: text, integer, decimal, date, Boolean.
- Data format: dates in
YYYY‑MM‑DD; currency with two decimal places. - Data quality:
- Mandatory fields – must be completed.
- Validation rules – e.g.
0 ≤ Quantity ≤ 10 000. - Uniqueness – primary key values must be unique.
- Check‑digit or checksum for IDs.
- Reporting needs: summary totals, trend graphs, drill‑down detail screens, printable PDF version.
7.1.6 Analysis checklist (syllabus checklist)
| Task | Completed? | Notes |
|---|
| Document all current inputs, processing steps and outputs | | |
| Identify and prioritise problems (technical/organisational/human) | | |
| Gather user requirements (interviews, questionnaires, observation) | | |
| Define information requirements (data types, validation, reporting) | | |
| Obtain sign‑off on the system specification from key stakeholders | | |
7.2 Design – Translating requirements into a solution
7.2.1 File‑structure design (syllabus expects primary/foreign keys)
For each logical file list the fields and their attributes.
| File | Field name | Data type | Length | Key | Notes |
|---|
| Students | StudentID | CHAR | 8 | PK | Check‑digit algorithm |
| FirstName | VARCHAR | 30 | | |
| LastName | VARCHAR | 30 | | |
| CourseID | CHAR | 5 | FK (Courses.CourseID) | |
| Courses | CourseID | CHAR | 5 | PK | |
| CourseName | VARCHAR | 50 | | |
| Credits | INTEGER | 2 | | Range 1‑10 |
7.2.2 Input‑screen design (validation checklist)
- Layout: logical order, clear labels, grouping related items, tab‑order consistent.
- Controls: drop‑down lists, radio buttons, check boxes, date pickers, numeric spin boxes.
- Validation taxonomy (syllabus requirement):
- Presence – mandatory fields cannot be left blank.
- Range – numeric values must fall within defined limits.
- Type – only numbers, dates, or text as specified.
- Length – maximum character count.
- Format – e.g. email must match
user@domain.com. - Check‑digit – for IDs such as StudentID.
- Example of a validation rule:
IF (Quantity < 0 OR Quantity > 10000) THEN display “Quantity must be between 0 and 10 000”.
7.2.3 Output‑design
- Screen output: headings, column alignment, colour coding for warnings, pagination.
- Printed reports: title, page number, column headings, totals, footers with date and author.
- Export formats: CSV for spreadsheets, PDF for official documents, XML for data exchange.
7.2.4 User‑interface (UI) layout & accessibility (AO3 – evaluate)
- Consistent navigation – menu bar, breadcrumb trail, “home” button.
- Icons with tool‑tips; text alternatives for screen‑reader users.
- High‑contrast colour scheme, scalable fonts, keyboard shortcuts (e.g., Alt+S to save).
- Compliance with WCAG Level A (text equivalents, focus order, error identification).
7.2.5 Security controls (syllabus expects a checklist)
| Control | Description | Implementation example |
|---|
| User authentication | Username + password, password policy (min 8 chars, mix of types) | Login screen with locked‑out after 3 failed attempts |
| Access levels | Define roles (e.g., Administrator, Manager, Clerk) | Only Manager role can delete records |
| Encryption | Data transmitted over network encrypted (TLS) | HTTPS for web‑based modules |
| Audit trail | Record who changed what and when | Log table with UserID, Action, Timestamp |
7.2.6 Backup & recovery plan (syllabus requirement)
- Backup frequency: full weekly backup, incremental daily backup.
- Media: external hard‑drive (offline) and cloud storage.
- Retention: keep backups for 6 months.
- Recovery procedure:
- Verify backup integrity.
- Restore latest full backup.
- Apply incremental backups in chronological order.
- Run test restore on a separate server.
7.2.7 Design checklist (covers all syllabus artefacts)
| Design element | Specified? | Comments |
|---|
| File structures (fields, types, lengths, keys) | | |
| Input screens (layout, controls, validation checklist) | | |
| Output specifications (screen, printed, export) | | |
| UI design (navigation, icons, accessibility) | | |
| Security controls (authentication, authorisation, audit) | | |
| Backup & recovery strategy | | |
| Documentation plan (technical & user manuals) | | |
7.3 Development & Testing
7.3.1 Test plan (AO2 – apply life‑cycle stages)
| Test case | Purpose | Input data | Expected result | Actual result | Pass/Fail | Remedial action |
|---|
| TC‑01 – Valid data entry | Check normal processing | All fields filled correctly | Record saved, confirmation message | | | |
| TC‑02 – Missing mandatory field | Validate presence check | Leave “Customer name” blank | Error message, record not saved | | | |
| TC‑03 – Out‑of‑range value | Validate range check | Quantity = –5 | Error message, record not saved | | | |
| TC‑04 – Extreme load | Performance test | Generate 10 000 records | Response ≤ 5 s per operation | | | |
| TC‑05 – Unauthorized delete | Check access control | Standard user attempts to delete a record | Access denied message | | | |
7.3.2 Testing types (syllabus checklist)
- Unit (module) testing – test individual routines or screens in isolation.
- Integration testing – verify that modules work together (e.g., data entry → database → report).
- System testing – end‑to‑end test of the whole solution against user requirements.
- Acceptance testing – carried out by the client to confirm the system meets the agreed specifications.
7.3.3 Testing checklist
| Item | Done? | Notes |
|---|
| All unit tests passed | | |
| Integration tests successful | | |
| System test meets all user requirements | | |
| Client acceptance signed off | | |
| Defect log cleared | | |
7.4 Implementation – Moving the new system into operation
7.4.1 Implementation methods (syllabus expects description of each)
| Method | How it works | Typical example |
|---|
| Direct change‑over | Old system is switched off and the new system is switched on at a predetermined time. | Launching a new online ordering portal at midnight. |
| Parallel running | Both old and new systems operate simultaneously for a period; results are compared. | A school runs the existing paper register alongside a new digital register for one term. |
| Pilot (trial) run | The new system is introduced to a limited group or site first. | Testing a new payroll system with the accounting department only. |
| Phased (step‑by‑step) implementation | Modules are introduced one after another. | First implementing inventory management, then sales, then reporting. |
7.4.2 Data migration plan (syllabus requirement)
- Extract data from the legacy system into a CSV file.
- Map old fields to new file‑structure (e.g.,
OldCustID → Customer.CustomerID). - Validate data against new constraints (mandatory, format, range).
- Load cleaned data into the new database using import scripts.
- Run reconciliation reports to verify record counts and totals.
7.4.3 Training & support (AO2 – produce solution)
- Training schedule – classroom sessions, e‑learning modules, on‑site practice.
- Training materials – user guide, quick‑reference cards, video tutorials.
- Support arrangements – help‑desk phone/email, on‑site technician for the first two weeks.
7.4.4 Implementation checklist
| Task | Completed? | Comments |
|---|
| Full backup of existing data | | |
| Data migration executed and verified | | |
| Training delivered to all user groups | | |
| Support desk operational | | |
| Post‑implementation review scheduled | | |
7.5 Documentation
7.5.1 Technical documentation (for developers/maintainers)
- System overview – purpose, scope, hardware/software requirements.
- Detailed file structure (field definitions, keys, indexes).
- Flowcharts or pseudocode for major processes.
- Data‑validation routines and algorithms.
- Security procedures (authentication, authorisation, audit).
- Backup, recovery and disaster‑recovery procedures.
- Test plan and test‑case results.
7.5.2 User documentation (for end‑users)
- Getting started – login, basic navigation.
- Step‑by‑step procedures (e.g., “Enter a new order”).
- Screen screenshots with labelled fields.
- Explanation of error messages and how to correct them.
- FAQ and troubleshooting guide.
7.5.3 Documentation checklist (syllabus expects both technical & user manuals)
| Document | Completed? | Reviewer |
|---|
| System specification (analysis) | | |
| Design specification (data, input, output, UI, security) | | |
| Technical manual (developers) | | |
| User guide (end‑users) | | |
| Test plan and test‑case results | | |
| Implementation & training plan | | |
| Backup & recovery procedures | | |
7.6 Evaluation – Did the new system meet its aims? (AO3 – evaluate)
7.6.1 Evaluation criteria (syllabus expects efficiency, effectiveness, usability, appropriateness, security)
- Efficiency – faster processing, reduced paperwork, lower operating cost.
- Effectiveness – accurate, complete outputs; meets all user requirements.
- Usability – meets training‑time target, low error rate, positive user feedback.
- Appropriateness – fits organisational procedures and future growth.
- Security – data protected, backups reliable, audit trail functional.
7.6.2 Evaluation checklist
| Criterion | Met? (Yes/No) | Evidence / Comments |
|---|
| All user requirements satisfied | | |
| Response time ≤ 5 seconds for standard reports | | |
| Training completed within one hour for 90 % of users | | |
| Backup restored successfully in test drill | | |
| Security audit shows no unauthorised access | | |
Review of Coverage Against the Cambridge IGCSE 0417 Syllabus
| Area | What the syllabus expects | What the notes currently provide | Suggested improvement (concise, actionable) |
|---|
| 1. Coverage of required topics | All 21 content sections (hardware, software, I/O devices, storage, networks, effects of IT, ICT applications, safety & security, file‑management, graphics, layout, styles, proof‑reading, graphs, document production, databases, presentations, spreadsheets, website authoring, etc.). | Notes only address Section 7 – The Systems Life Cycle. | • Add separate lecture modules for each missing content block (e.g., “1 – Types & components of computer systems”, “2 – Input & output devices”, … up to “21 – Website authoring”). • Map each module to the syllabus numbering so teachers can verify coverage at a glance. |
| 2. Depth & accuracy of SLC material | AO2 (60 %) – apply life‑cycle stages to produce ICT solutions; AO3 (8 %) – evaluate and justify design choices. Detailed artefacts required: file structures, primary/foreign keys, validation taxonomy, security‑control checklist, backup‑recovery plan, UI accessibility guidelines, test plan, implementation methods, documentation types. | Notes give a solid narrative and useful check‑lists but omit many detailed artefacts (e.g., explicit file‑structure table, validation rule taxonomy, security‑control checklist, backup‑recovery plan, UI accessibility guidelines). | • Insert explicit sub‑sections under each SLC stage that mirror the syllabus items (see sections 7.2.1‑7.2.6, 7.3.2, 7.4.2, 7.5.1‑7.5.2). • Provide template tables for file structures, validation rules, security controls and backup‑recovery plans. • Add a short UI‑accessibility checklist aligned with WCAG Level A. • Ensure the test plan includes unit, integration, system and acceptance testing as separate rows. |