Understand software development methods (agile, waterfall, RAD)

System Life Cycle – Software Development Methods (Cambridge A‑Level IT 9626)

Learning Objectives

  • Explain the purpose and structure of the system life cycle (SLC).
  • Identify the eight generic SLC phases and the artefacts produced in each.
  • Describe the key characteristics, activities and artefacts of the Waterfall, Agile (Scrum) and RAD models.
  • Compare the strengths, weaknesses and suitability of each model (AO3).
  • Apply the SLC to a realistic scenario, select an appropriate model (or hybrid) and justify the choice (AO2).
  • Link the SLC to core project‑management concepts (risk register, stakeholder analysis, Gantt/PERT, cost‑benefit analysis) as required by Topic 15.

1. Syllabus Mapping – How the Notes Meet the Specification

Syllabus Requirement (Topic 16) Notes Coverage
1.1 Define the purpose of a system life cycle. Section 2 – purpose, benefits and alignment with Topic 15.
1.2 List and describe the eight generic phases. Section 2.1 – full ordered list with brief description of each phase.
1.3 Describe at least three recognised development models and the activities/artefacts associated with each. Sections 3.1‑3.3 – Waterfall, Agile (Scrum) and RAD with step‑by‑step activities and typical artefacts.
1.4 Compare the strengths and weaknesses of the models and select an appropriate model for a given scenario. Section 4 – comparative table; Section 5 – decision checklist; Section 8 – mini‑case study with AO3 justification.
1.5 Explain the role of risk management, stakeholder analysis and scheduling within the SLC (link to Topic 15). Section 2.2 – detailed mapping of each SLC phase to risk register, stakeholder matrix and scheduling tools.
1.6 Show how the SLC can be applied to other syllabus topics (e.g., spreadsheets, databases, expert systems). Section 7 – cross‑topic application table.

2. Overview of the System Life Cycle

The SLC is a structured roadmap that guides a project from the initial idea through to retirement. It ensures that stakeholder needs are captured, development work is planned and controlled, quality is built‑in, and changes are managed.

2.1 The Eight Generic Phases (Cambridge specification)

  1. Feasibility / Initiation – business case, cost‑benefit analysis, high‑level risk identification.
  2. Requirements Analysis – functional & non‑functional requirements, use‑case diagrams, stakeholder interviews.
  3. Design – logical & physical design, data models, UI mock‑ups, architecture diagrams.
  4. Implementation (Coding) – translation of design into source code, configuration of off‑the‑shelf components.
  5. Testing – unit, integration, system and acceptance testing; defect logging.
  6. Deployment – installation, data migration, user training, release notes.
  7. Maintenance & Support – corrective, adaptive and perfective changes; change‑request handling.
  8. Retirement – de‑commissioning, data archiving, migration to a replacement system, final hand‑over documentation.

2.2 Project‑Management Overlay (Topic 15)

Each SLC phase can be linked to a core project‑management artefact. The table below shows the typical mapping.

SLC Phase Risk Register Activity Stakeholder Analysis Scheduling Tool Cost‑Benefit / Financial Review
Feasibility / Initiation Identify high‑level risks (scope, technology, finance) and assign likelihood/impact. Initial stakeholder matrix (interest vs. influence). High‑level Gantt chart showing major milestones. Produce a cost‑benefit analysis to justify the project.
Requirements Analysis Update risk register with requirements‑related risks (e.g., unclear user needs). Refine stakeholder list; capture communication needs. Detailed Gantt with analysis tasks and dependencies. Re‑evaluate ROI based on refined requirements.
Design Assess design‑technology risks (integration, performance). Confirm design‑review participants. PERT diagram for critical‑path activities (design reviews, prototyping). Update budget for design tools and licences.
Implementation Log coding risks (skill gaps, tool availability). Maintain communication with developers and testers. Gantt with coding sprints or work‑packages. Track actual vs. planned expenditure.
Testing Record testing risks (test data quality, environment stability). Identify test‑owner stakeholders. PERT for test cycles and defect‑fix turnaround. Budget for test tools and external validation.
Deployment Identify deployment risks (downtime, data loss). Engage end‑user trainers and support staff. Gantt for cut‑over activities. Final cost reconciliation.
Maintenance & Support Monitor operational risks (security patches, performance degradation). Update stakeholder register for support teams. Ongoing maintenance schedule (monthly, quarterly). Cost‑benefit of continued support vs. replacement.
Retirement Identify retirement risks (data loss, compliance). Notify all affected stakeholders; plan hand‑over. Gantt for de‑commissioning tasks. Final financial closure and ROI calculation.

3. Development Models

3.1 Waterfall Model

Linear and sequential – each phase must be completed and formally signed‑off before the next begins.

Key Activities & Artefacts (by SLC phase)
  1. Feasibility Study – feasibility report, high‑level risk register.
  2. Requirements Specification – SRS document, use‑case diagrams, requirement traceability matrix.
  3. System Design – High‑Level Design (HLD), Low‑Level Design (LLD), data model, UI wireframes.
  4. Implementation – source code, version‑control log, build scripts.
  5. Testing – test plan, test cases, defect log.
  6. Deployment – installation guide, user manual, training material.
  7. Maintenance – change‑request forms, maintenance log.
  8. Retirement – de‑commission checklist, data‑archive plan, final hand‑over report.
Strengths
  • Clear milestones and documentation – easy to track progress and audit.
  • Predictable budget and schedule when requirements are stable.
  • Well‑suited to contracts with fixed scope and regulatory compliance.
Weaknesses
  • Inflexible – costly to accommodate changes after sign‑off.
  • Late delivery of usable software (first release only after testing).
  • Risk of misunderstanding requirements until the testing phase.

3.2 Agile Model (Scrum framework)

Iterative, incremental delivery in short time‑boxes called sprints (1–4 weeks).

Key Activities & Artefacts per Sprint
  1. Sprint Planning – product backlog, sprint goal, sprint backlog (user stories, story points).
  2. Daily Stand‑up – task board, burndown chart.
  3. Development & Testing – incremental code, automated unit tests, continuous integration.
  4. Sprint Review – demo of potentially shippable increment, stakeholder feedback.
  5. Sprint Retrospective – process‑improvement actions.
Supporting Artefacts (outside sprints)
  • Product Vision & Roadmap.
  • Definition of Done (DoD).
  • Release Plan (grouping of sprints).
  • Risk burndown (updated each sprint).
Strengths
  • High flexibility – changes welcomed each sprint.
  • Early and frequent delivery of usable software.
  • Continuous stakeholder involvement improves quality.
  • Risks are exposed early and mitigated iteratively.
Weaknesses
  • Requires disciplined, self‑organising team.
  • Less formal documentation may be problematic for regulated environments.
  • Scope creep possible without a strong product owner.

3.3 Rapid Application Development (RAD)

Emphasises rapid prototyping, user‑centred design and the use of CASE tools to shorten development time.

Phases (time‑boxed)
  1. Requirements Planning – workshops, high‑level requirements, feasibility.
  2. User Design (Prototyping) – interactive mock‑ups, UI prototypes, rapid‑tool generated screens.
  3. Construction – iterative building of functional modules, reuse of components.
  4. Cut‑over – final testing, data conversion, user training, deployment.
Typical Artefacts
  • Prototype specifications (screens, flowcharts).
  • Component library catalogue.
  • User Acceptance Test (UAT) scripts linked to prototypes.
Strengths
  • Very fast time‑to‑market – prototype can become the final product.
  • Strong user involvement ensures the UI meets expectations.
  • Reusable components reduce coding effort.
Weaknesses
  • Less suitable for large, complex systems with heavy back‑end processing.
  • Quality of the final system depends heavily on the quality of prototypes.
  • Documentation may be insufficient for maintenance or audit.

3.4 Hybrid / Multi‑Method Approaches

The syllabus recognises that many real‑world projects combine elements of two or more models (e.g., a Waterfall‑style feasibility study followed by Agile sprints, or a Waterfall backbone with RAD prototyping for the UI). Hybrid approaches allow teams to benefit from the documentation and risk control of Waterfall while retaining the flexibility of Agile or the speed of RAD.

4. Sample Artefacts (Exam‑style AO2 examples)

4.1 Brief SRS excerpt (Waterfall)

1.1 Functional Requirement – Student Registration
The system shall allow a student to register for a course by selecting the course code, confirming prerequisites, and clicking “Register”. The system shall generate a confirmation number and send an email receipt.

4.2 Sprint Backlog excerpt (Agile)

Sprint 3 – Goal: “Enable course search and registration”
US‑12: As a student, I want to search courses by keyword (5 points)
US‑13: As a student, I want to view course prerequisites (3 points)
US‑14: As a student, I want to add a course to my basket (8 points)
Tasks:
- DB query for keyword search (2h)
- UI mock‑up for results page (3h)
- Unit tests for search service (2h)

4.3 Risk Register entry (applicable to any model)

Risk ID: R07
Description: Third‑party payment gateway may not meet SLA.
Likelihood: Medium
Impact: High (delays launch, loss of revenue)
Mitigation: Conduct early API compatibility test; negotiate backup gateway contract.
Owner: Project Manager
Review Date: 12 Oct 2025

5. Comparative Summary

Aspect Waterfall Agile (Scrum) RAD
Approach Linear, sequential Iterative, incremental (sprints) Iterative with heavy prototyping
Flexibility to change Low – costly after sign‑off High – embraced each sprint Medium – accommodated via prototype revisions
Typical documentation Extensive (SRS, design docs, test plan) Just‑enough (product backlog, sprint backlog, DoD) Moderate (prototype specs, component catalogue)
Customer involvement Limited after requirements phase Continuous – sprint reviews & daily feedback Frequent – during prototype design and UAT
Risk handling Early risk analysis; issues may surface late Risks exposed & mitigated each sprint Early user feedback reduces functional risk
Project size suitability Large, well‑defined projects Small‑to‑medium, evolving scope Medium, UI‑centric applications
Time to market Long – full lifecycle before release Short – incremental releases every 1–4 weeks Very short – prototype may become the final product
Regulatory / compliance fit Strong – heavy documentation Variable – may need extra artefacts for audit Weak – documentation often minimal

6. Decision Checklist (AO3)

  1. Stability of requirements – Fixed? → Waterfall. Likely to change? → Agile.
  2. Time pressure – Must launch quickly? → RAD or Agile.
  3. Regulatory / documentation needs – High? → Waterfall (or V‑Model).
  4. Project complexity & size – Very large, critical infrastructure? → Waterfall/Spiral.
  5. Stakeholder availability – Can users attend daily stand‑ups? → Agile.
  6. Team expertise – Experienced with iterative tools? → Agile/RAD; otherwise Waterfall.
  7. Risk profile – High technical risk? → Spiral or Agile (early risk mitigation).
  8. Need for rapid UI design – Emphasise prototypes? → RAD (or Agile with UI sprints).
  9. Hybrid suitability – Combine phases to meet mixed requirements (e.g., Waterfall feasibility + Agile development).

7. Cross‑Topic Application (Linking SLC to Other Syllabus Topics)

Other Syllabus Topic How the SLC is used Concrete Example (artefact)
Spreadsheets – Budgeting System Apply the eight phases to design a spreadsheet solution that calculates departmental budgets. Feasibility: cost‑benefit analysis of Excel vs. specialised software.
Requirements: list of required formulas and input sheets.
Design: flowchart of data flow between sheets.
Implementation: prototype workbook.
Testing: test cases with sample data.
Deployment: user guide and training session.
Maintenance: change‑request log for formula updates.
Retirement: archiving old fiscal‑year workbooks.
Databases – Student Records Use the SLC to plan, develop and maintain a relational database for storing student information. Feasibility: analysis of data volume and security requirements.
Requirements: entity‑relationship diagram (ERD) and data‑dictionary.
Design: logical schema, normalization steps.
Implementation: SQL scripts to create tables and constraints.
Testing: test cases for CRUD operations.
Deployment: migration script from legacy system.
Maintenance: log of schema change requests.
Retirement: data export to CSV and de‑commissioning plan.
Expert Systems – Diagnostic Tool Guide the knowledge‑engineering process through SLC phases. Feasibility: assessment of rule‑base size and inference engine options.
Requirements: list of decision rules and confidence levels.
Design: flowchart of inference process.
Implementation: rule files in CLIPS or Prolog.
Testing: test scenarios with expected diagnoses.
Deployment: user interface mock‑up and training.
Maintenance: knowledge‑base update log.
Retirement: transfer of knowledge to new AI platform.

8. Mini‑Case Study (AO3 – Model Selection & Justification)

Scenario (≈150 words)

A regional health board needs a patient‑appointment system. Core functionality (booking, cancellation, reminder emails) is well understood, but the UI must be adapted for both elderly patients and busy clinicians. The board requires a formal hand‑over package for the IT support team, and the project must be delivered within six months to meet a new NHS reporting deadline.

Task for Students

  1. Identify the most appropriate development model (or hybrid) for this project.
  2. Justify your choice using at least three criteria from the decision checklist.
  3. List two artefacts you would produce in the chosen approach.

Answer Key (excerpt)

  1. Recommended approach: A hybrid model – Waterfall feasibility and high‑level design followed by Agile (Scrum) sprints for UI development, with rapid prototyping (RAD) used in the first two sprints to gather elderly‑user feedback.
  2. Justification:
    • Stability of core requirements – booking logic is fixed → Waterfall documentation ensures compliance.
    • Time pressure – six‑month deadline → Agile sprints give early usable increments.
    • Stakeholder availability – clinicians can attend sprint reviews, but elderly users can only provide feedback during prototype sessions → RAD prototyping fits.
  3. Two key artefacts:
    • Feasibility report & risk register (Waterfall phase).
    • Sprint backlog with user stories for UI screens (Agile phase).

9. Summary

  • The System Life Cycle provides a structured framework that aligns with Topic 15 project‑management tools and with the broader Cambridge IT syllabus.
  • Waterfall delivers predictability and thorough documentation – ideal for stable, high‑risk or regulated projects.
  • Agile (Scrum) offers flexibility, early delivery and continuous risk mitigation – best for evolving requirements.
  • RAD accelerates delivery through rapid prototyping – suited to UI‑centric, time‑critical applications.
  • Hybrid approaches allow you to combine the strengths of two or more models to meet mixed requirements (e.g., documentation + speed).
  • Choosing a method requires weighing requirement stability, time constraints, regulatory needs, stakeholder availability and team capability, and then mapping the choice to the appropriate SLC artefacts and project‑management processes.

Create an account or Login to take a Quiz

41 views
0 improvement suggestions

Log in to suggest improvements to this note.