Show understanding of the analysis, design, coding, testing and maintenance stages in the program development life cycle

Program Development Life Cycle (PDLC) – Cambridge IGCSE/A‑Level Computer Science (9618)

The PDLC is the core of 12.1 Program Development Life Cycle in the Cambridge syllabus. It provides a systematic, repeatable approach to producing reliable software and links directly to the assessment objectives:

  • AO1 – Knowledge: define each stage and its artefacts.
  • AO2 – Application: analyse problems, choose appropriate techniques and justify decisions.
  • AO3 – Design & Evaluation: design solutions, implement, test, evaluate and maintain them.

1. Syllabus Context (AS & A‑Level)

LevelCore UnitsExtension Units (A‑Level)
AS

1. Data representation

2. Communication and networking

3. Hardware and processor fundamentals

4. System software

5. Security, ethics and legal issues

6. Databases

7. Algorithms and problem solving

8. Data structures

9. Programming concepts

10. Software development (PDLC) – this note

A‑LevelAll AS units plus deeper treatment of algorithms, data structures and programming.

13. User‑defined data types

14. File organisation and access methods

15. Floating‑point representation

16. Communication protocols

17. Parallel processing

18. Virtual machines and operating systems

19. Encryption and cryptography

20. Artificial intelligence, recursion and exception handling

21. Advanced programming paradigms

2. The Five Stages of the PDLC

2.1 Analysis

  • Primary goal: Understand the problem and agree what the software must do.
  • Key activities

    • Stakeholder interviews / questionnaires.
    • Identify functional & non‑functional requirements.
    • Feasibility study (time, cost, technology, risk).
    • Prioritise requirements (e.g., MoSCoW).
    • Produce a clear Requirements Specification.

  • Typical artefacts

    • Requirements Specification (text + tables).
    • Use‑case or activity diagrams.
    • Feasibility report.

  • Exam tip (AO1/AO2): Always show traceability – each requirement should be numbered so you can refer to it in design, code and test cases.

2.2 Design

  • Primary goal: Produce a detailed blueprint that can be turned into code.
  • Key activities

    • Select the programming paradigm (procedural, OOP, functional).
    • High‑level design: Data‑flow diagrams (DFDs), Entity‑relationship diagrams (ERDs), class diagrams.
    • Algorithm design: flowcharts, pseudocode, structured English.
    • Define module decomposition, interfaces, data structures and storage.

  • Typical artefacts

    • Design Specification (document).
    • DFDs, ERDs, class diagrams.
    • Flowcharts and pseudocode for each module.

  • Exam tip (AO3): Keep naming consistent with the analysis stage – this demonstrates traceability and earns marks for clear design.

2.3 Coding (Implementation)

  • Primary goal: Translate the design into executable source code.
  • Key activities

    • Write code following language‑specific conventions (indentation, naming).
    • Comment liberally – purpose, algorithm, assumptions, complexity.
    • Use version‑control (Git, SVN) with regular commits.
    • Compile / interpret and create build scripts or makefiles.

  • Typical artefacts

    • Source‑code files (.java, .py, .vb, etc.).
    • Build scripts (makefile, Ant, Maven).
    • Version‑control log.

  • Exam tip (AO1): Show at least one well‑commented code fragment and explain how it implements a specific design element.

2.4 Testing

  • Primary goal: Verify that the program meets its requirements and is free of defects.
  • Key activities

    • Develop a test plan (objectives, resources, schedule, pass/fail criteria).
    • Create test cases derived directly from requirements (include edge & boundary values).
    • Execute tests at four levels:

      1. Unit testing – single modules/functions.
      2. Integration testing – module interactions.
      3. System testing – whole application.
      4. Acceptance testing – client validation.

    • Apply test techniques:

      • Black‑box (equivalence partitioning, boundary testing).
      • White‑box (statement/branch coverage, path testing).
      • Alpha / beta testing for user‑focused feedback.

    • Record defects, classify severity (Critical, Major, Minor) and feed back to coding.

  • Typical artefacts

    • Test plan document.
    • Test case table (ID, input, expected output, result).
    • Defect/bug log.
    • Test report (coverage statistics, pass rate).

  • Exam tip (AO3): Show a traceability matrix linking requirements → design → test cases. This is a high‑scoring AO3 feature.

2.5 Maintenance

  • Primary goal: Keep the software operational, efficient and relevant after deployment.
  • Types of maintenance

    • Corrective – fix bugs discovered in use.
    • Adaptive – modify for new hardware, OS, regulations or standards.
    • Perfective – improve performance, usability or add minor features.
    • Retirement – plan migration to a replacement system.

  • Key activities

    • Analyse change request (link back to analysis).
    • Update design documentation and source code.
    • Re‑run relevant test cases (regression testing).
    • Release patches or new versions, update user manuals.

  • Typical artefacts

    • Patch release files.
    • Maintenance log (date, change description, author).
    • Updated user documentation.
    • Decommission / migration plan (for retirement).

  • Exam tip (AO2/AO3): Mention that maintenance typically consumes 60‑80 % of the total life‑cycle cost – a fact frequently asked in AO2 questions.

3. Comparison of Common Life‑Cycle Models (exam‑relevant)

ModelStructureTypical use‑caseAdvantagesDisadvantages
Waterfall (classic PDLC)Linear, sequential stages.Well‑defined, stable requirements (e.g., payroll system).Clear documentation; easy to manage milestones.Inflexible to change; defects discovered late are costly.
Iterative / IncrementalRepeat the PDLC for each increment.Projects where requirements evolve (e.g., mobile app).Early delivery of useful functionality; feedback can be incorporated.More documentation overhead; requires strong project control.
Agile (Scrum)Sprints (2‑4 weeks) containing analysis, design, coding, testing.Rapidly changing environments, small cross‑functional teams.High customer involvement; continuous improvement.Less formal documentation; can be hard to scale for large systems.
SpiralRisk‑driven cycles of planning, engineering, evaluation.High‑risk, large‑scale systems (e.g., aerospace control).Explicit risk analysis each cycle; early prototyping.Complex to manage; expensive in terms of time and resources.

4. Example Test Plan (A‑Level style)

Test Plan – “Student Record Management” System

-------------------------------------------------

1. Objective: Verify that all functional and non‑functional requirements are met.

2. Scope: Modules – Login, AddStudent, SearchStudent, GenerateReport.

3. Resources: Laptop (Windows 10), Java 17, JUnit 5, Selenium (UI testing).

4. Schedule:

• Unit testing – 4 h

• Integration testing – 3 h

• System testing – 4 h

• Acceptance testing – 2 h

5. Test Cases (excerpt):

• TC‑001 (Login – valid credentials) – Expected: Main menu displayed.

• TC‑005 (AddStudent – boundary age 0) – Expected: Validation error.

• TC‑012 (SearchStudent – case‑insensitive) – Expected: Correct record returned.

6. Test Techniques:

– Black‑box equivalence partitioning for numeric fields.

– White‑box statement coverage for the ‘calculateGPA()’ routine.

7. Pass/Fail Criteria: ≥ 95 % of test cases must pass; no critical defects.

8. Defect Reporting: JIRA ticket with severity (Critical, Major, Minor).

9. Acceptance: End‑user (teacher) signs off after successful acceptance testing.

5. Practical Component (Paper 4) – What to Remember

  • The PDLC will be demonstrated using pseudocode or a real language (Java, Python, Visual Basic). You must be able to:

    • Write clear, indented pseudocode that mirrors the design.
    • Apply appropriate naming conventions and comments.
    • Identify where unit tests would be written (e.g., JUnit test methods).

  • Typical exam tasks:

    • Given a requirements specification, produce a design artefact (flowchart, class diagram or pseudocode).
    • Write a short piece of code and then a set of test cases for it.
    • Analyse a faulty program, locate the bug and suggest a corrective‑maintenance action.

  • Time‑saving tip: reuse the same variable names and module titles across analysis, design and code – this demonstrates traceability (AO3).

6. Key Points for Examination

  1. Why each stage matters: Omit analysis → unclear requirements → re‑work; omit testing → undiscovered defects; omit maintenance → system becomes unusable.
  2. Feedback loops: Defects found in testing → return to coding; user feedback after acceptance → adaptive maintenance; major change → back to analysis.
  3. Cost distribution: In most real‑world projects, maintenance accounts for 60‑80 % of total life‑cycle cost – a point often asked in AO2.
  4. Model comparison: Be ready to list at least two advantages and two disadvantages of Waterfall vs. Agile (or Spiral) and justify which is more suitable for a given scenario.
  5. Evaluation language: Use terms such as “traceability”, “defect density”, “risk”, “iteration”, “stakeholder satisfaction”, “regression testing” when answering AO3 questions.

7. Revision Diagram (Suggested)

A circular flowchart showing the five PDLC stages with double‑headed arrows:

  • Analysis → Design → Coding → Testing → Maintenance → (back to Analysis if major changes are required).
  • Additional short arrows:

    • Testing → Coding (bug fixing).
    • Maintenance → Analysis (new requirements or adaptive changes).

Label each arrow with the artefact that moves forward, e.g., “Requirements Specification”, “Design Specification”, “Source Code”, “Test Report”, “Patch Release”.

8. Audit Checklist – Aligning Your Notes with the Cambridge Syllabus

Syllabus Unit (AS)Sub‑topics (required)Covered in notes? (✓/✗/≈)Comments / Missing concepts
12.1 Software development (PDLC)

• Analysis – requirements, feasibility, stakeholder identification.

• Design – DFDs, ERDs, class diagrams, flowcharts, pseudocode.

• Coding – conventions, comments, version control, build scripts.

• Testing – test plan, test cases, unit/integration/system/acceptance, black‑box & white‑box techniques, defect log.

• Maintenance – corrective, adaptive, perfective, retirement, cost distribution.

All core elements are present; ensure examples use the same variable names throughout for traceability.
12.2 Alternative life‑cycle modelsWaterfall, Iterative/Incremental, Agile (Scrum), Spiral – structure, typical use‑case, pros & cons.Table includes all four models with exam‑relevant points.
12.3 Testing techniquesEquivalence partitioning, boundary testing, statement/branch coverage, alpha/beta testing, regression testing.Regression testing mentioned implicitly; add a short bullet “Run previously‑passed test cases after any change”.
12.4 Evaluation of softwareDefect density, maintainability, usability, performance, stakeholder satisfaction, cost‑benefit analysis.Include a brief evaluation checklist (e.g., “Is the system reliable? Is it maintainable? Does it meet non‑functional requirements?”).

Use this checklist to verify that every required sub‑topic appears in your revision material. Mark each row, add notes, and then fill any gaps before the exam.