Describe the principles, benefits and drawbacks of each type of life cycle

Software Development Life‑Cycle (SDLC) – Cambridge AS & A‑Level Overview

This note covers the full range of life‑cycle models required by the Cambridge 9618 syllabus, links each model to the five standard PDLC stages, and connects the concepts to the other major syllabus topics (hardware, data representation, communication, security, databases, AI and programming). Use the tables and “link‑back” boxes to see how life‑cycle decisions affect algorithm design, testing, documentation and evaluation (AO1‑AO3).


1. The Five Standard PDLC Stages

  1. Analysis (requirements gathering) – identify the problem, users and functional & non‑functional requirements.
  2. Design – produce high‑level architecture (system components, data flow) and detailed designs (algorithms, data structures, pseudo‑code).
  3. Coding (implementation) – translate the detailed design into a working program using a chosen language/paradigm.
  4. Testing – verify correctness (unit, integration, system, acceptance) and evaluate security, performance and usability.
  5. Maintenance – correct faults, improve performance, add features or adapt to new hardware/software environments after deployment.

Why the PDLC stages matter for the syllabus

  • AO1 (knowledge): you must define each stage and its purpose.
  • AO2 (application): you will be asked to apply the stages when solving a programming problem or designing a system.
  • AO3 (evaluation): you must evaluate the suitability of a life‑cycle model for a given context.

2. Choosing a Life‑Cycle Model

Projects differ in size, risk, time‑to‑market and stability of requirements. Selecting an appropriate model helps to:

  • Minimise re‑work and cost.
  • Control risk (especially for safety‑critical or high‑cost systems).
  • Match stakeholder involvement and resource availability.

3. Main Life‑Cycle Models – Principles, Benefits & Drawbacks

Model Key Principles Benefits (AO2) Drawbacks (AO3)
Waterfall
  • Linear, sequential flow – each PDLC stage is completed before the next begins.
  • Heavy documentation at the end of each stage.
  • Changes are discouraged once a stage is signed off.
  • Clear milestones; easy to track progress and budget.
  • Well‑suited to projects with stable, well‑understood requirements (e.g., payroll system).
  • Inflexible to requirement changes – costly re‑work.
  • Errors often discovered late (during testing).
Iterative
  • Repeats the five PDLC stages in cycles (iterations) that each produce a working partial system.
  • Feedback from one iteration informs the next.
  • Early delivery of usable functionality.
  • Allows requirements to evolve.
  • Needs a solid overall architecture to avoid re‑work.
  • More planning effort than pure Waterfall.
Incremental
  • System split into independent modules; each module follows a mini‑Waterfall.
  • Modules delivered one after another.
  • Risk spread across increments; early user feedback.
  • Stakeholders can start using parts of the system early.
  • Integration testing can become complex.
  • Requires careful overall design to keep modules compatible.
Prototyping
  • Build a quick, simplified version of the system to explore requirements.
  • Iteratively refine the prototype based on user feedback.
  • Clarifies user expectations early.
  • Reduces risk of costly re‑work later.
  • Stakeholders may mistake the prototype for the final product.
  • Documentation can be neglected.
Rapid Application Development (RAD)
  • Emphasises fast construction using reusable components and visual development tools.
  • Short development cycles with continuous user involvement.
  • Very short time‑to‑market.
  • High user satisfaction because they see working software early.
  • Requires skilled developers and a strong component library.
  • Not ideal for large, safety‑critical systems where thorough testing is mandatory.
Agile – Scrum
  • Work divided into 2‑4 week sprints; each sprint delivers a potentially shippable increment.
  • Cross‑functional, self‑organising teams.
  • Product backlog continuously prioritised by the Product Owner.
  • Highly responsive to changing requirements.
  • Frequent delivery keeps stakeholders engaged.
  • Promotes collaboration and team morale.
  • Less emphasis on upfront documentation – may be a concern for exam traceability questions.
  • Success depends on disciplined teams and committed stakeholders.

4. Mapping Each Model to the Five PDLC Stages

Model How the PDLC stages are applied
Waterfall Analysis → Design → Coding → Testing → Maintenance (each stage completed once before moving on).
Iterative Each iteration repeats all five stages for a subset of functionality; the final product emerges after several cycles.
Incremental Each increment passes through the five stages; later increments build on earlier ones.
Prototyping Rapid analysis → quick design → fast coding (prototype) → informal testing → refinement; the final system may be built from the refined prototype.
RAD Very short analysis & design cycles, heavy use of reusable components during coding, continuous testing, and rapid maintenance updates.
Agile (Scrum) Each sprint performs a mini‑analysis, design, coding, testing, and a brief maintenance/feedback step.

5. Linking Life‑Cycle Models to Other Syllabus Topics

Iterative ↔ Algorithm Design & Testing
Each iteration forces you to design a small, testable algorithm, run unit tests, and then refine the algorithm in the next cycle – a perfect illustration of AO2 (applying algorithmic techniques) and AO3 (evaluating test results).
Agile ↔ Recursion & Exception Handling
Agile’s short sprints encourage developers to use clean, reusable code. Recursive functions and structured exception handling are easier to maintain across many incremental releases.
Prototyping ↔ User Interface (UI) Design & Human‑Computer Interaction
Rapid prototypes give users a tangible UI early, allowing designers to evaluate ergonomics, accessibility and feedback – linking to the syllabus requirement on HCI.

6. Mini‑Modules for Missing Syllabus Areas (concise, exam‑style)

6.1 Data Representation

  • Binary & Hexadecimal – 8‑bit byte, two’s complement for signed integers.
  • Floating‑point (IEEE 754) – sign, exponent, mantissa; rounding errors affect algorithm accuracy.
  • BCD & Character Encoding (ASCII, Unicode).

Link‑back: During the Testing stage, you must verify that numeric calculations give correct results despite floating‑point rounding.

6.2 Computer Architecture & Processor Fundamentals

  • CPU components: ALU, control unit, registers, cache.
  • Instruction set basics – machine code vs. assembly.
  • Fetch‑decode‑execute cycle; impact on algorithm efficiency (e.g., loop unrolling).

Link‑back: In the Design stage you may choose an algorithm that minimises cache misses for large data sets.

6.3 System Software

  • Operating system functions – process management, memory management, file systems.
  • Utility software (compilers, debuggers, version control).

Link‑back: The Coding stage relies on a compiler; the Testing stage often uses a debugger.

6.4 Communication & Networks

  • TCP/IP model – layers, protocols (HTTP, SMTP, FTP).
  • Network topologies (star, bus, mesh) and hardware (router, switch).
  • Data transmission concepts – bandwidth, latency, error detection (parity, CRC).

Link‑back: When designing a client‑server application, the Analysis stage must specify required network protocols; the Testing stage must include reliability testing over a network.

6.5 Security & Privacy

  • Symmetric encryption (AES, DES) vs. asymmetric (RSA, ECC).
  • Hashing (SHA‑256) and digital signatures.
  • Authentication, authorisation, confidentiality, integrity.

Link‑back: Security requirements are part of Analysis**; secure coding practices are checked during Testing** (e.g., penetration testing).

6.6 Databases

  • Relational model – tables, primary/foreign keys.
  • Normalization (1NF, 2NF, 3NF) to avoid redundancy.
  • SQL – DDL (CREATE, ALTER), DML (SELECT, INSERT, UPDATE, DELETE).

Link‑back: The Design stage includes an ER diagram; the Coding stage implements SQL queries; the Testing stage validates data integrity.

6.7 Artificial Intelligence (AI)

  • Search algorithms – BFS, DFS, A* (heuristic).
  • Machine learning basics – supervised vs. unsupervised, neural networks.
  • Ethical considerations – bias, privacy.

Link‑back: An AI component would be prototyped early (Prototyping model) and iteratively refined (Iterative model).

6.8 Programming Paradigms & Recursion

  • Procedural, object‑oriented, functional.
  • Recursion – base case, recursive case, stack depth.
  • Exception handling – try/catch, custom exceptions.

Link‑back: Choice of paradigm influences the Design** (e.g., class diagrams for OOP) and the Testing** strategy (unit tests for recursive functions).


7. Worked Example – Selecting a Life‑Cycle Model (AO2)

Problem: A university wants an online exam‑submission system that must (a) authenticate students, (b) accept large file uploads, (c) run plagiarism detection, and (d) generate real‑time analytics for staff.

  1. Analyse requirements – high security, variable load, need for frequent updates (e.g., new plagiarism algorithms).
  2. Choose a modelAgile (Scrum) is most appropriate because:
    • Stakeholders (students, staff) can give continuous feedback.
    • Security patches and algorithm updates can be released in short sprints.
    • Rapid prototyping of the UI can be done early.
  3. Justify (AO3) – Agile’s flexibility outweighs the need for heavy documentation; however, the safety‑critical authentication component will still require thorough security testing (a drawback that must be mitigated by adding a dedicated “security sprint”).

8. Sample Exam Questions (AO1‑AO3)

Level Question Key Points for Marking
AS (AO1) Define the five stages of the PDLC and give one purpose for each. Accurate definitions + one correct purpose per stage (5 marks).
A (AO2) Explain why an iterative model would be suitable for developing a mobile game that will be updated with new levels every six months. Reference to evolving requirements, early delivery, need for repeated testing, and risk reduction (6‑8 marks).
A (AO3) Compare the Waterfall and Agile models in terms of handling changing security requirements for a banking application. Balanced comparison: Waterfall – good documentation, hard to change; Agile – flexible but needs disciplined security testing; conclude with justified recommendation (8‑10 marks).
AS (AO2) Given a simple pseudo‑code algorithm, identify a possible defect and suggest a test case that would reveal it. Identify defect (e.g., off‑by‑one error), design appropriate test case, explain why it would fail (6 marks).

9. Evaluation Checklist – Deciding Which Model to Use

Criterion Guiding Questions
Project size & complexity Are there many interacting components or a simple, single‑function system?
Stability of requirements Will requirements change significantly after the analysis phase?
Time‑to‑market pressure Do stakeholders need a usable product quickly?
Risk level (safety‑critical, financial, data‑privacy) Is extensive verification and documentation mandatory?
Stakeholder involvement Can users give regular feedback throughout development?
Team expertise & resources Do we have reusable components, skilled developers, and suitable tools (e.g., CASE, version control)?

When answering exam questions, use this checklist to justify your model choice and to discuss at least one advantage and one disadvantage.


10. Syllabus Coverage Map – Quick Reference

Syllabus Area Notes Included Here Further Reading / Mini‑Module
Software Development Life‑Cycle Sections 1‑4, 7‑9 Full PDLC description (already covered)
Data Representation Mini‑module 6.1 Binary, floating‑point, BCD, ASCII/Unicode
Computer Architecture & Processor Mini‑module 6.2 CPU components, instruction set, cache
System Software Mini‑module 6.3 OS functions, compilers, debuggers
Communication & Networks Mini‑module 6.4 TCP/IP, protocols, topologies, error detection
Security & Privacy Mini‑module 6.5 Encryption, hashing, authentication
Databases Mini‑module 6.6 Relational model, normalization, SQL
Artificial Intelligence Mini‑module 6.7 Search algorithms, machine learning basics
Programming Paradigms & Recursion Mini‑module 6.8 Procedural, OOP, functional, exception handling
Evaluation (AO3) Sections 5, 8, 9 Checklist, comparative tables, exam‑style evaluation

Use the map to ensure you have addressed every required topic when revising for the Cambridge AS & A‑Level Computer Science exam.

Create an account or Login to take a Quiz

85 views
0 improvement suggestions

Log in to suggest improvements to this note.