Know and understand the definition, characteristics and use of test data using normal, abnormal and extreme data

7 The Systems Life Cycle (SLC)

Objective

Know and understand the whole SLC – analysis, design, development & testing, implementation, documentation and evaluation – and be able to define, select and use normal, abnormal and extreme test data effectively.


7.1 Analysis

The aim of analysis is to produce a clear, written description of the current system and the requirements for the new system.

  • Techniques for gathering information

    • Observation – watch users perform tasks.
    • Interviews – ask open‑ended questions.
    • Questionnaires – collect data from many users quickly.
    • Document review – study existing forms, reports, manuals.

  • System specification template (syllabus requirement)

    System Specification
    PurposeWhat the system will achieve.
    User requirementsFunctions users need (e.g., calculate payroll).
    Information requirementsData that must be stored, processed and reported.
    InputsAll data entered by users or other systems.
    OutputsReports, screens, printed documents.
    ConstraintsHardware, software, legal or security limits.


7.2 Design

Design converts the specification into detailed artefacts that guide development and testing.

  • Data & file structures

    • Data dictionary – defines each field (name, type, length, allowed values, mandatory). See the example in the next box.
    • File‑layout diagram – shows the physical record format and the relationship between files. (A simple block diagram is sufficient for IGCSE.)

  • Input & output formats

    • Screen/form mock‑ups with labelled fields.
    • Report templates (column headings, totals, footers).

  • Validation routines (the basis for test data)

    • Range checks – value must be between a minimum and maximum.
    • Type checks – numeric, alphabetic, date, etc.
    • Format checks – e.g., “DD/MM/YYYY”.
    • Presence checks – mandatory fields cannot be blank.
    • Check‑digit or checksum rules for IDs.

Example: Data Dictionary for a Payroll System

FieldTypeLengthValid Range / FormatMandatory?
Employee IDAlphanumeric6AA1234Yes
SalaryNumeric70 – 99 999Yes
Date of BirthDate10DD/MM/YYYYNo

Simple File‑Layout Diagram (illustrative)

+-------------------+-------------------+-------------------+

| Record 1 (Employee) |

|---------------------------------------------------|

| EmpID (6) | Name (30) | Salary (7) | DOB (10) |

+-------------------+-------------------+-------------------+

| Record 2 (Employee) …………………………………………|

+---------------------------------------------------+


7.3 Development & Testing

7.3.1 Development

Write the program, create the database or configure the off‑the‑shelf package according to the design artefacts. Follow good coding practice – naming conventions, comments and modular structure – to make later testing easier.

7.3.2 Testing strategy

  • Module (unit) testing – test individual routines or screens.
  • Integration testing – test interaction between modules (e.g., data entry → calculation).
  • System testing – test the complete solution against the original specification.

7.3.3 Test‑Plan Checklist (AO3 requirement)

ItemWhat to include
Test objectivesWhat each test is trying to prove (e.g., “salary field accepts values up to the maximum”).
Test casesNumbered description of the action to be performed.
Test dataNormal, abnormal and extreme values for each input.
Expected resultsExact output or error message that should appear.
Pass/Fail criteriaHow the tester decides if the case succeeded.
Tester & dateWho performed the test and when.

7.3.4 Test Data – Normal, Abnormal & Extreme

AspectNormal (Typical) dataAbnormal (Invalid) dataExtreme (Boundary) data
PurposeConfirm correct functionality under everyday conditions.Verify error handling and validation.Check limits and uncover off‑by‑one or overflow errors.
Typical valuesWithin the defined valid range.Outside the range, wrong type, missing mandatory field.Exactly at the minimum or maximum of the valid range (and just beyond).
Examples (Payroll system)Salary = £2 500Salary = “twenty thousand” (text) or –£100Salary = £0, £99 999 (max), £100 000 (just above max)
Expected system responseSuccessful processing and correct output.Clear error message; transaction rejected.Accept if within limits; otherwise error message.

Why each type is needed (AO3 analysis)

  • Normal data – demonstrate that the system does what it should for the majority of users.
  • Abnormal data – test the robustness of validation routines – essential for security and data integrity.
  • Extreme data – apply the principle of boundary‑value analysis, which uncovers the highest proportion of input‑related bugs.

Security & confidentiality of test data

  • Mask or anonymise personal data (use fictitious names, dummy account numbers).
  • Store test data in a protected folder with read‑only rights for testers.
  • Delete or overwrite test files after the testing phase to avoid accidental release of sensitive information.


7.4 Implementation

Implementation moves the tested solution into the live environment. Four common strategies are used, each with advantages and disadvantages.

MethodHow it worksWhen it is best used
Direct changeoverOld system is switched off and the new system is switched on at the same moment.Simple, low cost, low risk of data duplication – suitable when the old system is obsolete.
Parallel runningBoth old and new systems run simultaneously for a set period; results are compared.High confidence in correctness – ideal for critical systems (e.g., banking).
Pilot (trial) implementationNew system is introduced to a small, representative group first.Useful when the organisation is large or the change is significant.
Phased (step‑by‑step) implementationSystem is introduced in stages (module by module or department by department).Reduces risk and allows learning – good for complex, integrated solutions.

Simple decision‑matrix (example)

CriteriaDirectParallelPilotPhased
CostLowHighMediumMedium
Risk of data lossHighLowMediumLow
User training neededHighMediumLowMedium
Time to full operationFastSlowMediumSlow


7.5 Documentation

Good documentation supports users, maintainers and future developers. Two families are required by the syllabus.

Technical documentation (for developers/maintainers)

  • System overview and architecture diagram.
  • Data dictionary and file‑layout diagram.
  • Source‑code listings (or module descriptions) with comments.
  • Installation and configuration instructions.
  • Test‑plan and test‑case results (useful for regression testing).

User documentation (for end‑users)

  • Purpose of the system and any prerequisites.
  • Step‑by‑step procedures (e.g., “How to enter a new employee”).
  • Screen mock‑ups with labelled fields.
  • Examples of correct and incorrect input.
  • Troubleshooting guide and contact details for support.

Audience awareness

Tailor language, level of detail and visual aids to the intended audience – technical staff need precise terminology, whereas non‑technical staff benefit from screenshots and plain‑English instructions.


7.6 Evaluation

After implementation the solution is reviewed against the original requirements and the six evaluation criteria set out in the syllabus.

Evaluation criterionGuiding questions
Functional requirementsDoes every required function work as specified?
EfficiencyIs the system fast enough? Does it use resources (CPU, storage) reasonably?
UsabilityIs the interface clear for the intended users? Are error messages helpful?
Security & data protectionAre passwords, encryption and access controls adequate? Were test data handled safely?
Reliability & maintainabilityDoes the system recover gracefully from failures? Is the code/documentation easy to modify?
Legal & ethical complianceDoes the system meet data‑protection legislation and e‑safety guidelines?

Document the findings, suggest improvements and, if necessary, plan a new iteration of the SLC (often called a “maintenance cycle”).


Glossary (AO1 – recall)

  • Normal test data – values that lie comfortably within the defined valid range; used to confirm that the system works as intended under typical conditions.
  • Abnormal test data – values that violate one or more validation rules (wrong type, out‑of‑range, missing mandatory field); used to check error handling.
  • Extreme test data – values that sit on or just beyond the boundaries of the valid range; used for boundary‑value analysis.
  • Boundary‑value analysis – a testing technique that focuses on the limits of input domains because many defects occur at the edges.
  • Data dictionary – a table that defines each data item (field name, type, length, valid values, mandatory status).
  • File‑layout diagram – a visual representation of the physical structure of a file or database table.


Key Points to Remember

  • The SLC consists of Analysis → Design → Development & Testing → Implementation → Documentation → Evaluation.
  • Good test data are relevant, representative, traceable, repeatable and clearly documented.
  • Use normal, abnormal and extreme data to verify functionality, error handling and boundary conditions.
  • Prepare a concise test‑plan checklist before execution – it links test objectives, cases, data, expected results and pass/fail criteria.
  • Choose an implementation method that balances cost, risk and training needs.
  • Produce both technical and user documentation, keeping the audience in mind.
  • Evaluate the final system against functional, efficiency, usability, security & data protection, reliability & maintainability, and legal & ethical criteria; record improvements for future cycles.