Know and understand test designs including the testing of data structures, file structures, input formats, output formats and validation routines

Cambridge IGCSE ICT – Test Designs and Related Syllabus Topics

1. Syllabus Coverage Map

Syllabus BlockTopicCoverage in These Notes
1 – Computer hardwareCPU, RAM, ROM, storage media, input & output devices
2 – Input & outputKeyboards, mice, scanners, printers, displays
3 – StoragePrimary vs secondary, magnetic, optical, flash, cloud
4 – NetworksLAN, WAN, Internet, protocols, wireless, security basics
5 – Effects of ICTSocial, economic, environmental, health & safety impacts
6 – ICT applicationsCommunication, modelling, banking, medicine, retail, expert systems, etc.
7 – Systems life‑cycle (SLC)Analysis, design, development, implementation, documentation, evaluation✔ (expanded)
8 – Safety & securityPhysical safety, e‑safety, data protection, threats, encryption, authentication✔ (detailed)
9 – Audience & copyrightTarget users, usability, copyright law, licences, ethical use of test data✔ (expanded)
10 – CommunicationEmails, instant messaging, video‑conferencing, netiquette✔ (quick checklist)
11 – File managementFolders, naming conventions, backup, compression✔ (quick checklist)
12 – ImagesRaster vs vector, resolution, colour depth, editing basics✔ (quick checklist)
13 – Layout & stylesPage setup, headings, fonts, paragraph styles, templates✔ (quick checklist)
14 – Proofreading & publishingSpelling/grammar tools, version control, printing, PDF creation✔ (quick checklist)
15 – Graphs & chartsBar, line, pie, scatter, appropriate use, axis labeling✔ (quick checklist)
16 – Document productionWord‑processors, mail‑merge, mail‑labels, collaborative editing✔ (quick checklist)
17 – DatabasesTables, fields, records, primary/foreign keys, queries, forms, reports✔ (quick checklist)
18 – PresentationsSlides, animation, multimedia, presenter notes, delivery tips✔ (quick checklist)
19 – SpreadsheetsFormulas, functions, charts, data‑validation, pivot tables✔ (quick checklist)
20 – Website authoringHTML, CSS, multimedia embedding, navigation, accessibility✔ (quick checklist)
21 – Assessment objectives & weightingAO1 (knowledge), AO2 (application), AO3 (evaluation) – 32 %/60 %/8 %

2. The Systems Life‑Cycle (SLC) – Full Overview

  1. Analysis – Gather user requirements (questionnaires, interviews), produce a requirements specification, identify constraints (budget, time, legal).
  2. Design – Create data models (ER diagrams), file structures, UI mock‑ups, and algorithms (flowcharts or pseudocode).
  3. Development & Testing – Write code, build databases, configure hardware, and carry out systematic testing (see Sections 3‑7).
  4. Implementation – Install hardware/software, migrate data, train users, and roll‑out the system.
  5. Documentation – Produce user manuals, technical guides, maintenance procedures, and online help.
  6. Evaluation – Compare actual performance with the original specifications, conduct cost‑benefit analysis, collect user feedback, and review safety, security, and ethical issues.

Testing (Section 3) links directly to AO2 (apply knowledge) and AO3 (evaluate). The heavy weighting of these objectives explains why the exam places strong emphasis on test design.

3. Development & Testing – Core Test‑Design Concepts

3.1 Test Strategy

  • Define the why – what you intend to prove (e.g., “All validation routines reject invalid input”).
  • Choose testing levels (see 3.6) and techniques:

    • Black‑box (functionality, user‑view)
    • White‑box (code structure, path coverage)

  • Decide on automation vs. manual execution.
  • Set pass/fail criteria and exit points for each level.

3.2 Test Plan

A written document that guides the whole testing effort.

ComponentWhat to Include
ObjectivesSpecific aims (e.g., “Validate numeric fields for range and type”).
ScopeModules, features, interfaces covered.
ResourcesPeople, hardware, software, test‑data sets.
ScheduleTimeline for unit, integration, system and acceptance testing.
ResponsibilitiesWho designs, executes, records, and reviews each test.
Risk & ContingencyPotential problems (missing data, environment failures) and mitigation.

3.3 Test Data

  • Valid data – conforms to all format, type and range rules.
  • Invalid data – breaks one or more rules (e.g., letters in a numeric field).
  • Boundary values – just inside and just outside the allowed limits.
  • Equivalence partitions – one representative value from each class of valid/invalid data.

3.4 Test Case Structure

Record each test case in a consistent table.

Test IDDescriptionPre‑conditionsInputExpected ResultActual ResultStatusRemedial Action
T01Age field accepts only numbers 0‑120Form openedAge = -5Error “Age must be between 0 and 120”Pass/FailCorrect validation routine

3.5 Expected vs. Actual Results & Remedial Action

  1. Write the expected outcome before running the test.
  2. Execute the test and record the actual outcome.
  3. If they differ, classify the defect (logic error, data‑type error, UI error) and suggest a remedial action.
  4. Apply the fix, retest, and update the status to “Pass”.

3.6 Testing Levels (Cambridge expects four)

  • Unit (module) testing – test individual procedures/functions in isolation.
  • Integration testing – test interaction between two or more modules (e.g., form → database).
  • System testing – test the complete, integrated system against the original specifications.
  • User‑acceptance testing (UAT) – end‑users verify that the system meets their needs and is ready for deployment.

4. Testing Specific Aspects

4.1 Data Structures

  • Check data‑type correctness (integer, string, date, Boolean).
  • Apply boundary‑value analysis (minimum, maximum, just‑outside).
  • Validate referential integrity (primary‑key/foreign‑key relationships).
  • Test handling of NULL or missing values.

Example – Age field (integer 0‑120)

InputExpected Result
-1Error – age cannot be negative
0Accepted
120Accepted
121Error – exceeds upper limit

4.2 File Structures

  • Sequential files – test correct ordering, end‑of‑file handling, and record addition.
  • Indexed files – verify index creation, update, and search performance.
  • Random‑access files – test direct retrieval using a key and correct handling of non‑existent keys.

Key test points:

  • Record layout (field order, delimiters, fixed/variable length).
  • Add/modify/delete operations without corruption.
  • Search speed for indexed files (basic performance check).
  • Integrity after a batch of operations (no lost or duplicated records).

4.3 Input Formats

  • All required fields present.
  • Format constraints (e.g., date as DD/MM/YYYY, email pattern user@domain.com).
  • Special‑character handling (quotes, commas, line‑breaks).
  • Basic injection tests (e.g., ' OR 1=1 --) to ensure validation blocks malicious input.

4.4 Output Formats

  • Correct number of decimal places for monetary values (e.g., £12.34).
  • Consistent headings, column alignment, and grid lines in tabular reports.
  • Uniform units and symbols (kg, cm, $).
  • Locale‑specific formatting (date, time, currency) matches user settings.

4.5 Validation Routines

Code that checks input before processing.

  1. Data‑type verification (numeric, alphabetic, date).
  2. Range checking (minimum/maximum values).
  3. Length checking (minimum/maximum characters).
  4. Pattern matching with regular expressions.
  5. Cross‑field validation (e.g., start date must precede end date).

Pseudocode example – Age validation

IF NOT isNumeric(age) THEN

DISPLAY "Age must be a number."

ELSE IF age < 0 OR age > 120 THEN

DISPLAY "Age must be between 0 and 120."

ELSE

CONTINUE processing

END IF

5. Safety & Security (Syllabus Block 8)

  • Physical safety – cable management, proper ventilation, earthing, safe workstation ergonomics.
  • E‑safety – phishing, smishing, vishing awareness; safe browsing; strong password policies.
  • Data protection – encryption of stored data, secure backup, GDPR‑style principles (consent, limited access).
  • Threats – viruses, malware, ransomware, unauthorised access; test lock‑out after a set number of failed logins.
  • Encryption & authentication – verify HTTPS, SSL/TLS certificates, password masking, role‑based access control.

6. Audience & Copyright (Syllabus Block 9)

  • Audience analysis – create personas, consider age, ICT skill level, language, and accessibility needs.
  • Usability testing – observe real users performing key tasks; note errors and satisfaction scores.
  • Copyright legislation – differentiate between copyright, licence, fair‑use, and public‑domain material.
  • Ethical test data – never use real personal data in a test environment; generate synthetic data or anonymise.
  • Licensing – recognise free‑software licences (GPL, MIT) vs commercial licences; respect software piracy rules.

7. Quick Checklist of the Remaining ICT Syllabus (Blocks 10‑20)

BlockKey Practical Skills to Practise
10 – CommunicationEmail etiquette, attaching files, using instant messaging, video‑conference tools.
11 – File ManagementFolder hierarchy, naming conventions, compression (ZIP), backup (external drive/cloud).
12 – ImagesResize, crop, convert raster ↔ vector, colour depth, basic editing (brightness/contrast).
13 – Layout & StylesPage orientation, margins, heading styles, bullet/numbered lists, templates.
14 – Proofreading & PublishingSpell‑check, grammar tools, version control, PDF export, printing settings.
15 – Graphs & ChartsSelect appropriate chart type, label axes, add legends, format data series.
16 – Document ProductionMail‑merge, mail‑labels, collaborative editing (cloud), track changes.
17 – DatabasesCreate tables, define fields, set primary/foreign keys, write simple SELECT queries, design forms/reports.
18 – PresentationsSlide layout, transition effects, embed video/audio, presenter notes, rehearse delivery.
19 – SpreadsheetsFormulas (SUM, AVERAGE), functions (VLOOKUP, IF), chart creation, data‑validation, pivot tables.
20 – Website AuthoringBasic HTML tags, CSS styling, insert images/video, create navigation menus, test accessibility.

8. Assessment Objectives (AO) & Weighting

  • AO1 – Knowledge & Understanding – 32 % of total marks. Recall terminology, describe concepts.
  • AO2 – Application – 60 % of total marks. Design, develop, test, and evaluate ICT solutions.
  • AO3 – Evaluation – 8 % of total marks. Assess the effectiveness, safety, security, and suitability of a solution.

Because AO2 and AO3 together account for 68 % of the exam, a solid grasp of test design, validation, safety, security and audience considerations is essential for high marks.

9. Suggested Diagram – Test Process Flow

Test Process Flowchart

Test Process: Test Planning → Test Case Design → Test Data Preparation → Test Execution → Record Expected & Actual Results → Analyse Defects → Remedial Action → Retest → Sign‑off.

10. Key Points to Remember for the Exam (AO2 & AO3)

  • Design both valid and invalid test cases; use boundary‑value analysis to minimise the number of cases while maximising coverage.
  • State the expected result *before* execution – examiners look for this in AO3.
  • Include a test plan with objectives, scope, resources, schedule and risk assessment.
  • Identify the appropriate testing level for each case and justify your choice.
  • When a test fails, describe a concise remedial action and indicate how you would re‑test.
  • Link every test to a safety, security, audience or copyright consideration where relevant – this demonstrates holistic evaluation (AO3).
  • Use the Syllabus Map (Section 1) to check that you have covered every required block before the exam.