Know and understand designing file/data structures, input formats, output formats and validation routines

1 Computer Systems – Hardware, Software and Operating Systems

1.1 Hardware – Main components

ComponentFunctionTypical examples
CPU (Central Processing Unit)Executes instructions, performs calculationsIntel Core i5, AMD Ryzen 5
RAM (Random‑Access Memory)Temporary storage for data and programs while they are running4 GB, 8 GB DDR4
ROM (Read‑Only Memory) / BIOSStores firmware that starts the computerBIOS, UEFI
MotherboardConnects all components, provides buses and slotsATX, Micro‑ATX
Power Supply Unit (PSU)Converts mains AC to DC for the computer450 W, 650 W
Graphics Processing Unit (GPU)Renders images, video and 3‑D graphicsNVIDIA GTX 1660, integrated Intel UHD
Storage devices (see §2)

1.2 Software – System vs. Application

  • System software – controls hardware, provides platform for applications.
    Examples: operating systems (Windows, macOS, Linux), device drivers, utility programmes.
  • Application software – designed to perform specific tasks for the user.
    Examples: word processors, spreadsheets, databases, web browsers, learning‑management systems.

1.3 Operating Systems (OS)

  • Manage hardware resources (CPU scheduling, memory allocation).
  • Provide a user interface – graphical (GUI) or command‑line (CLI).
  • Common IGCSE examples: Windows 10, macOS, Chrome OS, Linux (Ubuntu).
  • Key OS functions: file management, security, multitasking, networking.

2 Data Storage – Media, Devices and Management

2.1 Primary vs. Secondary storage

  • Primary (volatile) storage – RAM; data is lost when power is removed.
  • Secondary (non‑volatile) storage – retains data without power.

2.2 Secondary storage media

Media typeTechnologyTypical capacityAdvantagesDisadvantages
Magnetic hard‑disk drive (HDD)Spinning platters, magnetic heads500 GB – 4 TBLarge capacity, low cost per GBMechanical wear, slower than SSD
Solid‑state drive (SSD)Flash memory chips120 GB – 2 TBFast access, no moving partsHigher cost per GB
Optical disc (CD/DVD/BD)Laser‑etched pits700 MB – 100 GBPortable, inexpensive for distributionLimited rewrite cycles, slower access
USB flash driveFlash memory, USB interface4 GB – 256 GBVery portable, plug‑and‑playEasy to lose, limited lifespan
Cloud storageRemote servers accessed via InternetVariable (pay‑as‑you‑go)Accessible from any device, automatic backupRequires Internet, ongoing cost

2.3 File organisation

  • Sequential – records stored one after another; good for batch processing.
  • Random (direct) access – fixed‑length records allow immediate retrieval via a key.
  • Indexed – a separate index file maps keys to record locations, combining speed of random access with flexibility of variable‑length records.

3 Input and Output (I/O) Devices

3.1 Input devices – characteristics

DeviceData typeAdvantagesDisadvantages
KeyboardAlphanumeric, control keysFast, accurate for textRequires typing skill
Mouse / TouchpadPointing, clickingIntuitive for GUI navigationNot suitable for text entry
ScannerImages, printed text (OCR)Digitises paper documentsRequires software to interpret
Digital camera / webcamImages, videoCapture visual informationLarge file sizes
MicrophoneAudioVoice input, recordingsBackground noise issues
Barcode readerAlphanumeric (encoded)Fast data entry, reduces errorsLimited to bar‑coded items
Sensor (e.g., temperature, motion)Numeric, binaryAutomates data collectionRequires calibration

3.2 Output devices – characteristics

DeviceOutput typeAdvantagesDisadvantages
Monitor (LCD/LED)Visual – text, graphics, videoImmediate feedback, high resolutionEye strain if used long
Printer (laser/inkjet)Hard copy – text, imagesPermanent records, useful for reportsConsumables cost, slower than screen
Speaker / headphonesAudio – speech, musicUseful for alerts, multimediaCan be disruptive in shared spaces
PlotterLarge‑scale graphics (architectural, engineering)High‑quality line drawingsSpecialised, expensive

4 Networks – Fundamentals and Types

4.1 Basic concepts

  • Network – collection of devices (nodes) connected to share resources.
  • Router – directs data between different networks (e.g., LAN to Internet).
  • Switch – connects devices within a LAN, forwards frames based on MAC addresses.
  • Network Interface Card (NIC) – hardware that provides a device with network connectivity.
  • IP address – unique numeric identifier for a device on a TCP/IP network.

4.2 Network topologies

TopologyLayoutProsCons
StarAll nodes connect to a central switch/routerEasy to manage, single point of failure for each link onlyCentral device failure disables whole network
BusAll nodes share a single backbone cableSimple, cheap cablingHard to isolate faults, limited bandwidth
RingEach node connects to two neighbours forming a circlePredictable performanceFailure of one node breaks the ring (unless using dual ring)
MeshMultiple redundant paths between nodesHigh reliabilityExpensive, complex cabling

4.3 Common network types

  • LAN (Local Area Network) – covers a single building or campus; typical speeds 100 Mbps – 10 Gbps.
  • WAN (Wide Area Network) – spans cities, countries; uses leased lines, satellite, or the Internet.
  • WLAN (Wireless LAN) – Wi‑Fi (IEEE 802.11); provides mobility but may be less secure.
  • MAN (Metropolitan Area Network) – connects multiple LANs within a city.

5 Effects of Information Technology

5.1 Positive effects

  • Improved communication (email, video‑conferencing).
  • Increased productivity – automation of repetitive tasks.
  • Access to information – internet research, e‑learning.
  • Support for decision‑making – data analysis, visualisation.
  • Economic growth – new jobs, e‑commerce.

5.2 Negative effects

  • Health issues – eye strain, repetitive‑strain injuries.
  • Environmental impact – e‑waste, energy consumption.
  • Social concerns – reduced face‑to‑face interaction, digital divide.
  • Security risks – data loss, cyber‑crime.
  • Dependence on technology – loss of manual skills.

6 ICT Applications – Real‑world Examples

DomainApplication exampleKey ICT features used
Business & financeOnline banking, accounting softwareDatabase, secure transactions, e‑payment gateways
EducationVirtual learning environment (VLE)Web‑based content, forums, assessment tools
HealthcareElectronic patient recordsData security, searchable databases, reporting
RetailE‑commerce websiteShopping cart, online payment, inventory database
ManufacturingComputer‑controlled CNC machinesAutomation, sensors, real‑time monitoring
TransportationGPS tracking and routingGeographic information systems, real‑time data
Media & entertainmentStreaming servicesCompression, bandwidth management, user profiles

7 The Systems Life Cycle (SLC)

7.1 Analysis – Recording the Current System & Specifying Requirements

  • Goal: Understand what the user needs and how the existing system works.
  • Techniques:

    • Observation – watch users perform tasks.
    • Interviews – open‑ended questions.
    • Questionnaires – structured data from many users.
    • Document review – existing forms, reports, manuals.

  • Output: Functional & non‑functional requirements, problem list, opportunity list.

Example – Questionnaire for a school exam‑record system

QuestionPurpose
How do you currently record exam scores?Identify existing method (paper, spreadsheet, etc.)
Which users need to view the results?Define user groups (teachers, pupils, parents, admin)
What reports are required?Specify output formats (class list, summary chart)
How often are results updated?Determine frequency and validation needs

Specification table (derived from analysis)

RequirementTypePriority
Store each pupil’s exam scoreFunctionalHigh
Generate a printable PDF report per classFunctionalMedium
System must run on existing school PCsNon‑functionalHigh
Response time for a class report ≤ 2 secondsNon‑functionalLow

7.2 Design – File/Data Structures, Input & Output Formats, Validation Routines

7.2.1 Designing File / Data Structures

  • Choose appropriate data types: numeric, alphanumeric, date, Boolean.
  • Determine field lengths – long enough to avoid truncation but not wasteful.
  • Decide on record layout:

    • Fixed‑length – easy random access.
    • Variable‑length – flexible, needs an index.

  • Define keys:

    • Primary key – uniquely identifies a record.
    • Foreign key – links to another table (relational design).

Example – Student exam record (fixed‑length)

Field NameData TypeLength (bytes)ExampleValidation Rule
StudentIDAlphanumeric8AB123456Presence; unique; pattern ^[A-Z]{2}\d{6}$
FirstNameAlphanumeric20EmmaPresence; length ≤ 20
LastNameAlphanumeric20BrownPresence; length ≤ 20
DateOfBirthDate (YYYY‑MM‑DD)102005-04-12Presence; valid date; not future
ScoreNumeric (integer)385Presence; numeric; range 0‑100

Total record length \(L = \sum{i=1}^{n} li\) where \(l_i\) is the length of field \(i\).

7.2.2 Input Formats

  • Sources: keyboard, file upload, scanner, sensor, external database.
  • Structure options:

    • Delimited – CSV, TSV.
    • Fixed‑width – each field occupies a predefined number of characters.
    • Hierarchical – XML, JSON.

  • Encoding: ASCII, UTF‑8, Unicode – must match the source.
  • Field order: either identical to the file layout or mapped during import.

Chosen format for the case study: CSV file with columns in the order shown in the table above. CSV can be created in spreadsheet software and parsed easily.

7.2.3 Output Formats

  • Screen display: HTML tables, dashboards, forms.
  • Printed reports: PDF (fixed layout), plain‑text, formatted Word document.
  • Data exchange: CSV for spreadsheets, XML/JSON for web services.
  • Graphical output: bar charts, pie charts, line graphs (e.g., average score per class).

Required outputs for the school system:

  1. On‑screen class list (HTML table).
  2. Printable PDF report for each class.
  3. JSON file for the school’s web portal (integration with other services).

7.2.4 Validation Routines

Validation enforces the rules defined in the design stage.

FieldRuleReason
StudentIDPresence, pattern, uniquenessIdentifies the record uniquely.
FirstName / LastNamePresence, length ≤ 20Prevents truncation and blank names.
DateOfBirthPresence, valid date, not futureEnsures realistic ages.
ScorePresence, numeric, range 0‑100Score must be a valid exam mark.

Pseudo‑code (server‑side)

IF StudentID = "" THEN

DISPLAY "Student ID is required"

ELSIF NOT MATCH(StudentID, "^[A-Z]{2}\d{6}$") THEN

DISPLAY "Student ID must be 2 letters followed by 6 digits"

ELSIF EXISTS(StudentID) THEN

DISPLAY "Student ID already exists"

END IF

IF LEN(FirstName) = 0 THEN

DISPLAY "First name is required"

ELSIF LEN(FirstName) > 20 THEN

DISPLAY "First name cannot exceed 20 characters"

END IF

IF NOT ISDATE(DateOfBirth) OR DateOfBirth > TODAY() THEN

DISPLAY "Enter a valid date of birth"

END IF

IF NOT ISNUMERIC(Score) THEN

DISPLAY "Score must be a number"

ELSIF Score < 0 OR Score > 100 THEN

DISPLAY "Score must be between 0 and 100"

END IF

Best practice

  • Validate as early as possible – client‑side HTML5 attributes (required, pattern, min/max) and server‑side checks.
  • Show clear, specific error messages.
  • Log every validation failure for audit and debugging.

7.3 Development & Testing – Test Design, Test Data, Expected vs. Actual Outcomes

7.3.1 Test Plan Template

Test IDTest DescriptionInput DataExpected ResultActual ResultPass/FailRemarks
T01Valid record importCSV row with correct dataRecord stored, no error
T02Score out of rangeScore = 105Validation error “Score must be between 0 and 100”
T03Missing StudentIDStudentID = ""Validation error “Student ID is required”
T04Boundary test – Score = 0Score = 0Record accepted
T05Boundary test – Score = 100Score = 100Record accepted

7.3.2 Types of Test Data

  • Normal data – satisfies all validation rules (e.g., Score = 85).
  • Boundary data – values at the limits of a range (Score = 0 or 100).
  • Abnormal / extreme data – missing fields, wrong data type, overly long text, future dates.

7.3.3 Recording Results

For each test case fill the “Actual Result” column, compare with “Expected Result”, and note any remedial action (e.g., fix validation logic, adjust field length).

7.4 Implementation – Methods of Introducing a New System

MethodHow it worksAdvantagesDisadvantages
Direct changeoverOld system is switched off and the new one starts immediately.Fast, low cost.High risk – if the new system fails there is no fallback.
ParallelBoth old and new systems run together for a period.Low risk – errors can be caught by comparing results.More expensive; double data entry may be needed.
PilotNew system is introduced to a small group (e.g., one year‑group).Allows real‑world testing and user feedback.May delay full roll‑out.
PhasedSystem is introduced module by module (e.g., first input, then reporting).Gradual learning curve; issues isolated to a module.Longer overall implementation time.

Recommended approach for the school exam‑record system: a pilot in one year‑group, followed by a phased roll‑out to the remaining years. This limits disruption and provides feedback before full deployment.

7.5 Documentation – Technical & User Documentation

7.5.1 Technical Documentation (for developers/maintainers)

  • System overview and architecture diagram.
  • Hardware & software requirements.
  • Database schema (tables, fields, primary/foreign keys).
  • File specifications (input CSV layout, output formats).
  • Validation rules table (as shown in 7.2.4).
  • Installation & configuration steps.
  • API/end‑point definitions (if any).
  • Testing summary (test plan, results, known issues).

7.5.2 User Documentation (for end‑users)

  • Purpose of the system and key benefits.
  • Step‑by‑step guide for data entry (screen captures of the input form).
  • Instructions for uploading a CSV file (required column order, accepted file size).
  • How to generate and download reports (PDF, JSON).
  • Explanation of error messages and how to correct them.
  • Contact details for support.

Example user‑guide excerpt

Figure 1 – Upload exam scores (CSV) – “Choose File” button, “Upload” action, validation summary.

7.6 Evaluation – Judging the Implemented Solution

  • Functional criteria – Does the system store all required fields, generate the required reports, and accept the defined input format?
  • Efficiency – Average processing time for a class report (target ≤ 2 seconds); storage used versus original estimate.
  • Ease of use – User feedback on the interface, number of validation errors encountered during the pilot.
  • Reliability – Frequency of system crashes or data‑loss incidents.
  • Maintainability – Clarity of the technical documentation, ease of adding a new subject or exam type.

Evaluation matrix

CriterionSpecificationActual ResultJudgement
All required fields capturedYesYesPass
Report generation time≤ 2 s1.8 s (average)Pass
User‑error rate (validation failures)≤ 5 % of entries3 %Pass
System uptime during pilot≥ 99 %99.5 %Pass
Documentation completenessAll sections present, clear languageMetPass

8 Safety & Security

8.1 Physical safety

  • Ergonomic workstation setup – chair height, monitor position.
  • Safe handling of equipment – avoid static discharge, unplug before cleaning.
  • Fire safety – keep fire extinguishers, know evacuation routes.

8.2 E‑safety (online safety)

  • Protect personal information – never share passwords, use strong passwords.
  • Recognise phishing – check sender address, avoid clicking unknown links.
  • Appropriate use – no illegal downloads, respect copyright.
  • Cyber‑bullying – report abusive messages, use school’s reporting system.

8.3 Data protection & privacy

  • Store personal data securely – encrypted files, access‑controlled folders.
  • Back‑up regularly – external drive or cloud service, follow 3‑2‑1 rule.
  • Comply with data‑protection legislation (e.g., GDPR for UK schools).

8.4 Security measures

MeasurePurposeExample
Passwords & authenticationPrevent unauthorised accessMinimum 8 characters, mixed case, numbers, symbols
FirewallsBlock unwanted network trafficWindows Defender Firewall, router ACLs
Antivirus / anti‑malwareDetect and remove malicious softwareWindows Defender, Avast
Two‑factor authentication (2FA)Add a second verification stepSMS code, authenticator app
EncryptionProtect data in transit and at restHTTPS for web traffic, AES‑256 for file storage

9 Audience, Purpose & Copyright

9.1 Audience analysis

  • Identify who will use the system (teachers, pupils, administrators).
  • Consider skill levels – provide appropriate help and training.
  • Determine accessibility needs (large fonts, screen‑reader compatibility).

9.2 Purpose of the system

State clearly what the system is intended to achieve – e.g., “to record, store and report exam results efficiently, reducing manual paperwork and errors”.

9.3 Copyright & software licensing

  • Only use software with a valid licence – avoid illegal copies.
  • Respect copyright when using images, music, text – obtain permission or use royalty‑free resources.
  • Give proper attribution for any third‑party code or libraries.

10 Communication – Email, Internet Use & Netiquette

10.1 Email etiquette

  • Use a clear subject line.
  • Start with a greeting, end with a signature.
  • Keep messages concise; use paragraphs and bullet points.
  • Check spelling and tone before sending.
  • Attach files only when necessary; compress large files.

10.2 Internet use & searching

  • Use reputable search engines; apply keywords effectively.
  • Evaluate sources – author credibility, date, purpose, bias.
  • Bookmark useful sites; avoid clicking unknown links.
  • Know the school’s acceptable‑use policy.

10.3 Netiquette

  • Be respectful – no offensive language or personal attacks.
  • Avoid “caps‑lock shouting”.
  • Do not share personal information of yourself or others.
  • Give credit when quoting or paraphrasing online content.

11 File Management

11.1 Folder structures & naming conventions

  • Use a hierarchical folder system – e.g., School/Year5/Exams/2025/.
  • Adopt consistent naming: YYYYMMDDSubjectClass.ext (e.g., 20250412MathY5A.csv).
  • Avoid spaces and special characters; use underscores or hyphens.

11.2 File formats – generic vs. application