Know and understand designing file/data structures, input formats, output formats and validation routines
1 Computer Systems – Hardware, Software and Operating Systems
1.1 Hardware – Main components
Component
Function
Typical examples
CPU (Central Processing Unit)
Executes instructions, performs calculations
Intel Core i5, AMD Ryzen 5
RAM (Random‑Access Memory)
Temporary storage for data and programs while they are running
4 GB, 8 GB DDR4
ROM (Read‑Only Memory) / BIOS
Stores firmware that starts the computer
BIOS, UEFI
Motherboard
Connects all components, provides buses and slots
ATX, Micro‑ATX
Power Supply Unit (PSU)
Converts mains AC to DC for the computer
450 W, 650 W
Graphics Processing Unit (GPU)
Renders images, video and 3‑D graphics
NVIDIA GTX 1660, integrated Intel UHD
Storage devices (see §2)
1.2 Software – System vs. Application
System software – controls hardware, provides platform for applications. Examples: operating systems (Windows, macOS, Linux), device drivers, utility programmes.
Application software – designed to perform specific tasks for the user. Examples: word processors, spreadsheets, databases, web browsers, learning‑management systems.
Fixed‑width – each field occupies a predefined number of characters.
Hierarchical – XML, JSON.
Encoding: ASCII, UTF‑8, Unicode – must match the source.
Field order: either identical to the file layout or mapped during import.
Chosen format for the case study: CSV file with columns in the order shown in the table above. CSV can be created in spreadsheet software and parsed easily.
7.2.3 Output Formats
Screen display: HTML tables, dashboards, forms.
Printed reports: PDF (fixed layout), plain‑text, formatted Word document.
Data exchange: CSV for spreadsheets, XML/JSON for web services.
Graphical output: bar charts, pie charts, line graphs (e.g., average score per class).
Required outputs for the school system:
On‑screen class list (HTML table).
Printable PDF report for each class.
JSON file for the school’s web portal (integration with other services).
7.2.4 Validation Routines
Validation enforces the rules defined in the design stage.
Field
Rule
Reason
StudentID
Presence, pattern, uniqueness
Identifies the record uniquely.
FirstName / LastName
Presence, length ≤ 20
Prevents truncation and blank names.
DateOfBirth
Presence, valid date, not future
Ensures realistic ages.
Score
Presence, numeric, range 0‑100
Score must be a valid exam mark.
Pseudo‑code (server‑side)
IF StudentID = "" THEN
DISPLAY "Student ID is required"
ELSIF NOT MATCH(StudentID, "^[A-Z]{2}\d{6}$") THEN
DISPLAY "Student ID must be 2 letters followed by 6 digits"
ELSIF EXISTS(StudentID) THEN
DISPLAY "Student ID already exists"
END IF
IF LEN(FirstName) = 0 THEN
DISPLAY "First name is required"
ELSIF LEN(FirstName) > 20 THEN
DISPLAY "First name cannot exceed 20 characters"
END IF
IF NOT ISDATE(DateOfBirth) OR DateOfBirth > TODAY() THEN
DISPLAY "Enter a valid date of birth"
END IF
IF NOT ISNUMERIC(Score) THEN
DISPLAY "Score must be a number"
ELSIF Score < 0 OR Score > 100 THEN
DISPLAY "Score must be between 0 and 100"
END IF
Best practice
Validate as early as possible – client‑side HTML5 attributes (required, pattern, min/max) and server‑side checks.
Show clear, specific error messages.
Log every validation failure for audit and debugging.
7.3 Development & Testing – Test Design, Test Data, Expected vs. Actual Outcomes
7.3.1 Test Plan Template
Test ID
Test Description
Input Data
Expected Result
Actual Result
Pass/Fail
Remarks
T01
Valid record import
CSV row with correct data
Record stored, no error
T02
Score out of range
Score = 105
Validation error “Score must be between 0 and 100”
T03
Missing StudentID
StudentID = ""
Validation error “Student ID is required”
T04
Boundary test – Score = 0
Score = 0
Record accepted
T05
Boundary test – Score = 100
Score = 100
Record accepted
7.3.2 Types of Test Data
Normal data – satisfies all validation rules (e.g., Score = 85).
Boundary data – values at the limits of a range (Score = 0 or 100).
Abnormal / extreme data – missing fields, wrong data type, overly long text, future dates.
7.3.3 Recording Results
For each test case fill the “Actual Result” column, compare with “Expected Result”, and note any remedial action (e.g., fix validation logic, adjust field length).
7.4 Implementation – Methods of Introducing a New System
Method
How it works
Advantages
Disadvantages
Direct changeover
Old system is switched off and the new one starts immediately.
Fast, low cost.
High risk – if the new system fails there is no fallback.
Parallel
Both old and new systems run together for a period.
Low risk – errors can be caught by comparing results.
More expensive; double data entry may be needed.
Pilot
New system is introduced to a small group (e.g., one year‑group).
Allows real‑world testing and user feedback.
May delay full roll‑out.
Phased
System is introduced module by module (e.g., first input, then reporting).
Gradual learning curve; issues isolated to a module.
Longer overall implementation time.
Recommended approach for the school exam‑record system: a pilot in one year‑group, followed by a phased roll‑out to the remaining years. This limits disruption and provides feedback before full deployment.
7.5 Documentation – Technical & User Documentation
State clearly what the system is intended to achieve – e.g., “to record, store and report exam results efficiently, reducing manual paperwork and errors”.
9.3 Copyright & software licensing
Only use software with a valid licence – avoid illegal copies.
Respect copyright when using images, music, text – obtain permission or use royalty‑free resources.
Give proper attribution for any third‑party code or libraries.
10 Communication – Email, Internet Use & Netiquette
10.1 Email etiquette
Use a clear subject line.
Start with a greeting, end with a signature.
Keep messages concise; use paragraphs and bullet points.
Check spelling and tone before sending.
Attach files only when necessary; compress large files.
10.2 Internet use & searching
Use reputable search engines; apply keywords effectively.