Know and understand identifying and justifying suitable hardware and software for the new system

ICT 0417 – Systems Life Cycle: Identifying and Justifying Suitable Hardware & Software

1. Why this topic matters (Cambridge syllabus links)

  • 1‑5 : Knowledge of computer hardware, software, I/O devices, storage media and networks – required to justify choices.
  • 6 : Health, safety and e‑safety considerations when selecting equipment and applications.
  • 7‑9 : Understanding of the Systems Life Cycle (SLC), audience analysis and communication planning.
  • 10‑16 : File‑management, document layout, styles, proof‑reading, graphs and charts – all used when producing the project report.
  • 17‑19 : Document production, simple databases and presentations – useful for supporting evidence in Paper 2/3.

2. The Systems Life Cycle (SLC) – hardware & software focus

PhaseKey Activities (hardware/software)Audience & Communication
PlanningDefine scope, budget, high‑level objectives; outline initial hardware/software ideas.Identify stakeholders (client, end‑users, IT support); draft a brief communication plan.
AnalysisGather detailed functional & non‑functional requirements – performance, storage, security, ergonomics.Analyse audience needs (e.g., cashiers, managers, maintenance staff); produce a requirements‑specification document.
DesignTranslate requirements into technical specifications; select hardware & software; create a weighted decision matrix.Prepare design brief for managers and technical team; decide on documentation style and naming conventions.
Development & TestingConfigure/assemble hardware, install software, develop custom code; run test plans.Provide test‑report to project sponsor; schedule training sessions.
ImplementationDeploy the solution using a change‑over method; conduct user training; set up rollback plan.Communicate go‑live date, support contacts and help‑desk procedures to all users.
DocumentationProduce technical specifications, user manuals, installation guides, maintenance logs.Apply consistent file‑naming, folder structure and style sheets; proof‑read all documents.
EvaluationMeasure system against original criteria (efficiency, usability, reliability, security, cost).Present findings to stakeholders using graphs, charts and a short presentation.
MaintenanceOngoing updates, performance monitoring, backup and security patches.Maintain a support log and update the user guide as required.

3. When hardware & software decisions are made

  • Analysis – first identification of requirements (performance, capacity, security, ergonomics).
  • Design – specification of exact components, creation of justification (decision matrix).
  • Testing – validation that the chosen hardware/software meet the requirements.
  • Implementation – final confirmation and documentation of the selected solution.

4. Core ICT concepts – quick recap (link to decision‑matrix criteria)

ConceptWhat it isRelevant decision‑matrix criterion
CPU (processor)Central processing unit – speed (GHz), cores, 32/64‑bit architecture.Performance
RAM (memory)Volatile storage for active data – size in GB.Performance / Scalability
Storage mediaHDD, SSD, external drives, RAID; capacity and read/write speed.Initial Cost, Performance, Scalability
Input/Output devicesKeyboards, mouse, touch‑screen, scanners, barcode readers, printers.Compatibility, User‑friendliness
Operating System (OS)Software layer that manages hardware – Windows, macOS, Linux, Android, iOS.Compatibility, Support & Maintenance
Application softwarePrograms that fulfil functional needs – POS, accounting, database, graphics.Functionality, Cost
Database Management System (DBMS)Software for storing, retrieving and managing data – relational (MySQL) or NoSQL.Scalability, Security
Network devicesNIC, router, switch, Wi‑Fi access point; bandwidth and VPN support.Network, Compatibility
Cloud servicesRemote storage or SaaS accessed via internet.Cost, Scalability, Security

5. Health, Safety & e‑Safety checklist (syllabus 6)

  • Ergonomic design – adjustable monitor height, keyboard placement, anti‑glare screens.
  • Power safety – UPS, surge protector, proper cabling.
  • Noise & heat – ensure ventilation, low‑decibel fans.
  • e‑Safety – data‑protection legislation, GDPR compliance, password policy, encryption.
  • Software licensing – avoid illegal copies, respect copyright.
  • Disposal – recycle e‑waste according to local regulations.

6. Identifying hardware requirements

  • Processing power – CPU speed, core count, 64‑bit support.
  • Memory (RAM) – minimum for simultaneous tasks; consider future growth.
  • Storage – SSD vs HDD, capacity (GB/TB), read/write speed, RAID level if required.
  • Input/Output devices – keyboards, mice, touch‑screens, scanners, barcode readers, printers.
  • Network – Ethernet (Cat‑5e/6), Wi‑Fi standards (802.11ac/ax), required bandwidth, VPN.
  • Reliability & durability – operating temperature, dust‑proofing, UPS need.
  • Scalability – ability to add RAM, extra drives, additional nodes.
  • Physical constraints – desk space, portability, power consumption.

7. Identifying software requirements

  • Operating system (OS) – compatibility with hardware, licensing model, support lifespan.
  • Application software – must satisfy functional requirements (e.g., POS, inventory, reporting).
  • Database Management System (DBMS) – relational vs NoSQL, backup facilities, scalability.
  • Security software – antivirus, firewall, encryption, authentication mechanisms.
  • Development tools – IDEs, version‑control (Git), testing frameworks, deployment scripts.
  • Licensing & cost – open‑source, commercial per‑seat, subscription, maintenance fees.
  • Support & maintenance contracts – vendor SLA, community support, update policy.

8. Audience analysis (syllabus 7‑9)

Identify who will use or be affected by the system and tailor communication accordingly.

  • End‑users – cashiers, shop floor staff – need simple, intuitive UI and quick training.
  • Managers – require reporting features, dashboards and data‑export capability.
  • IT support / maintenance team – need detailed technical specifications and maintenance logs.
  • External auditors / regulators – may require evidence of data‑protection compliance.

9. Structured justification – Weighted Decision Matrix (AO2)

Compare alternatives against weighted criteria. Create the matrix in a spreadsheet, then export as PDF for the project report.

CriteriaWeight (1‑5)Option A
(Desktop + Commercial POS)
Option B
(All‑in‑One + Open‑source POS)
Option C
(Tablet + Cloud POS)
Initial Cost5867
Performance (CPU/RAM)4796
Scalability3687
Compatibility with Existing Systems4958
Support & Maintenance2786
Total Weighted Score8×5 + 7×4 + 6×3 + 9×4 + 7×2 = 1516×5 + 9×4 + 8×3 + 5×4 + 8×2 = 1667×5 + 6×4 + 7×3 + 8×4 + 6×2 = 155

Interpretation (AO3): Option B scores highest because it balances cost, performance and future‑proofing, satisfying the budget (< $2 500) and functional needs.

10. Testing – Development & Testing Phase

  • Test plan – list of features, test environment, responsibilities, schedule.
  • Test data – realistic transactions, boundary values, invalid inputs.
  • Test types

    • Normal case – typical user actions.
    • Abnormal case – wrong data entry, network loss.
    • Extreme case – maximum load, simultaneous users.
    • Security test – password strength, encryption verification.

  • Recording results – pass/fail, defect description, severity, screenshot, retest date.
  • Reporting – summary table, overall pass rate, recommendations for fixes before implementation.

11. Implementation – change‑over methods & communication plan

  • Direct (Big‑Bang) – old system switched off, new system activated instantly.
  • Parallel run – both systems operate together for a defined period; discrepancies reconciled.
  • Pilot – new system introduced to a single department or location first.
  • Phased roll‑out – functionality introduced in stages (e.g., sales first, then inventory).

Communication plan (AO2)

  • Pre‑go‑live announcement (email & notice board) – date, expected downtime, support contacts.
  • Training schedule – hands‑on sessions, quick‑reference guides, video tutorials.
  • Help‑desk procedure – ticket system, escalation matrix, backup contact.
  • Rollback plan – retain original system until pilot success confirmed; document steps to revert.

12. Documentation – production, file‑naming & proof‑reading (syllabus 10‑16)

Document typePurposeTypical contents
Technical SpecificationGuide developers & maintainersHardware list, OS, software versions, network diagram, data‑flow diagrams, ERD, API specs.
User ManualAssist end‑usersStep‑by‑step procedures, screenshots, FAQs, troubleshooting tips.
Installation & Configuration GuideSupport deployment teamPrerequisites, installation steps, configuration parameters, rollback instructions.
Maintenance LogTrack changes & issuesDate, description of change/issue, person responsible, resolution, next review date.

File‑naming & folder structure (AO2)

ProjectPOS/

│── 01_Analysis/

│   ├── Requirements.docx

│   └── AudienceAnalysis.docx

│── 02_Design/

│   ├── DecisionMatrix.xlsx

│   └── TechnicalSpec.docx

│── 03_Testing/

│   ├── TestPlan.docx

│   └── TestResults.xlsx

│── 04_Implementation/

│   ├── TrainingSchedule.docx

│   └── RollbackPlan.docx

│── 05_Documentation/

│   ├── UserManual.docx

│   └── MaintenanceLog.xlsx

│── 06_Evaluation/

│   ├── EvaluationReport.docx

│   └── PerformanceCharts.pdf

Use styles for headings, sub‑headings and body text; run spell‑check and have a peer proof‑read before submission.

13. Evaluation – measuring success (AO3)

  • Evaluation criteria

    • Efficiency – transaction time, CPU utilisation.
    • Effectiveness – all functional requirements met?
    • Usability – learning curve, user‑satisfaction survey.
    • Reliability – downtime, error rate.
    • Security – data protection, audit logs.
    • Cost‑effectiveness – total cost of ownership vs budget.

  • Method – compare measured values (e.g., average sale time = 1.8 s) with targets (≤ 2 s); present results in bar or line graphs created in a spreadsheet.
  • Judgement – state whether the solution is “acceptable”, “needs improvement” or “unsuitable”, and justify with evidence.
  • Recommendations – suggest upgrades, further training, or alternative software if gaps remain.

14. Expanded case study – Small Retail POS

  1. Requirements (Analysis)

    • Transaction ≤ 2 seconds.
    • Touch‑screen interface.
    • Secure local storage + daily backup.
    • Generate sales & inventory reports.
    • Budget ≤ $2 500.
    • Ergonomic workstation for cashiers.

  2. Design – hardware & software selection

    Used the weighted decision matrix (section 9). Option B (All‑in‑One + Open‑source POS) scored highest.

  3. Development & Testing

    • Test plan covered normal sales, voided sales, network loss and 10 simultaneous cashiers (extreme case).
    • All test cases passed; two minor UI glitches were corrected before rollout.

  4. Implementation

    • Chosen method: Pilot – rolled out to one checkout lane for two weeks.
    • Training: 2‑hour hands‑on session + printed quick‑reference guide.
    • Rollback: original desktop POS retained until pilot success confirmed.
    • Communication: email announcement, on‑site signage, help‑desk number displayed.

  5. Documentation

    • Technical spec – hardware list, OS Windows 10 Pro, Open‑source POS v2.3, MySQL DB.
    • User manual – step‑by‑step screenshots of the touch‑screen workflow.
    • Installation guide – SSD cloning, network configuration, backup schedule.
    • Maintenance log – initial entries for software updates and hardware checks.

  6. Evaluation

    • Average transaction time: 1.7 s (target met).
    • User survey (5‑point Likert): 4.2 average satisfaction.
    • Total cost: $2 380 (within budget).
    • Reliability: 99.8 % uptime over pilot period.
    • Recommendation: Extend rollout to all lanes; add cloud backup for redundancy.

15. Checklist for hardware/software selection (AO2)

  • Meets minimum performance specs and scalability needs?
  • Compatible with existing OS and other systems?
  • All hidden costs (licences, maintenance, upgrades) accounted for?
  • Supports future growth (more users, larger data volume)?
  • Reliable vendor or community support available?
  • Complies with health, safety, e‑safety and data‑protection policies?
  • Testing procedures defined and documented?
  • Clear implementation plan (change‑over method, training, rollback)?
  • Technical and user documentation prepared with proper file‑naming and proof‑reading?
  • Evaluation method established (metrics, graphs, presentation)?

16. Linking to related ICT skills (syllabus 10‑19)

  • File management – store all artefacts using the folder structure shown in section 12; use consistent naming (e.g., 02_Design/DecisionMatrix.xlsx).
  • Layout & styles – apply heading styles, bullet lists and tables in Word/Google Docs for a professional look.
  • Proofing – run spell‑check, grammar check and peer review before final submission.
  • Graphs & charts – create bar/line charts of performance data in Excel; insert them into the evaluation report.
  • Document production – export the decision matrix, evaluation charts and final report as PDF (exam size limit).
  • Simple database – maintain a hardware inventory table (ID, type, specs, cost) in a spreadsheet or Access file to demonstrate database concepts.
  • Presentation – prepare a 5‑slide slide‑deck summarising the justification, testing results and recommendations; use consistent slide layout and speaker notes.

17. Summary

Identifying and justifying suitable hardware and software is a core component of the Analysis and Design phases of the Systems Life Cycle. By:

  1. Gathering clear, weighted requirements (including health, safety and e‑safety).
  2. Using a weighted decision matrix to compare alternatives.
  3. Planning thorough testing and choosing an appropriate change‑over method.
  4. Producing well‑structured technical and user documentation with proper file‑management and proof‑reading.
  5. Evaluating the live system against original criteria and presenting findings with graphs and a short presentation,

students can produce ICT solutions that are effective, economical and fully aligned with the Cambridge IGCSE/A‑Level 0417 – The Systems Life Cycle syllabus.

Suggested diagram: Flow of the Systems Life Cycle highlighting the points where hardware & software selection, testing, implementation, documentation and evaluation occur.