Compare alternatives against weighted criteria. Create the matrix in a spreadsheet, then export as PDF for the project report.
Criteria
Weight (1‑5)
Option A (Desktop + Commercial POS)
Option B (All‑in‑One + Open‑source POS)
Option C (Tablet + Cloud POS)
Initial Cost
5
8
6
7
Performance (CPU/RAM)
4
7
9
6
Scalability
3
6
8
7
Compatibility with Existing Systems
4
9
5
8
Support & Maintenance
2
7
8
6
Total Weighted Score
8×5 + 7×4 + 6×3 + 9×4 + 7×2 = 151
6×5 + 9×4 + 8×3 + 5×4 + 8×2 = 166
7×5 + 6×4 + 7×3 + 8×4 + 6×2 = 155
Interpretation (AO3): Option B scores highest because it balances cost, performance and future‑proofing, satisfying the budget (< $2 500) and functional needs.
10. Testing – Development & Testing Phase
Test plan – list of features, test environment, responsibilities, schedule.
Test data – realistic transactions, boundary values, invalid inputs.
Test types
Normal case – typical user actions.
Abnormal case – wrong data entry, network loss.
Extreme case – maximum load, simultaneous users.
Security test – password strength, encryption verification.
Cost‑effectiveness – total cost of ownership vs budget.
Method – compare measured values (e.g., average sale time = 1.8 s) with targets (≤ 2 s); present results in bar or line graphs created in a spreadsheet.
Judgement – state whether the solution is “acceptable”, “needs improvement” or “unsuitable”, and justify with evidence.
Recommendations – suggest upgrades, further training, or alternative software if gaps remain.
14. Expanded case study – Small Retail POS
Requirements (Analysis)
Transaction ≤ 2 seconds.
Touch‑screen interface.
Secure local storage + daily backup.
Generate sales & inventory reports.
Budget ≤ $2 500.
Ergonomic workstation for cashiers.
Design – hardware & software selection
Used the weighted decision matrix (section 9). Option B (All‑in‑One + Open‑source POS) scored highest.
Development & Testing
Test plan covered normal sales, voided sales, network loss and 10 simultaneous cashiers (extreme case).
All test cases passed; two minor UI glitches were corrected before rollout.
Implementation
Chosen method: Pilot – rolled out to one checkout lane for two weeks.
Maintenance log – initial entries for software updates and hardware checks.
Evaluation
Average transaction time: 1.7 s (target met).
User survey (5‑point Likert): 4.2 average satisfaction.
Total cost: $2 380 (within budget).
Reliability: 99.8 % uptime over pilot period.
Recommendation: Extend rollout to all lanes; add cloud backup for redundancy.
15. Checklist for hardware/software selection (AO2)
Meets minimum performance specs and scalability needs?
Compatible with existing OS and other systems?
All hidden costs (licences, maintenance, upgrades) accounted for?
Supports future growth (more users, larger data volume)?
Reliable vendor or community support available?
Complies with health, safety, e‑safety and data‑protection policies?
Testing procedures defined and documented?
Clear implementation plan (change‑over method, training, rollback)?
Technical and user documentation prepared with proper file‑naming and proof‑reading?
Evaluation method established (metrics, graphs, presentation)?
16. Linking to related ICT skills (syllabus 10‑19)
File management – store all artefacts using the folder structure shown in section 12; use consistent naming (e.g., 02_Design/DecisionMatrix.xlsx).
Layout & styles – apply heading styles, bullet lists and tables in Word/Google Docs for a professional look.
Proofing – run spell‑check, grammar check and peer review before final submission.
Graphs & charts – create bar/line charts of performance data in Excel; insert them into the evaluation report.
Document production – export the decision matrix, evaluation charts and final report as PDF (exam size limit).
Simple database – maintain a hardware inventory table (ID, type, specs, cost) in a spreadsheet or Access file to demonstrate database concepts.
Presentation – prepare a 5‑slide slide‑deck summarising the justification, testing results and recommendations; use consistent slide layout and speaker notes.
17. Summary
Identifying and justifying suitable hardware and software is a core component of the Analysis and Design phases of the Systems Life Cycle. By:
Gathering clear, weighted requirements (including health, safety and e‑safety).
Using a weighted decision matrix to compare alternatives.
Planning thorough testing and choosing an appropriate change‑over method.
Producing well‑structured technical and user documentation with proper file‑management and proof‑reading.
Evaluating the live system against original criteria and presenting findings with graphs and a short presentation,
students can produce ICT solutions that are effective, economical and fully aligned with the Cambridge IGCSE/A‑Level 0417 – The Systems Life Cycle syllabus.
Suggested diagram: Flow of the Systems Life Cycle highlighting the points where hardware & software selection, testing, implementation, documentation and evaluation occur.
Your generous donation helps us continue providing free Cambridge IGCSE & A-Level resources,
past papers, syllabus notes, revision questions, and high-quality online tutoring to students across Kenya.