Know and understand evaluating a solution including the efficiency of the solution, the ease of use of the solution, and the appropriateness of the solution
Test plan – objectives, resources, schedule, responsibilities.
Test data – normal (valid) data, abnormal (invalid) data, boundary values.
Testing strategies
Unit testing – individual modules.
Integration testing – interaction between modules.
System testing – whole system against requirements.
User‑acceptance testing (UAT) – real users validate the solution.
Implementation types – pilot, parallel, direct (big‑bang) change‑over.
7.4 Implementation (Change‑over)
Direct (big‑bang) – switch off old system and switch on new at once.
Parallel – run old and new systems together for a period.
Phased – introduce new system in stages (e.g., by department).
Pilot – trial in a limited area before full roll‑out.
Roll‑out planning – training schedule, data migration, cut‑over timetable.
7.5 Documentation
Technical documentation – system architecture, source‑code comments, database schema, maintenance procedures, version control, change log.
User documentation – user manual, quick‑start guide, FAQ, online help, video tutorials, troubleshooting guide.
7.6 Evaluation (AO3 – Analyse, evaluate, make reasoned judgements)
The evaluation stage measures how well the new solution meets the original requirements and stakeholder expectations.
Key Evaluation Criteria
Efficiency – speed and resource use.
Ease of Use (Usability) – learnability, error rate, satisfaction.
Appropriateness – fit for purpose, compliance, scalability, maintainability.
1. Efficiency of the Solution
Metric
What it measures
Typical method of measurement
Response time
Time taken to complete a user request
Stop‑watch, automated timing tools, server logs
Throughput
Number of transactions processed per unit time
Transactions per second/minute from system logs
Resource utilisation
CPU, memory, disk, network usage while operating
Performance monitor, profiling software
Cost per transaction
Financial cost divided by number of transactions
\(\displaystyle \text{Cost per transaction}= \frac{\text{Total cost of system}}{\text{Number of transactions}}\)
2. Ease of Use (Usability)
Indicator
What it measures
How to obtain data
Learnability
Time for a new user to perform basic tasks
Observation, timed tasks during a usability test
Efficiency (experienced users)
Speed of task completion after learning
Time‑on‑task measurements in repeat‑test
Memorability
Ability to recall how to use the system after a break
Delayed test – users return after days/weeks
Error rate
Frequency of user errors and ease of recovery
Count errors during test sessions; note recovery steps
Satisfaction
User’s subjective opinion of the system
Questionnaire using Likert scale (1‑5)
3. Appropriateness of the Solution
Alignment with business goals and user requirements.
Compliance with legal, regulatory and security standards.
Scalability – ability to handle future growth.
Compatibility with existing hardware, software and network infrastructure.
Maintainability – ease of updating, fixing bugs and providing support.
Evaluation Process (AO3 focus)
Compare the original task requirements (from analysis) with the actual performance measured using the three criteria above.
Produce an evaluation report that includes:
Summary of findings – tables/graphs.
Judgement on whether success criteria have been met.
Identification of any short‑comings.
Recommendations for improvement or further development.
Reflect on the whole life‑cycle – could a different change‑over method, design choice or testing strategy have produced a better outcome?
Evaluation Rubric (5‑point scale)
Criterion
Excellent (5)
Good (4)
Fair (3)
Poor (2)
Unsatisfactory (1)
Efficiency
Response time < 1 s, CPU & memory utilisation < 5 %
Response time 1–2 s, 5–10 % utilisation
Response time 2–5 s, 10–20 % utilisation
Response time > 5 s, > 20 % utilisation
System crashes or is unusable
Ease of Use
Learnable < 5 min, error rate < 1 %
Learnable 5–10 min, error rate 1–3 %
Learnable 10–20 min, error rate 3–5 %
Learnable > 20 min, error rate > 5 %
User cannot complete core tasks
Appropriateness
Fully meets all functional & non‑functional requirements; compliant; scalable
Meets most requirements; minor gaps
Meets some requirements; several gaps
Meets few requirements; major gaps
Fails to meet any requirement
Sample Assessment Questions (AO3)
Identify two efficiency metrics that would be relevant for a web‑based e‑commerce platform and explain how you would measure each.
Describe a usability test you would conduct to evaluate the learnability of a new software application.
List three factors that determine the appropriateness of a solution for a small non‑profit organisation.
Using the rubric above, evaluate the following scenario: a mobile app processes 500 transactions per hour, has an average response time of 3 s and an error rate of 4 %. Provide a score (1‑5) for each criterion and justify your marks.
Suggested Diagram (Insert into teaching material)
Flow of the Evaluation stage within the Systems Life Cycle. Inputs are the documented requirements, design artefacts and prototype; outputs are the evaluation report, user feedback and recommendations for further development.
Systems Life Cycle – full walkthrough (analysis → evaluation)
Practical project – students work on a mini‑SLC project (choose a problem, develop a solution, evaluate).
Revision & mock exam practice (focus on AO1, AO2, AO3).
Practical Lab Ideas (AO2)
Lab 1 – Word‑Processing – create a formatted newsletter using styles and tables.
Lab 2 – Spreadsheet – build a budgeting model with IF and SUMIF functions.
Lab 3 – Database – design a simple student‑records database, enter data, run SELECT queries.
Lab 4 – Web Authoring – produce a 3‑page site with HTML and CSS, publish via FTP.
Lab 5 – Evaluation – conduct a usability test on the site created in Lab 4, record metrics, write a brief evaluation report.
Key Take‑aways for Students
Know the terminology for each hardware/software component (AO1).
Be able to create, edit and format documents, spreadsheets, databases, presentations and web pages (AO2).
Apply the Systems Life Cycle to a real problem, collect data, and evaluate the final solution against efficiency, usability and appropriateness (AO3).
Always consider safety, legal and ethical issues when using ICT.
Support e-Consult Kenya
Your generous donation helps us continue providing free Cambridge IGCSE & A-Level resources,
past papers, syllabus notes, revision questions, and high-quality online tutoring to students across Kenya.