Know and understand comparing the solution with the original task requirements, identifying any limitations and necessary improvements to the system, evaluating the users' responses to the results of testing the system

Published by Patrick Mutisya · 8 days ago

ICT 0417 – The Systems Life Cycle: Evaluation Phase

The Systems Life Cycle – Evaluation Phase

1. Comparing the Solution with the Original Task Requirements

After a system has been developed and tested, the next step is to verify that it meets the original specifications that were set out at the start of the project. This involves:

  • Reviewing the task requirements document (functional and non‑functional).
  • Checking each requirement against the actual features of the completed system.
  • Recording any discrepancies, omissions or enhancements.

Suggested diagram: Flowchart showing “Requirements → Design → Development → Testing → Evaluation → Maintenance”.

2. Identifying Limitations and Necessary Improvements

During evaluation, limitations are identified and a plan for improvement is created. Common areas to examine include:

  1. Functionality gaps – features that were required but are missing or incomplete.
  2. Performance issues – slow response times, high resource usage, or scalability problems.
  3. Usability problems – confusing navigation, poor error messages, or accessibility barriers.
  4. Security weaknesses – inadequate authentication, data encryption, or audit trails.
  5. Compatibility concerns – the system does not work on all intended devices or operating systems.

For each limitation, a recommended improvement should be recorded, together with an estimate of the effort required.

3. Evaluating Users' Responses to Testing Results

Users provide valuable feedback after they have interacted with the system during testing. Their responses are evaluated by:

  • Collecting qualitative data (comments, interviews, questionnaires).
  • Collecting quantitative data (error rates, task completion times, satisfaction scores).
  • Analyzing trends to see whether the system meets user expectations.
  • Prioritising issues based on impact and frequency.

4. Summary Table – Evaluation Checklist

Evaluation AreaCriteriaFindingsRecommended Action
Requirement CoverageAll functional requirements met?5 of 12 requirements partially metDevelop missing modules; retest
PerformanceAverage response time ≤ 2 sAverage 3.4 s during peak loadOptimise database queries; consider caching
UsabilityUser satisfaction ≥ 80 %Survey result 72 %Redesign navigation menu; add tooltips
SecurityNo critical vulnerabilitiesTwo medium‑risk XSS issues foundImplement input sanitisation; retest
CompatibilityWorks on Windows, macOS, Android, iOSFails on iOS 13 browsersUpdate CSS for older WebKit versions

5. Steps to Document the Evaluation

  1. Compile a Evaluation Report that includes the comparison table, user feedback analysis, and a risk assessment.
  2. Present the report to stakeholders for approval of the improvement plan.
  3. Update the Project Plan with new tasks, timelines, and resource allocations.
  4. Proceed to the Implementation of Improvements phase, then repeat testing as required.

6. Key Points to Remember for the IGCSE Exam

  • Always link back to the original task requirements when evaluating a solution.
  • Distinguish between limitations (current shortcomings) and improvements (planned actions).
  • Use both qualitative and quantitative data to assess user responses.
  • Present findings in a clear, organised table – examiners look for structured answers.
  • Explain how the evaluation informs the next stage of the systems life cycle (maintenance or further development).