ICT 0417 – The Systems Life Cycle: Evaluation Phase
The Systems Life Cycle – Evaluation Phase
1. Comparing the Solution with the Original Task Requirements
After a system has been developed and tested, the next step is to verify that it meets the original specifications that were set out at the start of the project. This involves:
Reviewing the task requirements document (functional and non‑functional).
Checking each requirement against the actual features of the completed system.
Recording any discrepancies, omissions or enhancements.
Suggested diagram: Flowchart showing “Requirements → Design → Development → Testing → Evaluation → Maintenance”.
2. Identifying Limitations and Necessary Improvements
During evaluation, limitations are identified and a plan for improvement is created. Common areas to examine include:
Functionality gaps – features that were required but are missing or incomplete.
Performance issues – slow response times, high resource usage, or scalability problems.
Usability problems – confusing navigation, poor error messages, or accessibility barriers.
Security weaknesses – inadequate authentication, data encryption, or audit trails.
Compatibility concerns – the system does not work on all intended devices or operating systems.
For each limitation, a recommended improvement should be recorded, together with an estimate of the effort required.
3. Evaluating Users' Responses to Testing Results
Users provide valuable feedback after they have interacted with the system during testing. Their responses are evaluated by:
Collecting qualitative data (comments, interviews, questionnaires).
Collecting quantitative data (error rates, task completion times, satisfaction scores).
Analyzing trends to see whether the system meets user expectations.
Prioritising issues based on impact and frequency.
4. Summary Table – Evaluation Checklist
Evaluation Area
Criteria
Findings
Recommended Action
Requirement Coverage
All functional requirements met?
5 of 12 requirements partially met
Develop missing modules; retest
Performance
Average response time ≤ 2 s
Average 3.4 s during peak load
Optimise database queries; consider caching
Usability
User satisfaction ≥ 80 %
Survey result 72 %
Redesign navigation menu; add tooltips
Security
No critical vulnerabilities
Two medium‑risk XSS issues found
Implement input sanitisation; retest
Compatibility
Works on Windows, macOS, Android, iOS
Fails on iOS 13 browsers
Update CSS for older WebKit versions
5. Steps to Document the Evaluation
Compile a Evaluation Report that includes the comparison table, user feedback analysis, and a risk assessment.
Present the report to stakeholders for approval of the improvement plan.
Update the Project Plan with new tasks, timelines, and resource allocations.
Proceed to the Implementation of Improvements phase, then repeat testing as required.
6. Key Points to Remember for the IGCSE Exam
Always link back to the original task requirements when evaluating a solution.
Distinguish between limitations (current shortcomings) and improvements (planned actions).
Use both qualitative and quantitative data to assess user responses.
Present findings in a clear, organised table – examiners look for structured answers.
Explain how the evaluation informs the next stage of the systems life cycle (maintenance or further development).