Know and understand characteristics, uses, advantages and disadvantages of the four methods of implementation, direct changeover, parallel running, pilot running, phased implementation

7.4 Implementation (life‑cycle stage) – AO2

AO2 (Apply knowledge): You must be able to select and apply an appropriate implementation method, produce the required documentation and justify your choice in an exam‑style answer.

Prerequisite recap (stages 7.1‑7.3, 11‑16)

Before implementation the following artefacts should already exist:

  • Analysis outputs: feasibility study, requirements specification, stakeholder matrix.
  • Design outputs: data‑flow diagrams, entity‑relationship diagrams, system architecture diagram, user‑interface mock‑ups.
  • Testing outputs: test plan, test cases, test‑report showing pass/fail results, defect log.
  • Documentation outputs: draft user guide, technical specifications, data‑migration strategy.

These documents feed directly into the implementation plan and are referenced when producing the final documentation (see below).

Documentation needed before go‑live (AO2)

All documentation must be clear, fit for purpose and ready for the chosen implementation method.

  • Technical documentation

    • System architecture diagram
    • Data‑migration scripts (with comments)
    • Integration points & API specifications
    • Backup & recovery procedures

  • User documentation

    • User guide (example excerpt below)
    • Quick‑reference sheets
    • Training material (slides, hands‑on exercises)

  • Change‑over plan

    • Timetable of activities
    • Roles & responsibilities
    • Contingency actions (fallback procedures)

  • Risk & e‑safety register (example template below)

Sample user‑guide excerpt (school timetable system)

1. Logging‑in

• Enter your staff ID and password.

• Click “Sign In”. If you forget your password, click “Reset”.

2. Creating a new class

• Go to “Timetable → Add Class”.

• Select Subject, Teacher, Room, Day and Period.

• Click “Save”. A confirmation message will appear.

Sample risk & e‑safety register (template)

RiskLikelihoodImpactMitigationOwner
Data loss during migrationMediumHighRun full backup; verify checksum after migrationIT Manager
Unauthorised access to new systemLowMediumEnforce strong passwords; audit logs weeklySecurity Officer

Four methods of implementation (AO2)

1. Direct Change‑over (Big‑Bang)

  • Characteristics: All users switch from the old system to the new system at a single, pre‑planned moment.
  • Typical uses: Small organisations, low‑risk applications, or when the legacy system is no longer functional.
  • Advantages

    • Very fast – no period of dual operation.
    • Low cost – no need to maintain two systems.
    • Clear cut‑over point simplifies communication.

  • Disadvantages

    • High risk – any failure leaves the business without a working system.
    • Little opportunity to correct problems after go‑live.
    • Staff may be unprepared for a sudden change.
    • Data‑migration errors are harder to detect without parallel verification.

  • Case‑study example: A small private school replaces its paper‑based attendance register with a simple spreadsheet. The switch is made on the first day of the new term – all teachers start using the spreadsheet immediately.

2. Parallel Running

  • Characteristics: Old and new systems run side‑by‑side for a defined period; data is entered into both.
  • Typical uses: Critical systems (payroll, finance, student records) where downtime would be disastrous.
  • Advantages

    • Safety net – the old system can take over if the new system fails.
    • Real‑time testing with live data identifies hidden errors.
    • Provides ample time for staff training and confidence building.
    • Data‑security can be checked by comparing outputs from both systems.

  • Disadvantages

    • Higher cost – two systems must be maintained and supported.
    • Risk of data entry errors and inconsistencies between the two databases.
    • Longer implementation period and more complex coordination.

  • Case‑study example: A secondary school introduces a new electronic grading system. For one term both the old paper‑based grade book and the new system are used; grades entered in the new system are cross‑checked against the paper records before the old system is retired.

3. Pilot Running (Trial Implementation)

  • Characteristics: The new system is introduced to a limited area, department or user group first; feedback is used to refine the system.
  • Typical uses: Large organisations, systems with varied user requirements, or when the impact of the new system is uncertain.
  • Advantages

    • Problems are identified early in a low‑risk environment.
    • Customisation can be based on real user feedback.
    • Disruption to the wider organisation is limited.
    • Allows a focused e‑safety check on data migration for the pilot group.

  • Disadvantages

    • Full benefits are delayed until the pilot phase is completed.
    • Success depends on selecting a representative pilot group.
    • Risk of “pilot‑itis” – endless tweaking before wider release.

  • Case‑study example: The school’s senior leadership team pilots the new timetable system for the sixth‑form only. After two weeks they report minor UI issues, which are fixed before the system is rolled out to the whole school.

4. Phased Implementation (Staged Roll‑out)

  • Characteristics: The new system is introduced in stages, either by function (e.g., admissions first) or by geography (e.g., one campus at a time).
  • Typical uses: Large, complex systems; organisations with multiple sites or departments.
  • Advantages

    • Costs and resource demands are spread over time.
    • Users adapt gradually, reducing resistance.
    • Problems can be isolated to a specific phase and corrected before the next roll‑out.
    • Data‑security checks can be performed phase by phase, limiting exposure.

  • Disadvantages

    • Overall implementation period is longer.
    • Requires careful integration between old and new modules during overlap.
    • Inconsistent processes may arise if phases are not well coordinated.

  • Case‑study example: The school implements the new timetable system in three phases: (1) Primary school, (2) Middle school, (3) Senior school. Each phase runs for one term, allowing staff to become familiar before the next group switches.

Decision‑making tip (AO2)

When selecting an implementation method ask yourself:

  1. What is the risk level of the system (critical vs. non‑critical)?
  2. What are the budget and resource constraints?
  3. How ready are the users (training needs, resistance to change)?
  4. What e‑safety / data‑security issues are involved in migration?
  5. Does the organisation need a quick benefit (direct change‑over) or can it tolerate a slower roll‑out (parallel, pilot, phased)?

Comparison of implementation methods

Key factors for the four implementation methods
MethodRisk levelCost implicationImplementation speedTypical use casese‑safety / data‑security issues
Direct Change‑overHighLowVery fastSmall, low‑risk systems; legacy system no longer functionalData‑migration errors hard to detect; no parallel verification
Parallel RunningLowHighSlowCritical, high‑risk systems (payroll, finance, student records)Allows cross‑checking of data between old and new systems
Pilot RunningMediumMediumModerateLarge organisations; uncertain impact; varied user requirementsData‑security tested on a limited group before full roll‑out
Phased ImplementationMediumMediumGradualComplex, multi‑site or multi‑function systemsSecurity checks performed phase‑by‑phase, limiting exposure

7.6 Evaluation (post‑implementation) – AO3

AO3 (Evaluate): You must be able to assess the success of the implemented solution against the original requirements, using evidence and making justified recommendations.

Post‑implementation activities

  • Provide on‑site user training and a help‑desk during the initial weeks.
  • Monitor system performance; compare actual output with the functional and non‑functional requirements.
  • Update technical and user documentation to reflect any changes made during go‑live.
  • Conduct a formal evaluation (see checklist below).
  • Establish a maintenance schedule, backup routine and disaster‑recovery plan.

Evaluation checklist (AO3 – three criteria)

  1. Efficiency: Does the system process data faster or use fewer resources than the old system? Evidence: processing times, CPU/memory usage, cost savings.
  2. Ease of use: Is the system intuitive for the intended users? Evidence: user‑survey results, number of support tickets, training time required.
  3. Appropriateness: Does the system meet the functional requirements and solve the original problem? Evidence: requirement‑traceability matrix, error‑rate comparison, stakeholder feedback.

Example evaluation report (≈150 words)

Evaluation of the new school timetable system

The system meets 95% of the functional requirements; only the “automatic clash‑resolution” feature was omitted due to time constraints. Processing a full timetable now takes 3 seconds compared with 45 seconds in the legacy spreadsheet (efficiency gain of 93%). User surveys (n=30) show a satisfaction rating of 4.2/5, with most teachers finding the drag‑and‑drop interface intuitive (ease of use). Data integrity checks reveal a 0.1% discrepancy between old and new records, which has been rectified. Overall, the system is appropriate for the school’s needs, delivers significant efficiency improvements and is well‑received by staff. Recommended actions: develop the missing clash‑resolution module and schedule a refresher training session before the next term.

Steps to produce the evaluation (exam guidance)

  1. Briefly restate the original objectives and success criteria.
  2. Present evidence for each AO3 criterion (use tables or bullet points).
  3. Identify any shortcomings and explain why they occurred.
  4. Make realistic, justified recommendations for improvement.
  5. Conclude with a short statement on overall success.

Link to AO2 and AO3 in the exam

When answering a question on implementation you may be asked to:

  • AO2 – Choose the most suitable method, justify the choice, and produce a change‑over plan or risk register.
  • AO3 – Evaluate the implementation using the three criteria above and suggest further actions.