Describe control technologies (smart homes, traffic lights, autonomous vehicles)

3 Monitoring and Control

Objective

Describe the control technologies used in:

  • Smart homes
  • Traffic‑light systems
  • Autonomous vehicles

1. Introduction to Control Systems

A control system continuously monitors a process, compares the measured output with a desired reference, and applies a corrective action. The basic feedback loop is expressed as:

$$e(t)=r(t)-y(t)$$

  • e(t) – error signal (difference between reference and output)
  • r(t) – reference (desired) value
  • y(t) – measured output

2. Sensor Families – Characteristics, Calibration & Pros/Cons

Sensor Family Physical Quantity Typical Range & Accuracy Common Calibration Method Key Advantages / Disadvantages
Light / UV Illuminance, UV intensity (lux, µW cm⁻²) 0‑100 000 lux; ±5 % of reading One‑point (reference lamp) or two‑point (dark & bright) + Simple, low cost – – Sensitive to ageing, temperature drift
Temperature Thermal temperature (°C/°F) ‑40 °C to +125 °C; ±0.5 °C (thermistor) or ±0.1 °C (RTD) Two‑point (ice‑water & boiling‑water) or multi‑point bath + Wide range, inexpensive – – Self‑heating errors in high‑precision devices
Pressure Gas or liquid pressure (Pa, bar) 0‑10 bar; ±0.25 % of span (piezo‑resistive) Zero‑point & span calibration with known pressures + Direct measurement – – Temperature compensation often required
Humidity Relative humidity (%) 0‑100 %; ±2 % RH Two‑point (dry & saturated salt solutions) + Useful for HVAC – – Slow response at low temperatures
pH Acidity/alkalinity (pH units) 0‑14 pH; ±0.01 pH Two‑point (pH 4 & pH 7 buffer solutions) + High resolution – – Electrode drift, requires frequent cleaning
Gas (CO₂, CO, CH₄, etc.) Concentration (ppm, %) 0‑10 000 ppm; ±1 % of reading Zero‑gas & span calibration with certified gas cylinders + Selective sensors available – – Cross‑sensitivity to other gases
Sound Acoustic pressure level (dB) 30‑130 dB; ±1 dB Reference sound source (e.g., 94 dB calibrator) + Simple, inexpensive – – Highly dependent on microphone placement
Infrared (IR) Heat radiation, proximity (µW cm⁻², distance) 0‑10 m; ±2 % distance error Black‑body source or calibrated distance target + Non‑contact – – Affected by emissivity of target surfaces
Touch / Proximity Contact or near‑field presence (binary) ≤ 10 mm; response < 5 ms Physical actuation of the sensor surface + Fast response – – May be triggered by dust or moisture
Magnetic‑field Magnetic flux density (µT, Gauss) 0‑500 µT; ±1 % of reading Reference magnet with known field strength + Works through non‑magnetic barriers – – Sensitive to nearby ferrous objects
Ultrasonic / Radar Distance, speed (via Doppler) (m, m s⁻¹) 0.02‑5 m; ±1 % distance, ±0.1 m s⁻¹ speed Measured distance to a flat target at known positions + Good for short‑range detection – – Echo‑clutter in noisy environments
LiDAR Laser‑based distance & 3‑D mapping (m, mm) 0.1‑200 m; ±2 cm typical Target boards at calibrated distances + High resolution & range – – Expensive, affected by rain/fog

Calibration Checklist (All Sensor Families)

Sensor Family Zero / Reference Point Span / Full‑Scale Point Frequency of Re‑calibration (Typical)
Light / UVDark (0 lux)Standard lamp (e.g., 10 000 lux)Annually or after firmware update
TemperatureIce‑water (0 °C)Boiling water (100 °C at 1 atm)Every 6 months
PressureAtmospheric (0 bar gauge)Known pressure source (e.g., 5 bar)Yearly
HumidityDry salt (0 % RH)Saturated salt (≈75 % RH)Yearly
pHpH 4 bufferpH 7 bufferQuarterly
GasZero‑gas (N₂ or clean air)Certified span gas (e.g., 1000 ppm CO₂)Before each field deployment
SoundSilence (0 dB)94 dB calibratorYearly
IRBlack‑body at 0 °CBlack‑body at 100 °CEvery 12 months
Touch / ProximityNo contactKnown distance (e.g., 5 mm)Every 6 months
Magnetic‑fieldZero field (shielded area)Reference magnet (e.g., 100 µT)Yearly
Ultrasonic / RadarZero distance (sensor face)Target at full‑scale distanceEvery 12 months
LiDARZero distance (sensor face)Target at calibrated distance (e.g., 100 m)Bi‑annual

3. Actuator Families – Types, Movements & Case‑Study Mapping

Actuator Type Movement / Energy Form Typical Example Mapped to Case Study
Linear (electric, pneumatic, hydraulic) Straight‑line motion Electric linear actuator – motorised window blind Smart‑home (blind control) & Traffic‑light barrier (hydraulic lift)
Rotary (electric motor, stepper, servo) Rotational motion Servo motor – thermostat valve Smart‑home (valve), Autonomous vehicle (steering motor)
Soft (electro‑active polymer, shape‑memory alloy) Flexible deformation Soft gripper on a robotic arm Future smart‑home adaptive shading; prototype autonomous‑vehicle interior adjustments
Hydraulic Fluid‑power generated force Hydraulic ram – car‑park barrier Traffic‑light system (barrier) & Autonomous‑vehicle active suspension (concept)
Pneumatic Compressed‑air generated force Air‑powered door‑closer Traffic‑light pedestrian push‑button mechanism
Electric (solenoid, heating element) Electromagnetic pull/push or resistive heating Solenoid dead‑bolt lock Smart‑home security; Traffic‑light signal head actuation (electromechanical)
Thermal (thermostatic valve) Temperature‑controlled flow Radiator valve in central heating Smart‑home heating control
Magnetic (maglev, magnetic clutch) Magnetic force or torque Maglev train levitation Autonomous‑vehicle magnetic‑brake prototype
Mechanical (gears, cams, levers) Pure mechanical transmission Cam‑operated valve Traffic‑light timing cam; Smart‑home mechanical timer

4. Smart‑Home Control Technologies

  • Sensors: temperature, humidity, light/UV, motion (PIR), door/window contacts, CO₂/gas, sound, IR proximity.
  • Actuators: smart thermostat (rotary valve), motorised blinds (linear), LED drivers (electrical), electronic door locks (solenoid), smart plugs (relay).
  • Communication protocols: Wi‑Fi, Zigbee, Z‑Wave, Thread, Bluetooth Low Energy (BLE).
  • Control platforms: cloud‑based assistants (Amazon Alexa, Google Assistant), local hubs (Home Assistant, Samsung SmartThings).
  • Control logic:
    • Rule‑based schedules (e.g., lights on at sunset).
    • Occupancy‑based optimisation using motion sensors.
    • AI‑driven energy‑saving algorithms that learn daily habits.
  • Safety & security features: fire‑alarm integration, door‑contact alerts, remote lock control, tamper detection.

Performance & Latency Box – Smart Home

Typical latency requirement: seconds to minutes (e.g., thermostat response < 5 s, lighting < 200 ms).
Bandwidth: < 1 Mbps per node (most devices send small status packets).
Reliability target: 99.9 % uptime for security‑related devices.

Sample Pseudocode – Smart‑Home Heating Control

IF (Time = 06:00) THEN
    SetTargetTemp(21°C)
END IF

READ currentTemp FROM TempSensor
error = targetTemp - currentTemp

IF (error > 0.5) THEN
    ACTIVATE heatingValve (open proportionally to error)   // PWM to rotary actuator
ELSE IF (error < -0.5) THEN
    DEACTIVATE heatingValve
END IF

// Feedback loop repeats every 5 seconds

5. Traffic‑Light Control Systems

  • Detection methods: inductive‑loop detectors, video image processing, radar, infrared, acoustic sensors.
  • Actuators / Outputs: electromechanical signal heads (red, amber, green), pedestrian crossing displays, audible beepers.
  • Control logic levels:
    1. Fixed‑time control – pre‑programmed cycle lengths.
    2. Actuated control – phase extensions based on real‑time detector data.
    3. Adaptive control – network‑wide optimisation (e.g., SCOOT, SCATS, InSync).
  • Communication: Dedicated Short‑Range Communications (DSRC), cellular IoT (4G/5G), fiber‑optic links to central traffic‑management centres.
  • Safety extensions: emergency‑vehicle pre‑emption, pedestrian push‑buttons, flashing/amber‑caution modes.

Performance & Latency Box – Traffic Lights

Latency requirement: ≤ 1 s for detector‑to‑signal change; ≤ 3 s for adaptive network re‑optimisation.
Bandwidth: 10‑100 kbps per intersection (mostly status & control messages).
Reliability target: 99.99 % (failure leads to fail‑safe flashing mode).

Sample Flowchart – Actuated Traffic‑Light Cycle

Actuated traffic-light flowchart
Flow: Detect → Extend green → Check pedestrian request → Change phase.

6. Autonomous Vehicle Control Technologies

  • Sensing suite: LiDAR, radar, stereo/mono cameras, ultrasonic sensors, GNSS (GPS/GLONASS) + Inertial Navigation System (INS), infrared night‑vision.
  • Perception algorithms: sensor fusion, object detection, lane‑keeping detection, semantic segmentation, localisation (SLAM).
  • Decision‑making layers:
    1. Strategic layer – route planning using digital maps.
    2. Tactical layer – manoeuvre selection (lane change, overtaking, intersection handling).
    3. Operational layer – low‑level control of steering, throttle, braking, signalling (typically via CAN‑bus).
  • Control logic: hybrid of rule‑based traffic‑law compliance and machine‑learning models for prediction and risk assessment.
  • Actuators: electric power‑steering motor (rotary), drive‑by‑wire throttle (linear/electric), electronic braking system (hydraulic‑electric), indicator lights (electrical).
  • Communication: Vehicle‑to‑Vehicle (V2V), Vehicle‑to‑Infrastructure (V2I), Vehicle‑to‑Network (V2N) via 5G/Cellular‑V2X, plus internal CAN/LIN buses.
  • Safety‑critical aspects: redundancy, fail‑safe states, real‑time latency constraints (≤ 50 ms for emergency braking, ≤ 100 ms for steering decisions).

Performance & Latency Box – Autonomous Vehicles

Latency requirement: ≤ 50 ms for braking, ≤ 100 ms for steering, ≤ 200 ms for lane‑keep assistance.
Bandwidth: 10‑100 Mbps per vehicle (high‑definition camera streams, LiDAR point clouds).
Reliability target: 99.999 % for safety‑critical functions (ISO 26262 ASIL D).

Sample Pseudocode – Operational Layer (Emergency Braking)

loop every 20 ms
    sensorData = fuse(LiDAR, radar, camera)
    obstacles = detectObjects(sensorData)
    IF (obstacle.distance < 2 m AND obstacle.relativeSpeed > 5 m/s) THEN
        brakeCommand = computeBrakingForce(obstacle)
        SEND brakeCommand TO electronicBrakeActuator   // CAN‑bus message
    END IF
end loop

7. Comparative Overview

Aspect Smart Home Traffic‑Light System Autonomous Vehicle
Primary Goal Comfort, energy efficiency, security Safe and efficient traffic flow Self‑driving transport with high safety
Typical Sensors Light/UV, temperature, humidity, motion, door contacts, gas, sound, IR Inductive loops, video cameras, radar, infrared, acoustic LiDAR, radar, cameras, ultrasonic, GNSS/INS, IR
Typical Actuators Thermostat valve, motorised blinds, LED driver, solenoid lock Electromechanical signal heads, pedestrian displays Steering motor, drive‑by‑wire throttle, electronic brakes, indicators
Control Logic Rule‑based schedules, AI optimisation, occupancy detection Fixed‑time, actuated, adaptive network algorithms Hybrid rule‑based + ML decision‑making across three layers
Communication Protocols Wi‑Fi, Zigbee, Z‑Wave, Thread, BLE DSRC, cellular IoT, fiber‑optic back‑haul V2V, V2I, 5G, CAN/LIN bus
Safety Criticality Low–medium (fire alarm, lock control) High – intersection collision avoidance Very high – passenger & public safety, real‑time fail‑safe
Latency Requirements Seconds to minutes (energy optimisation) Sub‑second to a few seconds (phase change) ≤ 50 ms for braking, ≤ 100 ms for steering

8. Summary

Control technologies follow the hierarchy sensors → processing → actuators → feedback. Mastery of the full sensor families, their calibration, and the mapping of actuator families to real‑world examples enables students to design reliable monitoring and control solutions for personal (smart home), public‑infrastructure (traffic lights) and advanced autonomous (vehicles) domains.

9. Suggested Classroom Activities

  1. Create a detailed flowchart for a smart‑home heating scenario (temperature sensor → PID controller → motorised valve → room‑temperature feedback).
  2. Visit a local road intersection, identify the detector types present, and design an adaptive‑control algorithm that could raise vehicle throughput by ≥ 10 %.
  3. Research a current autonomous‑vehicle platform (Waymo, Tesla, Cruise) and produce a diagram linking each sensor to the strategic, tactical and operational decision layers.
  4. Design an experiment to calibrate a temperature sensor using the ice‑water (0 °C) and boiling‑water (100 °C) method; analyse how a ±0.5 °C calibration error would affect a smart‑home thermostat’s energy consumption.
  5. Calculate the maximum allowable delay for safe operation in each technology (fire‑alarm activation, traffic‑light phase change, emergency‑brake in a vehicle) and compare the resulting latency budgets.
Suggested diagram set: three parallel block diagrams showing the feedback loop for (a) a smart‑home lighting system, (b) a traffic‑light controller, and (c) an autonomous‑vehicle steering controller. Each diagram should label sensors, controller, actuator, and communication link.

Create an account or Login to take a Quiz

44 views
0 improvement suggestions

Log in to suggest improvements to this note.