The gulf between measuring and understanding
Designers are creating processes that react not to the measured data but to the state of another complex system.
|Click image to go to digital edition.|
Recently some developers have partitioned this mechanistic concept into two independent processes. Sometimes, the embedded system is supposed to react not to the measured data, but to the state of another complex system. For example, a patient monitor should respond to a patient’s core or brain temperature, not to his forehead. In these cases, designers have created a separate process, called sensor fusion, to combine data from many sensors and process them into an estimate of the variable the system is really supposed to control. In our example, a design might fuse several temperature measurements from different sensors, the patient’s position, pulse and respiration, and diagnosis into a single estimate of cranial temperature.
Take this a step further, to a collision-avoidance system for a car. How do you know if that reflection from the bumper-mounted radar is the truck in front of you panic-stopping, a half-ton anvil falling from the truck, a pillow bouncing out of a pickup in another lane, or a plume of water from a puddle? You take as many measurements as you can--radar, video, sonar. Perhaps you employ a rules-based fusion process to classify the threat: can’t avoid, must avoid, try to avoid, or forget it. Then you provide the vehicle control system with the classification and your best estimate of the trajectory of the object. Notice that what had been sensor data is now an object.
One step more. To avoid the object you can brake or swerve. To decide, you will need more data, fused into more objects: the road, other traffic, surroundings, and pedestrians. In his cover story this month, Supreet Oberoi argues that at some point, you pass to the control system not just a list of objects, but awareness of a situation.
The system may still respond to the situation based on rules. Or it may run a dynamic model of the situation to explore responses. Or, as Jack Ganssle speculates in this issue, at some point the system ceases to classify and apply rules, or to simulate and search for an optimum outcome, but instead begins to reason about the situation. We really don’t know how to define that transition. But by then, our continuum has taken us from simple state-machines to robots--a new land in which the machines understand, and we do not.
Ron Wilson is the editorial director of ESD magazine, Embedded.com, and the Embedded Systems Conferences. You may reach him at email@example.com.