Constructing a robot back in the “good olddays” was a difficult, error-prone and time-consuming process. Sensingthe environment was achieved with devices built from discretecomponents, many of which were never designed to effectively functiontogether.
The processors were small and lacking sufficient power to gatherinformation from multiple sensors and subsequently process thatinformation.
|Figure1: Shown is an ultrasonic range sensor. This design not only struggledwith hardware and software limitations – everything was createdin-house, thus increasing cost and time-to-market.|
As an example, let's look at an ultrasonic range sensor (Figure 1, above) . Building a sensormost likely involved purchasing transducers from a camera company. Onewould then build some interface circuitry to send out the pulse andtime the return.
The interface to the robot's processor would have consisted of anoutput signal indicating when to start taking a measurement and aninput of the elapsed count on a timer detecting the echo (Figure 2, below ).
|Figure2: By allowing the processor to handle the echo return, complexmultiple echo processing algorithms could be developed.|
The processor would take the elapsed time and convert this to adistance. The hardware became even more complex if it needed to handlemultiple echoes.
This design not only struggled with hardware and softwarelimitations – everything was created in-house, thus increasing cost andtime-to-market. Processors became more powerful with time andeventually reclaimed the processing that had been delegated out todiscrete hardware.
By allowing the processor to handle the echo return, complexmultiple echo processing algorithms could be developed.
Many of the algorithms that are now common were just being invented.This reduced some of the complexity in the hardware, thus reducingcost. The software programming process, however, was stilltime-consuming. Most of the hardware was custom-built and so were thesoftware drivers that interfaced to it.
As the software grew more complex, it taxed the processors of theday. Often, this was solved by using multiple processors that openedwhole new potentials for race conditions, deadlocks anddifficult-to-duplicate problems.
|Figure3: Many sensors are now designed to communicate using these commonbuses, which simplify interfacing.|
Current state of the world
Today, it is fairly common to use an off-the-shelf MCU ormicroprocessor board equipped with various readily available hardwareperipherals. Many of these peripherals provide hardware interfaceassistance such as timers and communication buses.
As shown in Figure 3, above ,some common communication buses are RS-232, USB, I2C or CAN bus. Theavailability of common drivers for these interfaces eases the softwareimplementation burden. Many sensors are now designed to communicateusing these common buses, which simplify interfacing.
Processing power has also moved into many sensor components,allowing for a higher-level abstraction of data to be gathered. Insteadof communicating the number of milliseconds for sending and receiving asonar echo, the sensor would report the distance to an object inmillimeters.
The gathered data is processed more efficiently. This relieves themain processor from handling the low-level calculations, allowing it totake on higher-level tasks such as localization and mapping.
With much of the sensor interface being off-the-shelf (e.g.communication link, software drivers, algorithms to handle the senseddata), engineers can develop and deliver solutions more rapidly,gaining a time-to-market advantage. The burden of developing theserobotic functions is moved from the robot developer to the sensorsupplier.
|Figure4: Matching an infrared distance sensor with sonar allows a range ofmaterials and situations to be detected in such a way that neitherdevice could accomplish on its own.|
Sensor systems will continue to be affected by the growth of low costprocessing power and data processing algorithms. Largely affected bythis growth is “sensor fusion,” whicheffectively means that sensory data streams are gathered by multiplesensors and processed to produce an intelligent and accurateinformation stream.
The sensor data is being “fused” together into a single view of theenvironment. Matching an infrared distance sensor with sonar allows arange of materials and situations to be detected in such a way thatneither device could accomplish on its own.
Facial recognition.Maturing software algorithms open the door to exciting areas, includingfacial recognition. Just a few years ago, the processing power was notavailable to consider doing this effort in real-time. Now, there areproducts available to process faces in a crowd in real-time. Soon, thesensor system won't merely report “object 2m in front,” but rather “Bobis 2m in front.”
Localization, mapping.Thisis another technology area that has seen increased interest in recentyears. There are multiple off-the-shelf implementations of simultaneouslocalization and mapping algorithms available either free or forminimal charge. This trend is occurring in many software areas and willcontinue.
Stereo vision.Exciting growth is also being seen in stereo vision. The amount of datathat a single camera generates can be enormous, but stereo vision addsto that, requiring two cameras in operation. This was only a remotepossibility until communication links, processing power and softwarealgorithms matured. Today, there are available off-the-shelf systemsthat can do distance detection in a limited environment.
As these systems continue to improve, their accuracy and speed willmake them a viable alternative to other forms of distance measurement.A “fused” system of ultrasonic, infrared and stereo vision will be ableto function in virtually any sort of environment.
In the future, sensor technology integration will continue tomature. The number of sensors that a robot can efficiently process willachieve a growth curve similar to the rate of transistor integrationpredicted by Moore's Law.