New sensor and MCU technology are key to nextgen robots
Constructing a robot back in the "good old days" was a difficult, error-prone and time-consuming process. Sensing the environment was achieved with devices built from discrete components, many of which were never designed to effectively function together.
The processors were small and lacking sufficient power to gather information from multiple sensors and subsequently process that information.
|Figure 1: Shown is an ultrasonic range sensor. This design not only struggled with hardware and software limitations - everything was created in-house, thus increasing cost and time-to-market.|
As an example, let's look at an ultrasonic range sensor (Figure 1, above). Building a sensor most likely involved purchasing transducers from a camera company. One would then build some interface circuitry to send out the pulse and time the return.
The interface to the robot's processor would have consisted of an output signal indicating when to start taking a measurement and an input of the elapsed count on a timer detecting the echo (Figure 2, below).
|Figure 2: By allowing the processor to handle the echo return, complex multiple echo processing algorithms could be developed.|
The processor would take the elapsed time and convert this to a distance. The hardware became even more complex if it needed to handle multiple echoes.
This design not only struggled with hardware and software limitations - everything was created in-house, thus increasing cost and time-to-market. Processors became more powerful with time and eventually reclaimed the processing that had been delegated out to discrete hardware.
By allowing the processor to handle the echo return, complex multiple echo processing algorithms could be developed.
Many of the algorithms that are now common were just being invented. This reduced some of the complexity in the hardware, thus reducing cost. The software programming process, however, was still time-consuming. Most of the hardware was custom-built and so were the software drivers that interfaced to it.
As the software grew more complex, it taxed the processors of the day. Often, this was solved by using multiple processors that opened whole new potentials for race conditions, deadlocks and difficult-to-duplicate problems.
|Figure 3: Many sensors are now designed to communicate using these common buses, which simplify interfacing.|
Current state of the world
Today, it is fairly common to use an off-the-shelf MCU or microprocessor board equipped with various readily available hardware peripherals. Many of these peripherals provide hardware interface assistance such as timers and communication buses.
As shown in Figure 3, above, some common communication buses are RS-232, USB, I2C or CAN bus. The availability of common drivers for these interfaces eases the software implementation burden. Many sensors are now designed to communicate using these common buses, which simplify interfacing.
Processing power has also moved into many sensor components,
allowing for a higher-level abstraction of data to be gathered. Instead
of communicating the number of milliseconds for sending and receiving a
sonar echo, the sensor would report the distance to an object in
The gathered data is processed more efficiently. This relieves the main processor from handling the low-level calculations, allowing it to take on higher-level tasks such as localization and mapping.
With much of the sensor interface being off-the-shelf (e.g. communication link, software drivers, algorithms to handle the sensed data), engineers can develop and deliver solutions more rapidly, gaining a time-to-market advantage. The burden of developing these robotic functions is moved from the robot developer to the sensor supplier.
|Figure 4: Matching an infrared distance sensor with sonar allows a range of materials and situations to be detected in such a way that neither device could accomplish on its own.|
Sensor systems will continue to be affected by the growth of low cost processing power and data processing algorithms. Largely affected by this growth is "sensor fusion," which effectively means that sensory data streams are gathered by multiple sensors and processed to produce an intelligent and accurate information stream.
The sensor data is being "fused" together into a single view of the environment. Matching an infrared distance sensor with sonar allows a range of materials and situations to be detected in such a way that neither device could accomplish on its own.
Facial recognition. Maturing software algorithms open the door to exciting areas, including facial recognition. Just a few years ago, the processing power was not available to consider doing this effort in real-time. Now, there are products available to process faces in a crowd in real-time. Soon, the sensor system won't merely report "object 2m in front," but rather "Bob is 2m in front."
Localization, mapping.This is another technology area that has seen increased interest in recent years. There are multiple off-the-shelf implementations of simultaneous localization and mapping algorithms available either free or for minimal charge. This trend is occurring in many software areas and will continue.
Stereo vision. Exciting growth is also being seen in stereo vision. The amount of data that a single camera generates can be enormous, but stereo vision adds to that, requiring two cameras in operation. This was only a remote possibility until communication links, processing power and software algorithms matured. Today, there are available off-the-shelf systems that can do distance detection in a limited environment.
As these systems continue to improve, their accuracy and speed will make them a viable alternative to other forms of distance measurement. A "fused" system of ultrasonic, infrared and stereo vision will be able to function in virtually any sort of environment.
In the future, sensor technology integration will continue to mature. The number of sensors that a robot can efficiently process will achieve a growth curve similar to the rate of transistor integration predicted by Moore's Law.
Jon Mandrell is Managing
Consultant at CoroWare Technologies
Inc. He can be contacted at firstname.lastname@example.org. To read a PDF
version of this story, go to "Equip
your robots with sensors, processors."