The prime purpose of any life saving equipment is to ensure no additional loss of human life and to find any casualties as fast as possible. With this is in mind, the development of the six legged robotic spider to support life rescuing operations during catastrophe response missions, such as collapsed buildings after an earth-quake empowers a complete robotic solution.
Thanks to its mobility, small size and onboard intelligence, the spider can avoid a variety of obstacles and move to remote, difficult-to-reach locations to look for trapped victims. Another potential application area is to replace humans in dangerous missions, such as sweeping and neutralizing mine-fields.
These challenges are met by a highly mobile walking scheme: the robot has independent legs allowing it to move in an omni-directional fashion, even on terrain where robotic movement is not normally possible or is too risky. 'Walking' and 'rotating' belong to the basic high-level motion patterns that have been adopted from six-legged insects.
Three moving and lifting 'feet' enable the desired walking speed and provide the required equilibrium for harsh terrain. 'Creeping' is a special motion which allows the robot to squeeze through tight spaces and narrow slots.
The leg mechanics and motion control are both key features of the spider robot. A total of 24 'smart' dc brushless motors not only drive the legs but also function as integral joints of the walking mechanics. This leads to a sturdy, yet light-weight construction, reducing the power consumption and improving the motion dynamics.
Besides the legs, the robot features typical autonomous robotic subsystems including machine vision, distance measuring and wireless communication. The embedded hardware and two 7.2V lithium polymer batteries including the fuel gauges reside inside the rigid body of the robot. Mission parameters, I/O settings and new motion gaits can be transferred either wirelessly or by removable media.
The spiders' low-level movements rely on complex mathematical models calculated at run-time. Thanks to the embedded computing power of the Analog Devices Blackfin Processor and Schmid Engineering's deterministic real-time services, the motion looks is dynamic and smooth. High-level LabVIEW VIs as well as hand-optimized Blackfin maths libraries are used for the inverse kinematics algorithm, which runs continuously.
This algorithm, including trigonometric functions and matrix operations, finds suitable joint angles to exactly move the robot along a trajectory within a desired space. The robot's trajectories can be programmed in 3 different ways:
1 They can be taught and played back as a common technique for design and training new or special patterns.
2 A 3D CAD software allows the user to visually check the simulated trajectories. The models are then exported as virtual reality files and imported into LabVIEW's picture controls. Movements are then tuned by comparing the virtual with real model.
3. There are continuously calculated trajectories at runtime from an inverse kinematic algorithm.
All of these need to be done in parallel for all joint angles of all 6 legs resulting in 24 continuously calculated setpoints for all motors to ensure dynamic motion. These setpoints are transferred to each motor via a serial RS485 network and turned into physical actions by de-central PD (Proportional- Derivative) controllers. Position feedback and temperature readings of all 24 actuators are acquired over the same network.
On top of the smart motion and freedom of movement, an intelligent camera and a distance measurement sensor form the “eye” of the spider robot. Objects and substances are localized and tracked by high performance image processing algorithms, for example finding a centroid ” or the point where an object coincides with its area of mass – within a region of interest.
The 'eye' can also be programmed to identify any color within its vicinity. Future versions will include improved image processing, pattern matching and edge detection, taking the Blackfin processor's computation power and high-speed image acquisition to the next level.
The ultra-low power mixed signal target ZMobile is the 'heart' of the spider robot. To provide communications on any level with the robot, a permanent Bluetooth communication interface is maintained to the 'outside world'.
This interface monitors channels for the ZMobile's Fast Debug Mode during development and test, reading critical parameters such as motor status and battery level for system diagnostics and allowing online acquisition of vital algorithm variables for tuning.
It enables downloading of new mission data prior to an operation. The LabVIEW/Blackfin target supplied by Schmid Engineering (M¼nchwilen, Switzerland) integrates sensors, actuators, vision, batteries and wireless communication on a single platform. Nanyang Polytechnic (Singapore), which designed the spider robot, chose the ZMobile platform for three reasons.
First, programming the spider in Lab- VIEW allowed the robot designers to concentrate on the prime functions of the project right from day one. Thanks to the high productivity of graphical programming, the system engineers were able to add more functionality than originally specified during the same development period.
Second, an ultra low energy scheme such as the ZMobile dynamic power management was a vital feature for this autonomous robot, since operation time can now be significantly prolonged. The same applies to the ZMobile's power consumption in the milliWatt range, allowing most of the energy stored in the onboard batteries to power the motors.
Third, the scalable process I/O slot makes room for integrating more sensors and actuators in the future.
The entire spider robot application software was programmed using the LabVIEW Embedded Module for ADI Blackfin 2.5, extended by the ZBrain board support package for NI Lab-VIEW from Schmid Engineering.
This provided the ideal embedded software platform providing high-level programming, graphical debugging, graphical multitasking and at the same time deterministic real-time behavior.
Object oriented design patterns helped to further manage complexity on the graphical level. Main objects such motors or sensors were abstracted by functional global variables, representing classes in LabVIEW. The main application framework consists of several tasks:
1. The top level main loop plans for actions and is represented by a classic state machine connecting to the other loops by software queues and synchronisation means such as semaphores.
2. The communication task maintains a wireless data connection to the outside world.
3. The vision task is responsible for the low level image processing and distance reading.
4. The motion task manages highlevel motion patterns and low level limb control and also monitors the motor's position and state.
5. A housekeeping task acts as a common error handler. Events and errors are detected and logged to removable media along with timestamps for later retrieval. ZMobile features like watchdog, rebooting and shutdown with programmed wake-up are efficient means to restart from scratch if error self correction (e.g. error rollback) doesn't succeed.
These loops run simultaneously as threads in a cooperative multitasking environment. Context switching in the millisecond range as well as microsecond real-time determinism on the driver level ensure smooth and glitch free movements.
Future developments to the common mechatronic platform include advancement in vision, a smarter power- management and energy-harvesting scheme, sensor-fusion, fuzzy-logic and GPS data collection. In terms of future products, it is planned to re-use the modular hardware and software system in other mobile, autonomous robot types such as snakes.