The infrastructure for building a sensor network, which is a basic element of the digital ecosystem, is growing rapidly. In recent years, a sensor network without limitations of time, place, and situation has been applied to several areas.
Mobile robots based on sensor networks are a typical example of this application. Mobile robots can offer services using a sensor network that has been built into indoor spaces such as offices, homes, and airports.
The sensors and robots that are distributed within a space constitute the digital ecosystem. It has been predicted that indoor mobile robots will be human-friendly robots that interact with people. To ensure the safety and stability of human–robot interaction, more accurate and precise mobile robot localization systems are essential.
The classical localization system for mobile robots uses dead-reckoning sensors such as encoders and gyroscopes. Other systems use some combination of cameras, ultrasonic sensors, and the GPS. However, the dead- reckoning sensors are subject to accumulation errors that can result in inaccurate performance for cases involving large displacements.
This paper addresses a radio-frequency identification (ZigBee)-based mobile robot localization which adopts ZigBee tags distributed in a space.
But existing stand- alone Zig-Bee systems for mobile robot localization are hampered by many uncertainties. Therefore, we propose a novel algorithm that improves the localization by fusing a Zigbee system with an ultrasonic sensor system. The proposed system partially removes the uncertainties of RFID systems by using distance data obtained from ultrasonic sensors.
We define a global position estimation (GPE) process using an Zig-Bee system and a local environment cognition (LEC) process using ultrasonic sensors. Then, a hierarchical localization algorithm is proposed to estimate the position of the mobile robot using both GPE and LEC. Finally, the utility of the proposed algorithm is demonstrated through experiments.
To read this external content in full, download the complete paper from the author archives on line at IJARECE.