Research aims to have TV viewers on the edge of the couch - Embedded.com

Research aims to have TV viewers on the edge of the couch

LONDON — Live TV outside broadcasts that combine real action and computer-generated images could become possible as the result of camera navigation technology under development.

Harnessing techniques from mathematics, computing and engineering, the new system is being developed at Oxford University with funding from the Engineering and Physical Sciences Research Council (EPSRC). The project aims to extend the capabilities of a prototype system developed by the same team, also with EPSRC funding.

The 3-year project, 'Real-Time Camera Localisation in Real Environments', began in January 2005 and is receiving EPSRC funding of nearly £255,000. The previous 15-month study, 'Real-Time Ego-Motion Estimation for a Single Camera', received EPSRC funding of nearly £61,000 and was completed in May 2004. Dr Ian Reid and Dr Andrew Davison of Oxford University's Department of Engineering Science are leading the project.

The system is able to work out in real-time where a camera is and how it is moving, simultaneously constructing a detailed visual map of its surroundings. This enables computer graphics to be overlaid accurately onto live pictures as soon as they are produced. Previously the blending of live action and computer-generated images has only been possible in controlled studio environments.

The system comprises a mobile video camera connected to a laptop computer, which analyses the images it receives using software developed by the researchers. As the camera moves, the system picks out landmarks as reference points and makes a map of their 3D locations against which to measure its position. The challenge is to estimate accurately the camera's position and the layout of its surroundings at the same time – a task known as Simultaneous Localisation and Mapping (SLAM).

The work is opening up the prospect of outdoor sporting, musical or other TV coverage that blends the excitement of being live with the spectacular visual impact that computer graphics can create. It can also be applied at the consumer level, e.g. to enable interior design ideas to be visualised by adding virtual furniture to the view of a room provided by a hand-held camera as it moves.

As well as TV and video applications, the technology under development could provide low-cost, high-performance navigation for domestic robots. The project team is also collaborating with Japan's AIST Research Institute on the development of humanoid robots. It could also be incorporated into video games or wearable computing, e.g. for use in dangerous environments, where it could confirm the wearer's location and allow relevant guidance to be overlaid onto their view of surroundings.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.