Monday, July 25, 2011

Living With Robots As Interactive Companions

Improving our knowledge about face perception could give researchers the help they need to develop the next generation of life-changing software and robots. Driving this effort are scientists from Queen Mary, University of London, as well as from University College London and Oxford University in the United Kingdom, who are examining whether robots and computers have the capacity to do the same thing.

The research is an outcome of the LIREC ('Living with robots and interactive companions') project, which is backed with EUR 8.2 million under the 'Information and communication technologies' (ICT) Theme of the EU's Seventh Framework Programme (FP7). The researchers presented their work at the annual Royal Society's Summer Science Exhibition in London from 5 to 10 July.

Emys, a LIREC robot
Emys, a LIREC robot © LIREC
Credit: © LIREC

When you interact with other people, your brain processes a number of small and subtle cues about faces. At the exhibition, visitors saw how the human brain understands faces, including how motions are transferred from one person's face to another, and what their faces look like when they switch gender. The visitors also saw sophisticated computer vision systems capable of recognising facial expressions.

Heriot Watt - Integrated Companions Update from Lirec on Vimeo.

'We will be showing some of the latest research from the EU-funded LIREC project, which aims to create socially aware companion robots and graphical characters,' said Professor Peter McOwan from the School of Electronic Engineering and Computer Science at Queen Mary, University of London before the exhibition. 'There will be the opportunity for those attending to see if our computer vision system can detect their smiles, watch the most recent videos of our robots in action and talk to us about the project.'

Being able to break up the movement of faces, specifically into basic facial actions, and understanding how actions are different between people will enable computer scientists to analyse facial movement and develop realistic motion into avatars. In doing so, people will be more accepting of avatars as channels of communication.

Said Professor McOwan: 'Robots are going to increasingly form part of our daily lives — for instance robotic aids used in hospitals or much later down the road sophisticated machines that we will have working in our homes. Our research aims to develop software, based on biology, that will allow robots to interact with humans in the most natural way possible — understanding the things we take for granted like personal space or reacting to an overt emotion such as happiness.'

University of Hertfordshire - Integrated Companions Update from Lirec on Vimeo.

Commenting on the research and work involved, Professor Alan Johnston from the Psychology and Language Sciences Division at University College London said: 'A picture of a face is just a frozen sample drawn from a highly dynamic sequence of movements. Facial motion transfer onto other faces or average avatars provides an extremely important tool for studying dynamic face perception in humans as it allows experimenters to study facial motion in isolation from the form of the face.'

For her part, co-researcher Cecilia Heyes from All Souls College at University of Oxford noted that this type of technology can lead to the creation of great spin-offs. 'We're using it to find out how people imitate facial expressions, which is very important for rapport and cooperation, and why people are better at recognising their own facial movements than those of their friends — even though they see their friends faces much more often than their own.'

Source: European Commission Research Information Center

0 comments: