"European researchers have developed a new approach to artificial intelligence that could empower computers to respond intelligently to human behaviour as well as commands." Michael Conroy writes about, Popeye, the robot with brains not brawn, on the wired.co.uk website.
"The originality of our project was our attempt to integrate two different sensory modalities, namely sound and vision," project coordinator Radu Horaud explains to Science Daily. Their robot, named Popeye, was built to work out which voices are "relevant" amongst a cacophony of noise by combining video input and image recognition technology with sound analysis. "It is not that easy to decide what is foreground and what is background using sound alone, but by combining the two modalities – sound and vision – it becomes much easier," Horaud continues. "If you are able to locate ten sound sources in ten different directions, but if in one of these directions you see a face, then you can much more easily concentrate on that sound and throw out the other ones."
It may not sound like much of an achievement, but using multiple "senses" to infer meaning – rather than simply throwing more computational power and new algorithms at the problem – is a fundamentally different approach to artificial intelligence.