Washington, May 11 : Robotic engineers have long been trying to develop robots near to having human intelligence, and now European researchers are working towards piecing together a new generation of machines that are more aware of their environment and better able to interact with humans.
This may enable scientists to make robots more responsive and allow them to be used for a number of sophisticated tasks in the manufacturing and service sectors or as home-helpers and caregivers.
With increasing advances in research into artificial cognitive systems (ACS), it has become quite a fragmented field, with some various teams focussing on machine vision, others on spatial cognition, and on human-robot interaction, among many other disciplines.
But, the EU-funded project CoSy (Cognitive Systems for Cognitive Assistants) has shown that researchers can make even more advances in the field by clubbing all these fragments together.
"We have brought together one of the broadest and most varied teams of researchers in this field. This has resulted in an ACS architecture that integrates multiple cognitive functions to create robots that are more self-aware, understand their environment and can better interact with humans," Science Daily quoted Geert-Jan Kruijff, the CoSy project manager at the German Research Centre for Artificial Intelligence, as saying.
The CoSy ACS consists of a number of technologies ranging from a design for cognitive architecture, spatial cognition, human-robot interaction and situated dialogue processing, to developmental models of visual processing.
And the ACS architecture toolkit developed has been made available under an open source license.
"The integration of different components in an ACS is one of the greatest challenges in robotics. Getting robots to understand their environment from visual inputs and to interact with humans from spoken commands and relate what is said to their environment is enormously complex," said Kruijff.
The majority of robots developed till date simply react to their environment rather than act in it autonomously. Many mobile robots back off when they collide with an object, but lack little self-awareness or understanding of the space around them and what they can do there.
On the other hand, a demonstrator called the Explorer developed by the CoSy team has a more human-like understanding of its environment. Explorer can even talk about its surroundings with a human. Rather than using just geometric data to create a map of its surroundings, the Explorer also incorporates qualitative, topographical information.
In fact, it also interacts with humans to learn to recognise objects, spaces and their uses. For example, if it sees a coffee machine it may reason that it is in a kitchen. If it sees a sofa it may conclude it is in a living room.
"The robot sees a room much as humans see it because it has a conceptual understanding of space," noted Kruijff.
Another demonstrator, called the PlayMate, applied machine vision and spatial recognition in a different context. PlayMate uses a robotic arm to manipulate objects in response to human instructions.
Kruijff has predicted that robots similar to those developed in the CoSy project will become an everyday sight over the coming years in what he describes as 'gofer scenarios'.