SELFCEPTION – Self/other distinction for interaction under uncertainty

SELFCEPTION – Self/other distinction for interaction under uncertainty

In 2018, there will be more than 35 million private or non-industrial robots used worldwide, a market of 19 billion euros. However, autonomous robot technology in Europe is not yet ready to cover this high expectancy. This is due to the lack of robust functionality in uncertain environments. Particularly, safe interaction is an essential requirement. A basic skill, still unachieved, is to allow the robot to be aware of its own body and perceive other agents. To build a synthetic model that allows robots that learn to recognize their own body and distinguish it from other elements in the environment is the goal of the SELFCEPTION research project.

Recent evidence suggests that self/other distinction will be a major breakthrough for improving interaction and might be the connection between low-level sensorimotor abilities and voluntary actions, or even abstract thinking. The project follows the hypothesis that the “sensorimotor self” learning will permit that humanoid robots could distinguish between the machine and the other agents during interaction. For that purpose, SELFCEPTION proposes combining advanced sensorimotor learning with new multimodal sensing devices, such as artificial skin, in order to permit the robot to acquire its perceptual representation.

SELFCEPTION is an interdisciplinary project that combines robotics and cognitive psychology. To this end, the main researcher will be trained under the supervision of the renowned cognitive psychologist Bernard Homel at the Leiden Institute for Brain and Cognition (LIBC). The synthetic model developed model will be tested in a whole body sensing humanoid and validated in a service robot in collaboration with the Spanish company PAL Robotics.

SELFCEPTION will boost the materialization of the next generation of perceptive robots: multisensory machines able to build their perceptual body schema and distinguish their actions from other entities. We already have robots that navigate and now it is the time to develop robots that interact.

This EU-funded project is led by Pablo Lanillos and coordinated by the director of the Institute for Cognitive Systems Gordon Cheng from the Technical University of Munich (TUM). The project has been funded through a Marie Sklodowska-Curie action granted by the European Union.

Project linkEU Cordis link