SELFCEPTION – Self/other distinction for interaction under uncertainty

SELFCEPTION – Self/other distinction for interaction under uncertainty

http://www.selfception.eu

In 2018, there will be more than 35 million private or non-industrial robots used worldwide, a market of 19 billion euros. However, autonomous robot technology in Europe is not yet ready to cover this high expectancy. This is due to the lack of robust functionality in uncertain environments. Particularly, safe interaction is an essential requirement. A basic skill, still unachieved, is to allow the robot to be aware of its own body and perceive other agents. To build a synthetic model that allows robots that learn to recognize their own body and distinguish it from other elements in the environment is the goal of the SELFCEPTION research project.

Recent evidence suggests that self/other distinction will be a major breakthrough for improving interaction and might be the connection between low-level sensorimotor abilities and voluntary actions, or even abstract thinking. The project follows the hypothesis that the “sensorimotor self” learning will permit that humanoid robots could distinguish between the machine and the other agents during interaction. For that purpose, SELFCEPTION proposes combining advanced sensorimotor learning with new multimodal sensing devices, such as artificial skin, in order to permit the robot to acquire its perceptual representation.

SELFCEPTION is an interdisciplinary project that combines robotics and cognitive psychology. To this end, the main researcher will be trained under the supervision of the renowned cognitive psychologist Bernard Homel at the Leiden Institute for Brain and Cognition (LIBC). The synthetic model developed model will be tested in a whole body sensing humanoid and validated in a service robot in collaboration with the Spanish company PAL Robotics.

SELFCEPTION will boost the materialization of the next generation of perceptive robots: multisensory machines able to build their perceptual body schema and distinguish their actions from other entities. We already have robots that navigate and now it is the time to develop robots that interact.

This EU-funded project is led by Pablo Lanillos and coordinated by the director of the Institute for Cognitive Systems Gordon Cheng from the Technical University of Munich (TUM). The project has been funded through a Marie Sklodowska-Curie action granted by the European Union.

Project linkEU Cordis link

REM – active perception for Reasoning in a Embodied robotic Mind

REM – active perception for Reasoning in a Embodied robotic Mind

REMlogo

The REM project’s aim is to instil a major breakthrough in social robotics by enhancing the robot multi-sensory active perception as well as the action reasoning response. Current social robots are still incapable of deploying enough coherent behaviour according to the human expectations diminishing the interaction considerably. This project seeks to enhance the semantics reasoning at symbolic level to one more connected to the robot real perception, improving the level of reciprocity and awareness and yielding to better human-robot interaction (HRI). The societal impact pursued in this research is to get closer to the socially capable robot for health care, assistive and social applications (e.g., assist elder population), thus, enhancing people’s quality of life and aiding robots to entry inside the end-user market.

There are three main lines of research:

  • Multisensory attention: real time bottom-up attention of visual and tactile cues
  • Aware robots: intentional state modelling though inference
  • Non-verbal communication: visual and haptics message communication

logo TUMLogo