Yielding self-perception in robots through sensorimotor contingencies

Yielding self-perception in robots through sensorimotor contingencies

Lanillos, P., Emannuel-Dean, L., Cheng, G. (2016) Yielding self-perception in robots through sensorimotor contingencies. IEEE Transactions on Cognitive and Developmental Systems.pdf

Summary:

We address self-perception in robots as the key for world understanding and causality interpretation. We present a self-perception mechanism that enables a humanoid robot to understand certain sensory changes caused by naive actions during interaction with objects. Visual, proprioceptive and tactile cues are combined via artificial attention and probabilistic reasoning to permit the robot to discern between inbody and outbody sources in the scene.With that support and exploiting inter-modal sensory contingencies, the robot can infer simple concepts such as discovering potential “usable” objects. Theoretically and through experimentation with a real humanoid robot, we show how self-perception is a backdrop ability for high order cognitive skills. Moreover, we present a novel model for self-detection, which does not need to track the body parts. Furthermore, results show that the proposed approach successfully discovers objects in the reaching space improving scene understanding by discriminating real objects from visual artefacts.