Automated Synthesis of Body Schema using Multiple Sensor Modalities
Proc. of the Int. Conf. on the Simulation and Synthesis of Living Systems (ALIFEX), , , 2006
Abstract: The way in which organisms create body schema, based on their interactions with the real world, is an unsolved problem in neuroscience. Similarly, in evolutionary robotics, a robot learns to behave in the real world either without re-course to an internal model (requiring at least hundreds of interactions), or a model is hand designed by the experimenter (requiring much prior knowledge about the robot and its environment). In this paper we present a method that allows a physical robot to automatically synthesize a body schema, using multimodal sensor data that it obtains through interaction with the real world. Furthermore, this synthesis can be either parametric (the experimenter provides an approximate model and the robot then refines the model) or topological: the robot synthesizes a predictive model of its own body plan using little prior knowledge. We show that this latter type of synthesis can occur when a physical quadrupedal robot performs only nine, 5-second interactions with its environment.
[edit database entry]