Publications


MixItUp Demo - Toggle Filtering AND Logic



Automated Synthesis of Body Schema using Multiple Sensor Modalities

Proc. of the Int. Conf. on the Simulation and Synthesis of Living Systems (ALIFEX), , , 2006


Status: Published

Citations:

Cite: [bibtex]


My Image

Abstract: The way in which organisms create body schema, based on their interactions with the real world, is an unsolved problem in neuroscience. Similarly, in evolutionary robotics, a robot learns to behave in the real world either without re-course to an internal model (requiring at least hundreds of interactions), or a model is hand designed by the experimenter (requiring much prior knowledge about the robot and its environment). In this paper we present a method that allows a physical robot to automatically synthesize a body schema, using multimodal sensor data that it obtains through interaction with the real world. Furthermore, this synthesis can be either parametric (the experimenter provides an approximate model and the robot then refines the model) or topological: the robot synthesizes a predictive model of its own body plan using little prior knowledge. We show that this latter type of synthesis can occur when a physical quadrupedal robot performs only nine, 5-second interactions with its environment.



[edit database entry]
Stacks Image 525289
(null)

  • Stacks Image 525371
    (null)
  • Stacks Image 525379
    (null)
  • Stacks Image 525375
    (null)


Stacks Image 525306
(null)

  • Stacks Image 525319
    (null)
  • Stacks Image 525314
    (null)
  • Stacks Image 525310
    (null)


Stacks Image 525327
(null)

  • Stacks Image 525331
    (null)
  • Stacks Image 525335
    (null)
  • Stacks Image 525339
    (null)


Stacks Image 525346
(null)

  • Stacks Image 525350
    (null)
  • Stacks Image 525354
    (null)
  • Stacks Image 525358
    (null)


Stacks Image 525386
(null)

  • Stacks Image 525390
    (null)
  • Stacks Image 525394
    (null)
  • Stacks Image 525398
    (null)