annotate thesis/abstract.org @ 545:b2c66ea58c39

changes from athena.
author Robert McIntyre <rlm@mit.edu>
date Mon, 28 Apr 2014 12:59:08 -0400
parents 8e52a2802821
children d304b2ea7c58
rev   line source
rlm@436 1 Here I demonstrate the power of using embodied artificial intelligence
rlm@436 2 to attack the /action recognition/ problem, which is the challenge of
rlm@436 3 recognizing actions performed by a creature given limited data about
rlm@437 4 the creature's actions, such as a video recording. I solve this
rlm@437 5 problem in the case of a worm-like creature performing actions such as
rlm@524 6 curling and wiggling.
rlm@432 7
rlm@436 8 To attack the action recognition problem, I developed a computational
rlm@441 9 model of empathy (=EMPATH=) which allows me to recognize actions using
rlm@441 10 simple, embodied representations of actions (which require rich
rlm@524 11 sensory data), even when that sensory data is not actually available.
rlm@524 12 The missing sense data is ``imagined'' by the system by combining
rlm@524 13 previous experiences gained from unsupervised free play. The worm is a
rlm@524 14 five-segment creature equipped with touch, proprioception, and muscle
rlm@524 15 tension senses. It recognizes actions using only proprioception data.
rlm@432 16
rlm@436 17 In order to build this empathic, action-recognizing system, I created
rlm@436 18 a program called =CORTEX=, which is a complete platform for embodied
rlm@436 19 AI research. It provides multiple senses for simulated creatures,
rlm@524 20 including vision, touch, proprioception, muscle tension, and hearing.
rlm@524 21 Each of these senses provides a wealth of parameters that are
rlm@436 22 biologically inspired. =CORTEX= is able to simulate any number of
rlm@436 23 creatures and senses, and provides facilities for easily modeling and
rlm@436 24 creating new creatures. As a research platform it is more complete
rlm@436 25 than any other system currently available.