rlm@436
|
1 Here I demonstrate the power of using embodied artificial intelligence
|
rlm@436
|
2 to attack the /action recognition/ problem, which is the challenge of
|
rlm@436
|
3 recognizing actions performed by a creature given limited data about
|
rlm@437
|
4 the creature's actions, such as a video recording. I solve this
|
rlm@437
|
5 problem in the case of a worm-like creature performing actions such as
|
rlm@437
|
6 curling and wiggling.
|
rlm@432
|
7
|
rlm@436
|
8 To attack the action recognition problem, I developed a computational
|
rlm@437
|
9 model of empathy (=EMPATH=) which allows me to use simple, embodied
|
rlm@436
|
10 representations of actions (which require rich sensory data), even
|
rlm@436
|
11 when that sensory data is not actually available. The missing sense
|
rlm@436
|
12 data is ``imagined'' by the system by combining previous experiences
|
rlm@436
|
13 gained from unsupervised free play.
|
rlm@432
|
14
|
rlm@436
|
15 In order to build this empathic, action-recognizing system, I created
|
rlm@436
|
16 a program called =CORTEX=, which is a complete platform for embodied
|
rlm@436
|
17 AI research. It provides multiple senses for simulated creatures,
|
rlm@436
|
18 including vision, touch, proprioception, muscle tension, and
|
rlm@436
|
19 hearing. Each of these senses provides a wealth of parameters that are
|
rlm@436
|
20 biologically inspired. =CORTEX= is able to simulate any number of
|
rlm@436
|
21 creatures and senses, and provides facilities for easily modeling and
|
rlm@436
|
22 creating new creatures. As a research platform it is more complete
|
rlm@436
|
23 than any other system currently available.
|