Mercurial > cortex
diff thesis/abstract.org @ 432:1e5ea711857d
abstract first draft.
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Sun, 23 Mar 2014 16:33:01 -0400 |
parents | |
children | d52bff980f0d |
line wrap: on
line diff
1.1 --- /dev/null Thu Jan 01 00:00:00 1970 +0000 1.2 +++ b/thesis/abstract.org Sun Mar 23 16:33:01 2014 -0400 1.3 @@ -0,0 +1,31 @@ 1.4 +Here I explore the design and capabilities of my system (called 1.5 +=CORTEX=) which enables experiments in /embodied artificial 1.6 +intelligence/ -- that is, AI which uses a physical simulation of 1.7 +reality accompanied by a simulated body to solve problems. 1.8 + 1.9 +In the first half of the thesis I describe the construction of 1.10 +=CORTEX= and the rationale behind my architecture choices. =CORTEX= is 1.11 +a complete platform for embodied AI research. It provides multiple 1.12 +senses for simulated creatures, including vision, touch, 1.13 +proprioception, muscle tension, and hearing. Each of these senses 1.14 +provides a wealth of parameters that are biologically 1.15 +inspired. =CORTEX= is able to simulate any number of creatures and 1.16 +senses, and provides facilities for easily modeling and creating new 1.17 +creatures. As a research platform it is more complete than any other 1.18 +system currently available. 1.19 + 1.20 +In the second half of the thesis I develop a computational model of 1.21 +empathy, using =CORTEX= as a base. Empathy in this context is the 1.22 +ability to observe another creature and infer what sorts of sensations 1.23 +that creature is feeling. My empathy algorithm involves multiple 1.24 +phases. First is free-play, where the creature moves around and gains 1.25 +sensory experience. From this experience I construct a representation 1.26 +of the creature's sensory state space, which I call \phi-space. Using 1.27 +\phi-space, I construct an efficient function for enriching the 1.28 +limited data that comes from observing another creature with a full 1.29 +compliment of imagined sensory data based on previous experience. I 1.30 +can then use the imagined sensory data to recognize what the observed 1.31 +creature is doing and feeling, using straightforward embodied action 1.32 +predicates. This is all demonstrated with using a simple worm-like 1.33 +creature, recognizing worm-actions in video. 1.34 +