Mercurial > cortex
changeset 542:97d45f796ad6
write more contributions, per Winston.
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Sun, 27 Apr 2014 22:27:49 -0400 |
parents | d947636fe0ee |
children | d8f06a80d3ab |
files | thesis/cortex.org |
diffstat | 1 files changed, 35 insertions(+), 18 deletions(-) [+] |
line wrap: on
line diff
1.1 --- a/thesis/cortex.org Sun Apr 27 21:58:26 2014 -0400 1.2 +++ b/thesis/cortex.org Sun Apr 27 22:27:49 2014 -0400 1.3 @@ -3381,23 +3381,38 @@ 1.4 1.5 * Contributions 1.6 1.7 - In this thesis you have seen the =CORTEX= system, a complete 1.8 - environment for creating simulated creatures. You have seen how to 1.9 - implement five senses: touch, proprioception, hearing, vision, and 1.10 - muscle tension. You have seen how to create new creatures using 1.11 - blender, a 3D modeling tool. I hope that =CORTEX= will be useful in 1.12 - further research projects. To this end I have included the full 1.13 - source to =CORTEX= along with a large suite of tests and examples. I 1.14 - have also created a user guide for =CORTEX= which is included in an 1.15 - appendix to this thesis. 1.16 - 1.17 - You have also seen how I used =CORTEX= as a platform to attack the 1.18 - /action recognition/ problem, which is the problem of recognizing 1.19 - actions in video. You saw a simple system called =EMPATH= which 1.20 - identifies actions by first describing actions in a body-centered, 1.21 - rich sense language, then inferring a full range of sensory 1.22 - experience from limited data using previous experience gained from 1.23 - free play. 1.24 + The big idea behind this thesis is a new way to represent and 1.25 + recognize physical actions -- empathic representation. Actions are 1.26 + represented as predicates which have available the totality of a 1.27 + creature's sensory abilities. To recognize the physical actions of 1.28 + another creature similar to yourself, you imagine what they would 1.29 + feel by examining the position of their body and relating it to your 1.30 + own previous experience. 1.31 + 1.32 + Empathic description of physical actions is very robust and general. 1.33 + Because the representation is body-centered, it avoids the fragility 1.34 + of learning from example videos. Because it relies on all of a 1.35 + creature's senses, it can describe exactly what an action /feels 1.36 + like/ without getting caught up in irrelevant details such as visual 1.37 + appearance. I think it is important that a correct description of 1.38 + jumping (for example) should not waste even a single bit on the 1.39 + color of a person's clothes or skin; empathic representation can 1.40 + avoid this waste by describing jumping in terms of touch, muscle 1.41 + contractions, and the brief feeling of weightlessness. Empathic 1.42 + representation is very low-level in that it describes actions using 1.43 + concrete sensory data with little abstraction, but it has the 1.44 + generality of much more abstract representations! 1.45 + 1.46 + Another important contribution of this thesis is the development of 1.47 + the =CORTEX= system, a complete environment for creating simulated 1.48 + creatures. You have seen how to implement five senses: touch, 1.49 + proprioception, hearing, vision, and muscle tension. You have seen 1.50 + how to create new creatures using blender, a 3D modeling tool. 1.51 + 1.52 + I hope that =CORTEX= will be useful in further research projects. To 1.53 + this end I have included the full source to =CORTEX= along with a 1.54 + large suite of tests and examples. I have also created a user guide 1.55 + for =CORTEX= which is included in an appendix to this thesis. 1.56 1.57 As a minor digression, you also saw how I used =CORTEX= to enable a 1.58 tiny worm to discover the topology of its skin simply by rolling on 1.59 @@ -3411,7 +3426,9 @@ 1.60 creatures using Blender, a free 3D modeling program. 1.61 1.62 - =EMPATH=, which uses =CORTEX= to identify the actions of a 1.63 - worm-like creature using a computational model of empathy. 1.64 + worm-like creature using a computational model of empathy. This 1.65 + empathic representation of actions is an important new kind of 1.66 + representation for physical actions. 1.67 1.68 #+BEGIN_LaTeX 1.69 \newpage