diff thesis/cortex.org @ 531:749452f063e5

elaborate paragraph.
author Robert McIntyre <rlm@mit.edu>
date Sat, 26 Apr 2014 20:10:20 -0400
parents 21b8389922ee
children 686f77b88292
line wrap: on
line diff
     1.1 --- a/thesis/cortex.org	Sat Apr 26 19:36:13 2014 -0400
     1.2 +++ b/thesis/cortex.org	Sat Apr 26 20:10:20 2014 -0400
     1.3 @@ -2381,7 +2381,7 @@
     1.4      #+end_listing
     1.5  
     1.6  
     1.7 -    =movement-kernel= creates a function that controlls the movement
     1.8 +    =movement-kernel= creates a function that controls the movement
     1.9      of the nearest physical node to the muscle node. The muscle exerts
    1.10      a rotational force dependent on it's orientation to the object in
    1.11      the blender file. The function returned by =movement-kernel= is
    1.12 @@ -2545,7 +2545,7 @@
    1.13    #+end_src
    1.14    #+end_listing
    1.15  
    1.16 -** COMMENT Embodiment factors action recognition into manageable parts
    1.17 +** Embodiment factors action recognition into manageable parts
    1.18  
    1.19     Using empathy, I divide the problem of action recognition into a
    1.20     recognition process expressed in the language of a full compliment
    1.21 @@ -2738,17 +2738,35 @@
    1.22     These action predicates satisfy the recognition requirement of an
    1.23     empathic recognition system. There is power in the simplicity of
    1.24     the action predicates. They describe their actions without getting
    1.25 -   confused in visual details of the worm. Each one is frame
    1.26 -   independent, but more than that, they are each independent of
    1.27 -   irrelevant visual details of the worm and the environment. They
    1.28 -   will work regardless of whether the worm is a different color or
    1.29 -   heavily textured, or if the environment has strange lighting.
    1.30 +   confused in visual details of the worm. Each one is independent of
    1.31 +   position and rotation, but more than that, they are each
    1.32 +   independent of irrelevant visual details of the worm and the
    1.33 +   environment. They will work regardless of whether the worm is a
    1.34 +   different color or heavily textured, or if the environment has
    1.35 +   strange lighting.
    1.36 +
    1.37 +   Consider how the human act of jumping might be described with
    1.38 +   body-centered action predicates: You might specify that jumping is
    1.39 +   mainly the feeling of your knees bending, your thigh muscles
    1.40 +   contracting, and your inner ear experiencing a certain sort of back
    1.41 +   and forth acceleration. This representation is a very concrete
    1.42 +   description of jumping, couched in terms of muscles and senses, but
    1.43 +   it also has the ability to describe almost all kinds of jumping, a
    1.44 +   generality that you might think could only be achieved by a very
    1.45 +   abstract description. The body centered jumping predicate does not
    1.46 +   have terms that consider the color of a person's skin or whether
    1.47 +   they are male or female, instead it gets right to the meat of what
    1.48 +   jumping actually /is/.
    1.49 +
    1.50 +   Of course, the action predicates are not directly applicable to
    1.51 +   video data which lacks the advanced sensory information which they
    1.52 +   require!
    1.53  
    1.54     The trick now is to make the action predicates work even when the
    1.55     sensory data on which they depend is absent. If I can do that, then
    1.56     I will have gained much.
    1.57  
    1.58 -** COMMENT \Phi-space describes the worm's experiences
    1.59 +** \Phi-space describes the worm's experiences
    1.60     
    1.61     As a first step towards building empathy, I need to gather all of
    1.62     the worm's experiences during free play. I use a simple vector to
    1.63 @@ -2794,7 +2812,7 @@
    1.64     provided by an experience vector and reliably inferring the rest of
    1.65     the senses.
    1.66  
    1.67 -** COMMENT Empathy is the process of tracing though \Phi-space 
    1.68 +** Empathy is the process of tracing though \Phi-space 
    1.69  
    1.70     Here is the core of a basic empathy algorithm, starting with an
    1.71     experience vector:
    1.72 @@ -2865,7 +2883,7 @@
    1.73     #+caption: =longest-thread= finds the longest path of consecutive 
    1.74     #+caption: experiences to explain proprioceptive worm data from 
    1.75     #+caption: previous data. Here, the film strip represents the  
    1.76 -   #+caption: creature's previous experience. Sort sequeuces of
    1.77 +   #+caption: creature's previous experience. Sort sequences of
    1.78     #+caption: memories are spliced together to match the
    1.79     #+caption: proprioceptive data. Their carry the other senses 
    1.80     #+caption: along with them.
    1.81 @@ -3273,7 +3291,7 @@
    1.82     completely scrambled. The cross shape is just for convenience. This
    1.83     example justifies the use of pre-defined touch regions in =EMPATH=.
    1.84  
    1.85 -* COMMENT Contributions
    1.86 +* Contributions
    1.87    
    1.88    In this thesis you have seen the =CORTEX= system, a complete
    1.89    environment for creating simulated creatures. You have seen how to