changeset 514:447c3c8405a2

accept/reject changes
author Robert McIntyre <rlm@mit.edu>
date Sun, 30 Mar 2014 10:50:05 -0400
parents 4c4d45f6f30b
children 58fa1ffd481e
files thesis/cortex.org thesis/dylan-accept.diff thesis/dylan-cortex-diff.diff
diffstat 3 files changed, 103 insertions(+), 96 deletions(-) [+]
line wrap: on
line diff
     1.1 --- a/thesis/cortex.org	Sun Mar 30 10:41:18 2014 -0400
     1.2 +++ b/thesis/cortex.org	Sun Mar 30 10:50:05 2014 -0400
     1.3 @@ -498,9 +498,9 @@
     1.4     is a vast and rich place, and for now simulations are a very poor
     1.5     reflection of its complexity. It may be that there is a significant
     1.6     qualatative difference between dealing with senses in the real
     1.7 -   world and dealing with pale facilimilies of them in a simulation.
     1.8 -   What are the advantages and disadvantages of a simulation vs.
     1.9 -   reality?
    1.10 +   world and dealing with pale facilimilies of them in a simulation
    1.11 +   \cite{brooks-representation}. What are the advantages and
    1.12 +   disadvantages of a simulation vs. reality?
    1.13  
    1.14  *** Simulation
    1.15  
    1.16 @@ -578,7 +578,7 @@
    1.17     40x on my machine!
    1.18  
    1.19  ** All sense organs are two-dimensional surfaces
    1.20 -# What is a sense?   
    1.21 +
    1.22     If =CORTEX= is to support a wide variety of senses, it would help
    1.23     to have a better understanding of what a ``sense'' actually is!
    1.24     While vision, touch, and hearing all seem like they are quite
    1.25 @@ -1376,7 +1376,7 @@
    1.26      capturing in-game video to a file.
    1.27  
    1.28  ** ...but hearing must be built from scratch
    1.29 -# is hard; =CORTEX= does it right
    1.30 +
    1.31     At the end of this section I will have simulated ears that work the
    1.32     same way as the simulated eyes in the last section. I will be able to
    1.33     place any number of ear-nodes in a blender file, and they will bind to
    1.34 @@ -3163,18 +3163,18 @@
    1.35    to interpretation, and dissaggrement between empathy and experience
    1.36    is more excusable.
    1.37  
    1.38 -** Digression: Learn touch sensor layout through haptic experimentation, instead 
    1.39 -# Boostraping touch using free exploration   
    1.40 -In the previous section I showed how to compute actions in terms of
    1.41 +** Digression: Learn touch sensor layout through free play
    1.42 +
    1.43 +   In the previous section I showed how to compute actions in terms of
    1.44     body-centered predicates which relied averate touch activation of
    1.45 -   pre-defined regions of the worm's skin. What if, instead of recieving
    1.46 -   touch pre-grouped into the six faces of each worm segment, the true
    1.47 -   topology of the worm's skin was unknown? This is more similiar to how
    1.48 -   a nerve fiber bundle might be arranged. While two fibers that are
    1.49 -   close in a nerve bundle /might/ correspond to two touch sensors that
    1.50 -   are close together on the skin, the process of taking a complicated
    1.51 -   surface and forcing it into essentially a circle requires some cuts
    1.52 -   and rerragenments.
    1.53 +   pre-defined regions of the worm's skin. What if, instead of
    1.54 +   recieving touch pre-grouped into the six faces of each worm
    1.55 +   segment, the true topology of the worm's skin was unknown? This is
    1.56 +   more similiar to how a nerve fiber bundle might be arranged. While
    1.57 +   two fibers that are close in a nerve bundle /might/ correspond to
    1.58 +   two touch sensors that are close together on the skin, the process
    1.59 +   of taking a complicated surface and forcing it into essentially a
    1.60 +   circle requires some cuts and rerragenments.
    1.61     
    1.62     In this section I show how to automatically learn the skin-topology of
    1.63     a worm segment by free exploration. As the worm rolls around on the
     2.1 --- a/thesis/dylan-accept.diff	Sun Mar 30 10:41:18 2014 -0400
     2.2 +++ b/thesis/dylan-accept.diff	Sun Mar 30 10:50:05 2014 -0400
     2.3 @@ -20,3 +20,90 @@
     2.4   
     2.5     You have also seen how I used =CORTEX= as a platform to attach the
     2.6     /action recognition/ problem, which is the problem of recognizing
     2.7 +
     2.8 +@@ -2888,7 +3007,7 @@
     2.9 +    #+end_src
    2.10 +    #+end_listing
    2.11 +   
    2.12 +-** Efficient action recognition with =EMPATH=
    2.13 ++** =EMPATH= recognizes actions efficiently
    2.14 +    
    2.15 +    To use =EMPATH= with the worm, I first need to gather a set of
    2.16 +    experiences from the worm that includes the actions I want to
    2.17 +
    2.18 +@@ -3044,9 +3163,9 @@
    2.19 +   to interpretation, and dissaggrement between empathy and experience
    2.20 +   is more excusable.
    2.21 + 
    2.22 +-** Digression: bootstrapping touch using free exploration
    2.23 +-
    2.24 +-   In the previous section I showed how to compute actions in terms of
    2.25 ++** Digression: Learn touch sensor layout through haptic experimentation, instead 
    2.26 ++# Boostraping touch using free exploration   
    2.27 ++In the previous section I showed how to compute actions in terms of
    2.28 +    body-centered predicates which relied averate touch activation of
    2.29 +    pre-defined regions of the worm's skin. What if, instead of recieving
    2.30 +    touch pre-grouped into the six faces of each worm segment, the true
    2.31 +
    2.32 +@@ -2193,7 +2311,7 @@
    2.33 +     #+ATTR_LaTeX: :width 11cm
    2.34 +     [[./images/proprio.png]]
    2.35 + 
    2.36 +-** Muscles are both effectors and sensors
    2.37 ++** Muscles contain both sensors and effectors
    2.38 + 
    2.39 +    Surprisingly enough, terrestrial creatures only move by using
    2.40 +    torque applied about their joints. There's not a single straight
    2.41 +
    2.42 +@@ -2059,7 +2177,7 @@
    2.43 +     #+ATTR_LaTeX: :width 15cm
    2.44 +     [[./images/touch-cube.png]]
    2.45 + 
    2.46 +-** Proprioception is the sense that makes everything ``real''
    2.47 ++** Proprioception provides knowledge of your own body's position
    2.48 + 
    2.49 +    Close your eyes, and touch your nose with your right index finger.
    2.50 +    How did you do it? You could not see your hand, and neither your
    2.51 +
    2.52 +@@ -1257,8 +1375,8 @@
    2.53 +     community and is now (in modified form) part of a system for
    2.54 +     capturing in-game video to a file.
    2.55 + 
    2.56 +-** Hearing is hard; =CORTEX= does it right
    2.57 +-   
    2.58 ++** ...but hearing must be built from scratch
    2.59 ++# is hard; =CORTEX= does it right
    2.60 +    At the end of this section I will have simulated ears that work the
    2.61 +    same way as the simulated eyes in the last section. I will be able to
    2.62 +    place any number of ear-nodes in a blender file, and they will bind to
    2.63 +
    2.64 +@@ -1565,7 +1683,7 @@
    2.65 +     jMonkeyEngine3 community and is used to record audio for demo
    2.66 +     videos.
    2.67 + 
    2.68 +-** Touch uses hundreds of hair-like elements
    2.69 ++** Hundreds of hair-like elements provide a sense of touch
    2.70 + 
    2.71 +    Touch is critical to navigation and spatial reasoning and as such I
    2.72 +    need a simulated version of it to give to my AI creatures.
    2.73 +
    2.74 +@@ -956,7 +1074,7 @@
    2.75 +     #+ATTR_LaTeX: :width 15cm
    2.76 +     [[./images/physical-hand.png]]
    2.77 + 
    2.78 +-** Eyes reuse standard video game components
    2.79 ++** Sight reuses standard video game components...
    2.80 + 
    2.81 +    Vision is one of the most important senses for humans, so I need to
    2.82 +    build a simulated sense of vision for my AI. I will do this with
    2.83 +@@ -459,8 +577,8 @@
    2.84 +    simulations of very simple creatures in =CORTEX= generally run at
    2.85 +    40x on my machine!
    2.86 + 
    2.87 +-** What is a sense?
    2.88 +-   
    2.89 ++** All sense organs are two-dimensional surfaces
    2.90 ++# What is a sense?   
    2.91 +    If =CORTEX= is to support a wide variety of senses, it would help
    2.92 +    to have a better understanding of what a ``sense'' actually is!
    2.93 +    While vision, touch, and hearing all seem like they are quite
     3.1 --- a/thesis/dylan-cortex-diff.diff	Sun Mar 30 10:41:18 2014 -0400
     3.2 +++ b/thesis/dylan-cortex-diff.diff	Sun Mar 30 10:50:05 2014 -0400
     3.3 @@ -294,64 +294,6 @@
     3.4   
     3.5      I envision =CORTEX= being used to support rapid prototyping and
     3.6      iteration of ideas. Even if I could put together a well constructed
     3.7 -@@ -459,8 +577,8 @@
     3.8 -    simulations of very simple creatures in =CORTEX= generally run at
     3.9 -    40x on my machine!
    3.10 - 
    3.11 --** What is a sense?
    3.12 --   
    3.13 -+** All sense organs are two-dimensional surfaces
    3.14 -+# What is a sense?   
    3.15 -    If =CORTEX= is to support a wide variety of senses, it would help
    3.16 -    to have a better understanding of what a ``sense'' actually is!
    3.17 -    While vision, touch, and hearing all seem like they are quite
    3.18 -@@ -956,7 +1074,7 @@
    3.19 -     #+ATTR_LaTeX: :width 15cm
    3.20 -     [[./images/physical-hand.png]]
    3.21 - 
    3.22 --** Eyes reuse standard video game components
    3.23 -+** Sight reuses standard video game components...
    3.24 - 
    3.25 -    Vision is one of the most important senses for humans, so I need to
    3.26 -    build a simulated sense of vision for my AI. I will do this with
    3.27 -@@ -1257,8 +1375,8 @@
    3.28 -     community and is now (in modified form) part of a system for
    3.29 -     capturing in-game video to a file.
    3.30 - 
    3.31 --** Hearing is hard; =CORTEX= does it right
    3.32 --   
    3.33 -+** ...but hearing must be built from scratch
    3.34 -+# is hard; =CORTEX= does it right
    3.35 -    At the end of this section I will have simulated ears that work the
    3.36 -    same way as the simulated eyes in the last section. I will be able to
    3.37 -    place any number of ear-nodes in a blender file, and they will bind to
    3.38 -@@ -1565,7 +1683,7 @@
    3.39 -     jMonkeyEngine3 community and is used to record audio for demo
    3.40 -     videos.
    3.41 - 
    3.42 --** Touch uses hundreds of hair-like elements
    3.43 -+** Hundreds of hair-like elements provide a sense of touch
    3.44 - 
    3.45 -    Touch is critical to navigation and spatial reasoning and as such I
    3.46 -    need a simulated version of it to give to my AI creatures.
    3.47 -@@ -2059,7 +2177,7 @@
    3.48 -     #+ATTR_LaTeX: :width 15cm
    3.49 -     [[./images/touch-cube.png]]
    3.50 - 
    3.51 --** Proprioception is the sense that makes everything ``real''
    3.52 -+** Proprioception provides knowledge of your own body's position
    3.53 - 
    3.54 -    Close your eyes, and touch your nose with your right index finger.
    3.55 -    How did you do it? You could not see your hand, and neither your
    3.56 -@@ -2193,7 +2311,7 @@
    3.57 -     #+ATTR_LaTeX: :width 11cm
    3.58 -     [[./images/proprio.png]]
    3.59 - 
    3.60 --** Muscles are both effectors and sensors
    3.61 -+** Muscles contain both sensors and effectors
    3.62 - 
    3.63 -    Surprisingly enough, terrestrial creatures only move by using
    3.64 -    torque applied about their joints. There's not a single straight
    3.65  @@ -2440,7 +2558,8 @@
    3.66           hard control problems without worrying about physics or
    3.67           senses.
    3.68 @@ -371,25 +313,3 @@
    3.69   
    3.70      Here is the core of a basic empathy algorithm, starting with an
    3.71      experience vector:
    3.72 -@@ -2888,7 +3007,7 @@
    3.73 -    #+end_src
    3.74 -    #+end_listing
    3.75 -   
    3.76 --** Efficient action recognition with =EMPATH=
    3.77 -+** =EMPATH= recognizes actions efficiently
    3.78 -    
    3.79 -    To use =EMPATH= with the worm, I first need to gather a set of
    3.80 -    experiences from the worm that includes the actions I want to
    3.81 -@@ -3044,9 +3163,9 @@
    3.82 -   to interpretation, and dissaggrement between empathy and experience
    3.83 -   is more excusable.
    3.84 - 
    3.85 --** Digression: bootstrapping touch using free exploration
    3.86 --
    3.87 --   In the previous section I showed how to compute actions in terms of
    3.88 -+** Digression: Learn touch sensor layout through haptic experimentation, instead 
    3.89 -+# Boostraping touch using free exploration   
    3.90 -+In the previous section I showed how to compute actions in terms of
    3.91 -    body-centered predicates which relied averate touch activation of
    3.92 -    pre-defined regions of the worm's skin. What if, instead of recieving
    3.93 -    touch pre-grouped into the six faces of each worm segment, the true