diff thesis/dylan-cortex-diff.diff @ 514:447c3c8405a2

accept/reject changes
author Robert McIntyre <rlm@mit.edu>
date Sun, 30 Mar 2014 10:50:05 -0400
parents 4c4d45f6f30b
children 58fa1ffd481e
line wrap: on
line diff
     1.1 --- a/thesis/dylan-cortex-diff.diff	Sun Mar 30 10:41:18 2014 -0400
     1.2 +++ b/thesis/dylan-cortex-diff.diff	Sun Mar 30 10:50:05 2014 -0400
     1.3 @@ -294,64 +294,6 @@
     1.4   
     1.5      I envision =CORTEX= being used to support rapid prototyping and
     1.6      iteration of ideas. Even if I could put together a well constructed
     1.7 -@@ -459,8 +577,8 @@
     1.8 -    simulations of very simple creatures in =CORTEX= generally run at
     1.9 -    40x on my machine!
    1.10 - 
    1.11 --** What is a sense?
    1.12 --   
    1.13 -+** All sense organs are two-dimensional surfaces
    1.14 -+# What is a sense?   
    1.15 -    If =CORTEX= is to support a wide variety of senses, it would help
    1.16 -    to have a better understanding of what a ``sense'' actually is!
    1.17 -    While vision, touch, and hearing all seem like they are quite
    1.18 -@@ -956,7 +1074,7 @@
    1.19 -     #+ATTR_LaTeX: :width 15cm
    1.20 -     [[./images/physical-hand.png]]
    1.21 - 
    1.22 --** Eyes reuse standard video game components
    1.23 -+** Sight reuses standard video game components...
    1.24 - 
    1.25 -    Vision is one of the most important senses for humans, so I need to
    1.26 -    build a simulated sense of vision for my AI. I will do this with
    1.27 -@@ -1257,8 +1375,8 @@
    1.28 -     community and is now (in modified form) part of a system for
    1.29 -     capturing in-game video to a file.
    1.30 - 
    1.31 --** Hearing is hard; =CORTEX= does it right
    1.32 --   
    1.33 -+** ...but hearing must be built from scratch
    1.34 -+# is hard; =CORTEX= does it right
    1.35 -    At the end of this section I will have simulated ears that work the
    1.36 -    same way as the simulated eyes in the last section. I will be able to
    1.37 -    place any number of ear-nodes in a blender file, and they will bind to
    1.38 -@@ -1565,7 +1683,7 @@
    1.39 -     jMonkeyEngine3 community and is used to record audio for demo
    1.40 -     videos.
    1.41 - 
    1.42 --** Touch uses hundreds of hair-like elements
    1.43 -+** Hundreds of hair-like elements provide a sense of touch
    1.44 - 
    1.45 -    Touch is critical to navigation and spatial reasoning and as such I
    1.46 -    need a simulated version of it to give to my AI creatures.
    1.47 -@@ -2059,7 +2177,7 @@
    1.48 -     #+ATTR_LaTeX: :width 15cm
    1.49 -     [[./images/touch-cube.png]]
    1.50 - 
    1.51 --** Proprioception is the sense that makes everything ``real''
    1.52 -+** Proprioception provides knowledge of your own body's position
    1.53 - 
    1.54 -    Close your eyes, and touch your nose with your right index finger.
    1.55 -    How did you do it? You could not see your hand, and neither your
    1.56 -@@ -2193,7 +2311,7 @@
    1.57 -     #+ATTR_LaTeX: :width 11cm
    1.58 -     [[./images/proprio.png]]
    1.59 - 
    1.60 --** Muscles are both effectors and sensors
    1.61 -+** Muscles contain both sensors and effectors
    1.62 - 
    1.63 -    Surprisingly enough, terrestrial creatures only move by using
    1.64 -    torque applied about their joints. There's not a single straight
    1.65  @@ -2440,7 +2558,8 @@
    1.66           hard control problems without worrying about physics or
    1.67           senses.
    1.68 @@ -371,25 +313,3 @@
    1.69   
    1.70      Here is the core of a basic empathy algorithm, starting with an
    1.71      experience vector:
    1.72 -@@ -2888,7 +3007,7 @@
    1.73 -    #+end_src
    1.74 -    #+end_listing
    1.75 -   
    1.76 --** Efficient action recognition with =EMPATH=
    1.77 -+** =EMPATH= recognizes actions efficiently
    1.78 -    
    1.79 -    To use =EMPATH= with the worm, I first need to gather a set of
    1.80 -    experiences from the worm that includes the actions I want to
    1.81 -@@ -3044,9 +3163,9 @@
    1.82 -   to interpretation, and dissaggrement between empathy and experience
    1.83 -   is more excusable.
    1.84 - 
    1.85 --** Digression: bootstrapping touch using free exploration
    1.86 --
    1.87 --   In the previous section I showed how to compute actions in terms of
    1.88 -+** Digression: Learn touch sensor layout through haptic experimentation, instead 
    1.89 -+# Boostraping touch using free exploration   
    1.90 -+In the previous section I showed how to compute actions in terms of
    1.91 -    body-centered predicates which relied averate touch activation of
    1.92 -    pre-defined regions of the worm's skin. What if, instead of recieving
    1.93 -    touch pre-grouped into the six faces of each worm segment, the true