diff thesis/cortex.org @ 562:6299450f9618

corrections for jessica noss.
author Robert McIntyre <rlm@mit.edu>
date Mon, 12 May 2014 10:04:16 -0400
parents 6a61b637a4c5
children 3c3dc4dbb973
line wrap: on
line diff
     1.1 --- a/thesis/cortex.org	Sun May 04 15:38:11 2014 -0400
     1.2 +++ b/thesis/cortex.org	Mon May 12 10:04:16 2014 -0400
     1.3 @@ -261,7 +261,7 @@
     1.4          it gathers experiences that satisfy the embodied action
     1.5          definitions. 
     1.6  
     1.7 -   - Posture imitation :: When trying to interpret a video or image,
     1.8 +   - Posture Imitation :: When trying to interpret a video or image,
     1.9          the creature takes a model of itself and aligns it with
    1.10          whatever it sees. This alignment might even cross species, as
    1.11          when humans try to align themselves with things like ponies,
    1.12 @@ -285,7 +285,7 @@
    1.13          retrieved, and if it is analogous enough to the scene, then
    1.14          the creature will correctly identify the action in the scene.
    1.15  
    1.16 -   My program, =EMPATH= uses this empathic problem solving technique
    1.17 +   My program =EMPATH= uses this empathic problem solving technique
    1.18     to interpret the actions of a simple, worm-like creature. 
    1.19     
    1.20     #+caption: The worm performs many actions during free play such as 
    1.21 @@ -324,7 +324,7 @@
    1.22       structure of =EMPATH= and =CORTEX= will make future work to
    1.23       enable video analysis much easier than it would otherwise be.
    1.24  
    1.25 -** =EMPATH= is built on =CORTEX=, a creature builder.
    1.26 +** COMMENT =EMPATH= is built on =CORTEX=, a creature builder.
    1.27  
    1.28     I built =CORTEX= to be a general AI research platform for doing
    1.29     experiments involving multiple rich senses and a wide variety and
    1.30 @@ -347,7 +347,7 @@
    1.31  
    1.32     - You can design new creatures using Blender (\cite{blender}), a
    1.33       popular, free 3D modeling program. Each sense can be specified
    1.34 -     using special blender nodes with biologically inspired
    1.35 +     using special Blender nodes with biologically inspired
    1.36       parameters. You need not write any code to create a creature, and
    1.37       can use a wide library of pre-existing blender models as a base
    1.38       for your own creatures.
    1.39 @@ -412,7 +412,8 @@
    1.40     I modeled my own right hand in Blender and rigged it with all the
    1.41     senses that {\tt CORTEX} supports. My simulated hand has a
    1.42     biologically inspired distribution of touch sensors. The senses are
    1.43 -   displayed on the right, and the simulation is displayed on the
    1.44 +   displayed on the right (the red/black squares are raw sensory output), 
    1.45 +   and the simulation is displayed on the
    1.46     left. Notice that my hand is curling its fingers, that it can see
    1.47     its own finger from the eye in its palm, and that it can feel its
    1.48     own thumb touching its palm.}
    1.49 @@ -764,8 +765,8 @@
    1.50  *** Solidifying/Connecting a body
    1.51  
    1.52      =CORTEX= creates a creature in two steps: first, it traverses the
    1.53 -    nodes in the blender file and creates physical representations for
    1.54 -    any of them that have mass defined in their blender meta-data.
    1.55 +    nodes in the Blender file and creates physical representations for
    1.56 +    any of them that have mass defined in their Blender meta-data.
    1.57  
    1.58     #+caption: Program for iterating through the nodes in a blender file
    1.59     #+caption: and generating physical jMonkeyEngine3 objects with mass
    1.60 @@ -830,7 +831,7 @@
    1.61     #+begin_listing clojure
    1.62     #+begin_src clojure
    1.63  (defn sense-nodes
    1.64 -  "For some senses there is a special empty blender node whose
    1.65 +  "For some senses there is a special empty Blender node whose
    1.66     children are considered markers for an instance of that sense. This
    1.67     function generates functions to find those children, given the name
    1.68     of the special parent node."
    1.69 @@ -891,13 +892,13 @@
    1.70      Once =CORTEX= finds all joints and targets, it creates them using
    1.71      a dispatch on the metadata of each joint node.
    1.72  
    1.73 -    #+caption: Program to dispatch on blender metadata and create joints
    1.74 +    #+caption: Program to dispatch on Blender metadata and create joints
    1.75      #+caption: suitable for physical simulation.
    1.76      #+name: joint-dispatch
    1.77      #+begin_listing clojure
    1.78      #+begin_src clojure
    1.79  (defmulti joint-dispatch
    1.80 -  "Translate blender pseudo-joints into real JME joints."
    1.81 +  "Translate Blender pseudo-joints into real JME joints."
    1.82    (fn [constraints & _] 
    1.83      (:type constraints)))
    1.84  
    1.85 @@ -930,10 +931,10 @@
    1.86  
    1.87      All that is left for joints is to combine the above pieces into
    1.88      something that can operate on the collection of nodes that a
    1.89 -    blender file represents.
    1.90 +    Blender file represents.
    1.91  
    1.92      #+caption: Program to completely create a joint given information 
    1.93 -    #+caption: from a blender file.
    1.94 +    #+caption: from a Blender file.
    1.95      #+name: connect
    1.96      #+begin_listing clojure
    1.97     #+begin_src clojure
    1.98 @@ -948,7 +949,7 @@
    1.99  
   1.100     {:type :cone :limit-xz 0]
   1.101                  :limit-xy 0]
   1.102 -                :twist 0]}   (use XZY rotation mode in blender!)"
   1.103 +                :twist 0]}   (use XZY rotation mode in Blender!)"
   1.104    [#^Node obj-a #^Node obj-b #^Node joint]
   1.105    (let [control-a (.getControl obj-a RigidBodyControl)
   1.106          control-b (.getControl obj-b RigidBodyControl)
   1.107 @@ -993,7 +994,7 @@
   1.108  (defn body!
   1.109    "Endow the creature with a physical body connected with joints.  The
   1.110     particulars of the joints and the masses of each body part are
   1.111 -   determined in blender."
   1.112 +   determined in Blender."
   1.113    [#^Node creature]
   1.114    (physical! creature)
   1.115    (joints! creature))
   1.116 @@ -1008,7 +1009,7 @@
   1.117      my own right hand, can now be given joints and simulated as a
   1.118      creature.
   1.119     
   1.120 -    #+caption: With the ability to create physical creatures from blender,
   1.121 +    #+caption: With the ability to create physical creatures from Blender,
   1.122      #+caption: =CORTEX= gets one step closer to becoming a full creature
   1.123      #+caption: simulation environment.
   1.124      #+name: physical-hand
   1.125 @@ -1125,7 +1126,7 @@
   1.126      images. Now, =CORTEX= needs simulated eyes to serve as the source
   1.127      of these images.
   1.128  
   1.129 -    An eye is described in blender in the same way as a joint. They
   1.130 +    An eye is described in Blender in the same way as a joint. They
   1.131      are zero dimensional empty objects with no geometry whose local
   1.132      coordinate system determines the orientation of the resulting eye.
   1.133      All eyes are children of a parent node named "eyes" just as all
   1.134 @@ -1142,7 +1143,7 @@
   1.135    "Create a Camera centered on the current position of 'eye which
   1.136     follows the closest physical node in 'creature. The camera will
   1.137     point in the X direction and use the Z vector as up as determined
   1.138 -   by the rotation of these vectors in blender coordinate space. Use
   1.139 +   by the rotation of these vectors in Blender coordinate space. Use
   1.140     XZY rotation for the node in blender."
   1.141    [#^Node creature #^Spatial eye]
   1.142    (let [target (closest-node creature eye)
   1.143 @@ -1156,7 +1157,7 @@
   1.144      (.lookAtDirection
   1.145       cam                           ; this part is not a mistake and
   1.146       (.mult rot Vector3f/UNIT_X)   ; is consistent with using Z in
   1.147 -     (.mult rot Vector3f/UNIT_Y))  ; blender as the UP vector.
   1.148 +     (.mult rot Vector3f/UNIT_Y))  ; Blender as the UP vector.
   1.149      (.setFrustumPerspective
   1.150       cam (float 45)
   1.151       (float (/ (.getWidth cam) (.getHeight cam)))
   1.152 @@ -1179,7 +1180,7 @@
   1.153      distance from the fovea.
   1.154  
   1.155      I want to be able to model any retinal configuration, so my
   1.156 -    eye-nodes in blender contain metadata pointing to images that
   1.157 +    eye-nodes in Blender contain metadata pointing to images that
   1.158      describe the precise position of the individual sensors using
   1.159      white pixels. The meta-data also describes the precise sensitivity
   1.160      to light that the sensors described in the image have. An eye can
   1.161 @@ -1322,7 +1323,7 @@
   1.162  
   1.163     At the end of this chapter I will have simulated ears that work the
   1.164     same way as the simulated eyes in the last chapter. I will be able to
   1.165 -   place any number of ear-nodes in a blender file, and they will bind to
   1.166 +   place any number of ear-nodes in a Blender file, and they will bind to
   1.167     the closest physical object and follow it as it moves around. Each ear
   1.168     will provide access to the sound data it picks up between every frame.
   1.169  
   1.170 @@ -2146,7 +2147,7 @@
   1.171     It's also particularly easy to implement in jMonkeyEngine.
   1.172     
   1.173     My simulated proprioception calculates the relative angles of each
   1.174 -   joint from the rest position defined in the blender file. This
   1.175 +   joint from the rest position defined in the Blender file. This
   1.176     simulates the muscle-spindles and joint capsules. I will deal with
   1.177     Golgi tendon organs, which calculate muscle strain, in the next
   1.178     chapter.
   1.179 @@ -2309,7 +2310,7 @@
   1.180  
   1.181  *** Muscle meta-data
   1.182  
   1.183 -    #+caption: Program to deal with loading muscle data from a blender
   1.184 +    #+caption: Program to deal with loading muscle data from a Blender
   1.185      #+caption: file's metadata.
   1.186      #+name: motor-pool
   1.187      #+begin_listing clojure
   1.188 @@ -2390,7 +2391,7 @@
   1.189      =movement-kernel= creates a function that controls the movement
   1.190      of the nearest physical node to the muscle node. The muscle exerts
   1.191      a rotational force dependent on it's orientation to the object in
   1.192 -    the blender file. The function returned by =movement-kernel= is
   1.193 +    the Blender file. The function returned by =movement-kernel= is
   1.194      also a sense function: it returns the percent of the total muscle
   1.195      strength that is currently being employed. This is analogous to
   1.196      muscle tension in humans and completes the sense of proprioception
   1.197 @@ -2534,10 +2535,10 @@
   1.198    #+ATTR_LaTeX: :width 10cm
   1.199    [[./images/basic-worm-view.png]]
   1.200  
   1.201 -  #+caption: Program for reading a worm from a blender file and 
   1.202 +  #+caption: Program for reading a worm from a Blender file and 
   1.203    #+caption: outfitting it with the senses of proprioception, 
   1.204    #+caption: touch, and the ability to move, as specified in the 
   1.205 -  #+caption: blender file.
   1.206 +  #+caption: Blender file.
   1.207    #+name: get-worm
   1.208    #+begin_listing clojure
   1.209    #+begin_src clojure
   1.210 @@ -2829,19 +2830,19 @@
   1.211     that defines the worm's life. It is a time-stamp for each set of
   1.212     sensations the worm has experienced.
   1.213  
   1.214 -   First, group the experience-indices into bins according to the
   1.215 +   First, I group the experience-indices into bins according to the
   1.216     similarity of their proprioceptive data. I organize my bins into a
   1.217     3 level hierarchy. The smallest bins have an approximate size of
   1.218     0.001 radians in all proprioceptive dimensions. Each higher level
   1.219     is 10x bigger than the level below it.
   1.220  
   1.221     The bins serve as a hashing function for proprioceptive data. Given
   1.222 -   a single piece of proprioceptive experience, the bins allow us to
   1.223 +   a single piece of proprioceptive experience, the bins allow me to
   1.224     rapidly find all other similar experience-indices of past
   1.225     experience that had a very similar proprioceptive configuration.
   1.226     When looking up a proprioceptive experience, if the smallest bin
   1.227 -   does not match any previous experience, then successively larger
   1.228 -   bins are used until a match is found or we reach the largest bin.
   1.229 +   does not match any previous experience, then I use successively
   1.230 +   larger bins until a match is found or I reach the largest bin.
   1.231     
   1.232     Given a sequence of proprioceptive input, I use the bins to
   1.233     generate a set of similar experiences for each input using the
   1.234 @@ -3529,7 +3530,7 @@
   1.235    the =CORTEX= system, a complete environment for creating simulated
   1.236    creatures. You have seen how to implement five senses: touch,
   1.237    proprioception, hearing, vision, and muscle tension. You have seen
   1.238 -  how to create new creatures using blender, a 3D modeling tool.
   1.239 +  how to create new creatures using Blender, a 3D modeling tool.
   1.240  
   1.241    As a minor digression, you also saw how I used =CORTEX= to enable a
   1.242    tiny worm to discover the topology of its skin simply by rolling on
   1.243 @@ -3628,7 +3629,7 @@
   1.244  {:type :cone
   1.245   :limit-xz <lim-xz>
   1.246   :limit-xy <lim-xy>
   1.247 - :twist    <lim-twist>}   ;(use XZY rotation mode in blender!)
   1.248 + :twist    <lim-twist>}   ;(use XZY rotation mode in Blender!)
   1.249      #+END_SRC
   1.250  
   1.251  *** Eyes
   1.252 @@ -3719,7 +3720,7 @@
   1.253  *** Proprioception
   1.254  
   1.255      Proprioception is tied to each joint node -- nothing special must
   1.256 -    be done in a blender model to enable proprioception other than
   1.257 +    be done in a Blender model to enable proprioception other than
   1.258      creating joint nodes.
   1.259  
   1.260  *** Muscles
   1.261 @@ -3820,7 +3821,7 @@
   1.262          Options are the same as in =box=.
   1.263  
   1.264     - =(load-blender-model file-name)= :: create a node structure
   1.265 -        representing the model described in a blender file.
   1.266 +        representing the model described in a Blender file.
   1.267  
   1.268     - =(light-up-everything world)= :: distribute a standard compliment
   1.269          of lights throughout the simulation. Should be adequate for most
   1.270 @@ -3855,7 +3856,7 @@
   1.271          during a simulation, return the vision data for the channel of
   1.272          one of the eyes. The functions are ordered depending on the
   1.273          alphabetical order of the names of the eye nodes in the
   1.274 -        blender file. The data returned by the functions is a vector
   1.275 +        Blender file. The data returned by the functions is a vector
   1.276          containing the eye's /topology/, a vector of coordinates, and
   1.277          the eye's /data/, a vector of RGB values filtered by the eye's
   1.278          sensitivity. 
   1.279 @@ -3864,7 +3865,7 @@
   1.280          Returns a list of functions, one for each ear, that when
   1.281          called will return a frame's worth of hearing data for that
   1.282          ear. The functions are ordered depending on the alphabetical
   1.283 -        order of the names of the ear nodes in the blender file. The
   1.284 +        order of the names of the ear nodes in the Blender file. The
   1.285          data returned by the functions is an array of PCM (pulse code
   1.286          modulated) wav data.
   1.287