diff thesis/cortex.org @ 449:09b7c8dd4365

first chapter done, half of last chapter done.
author Robert McIntyre <rlm@mit.edu>
date Wed, 26 Mar 2014 02:42:01 -0400
parents af13fc73e851
children 432f2c4646cb
line wrap: on
line diff
     1.1 --- a/thesis/cortex.org	Tue Mar 25 22:54:41 2014 -0400
     1.2 +++ b/thesis/cortex.org	Wed Mar 26 02:42:01 2014 -0400
     1.3 @@ -226,7 +226,7 @@
     1.4     #+end_listing
     1.5  
     1.6  
     1.7 -** =CORTEX= is a toolkit for building sensate creatures
     1.8 +**  =CORTEX= is a toolkit for building sensate creatures
     1.9  
    1.10     I built =CORTEX= to be a general AI research platform for doing
    1.11     experiments involving multiple rich senses and a wide variety and
    1.12 @@ -269,14 +269,16 @@
    1.13     engine designed to create cross-platform 3D desktop games. =CORTEX=
    1.14     is mainly written in clojure, a dialect of =LISP= that runs on the
    1.15     java virtual machine (JVM). The API for creating and simulating
    1.16 -   creatures is entirely expressed in clojure. Hearing is implemented
    1.17 -   as a layer of clojure code on top of a layer of java code on top of
    1.18 -   a layer of =C++= code which implements a modified version of
    1.19 -   =OpenAL= to support multiple listeners. =CORTEX= is the only
    1.20 -   simulation environment that I know of that can support multiple
    1.21 -   entities that can each hear the world from their own perspective.
    1.22 -   Other senses also require a small layer of Java code. =CORTEX= also
    1.23 -   uses =bullet=, a physics simulator written in =C=.
    1.24 +   creatures and senses is entirely expressed in clojure, though many
    1.25 +   senses are implemented at the layer of jMonkeyEngine or below. For
    1.26 +   example, for the sense of hearing I use a layer of clojure code on
    1.27 +   top of a layer of java JNI bindings that drive a layer of =C++=
    1.28 +   code which implements a modified version of =OpenAL= to support
    1.29 +   multiple listeners. =CORTEX= is the only simulation environment
    1.30 +   that I know of that can support multiple entities that can each
    1.31 +   hear the world from their own perspective. Other senses also
    1.32 +   require a small layer of Java code. =CORTEX= also uses =bullet=, a
    1.33 +   physics simulator written in =C=.
    1.34  
    1.35     #+caption: Here is the worm from above modeled in Blender, a free 
    1.36     #+caption: 3D-modeling program. Senses and joints are described
    1.37 @@ -285,26 +287,46 @@
    1.38     #+ATTR_LaTeX: :width 12cm
    1.39     [[./images/blender-worm.png]]
    1.40  
    1.41 +   Here are some thing I anticipate that =CORTEX= might be used for:
    1.42 +
    1.43 +   - exploring new ideas about sensory integration
    1.44 +   - distributed communication among swarm creatures
    1.45 +   - self-learning using free exploration, 
    1.46 +   - evolutionary algorithms involving creature construction
    1.47 +   - exploration of exoitic senses and effectors that are not possible
    1.48 +     in the real world (such as telekenisis or a semantic sense)
    1.49 +   - imagination using subworlds
    1.50 +
    1.51     During one test with =CORTEX=, I created 3,000 entities each with
    1.52     their own independent senses and ran them all at only 1/80 real
    1.53     time. In another test, I created a detailed model of my own hand,
    1.54     equipped with a realistic distribution of touch (more sensitive at
    1.55     the fingertips), as well as eyes and ears, and it ran at around 1/4
    1.56 -   real time.
    1.57 +   real time. 
    1.58  
    1.59 -   #+caption: Here is the worm from above modeled in Blender, a free 
    1.60 -   #+caption: 3D-modeling program. Senses and joints are described
    1.61 -   #+caption: using special nodes in Blender.
    1.62 -   #+name: worm-recognition-intro
    1.63 -   #+ATTR_LaTeX: :width 15cm
    1.64 -   [[./images/full-hand.png]]
    1.65 -   
    1.66 -   
    1.67 -   
    1.68 +   #+BEGIN_LaTeX
    1.69 +   \begin{sidewaysfigure}
    1.70 +   \includegraphics[width=9.5in]{images/full-hand.png}
    1.71 +   \caption{Here is the worm from above modeled in Blender, 
    1.72 +   a free 3D-modeling program. Senses and joints are described
    1.73 +   using special nodes in Blender. The senses are displayed on 
    1.74 +   the right, and the simulation is displayed on the left. Notice
    1.75 +   that the hand is curling its fingers, that it can see its own 
    1.76 +   finger from the eye in its palm, and thta it can feel its own 
    1.77 +   thumb touching its palm.}
    1.78 +   \end{sidewaysfigure}
    1.79 +   #+END_LaTeX
    1.80  
    1.81 -   
    1.82  ** Contributions
    1.83  
    1.84 +   I built =CORTEX=, a comprehensive platform for embodied AI
    1.85 +   experiments. =CORTEX= many new features lacking in other systems,
    1.86 +   such as sound. It is easy to create new creatures using Blender, a
    1.87 +   free 3D modeling program.
    1.88 +
    1.89 +   I built =EMPATH=, which uses =CORTEX= to identify the actions of a
    1.90 +   worm-like creature using a computational model of empathy.
    1.91 +   
    1.92  * Building =CORTEX=
    1.93  
    1.94  ** To explore embodiment, we need a world, body, and senses
    1.95 @@ -331,52 +353,409 @@
    1.96  
    1.97  * Empathy in a simulated worm
    1.98  
    1.99 +  Here I develop a computational model of empathy, using =CORTEX= as a
   1.100 +  base. Empathy in this context is the ability to observe another
   1.101 +  creature and infer what sorts of sensations that creature is
   1.102 +  feeling. My empathy algorithm involves multiple phases. First is
   1.103 +  free-play, where the creature moves around and gains sensory
   1.104 +  experience. From this experience I construct a representation of the
   1.105 +  creature's sensory state space, which I call \Phi-space. Using
   1.106 +  \Phi-space, I construct an efficient function which takes the
   1.107 +  limited data that comes from observing another creature and enriches
   1.108 +  it full compliment of imagined sensory data. I can then use the
   1.109 +  imagined sensory data to recognize what the observed creature is
   1.110 +  doing and feeling, using straightforward embodied action predicates.
   1.111 +  This is all demonstrated with using a simple worm-like creature, and
   1.112 +  recognizing worm-actions based on limited data.
   1.113 +
   1.114 +  #+caption: Here is the worm with which we will be working. 
   1.115 +  #+caption: It is composed of 5 segments. Each segment has a 
   1.116 +  #+caption: pair of extensor and flexor muscles. Each of the 
   1.117 +  #+caption: worm's four joints is a hinge joint which allows 
   1.118 +  #+caption: 30 degrees of rotation to either side. Each segment
   1.119 +  #+caption: of the worm is touch-capable and has a uniform 
   1.120 +  #+caption: distribution of touch sensors on each of its faces.
   1.121 +  #+caption: Each joint has a proprioceptive sense to detect 
   1.122 +  #+caption: relative positions. The worm segments are all the 
   1.123 +  #+caption: same except for the first one, which has a much
   1.124 +  #+caption: higher weight than the others to allow for easy 
   1.125 +  #+caption: manual motor control.
   1.126 +  #+name: basic-worm-view
   1.127 +  #+ATTR_LaTeX: :width 10cm
   1.128 +  [[./images/basic-worm-view.png]]
   1.129 +
   1.130 +  #+caption: Program for reading a worm from a blender file and 
   1.131 +  #+caption: outfitting it with the senses of proprioception, 
   1.132 +  #+caption: touch, and the ability to move, as specified in the 
   1.133 +  #+caption: blender file.
   1.134 +  #+name: get-worm
   1.135 +  #+begin_listing clojure
   1.136 +  #+begin_src clojure
   1.137 +(defn worm []
   1.138 +  (let [model (load-blender-model "Models/worm/worm.blend")]
   1.139 +    {:body (doto model (body!))
   1.140 +     :touch (touch! model)
   1.141 +     :proprioception (proprioception! model)
   1.142 +     :muscles (movement! model)}))
   1.143 +  #+end_src
   1.144 +  #+end_listing
   1.145 +  
   1.146  ** Embodiment factors action recognition into managable parts
   1.147  
   1.148 +   Using empathy, I divide the problem of action recognition into a
   1.149 +   recognition process expressed in the language of a full compliment
   1.150 +   of senses, and an imaganitive process that generates full sensory
   1.151 +   data from partial sensory data. Splitting the action recognition
   1.152 +   problem in this manner greatly reduces the total amount of work to
   1.153 +   recognize actions: The imaganitive process is mostly just matching
   1.154 +   previous experience, and the recognition process gets to use all
   1.155 +   the senses to directly describe any action.
   1.156 +
   1.157  ** Action recognition is easy with a full gamut of senses
   1.158  
   1.159 -** Digression: bootstrapping touch using free exploration
   1.160 +   Embodied representations using multiple senses such as touch,
   1.161 +   proprioception, and muscle tension turns out be be exceedingly
   1.162 +   efficient at describing body-centered actions. It is the ``right
   1.163 +   language for the job''. For example, it takes only around 5 lines
   1.164 +   of LISP code to describe the action of ``curling'' using embodied
   1.165 +   primitives. It takes about 8 lines to describe the seemingly
   1.166 +   complicated action of wiggling.
   1.167 +
   1.168 +   The following action predicates each take a stream of sensory
   1.169 +   experience, observe however much of it they desire, and decide
   1.170 +   whether the worm is doing the action they describe. =curled?=
   1.171 +   relies on proprioception, =resting?= relies on touch, =wiggling?=
   1.172 +   relies on a fourier analysis of muscle contraction, and
   1.173 +   =grand-circle?= relies on touch and reuses =curled?= as a gaurd.
   1.174 +   
   1.175 +   #+caption: Program for detecting whether the worm is curled. This is the 
   1.176 +   #+caption: simplest action predicate, because it only uses the last frame 
   1.177 +   #+caption: of sensory experience, and only uses proprioceptive data. Even 
   1.178 +   #+caption: this simple predicate, however, is automatically frame 
   1.179 +   #+caption: independent and ignores vermopomorphic differences such as 
   1.180 +   #+caption: worm textures and colors.
   1.181 +   #+name: curled
   1.182 +   #+begin_listing clojure
   1.183 +   #+begin_src clojure
   1.184 +(defn curled?
   1.185 +  "Is the worm curled up?"
   1.186 +  [experiences]
   1.187 +  (every?
   1.188 +   (fn [[_ _ bend]]
   1.189 +     (> (Math/sin bend) 0.64))
   1.190 +   (:proprioception (peek experiences))))
   1.191 +   #+end_src
   1.192 +   #+end_listing
   1.193 +
   1.194 +   #+caption: Program for summarizing the touch information in a patch 
   1.195 +   #+caption: of skin.
   1.196 +   #+name: touch-summary
   1.197 +   #+begin_listing clojure
   1.198 +   #+begin_src clojure
   1.199 +(defn contact
   1.200 +  "Determine how much contact a particular worm segment has with
   1.201 +   other objects. Returns a value between 0 and 1, where 1 is full
   1.202 +   contact and 0 is no contact."
   1.203 +  [touch-region [coords contact :as touch]]
   1.204 +  (-> (zipmap coords contact)
   1.205 +      (select-keys touch-region)
   1.206 +      (vals)
   1.207 +      (#(map first %))
   1.208 +      (average)
   1.209 +      (* 10)
   1.210 +      (- 1)
   1.211 +      (Math/abs)))
   1.212 +   #+end_src
   1.213 +   #+end_listing
   1.214 +
   1.215 +
   1.216 +   #+caption: Program for detecting whether the worm is at rest. This program
   1.217 +   #+caption: uses a summary of the tactile information from the underbelly 
   1.218 +   #+caption: of the worm, and is only true if every segment is touching the 
   1.219 +   #+caption: floor. Note that this function contains no references to 
   1.220 +   #+caption: proprioction at all.
   1.221 +   #+name: resting
   1.222 +   #+begin_listing clojure
   1.223 +   #+begin_src clojure
   1.224 +(def worm-segment-bottom (rect-region [8 15] [14 22]))
   1.225 +
   1.226 +(defn resting?
   1.227 +  "Is the worm resting on the ground?"
   1.228 +  [experiences]
   1.229 +  (every?
   1.230 +   (fn [touch-data]
   1.231 +     (< 0.9 (contact worm-segment-bottom touch-data)))
   1.232 +   (:touch (peek experiences))))
   1.233 +   #+end_src
   1.234 +   #+end_listing
   1.235 +
   1.236 +   #+caption: Program for detecting whether the worm is curled up into a 
   1.237 +   #+caption: full circle. Here the embodied approach begins to shine, as
   1.238 +   #+caption: I am able to both use a previous action predicate (=curled?=)
   1.239 +   #+caption: as well as the direct tactile experience of the head and tail.
   1.240 +   #+name: grand-circle
   1.241 +   #+begin_listing clojure
   1.242 +   #+begin_src clojure
   1.243 +(def worm-segment-bottom-tip (rect-region [15 15] [22 22]))
   1.244 +
   1.245 +(def worm-segment-top-tip (rect-region [0 15] [7 22]))
   1.246 +
   1.247 +(defn grand-circle?
   1.248 +  "Does the worm form a majestic circle (one end touching the other)?"
   1.249 +  [experiences]
   1.250 +  (and (curled? experiences)
   1.251 +       (let [worm-touch (:touch (peek experiences))
   1.252 +             tail-touch (worm-touch 0)
   1.253 +             head-touch (worm-touch 4)]
   1.254 +         (and (< 0.55 (contact worm-segment-bottom-tip tail-touch))
   1.255 +              (< 0.55 (contact worm-segment-top-tip    head-touch))))))
   1.256 +   #+end_src
   1.257 +   #+end_listing
   1.258 +
   1.259 +
   1.260 +   #+caption: Program for detecting whether the worm has been wiggling for 
   1.261 +   #+caption: the last few frames. It uses a fourier analysis of the muscle 
   1.262 +   #+caption: contractions of the worm's tail to determine wiggling. This is 
   1.263 +   #+caption: signigicant because there is no particular frame that clearly 
   1.264 +   #+caption: indicates that the worm is wiggling --- only when multiple frames 
   1.265 +   #+caption: are analyzed together is the wiggling revealed. Defining 
   1.266 +   #+caption: wiggling this way also gives the worm an opportunity to learn 
   1.267 +   #+caption: and recognize ``frustrated wiggling'', where the worm tries to 
   1.268 +   #+caption: wiggle but can't. Frustrated wiggling is very visually different 
   1.269 +   #+caption: from actual wiggling, but this definition gives it to us for free.
   1.270 +   #+name: wiggling
   1.271 +   #+begin_listing clojure
   1.272 +   #+begin_src clojure
   1.273 +(defn fft [nums]
   1.274 +  (map
   1.275 +   #(.getReal %)
   1.276 +   (.transform
   1.277 +    (FastFourierTransformer. DftNormalization/STANDARD)
   1.278 +    (double-array nums) TransformType/FORWARD)))
   1.279 +
   1.280 +(def indexed (partial map-indexed vector))
   1.281 +
   1.282 +(defn max-indexed [s]
   1.283 +  (first (sort-by (comp - second) (indexed s))))
   1.284 +
   1.285 +(defn wiggling?
   1.286 +  "Is the worm wiggling?"
   1.287 +  [experiences]
   1.288 +  (let [analysis-interval 0x40]
   1.289 +    (when (> (count experiences) analysis-interval)
   1.290 +      (let [a-flex 3
   1.291 +            a-ex   2
   1.292 +            muscle-activity
   1.293 +            (map :muscle (vector:last-n experiences analysis-interval))
   1.294 +            base-activity
   1.295 +            (map #(- (% a-flex) (% a-ex)) muscle-activity)]
   1.296 +        (= 2
   1.297 +           (first
   1.298 +            (max-indexed
   1.299 +             (map #(Math/abs %)
   1.300 +                  (take 20 (fft base-activity))))))))))
   1.301 +   #+end_src
   1.302 +   #+end_listing
   1.303 +
   1.304 +   With these action predicates, I can now recognize the actions of
   1.305 +   the worm while it is moving under my control and I have access to
   1.306 +   all the worm's senses.
   1.307 +
   1.308 +   #+caption: Use the action predicates defined earlier to report on 
   1.309 +   #+caption: what the worm is doing while in simulation.
   1.310 +   #+name: report-worm-activity
   1.311 +   #+begin_listing clojure
   1.312 +   #+begin_src clojure
   1.313 +(defn debug-experience
   1.314 +  [experiences text]
   1.315 +  (cond
   1.316 +   (grand-circle? experiences) (.setText text "Grand Circle")
   1.317 +   (curled? experiences)       (.setText text "Curled")
   1.318 +   (wiggling? experiences)     (.setText text "Wiggling")
   1.319 +   (resting? experiences)      (.setText text "Resting")))
   1.320 +   #+end_src
   1.321 +   #+end_listing
   1.322 +
   1.323 +   #+caption: Using =debug-experience=, the body-centered predicates
   1.324 +   #+caption: work together to classify the behaviour of the worm. 
   1.325 +   #+caption: while under manual motor control.
   1.326 +   #+name: basic-worm-view
   1.327 +   #+ATTR_LaTeX: :width 10cm
   1.328 +   [[./images/worm-identify-init.png]]
   1.329 +
   1.330 +   These action predicates satisfy the recognition requirement of an
   1.331 +   empathic recognition system. There is a lot of power in the
   1.332 +   simplicity of the action predicates. They describe their actions
   1.333 +   without getting confused in visual details of the worm. Each one is
   1.334 +   frame independent, but more than that, they are each indepent of
   1.335 +   irrelevant visual details of the worm and the environment. They
   1.336 +   will work regardless of whether the worm is a different color or
   1.337 +   hevaily textured, or of the environment has strange lighting.
   1.338 +
   1.339 +   The trick now is to make the action predicates work even when the
   1.340 +   sensory data on which they depend is absent. If I can do that, then
   1.341 +   I will have gained much,
   1.342  
   1.343  ** \Phi-space describes the worm's experiences
   1.344 +   
   1.345 +   As a first step towards building empathy, I need to gather all of
   1.346 +   the worm's experiences during free play. I use a simple vector to
   1.347 +   store all the experiences. 
   1.348 +   
   1.349 +   #+caption: Program to gather the worm's experiences into a vector for 
   1.350 +   #+caption: further processing. The =motor-control-program= line uses
   1.351 +   #+caption: a motor control script that causes the worm to execute a series
   1.352 +   #+caption: of ``exercices'' that include all the action predicates.
   1.353 +   #+name: generate-phi-space
   1.354 +   #+begin_listing clojure
   1.355 +   #+begin_src clojure
   1.356 +(defn generate-phi-space []
   1.357 +  (let [experiences (atom [])]
   1.358 +    (run-world
   1.359 +     (apply-map 
   1.360 +      worm-world
   1.361 +      (merge
   1.362 +       (worm-world-defaults)
   1.363 +       {:end-frame 700
   1.364 +        :motor-control
   1.365 +        (motor-control-program worm-muscle-labels do-all-the-things)
   1.366 +        :experiences experiences})))
   1.367 +    @experiences))
   1.368 +   #+end_src
   1.369 +   #+end_listing
   1.370 +
   1.371 +   Each element of the experience vector exists in the vast space of
   1.372 +   all possible worm-experiences. Most of this vast space is actually
   1.373 +   unreachable due to physical constraints of the worm's body. For
   1.374 +   example, the worm's segments are connected by hinge joints that put
   1.375 +   a practical limit on the worm's degrees of freedom. Also, the worm
   1.376 +   can not be bent into a circle so that its ends are touching and at
   1.377 +   the same time not also experience the sensation of touching itself.
   1.378 +
   1.379 +   As the worm moves around during free play and the vector grows
   1.380 +   larger, the vector begins to define a subspace which is all the
   1.381 +   practical experiences the worm can experience during normal
   1.382 +   operation, which I call \Phi-space, short for physical-space. The
   1.383 +   vector defines a path through \Phi-space. This path has interesting
   1.384 +   properties that all derive from embodiment. The proprioceptive
   1.385 +   components are completely smooth, because in order for the worm to
   1.386 +   move from one position to another, it must pass through the
   1.387 +   intermediate positions. The path invariably forms loops as actions
   1.388 +   are repeated. Finally and most importantly, proprioception actually
   1.389 +   gives very strong inference about the other senses. For example,
   1.390 +   when the worm is flat, you can infer that it is touching the ground
   1.391 +   and that its muscles are not active, because if the muscles were
   1.392 +   active, the worm would be moving and would not be perfectly flat.
   1.393 +   In order to stay flat, the worm has to be touching the ground, or
   1.394 +   it would again be moving out of the flat position due to gravity.
   1.395 +   If the worm is positioned in such a way that it interacts with
   1.396 +   itself, then it is very likely to be feeling the same tactile
   1.397 +   feelings as the last time it was in that position, because it has
   1.398 +   the same body as then. If you observe multiple frames of
   1.399 +   proprioceptive data, then you can become increasingly confident
   1.400 +   about the exact activations of the worm's muscles, because it
   1.401 +   generally takes a unique combination of muscle contractions to
   1.402 +   transform the worm's body along a specific path through \Phi-space.
   1.403 +
   1.404 +   There is a simple way of taking \Phi-space and the total ordering
   1.405 +   provided by an experience vector and reliably infering the rest of
   1.406 +   the senses.
   1.407  
   1.408  ** Empathy is the process of tracing though \Phi-space 
   1.409 +
   1.410 +
   1.411 +
   1.412 +(defn bin [digits]
   1.413 +  (fn [angles]
   1.414 +    (->> angles
   1.415 +         (flatten)
   1.416 +         (map (juxt #(Math/sin %) #(Math/cos %)))
   1.417 +         (flatten)
   1.418 +         (mapv #(Math/round (* % (Math/pow 10 (dec digits))))))))
   1.419 +
   1.420 +(defn gen-phi-scan 
   1.421 +"Nearest-neighbors with spatial binning. Only returns a result if
   1.422 + the propriceptive data is within 10% of a previously recorded
   1.423 + result in all dimensions."
   1.424 +
   1.425 +[phi-space]
   1.426 +  (let [bin-keys (map bin [3 2 1])
   1.427 +        bin-maps
   1.428 +        (map (fn [bin-key]
   1.429 +               (group-by
   1.430 +                (comp bin-key :proprioception phi-space)
   1.431 +                (range (count phi-space)))) bin-keys)
   1.432 +        lookups (map (fn [bin-key bin-map]
   1.433 +                      (fn [proprio] (bin-map (bin-key proprio))))
   1.434 +                    bin-keys bin-maps)]
   1.435 +    (fn lookup [proprio-data]
   1.436 +      (set (some #(% proprio-data) lookups)))))
   1.437 +
   1.438 +
   1.439 +(defn longest-thread
   1.440 +  "Find the longest thread from phi-index-sets. The index sets should
   1.441 +   be ordered from most recent to least recent."
   1.442 +  [phi-index-sets]
   1.443 +  (loop [result '()
   1.444 +         [thread-bases & remaining :as phi-index-sets] phi-index-sets]
   1.445 +    (if (empty? phi-index-sets)
   1.446 +      (vec result)
   1.447 +      (let [threads
   1.448 +            (for [thread-base thread-bases]
   1.449 +              (loop [thread (list thread-base)
   1.450 +                     remaining remaining]
   1.451 +                (let [next-index (dec (first thread))]
   1.452 +                  (cond (empty? remaining) thread
   1.453 +                        (contains? (first remaining) next-index)
   1.454 +                        (recur
   1.455 +                         (cons next-index thread) (rest remaining))
   1.456 +                        :else thread))))
   1.457 +            longest-thread
   1.458 +            (reduce (fn [thread-a thread-b]
   1.459 +                      (if (> (count thread-a) (count thread-b))
   1.460 +                        thread-a thread-b))
   1.461 +                    '(nil)
   1.462 +                    threads)]
   1.463 +        (recur (concat longest-thread result)
   1.464 +               (drop (count longest-thread) phi-index-sets))))))
   1.465 +
   1.466 +There is one final piece, which is to replace missing sensory data
   1.467 +with a best-guess estimate. While I could fill in missing data by
   1.468 +using a gradient over the closest known sensory data points, averages
   1.469 +can be misleading. It is certainly possible to create an impossible
   1.470 +sensory state by averaging two possible sensory states. Therefore, I
   1.471 +simply replicate the most recent sensory experience to fill in the
   1.472 +gaps. 
   1.473 +
   1.474 +   #+caption: Fill in blanks in sensory experience by replicating the most 
   1.475 +   #+caption: recent experience.
   1.476 +   #+name: infer-nils
   1.477 +   #+begin_listing clojure
   1.478 +   #+begin_src clojure
   1.479 +(defn infer-nils
   1.480 +  "Replace nils with the next available non-nil element in the
   1.481 +   sequence, or barring that, 0."
   1.482 +  [s]
   1.483 +  (loop [i (dec (count s))
   1.484 +         v (transient s)]
   1.485 +    (if (zero? i) (persistent! v)
   1.486 +        (if-let [cur (v i)]
   1.487 +          (if (get v (dec i) 0)
   1.488 +            (recur (dec i) v)
   1.489 +            (recur (dec i) (assoc! v (dec i) cur)))
   1.490 +          (recur i (assoc! v i 0))))))
   1.491 +   #+end_src
   1.492 +   #+end_listing
   1.493 +
   1.494 +
   1.495 +
   1.496 +
   1.497    
   1.498  ** Efficient action recognition with =EMPATH=
   1.499  
   1.500 +** Digression: bootstrapping touch using free exploration
   1.501 +
   1.502  * Contributions
   1.503 -  - Built =CORTEX=, a comprehensive platform for embodied AI
   1.504 -    experiments. Has many new features lacking in other systems, such
   1.505 -    as sound. Easy to model/create new creatures.
   1.506 -  - created a novel concept for action recognition by using artificial
   1.507 -    imagination. 
   1.508 -
   1.509 -In the second half of the thesis I develop a computational model of
   1.510 -empathy, using =CORTEX= as a base. Empathy in this context is the
   1.511 -ability to observe another creature and infer what sorts of sensations
   1.512 -that creature is feeling. My empathy algorithm involves multiple
   1.513 -phases. First is free-play, where the creature moves around and gains
   1.514 -sensory experience. From this experience I construct a representation
   1.515 -of the creature's sensory state space, which I call \Phi-space. Using
   1.516 -\Phi-space, I construct an efficient function for enriching the
   1.517 -limited data that comes from observing another creature with a full
   1.518 -compliment of imagined sensory data based on previous experience. I
   1.519 -can then use the imagined sensory data to recognize what the observed
   1.520 -creature is doing and feeling, using straightforward embodied action
   1.521 -predicates. This is all demonstrated with using a simple worm-like
   1.522 -creature, and recognizing worm-actions based on limited data.
   1.523 -
   1.524 -Embodied representation using multiple senses such as touch,
   1.525 -proprioception, and muscle tension turns out be be exceedingly
   1.526 -efficient at describing body-centered actions. It is the ``right
   1.527 -language for the job''. For example, it takes only around 5 lines of
   1.528 -LISP code to describe the action of ``curling'' using embodied
   1.529 -primitives. It takes about 8 lines to describe the seemingly
   1.530 -complicated action of wiggling.
   1.531 -
   1.532 -
   1.533 -
   1.534 -* COMMENT names for cortex
   1.535 - - bioland
   1.536  
   1.537  
   1.538