diff thesis/cortex.org @ 451:0a4362d1f138

finishing up chapter 3.
author Robert McIntyre <rlm@mit.edu>
date Wed, 26 Mar 2014 20:38:17 -0400
parents 432f2c4646cb
children f339e3d5cc8c
line wrap: on
line diff
     1.1 --- a/thesis/cortex.org	Wed Mar 26 03:18:57 2014 -0400
     1.2 +++ b/thesis/cortex.org	Wed Mar 26 20:38:17 2014 -0400
     1.3 @@ -3,9 +3,9 @@
     1.4  #+email: rlm@mit.edu
     1.5  #+description: Using embodied AI to facilitate Artificial Imagination.
     1.6  #+keywords: AI, clojure, embodiment
     1.7 +#+LaTeX_CLASS_OPTIONS: [nofloat]
     1.8  
     1.9 -
    1.10 -* Empathy and Embodiment as problem solving strategies
    1.11 +* Empathy and Embodiment as problem solving strategieszzzzzzz
    1.12    
    1.13    By the end of this thesis, you will have seen a novel approach to
    1.14    interpreting video using embodiment and empathy. You will have also
    1.15 @@ -297,35 +297,36 @@
    1.16       in the real world (such as telekenisis or a semantic sense)
    1.17     - imagination using subworlds
    1.18  
    1.19 -   During one test with =CORTEX=, I created 3,000 entities each with
    1.20 +   During one test with =CORTEX=, I created 3,000 creatures each with
    1.21     their own independent senses and ran them all at only 1/80 real
    1.22     time. In another test, I created a detailed model of my own hand,
    1.23     equipped with a realistic distribution of touch (more sensitive at
    1.24     the fingertips), as well as eyes and ears, and it ran at around 1/4
    1.25 -   real time. 
    1.26 +   real time.
    1.27  
    1.28 -   #+BEGIN_LaTeX
    1.29 +#+BEGIN_LaTeX
    1.30     \begin{sidewaysfigure}
    1.31     \includegraphics[width=9.5in]{images/full-hand.png}
    1.32 -   \caption{Here is the worm from above modeled in Blender, 
    1.33 -   a free 3D-modeling program. Senses and joints are described
    1.34 -   using special nodes in Blender. The senses are displayed on 
    1.35 -   the right, and the simulation is displayed on the left. Notice
    1.36 -   that the hand is curling its fingers, that it can see its own 
    1.37 -   finger from the eye in its palm, and thta it can feel its own 
    1.38 -   thumb touching its palm.}
    1.39 +   \caption{
    1.40 +   I modeled my own right hand in Blender and rigged it with all the
    1.41 +   senses that {\tt CORTEX} supports. My simulated hand has a
    1.42 +   biologically inspired distribution of touch sensors. The senses are
    1.43 +   displayed on the right, and the simulation is displayed on the
    1.44 +   left. Notice that my hand is curling its fingers, that it can see
    1.45 +   its own finger from the eye in its palm, and that it can feel its
    1.46 +   own thumb touching its palm.}
    1.47     \end{sidewaysfigure}
    1.48 -   #+END_LaTeX
    1.49 +#+END_LaTeX
    1.50  
    1.51  ** Contributions
    1.52  
    1.53 -   I built =CORTEX=, a comprehensive platform for embodied AI
    1.54 -   experiments. =CORTEX= many new features lacking in other systems,
    1.55 -   such as sound. It is easy to create new creatures using Blender, a
    1.56 -   free 3D modeling program.
    1.57 +   - I built =CORTEX=, a comprehensive platform for embodied AI
    1.58 +     experiments. =CORTEX= supports many features lacking in other
    1.59 +     systems, such proper simulation of hearing. It is easy to create
    1.60 +     new =CORTEX= creatures using Blender, a free 3D modeling program.
    1.61  
    1.62 -   I built =EMPATH=, which uses =CORTEX= to identify the actions of a
    1.63 -   worm-like creature using a computational model of empathy.
    1.64 +   - I built =EMPATH=, which uses =CORTEX= to identify the actions of
    1.65 +     a worm-like creature using a computational model of empathy.
    1.66     
    1.67  * Building =CORTEX=
    1.68  
    1.69 @@ -372,7 +373,7 @@
    1.70    #+caption: It is composed of 5 segments. Each segment has a 
    1.71    #+caption: pair of extensor and flexor muscles. Each of the 
    1.72    #+caption: worm's four joints is a hinge joint which allows 
    1.73 -  #+caption: 30 degrees of rotation to either side. Each segment
    1.74 +  #+caption: about 30 degrees of rotation to either side. Each segment
    1.75    #+caption: of the worm is touch-capable and has a uniform 
    1.76    #+caption: distribution of touch sensors on each of its faces.
    1.77    #+caption: Each joint has a proprioceptive sense to detect 
    1.78 @@ -418,7 +419,7 @@
    1.79     efficient at describing body-centered actions. It is the ``right
    1.80     language for the job''. For example, it takes only around 5 lines
    1.81     of LISP code to describe the action of ``curling'' using embodied
    1.82 -   primitives. It takes about 8 lines to describe the seemingly
    1.83 +   primitives. It takes about 10 lines to describe the seemingly
    1.84     complicated action of wiggling.
    1.85  
    1.86     The following action predicates each take a stream of sensory
    1.87 @@ -578,19 +579,20 @@
    1.88  
    1.89     #+caption: Using =debug-experience=, the body-centered predicates
    1.90     #+caption: work together to classify the behaviour of the worm. 
    1.91 -   #+caption: while under manual motor control.
    1.92 +   #+caption: the predicates are operating with access to the worm's
    1.93 +   #+caption: full sensory data.
    1.94     #+name: basic-worm-view
    1.95     #+ATTR_LaTeX: :width 10cm
    1.96     [[./images/worm-identify-init.png]]
    1.97  
    1.98     These action predicates satisfy the recognition requirement of an
    1.99 -   empathic recognition system. There is a lot of power in the
   1.100 -   simplicity of the action predicates. They describe their actions
   1.101 -   without getting confused in visual details of the worm. Each one is
   1.102 -   frame independent, but more than that, they are each indepent of
   1.103 +   empathic recognition system. There is power in the simplicity of
   1.104 +   the action predicates. They describe their actions without getting
   1.105 +   confused in visual details of the worm. Each one is frame
   1.106 +   independent, but more than that, they are each indepent of
   1.107     irrelevant visual details of the worm and the environment. They
   1.108     will work regardless of whether the worm is a different color or
   1.109 -   hevaily textured, or of the environment has strange lighting.
   1.110 +   hevaily textured, or if the environment has strange lighting.
   1.111  
   1.112     The trick now is to make the action predicates work even when the
   1.113     sensory data on which they depend is absent. If I can do that, then
   1.114 @@ -601,61 +603,42 @@
   1.115     As a first step towards building empathy, I need to gather all of
   1.116     the worm's experiences during free play. I use a simple vector to
   1.117     store all the experiences. 
   1.118 -   
   1.119 -   #+caption: Program to gather the worm's experiences into a vector for 
   1.120 -   #+caption: further processing. The =motor-control-program= line uses
   1.121 -   #+caption: a motor control script that causes the worm to execute a series
   1.122 -   #+caption: of ``exercices'' that include all the action predicates.
   1.123 -   #+name: generate-phi-space
   1.124 -   #+begin_listing clojure
   1.125 -   #+begin_src clojure
   1.126 -(defn generate-phi-space []
   1.127 -  (let [experiences (atom [])]
   1.128 -    (run-world
   1.129 -     (apply-map 
   1.130 -      worm-world
   1.131 -      (merge
   1.132 -       (worm-world-defaults)
   1.133 -       {:end-frame 700
   1.134 -        :motor-control
   1.135 -        (motor-control-program worm-muscle-labels do-all-the-things)
   1.136 -        :experiences experiences})))
   1.137 -    @experiences))
   1.138 -   #+end_src
   1.139 -   #+end_listing
   1.140  
   1.141     Each element of the experience vector exists in the vast space of
   1.142     all possible worm-experiences. Most of this vast space is actually
   1.143     unreachable due to physical constraints of the worm's body. For
   1.144     example, the worm's segments are connected by hinge joints that put
   1.145 -   a practical limit on the worm's degrees of freedom. Also, the worm
   1.146 -   can not be bent into a circle so that its ends are touching and at
   1.147 -   the same time not also experience the sensation of touching itself.
   1.148 +   a practical limit on the worm's range of motions without limiting
   1.149 +   its degrees of freedom. Some groupings of senses are impossible;
   1.150 +   the worm can not be bent into a circle so that its ends are
   1.151 +   touching and at the same time not also experience the sensation of
   1.152 +   touching itself.
   1.153  
   1.154 -   As the worm moves around during free play and the vector grows
   1.155 -   larger, the vector begins to define a subspace which is all the
   1.156 -   practical experiences the worm can experience during normal
   1.157 -   operation, which I call \Phi-space, short for physical-space. The
   1.158 -   vector defines a path through \Phi-space. This path has interesting
   1.159 -   properties that all derive from embodiment. The proprioceptive
   1.160 -   components are completely smooth, because in order for the worm to
   1.161 -   move from one position to another, it must pass through the
   1.162 -   intermediate positions. The path invariably forms loops as actions
   1.163 -   are repeated. Finally and most importantly, proprioception actually
   1.164 -   gives very strong inference about the other senses. For example,
   1.165 -   when the worm is flat, you can infer that it is touching the ground
   1.166 -   and that its muscles are not active, because if the muscles were
   1.167 -   active, the worm would be moving and would not be perfectly flat.
   1.168 -   In order to stay flat, the worm has to be touching the ground, or
   1.169 -   it would again be moving out of the flat position due to gravity.
   1.170 -   If the worm is positioned in such a way that it interacts with
   1.171 -   itself, then it is very likely to be feeling the same tactile
   1.172 -   feelings as the last time it was in that position, because it has
   1.173 -   the same body as then. If you observe multiple frames of
   1.174 -   proprioceptive data, then you can become increasingly confident
   1.175 -   about the exact activations of the worm's muscles, because it
   1.176 -   generally takes a unique combination of muscle contractions to
   1.177 -   transform the worm's body along a specific path through \Phi-space.
   1.178 +   As the worm moves around during free play and its experience vector
   1.179 +   grows larger, the vector begins to define a subspace which is all
   1.180 +   the sensations the worm can practicaly experience during normal
   1.181 +   operation. I call this subspace \Phi-space, short for
   1.182 +   physical-space. The experience vector defines a path through
   1.183 +   \Phi-space. This path has interesting properties that all derive
   1.184 +   from physical embodiment. The proprioceptive components are
   1.185 +   completely smooth, because in order for the worm to move from one
   1.186 +   position to another, it must pass through the intermediate
   1.187 +   positions. The path invariably forms loops as actions are repeated.
   1.188 +   Finally and most importantly, proprioception actually gives very
   1.189 +   strong inference about the other senses. For example, when the worm
   1.190 +   is flat, you can infer that it is touching the ground and that its
   1.191 +   muscles are not active, because if the muscles were active, the
   1.192 +   worm would be moving and would not be perfectly flat. In order to
   1.193 +   stay flat, the worm has to be touching the ground, or it would
   1.194 +   again be moving out of the flat position due to gravity. If the
   1.195 +   worm is positioned in such a way that it interacts with itself,
   1.196 +   then it is very likely to be feeling the same tactile feelings as
   1.197 +   the last time it was in that position, because it has the same body
   1.198 +   as then. If you observe multiple frames of proprioceptive data,
   1.199 +   then you can become increasingly confident about the exact
   1.200 +   activations of the worm's muscles, because it generally takes a
   1.201 +   unique combination of muscle contractions to transform the worm's
   1.202 +   body along a specific path through \Phi-space.
   1.203  
   1.204     There is a simple way of taking \Phi-space and the total ordering
   1.205     provided by an experience vector and reliably infering the rest of
   1.206 @@ -664,34 +647,38 @@
   1.207  ** Empathy is the process of tracing though \Phi-space 
   1.208  
   1.209     Here is the core of a basic empathy algorithm, starting with an
   1.210 -   experience vector: First, group the experiences into tiered
   1.211 -   proprioceptive bins. I use powers of 10 and 3 bins, and the
   1.212 -   smallest bin has and approximate size of 0.001 radians in all
   1.213 -   proprioceptive dimensions.
   1.214 +   experience vector:
   1.215 +
   1.216 +   First, group the experiences into tiered proprioceptive bins. I use
   1.217 +   powers of 10 and 3 bins, and the smallest bin has an approximate
   1.218 +   size of 0.001 radians in all proprioceptive dimensions.
   1.219     
   1.220     Then, given a sequence of proprioceptive input, generate a set of
   1.221 -   matching experience records for each input. 
   1.222 +   matching experience records for each input, using the tiered
   1.223 +   proprioceptive bins. 
   1.224  
   1.225     Finally, to infer sensory data, select the longest consective chain
   1.226 -   of experiences as determined by the indexes into the experience
   1.227 -   vector. 
   1.228 +   of experiences. Conecutive experience means that the experiences
   1.229 +   appear next to each other in the experience vector.
   1.230  
   1.231     This algorithm has three advantages: 
   1.232  
   1.233     1. It's simple
   1.234  
   1.235 -   3. It's very fast -- both tracing through possibilites and
   1.236 -      retrieving possible interpretations take essentially constant
   1.237 -      time. 
   1.238 +   3. It's very fast -- retrieving possible interpretations takes
   1.239 +      constant time. Tracing through chains of interpretations takes
   1.240 +      time proportional to the average number of experiences in a
   1.241 +      proprioceptive bin. Redundant experiences in \Phi-space can be
   1.242 +      merged to save computation.
   1.243  
   1.244     2. It protects from wrong interpretations of transient ambiguous
   1.245 -      proprioceptive data : for example, if the worm is flat for just
   1.246 +      proprioceptive data. For example, if the worm is flat for just
   1.247        an instant, this flattness will not be interpreted as implying
   1.248        that the worm has its muscles relaxed, since the flattness is
   1.249        part of a longer chain which includes a distinct pattern of
   1.250 -      muscle activation. A memoryless statistical model such as a
   1.251 -      markov model that operates on individual frames may very well
   1.252 -      make this mistake.
   1.253 +      muscle activation. Markov chains or other memoryless statistical
   1.254 +      models that operate on individual frames may very well make this
   1.255 +      mistake.
   1.256  
   1.257     #+caption: Program to convert an experience vector into a 
   1.258     #+caption: proprioceptively binned lookup function.
   1.259 @@ -725,6 +712,30 @@
   1.260     #+end_src
   1.261     #+end_listing
   1.262  
   1.263 +   #+caption: =longest-thread= finds the longest path of consecutive 
   1.264 +   #+caption: experiences to explain proprioceptive worm data.
   1.265 +   #+name: phi-space-history-scan
   1.266 +   #+ATTR_LaTeX: :width 10cm
   1.267 +   [[./images/aurellem-gray.png]]
   1.268 +
   1.269 +   =longest-thread= infers sensory data by stitching together pieces
   1.270 +   from previous experience. It prefers longer chains of previous
   1.271 +   experience to shorter ones. For example, during training the worm
   1.272 +   might rest on the ground for one second before it performs its
   1.273 +   excercises. If during recognition the worm rests on the ground for
   1.274 +   five seconds, =longest-thread= will accomodate this five second
   1.275 +   rest period by looping the one second rest chain five times.
   1.276 +
   1.277 +   =longest-thread= takes time proportinal to the average number of
   1.278 +   entries in a proprioceptive bin, because for each element in the
   1.279 +   starting bin it performes a series of set lookups in the preceeding
   1.280 +   bins. If the total history is limited, then this is only a constant
   1.281 +   multiple times the number of entries in the starting bin. This
   1.282 +   analysis also applies even if the action requires multiple longest
   1.283 +   chains -- it's still the average number of entries in a
   1.284 +   proprioceptive bin times the desired chain length. Because
   1.285 +   =longest-thread= is so efficient and simple, I can interpret
   1.286 +   worm-actions in real time.
   1.287  
   1.288     #+caption: Program to calculate empathy by tracing though \Phi-space
   1.289     #+caption: and finding the longest (ie. most coherent) interpretation
   1.290 @@ -761,14 +772,13 @@
   1.291     #+end_src
   1.292     #+end_listing
   1.293  
   1.294 -
   1.295 -There is one final piece, which is to replace missing sensory data
   1.296 -with a best-guess estimate. While I could fill in missing data by
   1.297 -using a gradient over the closest known sensory data points, averages
   1.298 -can be misleading. It is certainly possible to create an impossible
   1.299 -sensory state by averaging two possible sensory states. Therefore, I
   1.300 -simply replicate the most recent sensory experience to fill in the
   1.301 -gaps. 
   1.302 +   There is one final piece, which is to replace missing sensory data
   1.303 +   with a best-guess estimate. While I could fill in missing data by
   1.304 +   using a gradient over the closest known sensory data points,
   1.305 +   averages can be misleading. It is certainly possible to create an
   1.306 +   impossible sensory state by averaging two possible sensory states.
   1.307 +   Therefore, I simply replicate the most recent sensory experience to
   1.308 +   fill in the gaps.
   1.309  
   1.310     #+caption: Fill in blanks in sensory experience by replicating the most 
   1.311     #+caption: recent experience.
   1.312 @@ -789,14 +799,158 @@
   1.313            (recur i (assoc! v i 0))))))
   1.314     #+end_src
   1.315     #+end_listing
   1.316 -
   1.317    
   1.318  ** Efficient action recognition with =EMPATH=
   1.319 +   
   1.320 +   To use =EMPATH= with the worm, I first need to gather a set of
   1.321 +   experiences from the worm that includes the actions I want to
   1.322 +   recognize. The =generate-phi-space= program (listint
   1.323 +   \ref{generate-phi-space} runs the worm through a series of
   1.324 +   exercices and gatheres those experiences into a vector. The
   1.325 +   =do-all-the-things= program is a routine expressed in a simple
   1.326 +   muscle contraction script language for automated worm control.
   1.327  
   1.328 -   In my exploration with the worm, I can generally infer actions from
   1.329 -   proprioceptive data exactly as well as when I have the complete
   1.330 -   sensory data. To reach this level, I have to train the worm with
   1.331 -   verious exercices for about 1 minute.
   1.332 +   #+caption: Program to gather the worm's experiences into a vector for 
   1.333 +   #+caption: further processing. The =motor-control-program= line uses
   1.334 +   #+caption: a motor control script that causes the worm to execute a series
   1.335 +   #+caption: of ``exercices'' that include all the action predicates.
   1.336 +   #+name: generate-phi-space
   1.337 +   #+attr_latex: [!H]
   1.338 +   #+begin_listing clojure 
   1.339 +   #+begin_src clojure
   1.340 +(def do-all-the-things 
   1.341 +  (concat
   1.342 +   curl-script
   1.343 +   [[300 :d-ex 40]
   1.344 +    [320 :d-ex 0]]
   1.345 +   (shift-script 280 (take 16 wiggle-script))))
   1.346 +
   1.347 +(defn generate-phi-space []
   1.348 +  (let [experiences (atom [])]
   1.349 +    (run-world
   1.350 +     (apply-map 
   1.351 +      worm-world
   1.352 +      (merge
   1.353 +       (worm-world-defaults)
   1.354 +       {:end-frame 700
   1.355 +        :motor-control
   1.356 +        (motor-control-program worm-muscle-labels do-all-the-things)
   1.357 +        :experiences experiences})))
   1.358 +    @experiences))
   1.359 +   #+end_src
   1.360 +   #+end_listing
   1.361 +
   1.362 +   #+caption: Use longest thread and a phi-space generated from a short
   1.363 +   #+caption: exercise routine to interpret actions during free play.
   1.364 +   #+name: empathy-debug
   1.365 +   #+begin_listing clojure
   1.366 +   #+begin_src clojure
   1.367 +(defn init []
   1.368 +  (def phi-space (generate-phi-space))
   1.369 +  (def phi-scan (gen-phi-scan phi-space)))
   1.370 +
   1.371 +(defn empathy-demonstration []
   1.372 +  (let [proprio (atom ())]
   1.373 +    (fn
   1.374 +      [experiences text]
   1.375 +      (let [phi-indices (phi-scan (:proprioception (peek experiences)))]
   1.376 +        (swap! proprio (partial cons phi-indices))
   1.377 +        (let [exp-thread (longest-thread (take 300 @proprio))
   1.378 +              empathy (mapv phi-space (infer-nils exp-thread))]
   1.379 +          (println-repl (vector:last-n exp-thread 22))
   1.380 +          (cond
   1.381 +           (grand-circle? empathy) (.setText text "Grand Circle")
   1.382 +           (curled? empathy)       (.setText text "Curled")
   1.383 +           (wiggling? empathy)     (.setText text "Wiggling")
   1.384 +           (resting? empathy)      (.setText text "Resting")
   1.385 +           :else                       (.setText text "Unknown")))))))
   1.386 +
   1.387 +(defn empathy-experiment [record]
   1.388 +  (.start (worm-world :experience-watch (debug-experience-phi)
   1.389 +                      :record record :worm worm*)))
   1.390 +   #+end_src
   1.391 +   #+end_listing
   1.392 +   
   1.393 +   The result of running =empathy-experiment= is that the system is
   1.394 +   generally able to interpret worm actions using the action-predicates
   1.395 +   on simulated sensory data just as well as with actual data. Figure
   1.396 +   \ref{empathy-debug-image} was generated using =empathy-experiment=:
   1.397 +
   1.398 +  #+caption: From only proprioceptive data, =EMPATH= was able to infer 
   1.399 +  #+caption: the complete sensory experience and classify four poses
   1.400 +  #+caption: (The last panel shows a composite image of \emph{wriggling}, 
   1.401 +  #+caption: a dynamic pose.)
   1.402 +  #+name: empathy-debug-image
   1.403 +  #+ATTR_LaTeX: :width 10cm :placement [H]
   1.404 +  [[./images/empathy-1.png]]
   1.405 +
   1.406 +  One way to measure the performance of =EMPATH= is to compare the
   1.407 +  sutiability of the imagined sense experience to trigger the same
   1.408 +  action predicates as the real sensory experience. 
   1.409 +  
   1.410 +   #+caption: Determine how closely empathy approximates actual 
   1.411 +   #+caption: sensory data.
   1.412 +   #+name: test-empathy-accuracy
   1.413 +   #+begin_listing clojure
   1.414 +   #+begin_src clojure
   1.415 +(def worm-action-label
   1.416 +  (juxt grand-circle? curled? wiggling?))
   1.417 +
   1.418 +(defn compare-empathy-with-baseline [matches]
   1.419 +  (let [proprio (atom ())]
   1.420 +    (fn
   1.421 +      [experiences text]
   1.422 +      (let [phi-indices (phi-scan (:proprioception (peek experiences)))]
   1.423 +        (swap! proprio (partial cons phi-indices))
   1.424 +        (let [exp-thread (longest-thread (take 300 @proprio))
   1.425 +              empathy (mapv phi-space (infer-nils exp-thread))
   1.426 +              experience-matches-empathy
   1.427 +              (= (worm-action-label experiences)
   1.428 +                 (worm-action-label empathy))]
   1.429 +          (println-repl experience-matches-empathy)
   1.430 +          (swap! matches #(conj % experience-matches-empathy)))))))
   1.431 +              
   1.432 +(defn accuracy [v]
   1.433 +  (float (/ (count (filter true? v)) (count v))))
   1.434 +
   1.435 +(defn test-empathy-accuracy []
   1.436 +  (let [res (atom [])]
   1.437 +    (run-world
   1.438 +     (worm-world :experience-watch
   1.439 +                 (compare-empathy-with-baseline res)
   1.440 +                 :worm worm*))
   1.441 +    (accuracy @res)))
   1.442 +   #+end_src
   1.443 +   #+end_listing
   1.444 +
   1.445 +  Running =test-empathy-accuracy= using the very short exercise
   1.446 +  program defined in listing \ref{generate-phi-space}, and then doing
   1.447 +  a similar pattern of activity manually yeilds an accuracy of around
   1.448 +  73%. This is based on very limited worm experience. By training the
   1.449 +  worm for longer, the accuracy dramatically improves.
   1.450 +
   1.451 +   #+caption: Program to generate \Phi-space using manual training.
   1.452 +   #+name: manual-phi-space
   1.453 +   #+begin_listing clojure
   1.454 +   #+begin_src clojure
   1.455 +(defn init-interactive []
   1.456 +  (def phi-space
   1.457 +    (let [experiences (atom [])]
   1.458 +      (run-world
   1.459 +       (apply-map 
   1.460 +        worm-world
   1.461 +        (merge
   1.462 +         (worm-world-defaults)
   1.463 +         {:experiences experiences})))
   1.464 +      @experiences))
   1.465 +  (def phi-scan (gen-phi-scan phi-space)))
   1.466 +   #+end_src
   1.467 +   #+end_listing
   1.468 +
   1.469 +  After about 1 minute of manual training, I was able to achieve 95%
   1.470 +  accuracy on manual testing of the worm using =init-interactive= and
   1.471 +  =test-empathy-accuracy=. The ability of the system to infer sensory
   1.472 +  states is truly impressive.
   1.473  
   1.474  ** Digression: bootstrapping touch using free exploration
   1.475