diff thesis/cortex.org @ 443:d3c5f9b70574

workling on thesis render.
author Robert McIntyre <rlm@mit.edu>
date Tue, 25 Mar 2014 00:20:01 -0400
parents eaf8c591372b
children 47cfbe84f00e
line wrap: on
line diff
     1.1 --- a/thesis/cortex.org	Mon Mar 24 22:43:06 2014 -0400
     1.2 +++ b/thesis/cortex.org	Tue Mar 25 00:20:01 2014 -0400
     1.3 @@ -142,7 +142,7 @@
     1.4        experience. This will include relevant muscle contractions, a
     1.5        close up view of the stream from the cat's perspective, and most
     1.6        importantly, the imagined feeling of water entering the
     1.7 -      mouth. The imagined sensory experience can come from both a
     1.8 +      mouth. The imagined sensory experience can come from a
     1.9        simulation of the event, but can also be pattern-matched from
    1.10        previous, similar embodied experience.
    1.11  
    1.12 @@ -162,8 +162,8 @@
    1.13         sensory experience associated with that particular proproceptive
    1.14         feeling.
    1.15  
    1.16 -    4. Retrieve the feeling of your bottom resting on a surface and
    1.17 -       your leg muscles relaxed.
    1.18 +    4. Retrieve the feeling of your bottom resting on a surface, your
    1.19 +       knees bent, and your leg muscles relaxed.
    1.20  
    1.21      5. This sensory information is consistent with the =sitting?=
    1.22         sensory predicate, so you (and the entity in the image) must be
    1.23 @@ -181,23 +181,9 @@
    1.24     #+caption: The worm performs many actions during free play such as 
    1.25     #+caption: curling, wiggling, and resting.
    1.26     #+name: worm-intro
    1.27 -   #+ATTR_LaTeX: :width 10cm
    1.28 -   [[./images/wall-push.png]]
    1.29 +   #+ATTR_LaTeX: :width 13cm
    1.30 +   [[./images/worm-free-play.png]]
    1.31  
    1.32 -   #+caption: This sensory predicate detects when the worm is resting on the 
    1.33 -   #+caption: ground.
    1.34 -   #+name: resting-intro
    1.35 -   #+begin_listing clojure
    1.36 -   #+begin_src clojure
    1.37 -(defn resting?
    1.38 -  "Is the worm resting on the ground?"
    1.39 -  [experiences]
    1.40 -  (every?
    1.41 -   (fn [touch-data]
    1.42 -     (< 0.9 (contact worm-segment-bottom touch-data)))
    1.43 -   (:touch (peek experiences))))
    1.44 -   #+end_src
    1.45 -   #+end_listing
    1.46  
    1.47     #+caption: Body-centerd actions are best expressed in a body-centered 
    1.48     #+caption: language. This code detects when the worm has curled into a 
    1.49 @@ -218,30 +204,6 @@
    1.50     #+end_src
    1.51     #+end_listing
    1.52  
    1.53 -   #+caption: Even complicated actions such as ``wiggling'' are fairly simple
    1.54 -   #+caption: to describe with a rich enough language.
    1.55 -   #+name: wiggling-intro
    1.56 -   #+begin_listing clojure
    1.57 -   #+begin_src clojure
    1.58 -(defn wiggling?
    1.59 -  "Is the worm wiggling?"
    1.60 -  [experiences]
    1.61 -  (let [analysis-interval 0x40]
    1.62 -    (when (> (count experiences) analysis-interval)
    1.63 -      (let [a-flex 3
    1.64 -            a-ex   2
    1.65 -            muscle-activity
    1.66 -            (map :muscle (vector:last-n experiences analysis-interval))
    1.67 -            base-activity
    1.68 -            (map #(- (% a-flex) (% a-ex)) muscle-activity)]
    1.69 -        (= 2
    1.70 -           (first
    1.71 -            (max-indexed
    1.72 -             (map #(Math/abs %)
    1.73 -                  (take 20 (fft base-activity))))))))))
    1.74 -   #+end_src
    1.75 -   #+end_listing
    1.76 -
    1.77     #+caption: The actions of a worm in a video can be recognized by
    1.78     #+caption: proprioceptive data and sentory predicates by filling
    1.79     #+caption:  in the missing sensory detail with previous experience.
    1.80 @@ -249,7 +211,6 @@
    1.81     #+ATTR_LaTeX: :width 10cm
    1.82     [[./images/wall-push.png]]
    1.83  
    1.84 -
    1.85     
    1.86     One powerful advantage of empathic problem solving is that it
    1.87     factors the action recognition problem into two easier problems. To