comparison thesis/cortex.org @ 443:d3c5f9b70574

workling on thesis render.
author Robert McIntyre <rlm@mit.edu>
date Tue, 25 Mar 2014 00:20:01 -0400
parents eaf8c591372b
children 47cfbe84f00e
comparison
equal deleted inserted replaced
442:eaf8c591372b 443:d3c5f9b70574
140 140
141 2. Play out this simulated scene and generate imagined sensory 141 2. Play out this simulated scene and generate imagined sensory
142 experience. This will include relevant muscle contractions, a 142 experience. This will include relevant muscle contractions, a
143 close up view of the stream from the cat's perspective, and most 143 close up view of the stream from the cat's perspective, and most
144 importantly, the imagined feeling of water entering the 144 importantly, the imagined feeling of water entering the
145 mouth. The imagined sensory experience can come from both a 145 mouth. The imagined sensory experience can come from a
146 simulation of the event, but can also be pattern-matched from 146 simulation of the event, but can also be pattern-matched from
147 previous, similar embodied experience. 147 previous, similar embodied experience.
148 148
149 3. The action is now easily identified as drinking by the sense of 149 3. The action is now easily identified as drinking by the sense of
150 taste alone. The other senses (such as the tongue moving in and 150 taste alone. The other senses (such as the tongue moving in and
160 160
161 3. Use the imagined proprioceptive data as a key to lookup related 161 3. Use the imagined proprioceptive data as a key to lookup related
162 sensory experience associated with that particular proproceptive 162 sensory experience associated with that particular proproceptive
163 feeling. 163 feeling.
164 164
165 4. Retrieve the feeling of your bottom resting on a surface and 165 4. Retrieve the feeling of your bottom resting on a surface, your
166 your leg muscles relaxed. 166 knees bent, and your leg muscles relaxed.
167 167
168 5. This sensory information is consistent with the =sitting?= 168 5. This sensory information is consistent with the =sitting?=
169 sensory predicate, so you (and the entity in the image) must be 169 sensory predicate, so you (and the entity in the image) must be
170 sitting. 170 sitting.
171 171
179 to interpret the actions of a simple, worm-like creature. 179 to interpret the actions of a simple, worm-like creature.
180 180
181 #+caption: The worm performs many actions during free play such as 181 #+caption: The worm performs many actions during free play such as
182 #+caption: curling, wiggling, and resting. 182 #+caption: curling, wiggling, and resting.
183 #+name: worm-intro 183 #+name: worm-intro
184 #+ATTR_LaTeX: :width 10cm 184 #+ATTR_LaTeX: :width 13cm
185 [[./images/wall-push.png]] 185 [[./images/worm-free-play.png]]
186 186
187 #+caption: This sensory predicate detects when the worm is resting on the
188 #+caption: ground.
189 #+name: resting-intro
190 #+begin_listing clojure
191 #+begin_src clojure
192 (defn resting?
193 "Is the worm resting on the ground?"
194 [experiences]
195 (every?
196 (fn [touch-data]
197 (< 0.9 (contact worm-segment-bottom touch-data)))
198 (:touch (peek experiences))))
199 #+end_src
200 #+end_listing
201 187
202 #+caption: Body-centerd actions are best expressed in a body-centered 188 #+caption: Body-centerd actions are best expressed in a body-centered
203 #+caption: language. This code detects when the worm has curled into a 189 #+caption: language. This code detects when the worm has curled into a
204 #+caption: full circle. Imagine how you would replicate this functionality 190 #+caption: full circle. Imagine how you would replicate this functionality
205 #+caption: using low-level pixel features such as HOG filters! 191 #+caption: using low-level pixel features such as HOG filters!
216 (and (< 0.55 (contact worm-segment-bottom-tip tail-touch)) 202 (and (< 0.55 (contact worm-segment-bottom-tip tail-touch))
217 (< 0.55 (contact worm-segment-top-tip head-touch)))))) 203 (< 0.55 (contact worm-segment-top-tip head-touch))))))
218 #+end_src 204 #+end_src
219 #+end_listing 205 #+end_listing
220 206
221 #+caption: Even complicated actions such as ``wiggling'' are fairly simple
222 #+caption: to describe with a rich enough language.
223 #+name: wiggling-intro
224 #+begin_listing clojure
225 #+begin_src clojure
226 (defn wiggling?
227 "Is the worm wiggling?"
228 [experiences]
229 (let [analysis-interval 0x40]
230 (when (> (count experiences) analysis-interval)
231 (let [a-flex 3
232 a-ex 2
233 muscle-activity
234 (map :muscle (vector:last-n experiences analysis-interval))
235 base-activity
236 (map #(- (% a-flex) (% a-ex)) muscle-activity)]
237 (= 2
238 (first
239 (max-indexed
240 (map #(Math/abs %)
241 (take 20 (fft base-activity))))))))))
242 #+end_src
243 #+end_listing
244
245 #+caption: The actions of a worm in a video can be recognized by 207 #+caption: The actions of a worm in a video can be recognized by
246 #+caption: proprioceptive data and sentory predicates by filling 208 #+caption: proprioceptive data and sentory predicates by filling
247 #+caption: in the missing sensory detail with previous experience. 209 #+caption: in the missing sensory detail with previous experience.
248 #+name: worm-recognition-intro 210 #+name: worm-recognition-intro
249 #+ATTR_LaTeX: :width 10cm 211 #+ATTR_LaTeX: :width 10cm
250 [[./images/wall-push.png]] 212 [[./images/wall-push.png]]
251
252 213
253 214
254 One powerful advantage of empathic problem solving is that it 215 One powerful advantage of empathic problem solving is that it
255 factors the action recognition problem into two easier problems. To 216 factors the action recognition problem into two easier problems. To
256 use empathy, you need an /aligner/, which takes the video and a 217 use empathy, you need an /aligner/, which takes the video and a