view thesis/garbage_cortex.org @ 559:f6a9a958718d

fix biblography.
author Robert McIntyre <rlm@mit.edu>
date Fri, 02 May 2014 15:24:23 -0400
parents 90b236381642
children
line wrap: on
line source
1 #+title: =CORTEX=
2 #+author: Robert McIntyre
3 #+email: rlm@mit.edu
4 #+description: Using embodied AI to facilitate Artificial Imagination.
5 #+keywords: AI, clojure, embodiment
7 * Artificial Imagination
9 Imagine watching a video of someone skateboarding. When you watch
10 the video, you can imagine yourself skateboarding, and your
11 knowledge of the human body and its dynamics guides your
12 interpretation of the scene. For example, even if the skateboarder
13 is partially occluded, you can infer the positions of his arms and
14 body from your own knowledge of how your body would be positioned if
15 you were skateboarding. If the skateboarder suffers an accident, you
16 wince in sympathy, imagining the pain your own body would experience
17 if it were in the same situation. This empathy with other people
18 guides our understanding of whatever they are doing because it is a
19 powerful constraint on what is probable and possible. In order to
20 make use of this powerful empathy constraint, I need a system that
21 can generate and make sense of sensory data from the many different
22 senses that humans possess. The two key proprieties of such a system
23 are /embodiment/ and /imagination/.
25 ** What is imagination?
27 One kind of imagination is /sympathetic/ imagination: you imagine
28 yourself in the position of something/someone you are
29 observing. This type of imagination comes into play when you follow
30 along visually when watching someone perform actions, or when you
31 sympathetically grimace when someone hurts themselves. This type of
32 imagination uses the constraints you have learned about your own
33 body to highly constrain the possibilities in whatever you are
34 seeing. It uses all your senses to including your senses of touch,
35 proprioception, etc. Humans are flexible when it comes to "putting
36 themselves in another's shoes," and can sympathetically understand
37 not only other humans, but entities ranging from animals to cartoon
38 characters to [[http://www.youtube.com/watch?v=0jz4HcwTQmU][single dots]] on a screen!
41 #+caption: A cat drinking some water. Identifying this action is beyond the state of the art for computers.
42 #+ATTR_LaTeX: :width 5cm
43 [[./images/cat-drinking.jpg]]
46 #+begin_listing clojure
47 \caption{This is a basic test for the vision system. It only tests the vision-pipeline and does not deal with loading eyes from a blender file. The code creates two videos of the same rotating cube from different angles.}
48 #+name: test-1
49 #+begin_src clojure
50 (defn test-pipeline
51 "Testing vision:
52 Tests the vision system by creating two views of the same rotating
53 object from different angles and displaying both of those views in
54 JFrames.
56 You should see a rotating cube, and two windows,
57 each displaying a different view of the cube."
58 ([] (test-pipeline false))
59 ([record?]
60 (let [candy
61 (box 1 1 1 :physical? false :color ColorRGBA/Blue)]
62 (world
63 (doto (Node.)
64 (.attachChild candy))
65 {}
66 (fn [world]
67 (let [cam (.clone (.getCamera world))
68 width (.getWidth cam)
69 height (.getHeight cam)]
70 (add-camera! world cam
71 (comp
72 (view-image
73 (if record?
74 (File. "/home/r/proj/cortex/render/vision/1")))
75 BufferedImage!))
76 (add-camera! world
77 (doto (.clone cam)
78 (.setLocation (Vector3f. -10 0 0))
79 (.lookAt Vector3f/ZERO Vector3f/UNIT_Y))
80 (comp
81 (view-image
82 (if record?
83 (File. "/home/r/proj/cortex/render/vision/2")))
84 BufferedImage!))
85 (let [timer (IsoTimer. 60)]
86 (.setTimer world timer)
87 (display-dilated-time world timer))
88 ;; This is here to restore the main view
89 ;; after the other views have completed processing
90 (add-camera! world (.getCamera world) no-op)))
91 (fn [world tpf]
92 (.rotate candy (* tpf 0.2) 0 0))))))
93 #+end_src
94 #+end_listing
96 - This is test1 \cite{Tappert77}.
98 \cite{Tappert77}
99 lol
100 \cite{Tappert77}
105 #+caption: This sensory predicate detects when the worm is resting on the
106 #+caption: ground using the worm's sense of touch.
107 #+name: resting-intro
108 #+begin_listing clojure
109 #+begin_src clojure
110 (defn resting?
111 "Is the worm resting on the ground?"
112 [experiences]
113 (every?
114 (fn [touch-data]
115 (< 0.9 (contact worm-segment-bottom touch-data)))
116 (:touch (peek experiences))))
117 #+end_src
118 #+end_listing
120 #+caption: Even complicated actions such as ``wiggling'' are fairly simple
121 #+caption: to describe with a rich enough language.
122 #+name: wiggling-intro
123 #+begin_listing clojure
124 #+begin_src clojure
125 (defn wiggling?
126 "Is the worm wiggling?"
127 [experiences]
128 (let [analysis-interval 0x40]
129 (when (> (count experiences) analysis-interval)
130 (let [a-flex 3
131 a-ex 2
132 muscle-activity
133 (map :muscle (vector:last-n experiences analysis-interval))
134 base-activity
135 (map #(- (% a-flex) (% a-ex)) muscle-activity)]
136 (= 2
137 (first
138 (max-indexed
139 (map #(Math/abs %)
140 (take 20 (fft base-activity))))))))))
141 #+end_src
142 #+end_listing