view thesis/cortex.tex @ 429:b5d0f0adf19f

improvements by Dylan!
author Robert McIntyre <rlm@mit.edu>
date Fri, 21 Mar 2014 20:56:56 -0400
parents 435b5e22d72a
children 7410f0d8011c
line wrap: on
line source

2 \section{Artificial Imagination}
3 \label{sec-1}
5 Imagine watching a video of someone skateboarding. When you watch
6 the video, you can imagine yourself skateboarding, and your
7 knowledge of the human body and its dynamics guides your
8 interpretation of the scene. For example, even if the skateboarder
9 is partially occluded, you can infer the positions of his arms and
10 body from your own knowledge of how your body would be positioned if
11 you were skateboarding. If the skateboarder suffers an accident, you
12 wince in sympathy, imagining the pain your own body would experience
13 if it were in the same situation. This empathy with other people
14 guides our understanding of whatever they are doing because it is a
15 powerful constraint on what is probable and possible. In order to
16 make use of this powerful empathy constraint, I need a system that
17 can generate and make sense of sensory data from the many different
18 senses that humans possess. The two key proprieties of such a system
19 are \emph{embodiment} and \emph{imagination}.
21 \subsection{What is imagination?}
22 \label{sec-1-1}
24 One kind of imagination is \emph{sympathetic} imagination: you imagine
25 yourself in the position of something/someone you are
26 observing. This type of imagination comes into play when you follow
27 along visually when watching someone perform actions, or when you
28 sympathetically grimace when someone hurts themselves. This type of
29 imagination uses the constraints you have learned about your own
30 body to highly constrain the possibilities in whatever you are
31 seeing. It uses all your senses to including your senses of touch,
32 proprioception, etc. Humans are flexible when it comes to "putting
33 themselves in another's shoes," and can sympathetically understand
34 not only other humans, but entities ranging from animals to cartoon
35 characters to \href{http://www.youtube.com/watch?v=0jz4HcwTQmU}{single dots} on a screen!
38 \begin{figure}[htb]
39 \centering
40 \includegraphics[width=5cm]{./images/cat-drinking.jpg}
41 \caption{A cat drinking some water. Identifying this action is beyond the state of the art for computers.}
42 \end{figure}
45 \begin{listing}
46 \caption{This is a basic test for the vision system. It only tests the vision-pipeline and does not deal with loading eyes from a blender file. The code creates two videos of the same rotating cube from different angles.}
47 \begin{clojurecode}
48 (defn test-pipeline
49 "Testing vision:
50 Tests the vision system by creating two views of the same rotating
51 object from different angles and displaying both of those views in
52 JFrames.
54 You should see a rotating cube, and two windows,
55 each displaying a different view of the cube."
56 ([] (test-pipeline false))
57 ([record?]
58 (let [candy
59 (box 1 1 1 :physical? false :color ColorRGBA/Blue)]
60 (world
61 (doto (Node.)
62 (.attachChild candy))
63 {}
64 (fn [world]
65 (let [cam (.clone (.getCamera world))
66 width (.getWidth cam)
67 height (.getHeight cam)]
68 (add-camera! world cam
69 (comp
70 (view-image
71 (if record?
72 (File. "/home/r/proj/cortex/render/vision/1")))
73 BufferedImage!))
74 (add-camera! world
75 (doto (.clone cam)
76 (.setLocation (Vector3f. -10 0 0))
77 (.lookAt Vector3f/ZERO Vector3f/UNIT_Y))
78 (comp
79 (view-image
80 (if record?
81 (File. "/home/r/proj/cortex/render/vision/2")))
82 BufferedImage!))
83 (let [timer (IsoTimer. 60)]
84 (.setTimer world timer)
85 (display-dilated-time world timer))
86 ;; This is here to restore the main view
87 ;; after the other views have completed processing
88 (add-camera! world (.getCamera world) no-op)))
89 (fn [world tpf]
90 (.rotate candy (* tpf 0.2) 0 0))))))
91 \end{clojurecode}
92 \end{listing}
94 \begin{itemize}
95 \item This is test1 \cite{Tappert77}.
96 \end{itemize}
98 \cite{Tappert77}
99 lol
100 \citet{Tappert77}