Mercurial > cortex
view thesis/garbage_cortex.org @ 570:9647f0168287
update for interview.
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Tue, 15 Jul 2014 02:46:00 -0400 |
parents | 90b236381642 |
children |
line wrap: on
line source
1 #+title: =CORTEX=2 #+author: Robert McIntyre3 #+email: rlm@mit.edu4 #+description: Using embodied AI to facilitate Artificial Imagination.5 #+keywords: AI, clojure, embodiment7 * Artificial Imagination9 Imagine watching a video of someone skateboarding. When you watch10 the video, you can imagine yourself skateboarding, and your11 knowledge of the human body and its dynamics guides your12 interpretation of the scene. For example, even if the skateboarder13 is partially occluded, you can infer the positions of his arms and14 body from your own knowledge of how your body would be positioned if15 you were skateboarding. If the skateboarder suffers an accident, you16 wince in sympathy, imagining the pain your own body would experience17 if it were in the same situation. This empathy with other people18 guides our understanding of whatever they are doing because it is a19 powerful constraint on what is probable and possible. In order to20 make use of this powerful empathy constraint, I need a system that21 can generate and make sense of sensory data from the many different22 senses that humans possess. The two key proprieties of such a system23 are /embodiment/ and /imagination/.25 ** What is imagination?27 One kind of imagination is /sympathetic/ imagination: you imagine28 yourself in the position of something/someone you are29 observing. This type of imagination comes into play when you follow30 along visually when watching someone perform actions, or when you31 sympathetically grimace when someone hurts themselves. This type of32 imagination uses the constraints you have learned about your own33 body to highly constrain the possibilities in whatever you are34 seeing. It uses all your senses to including your senses of touch,35 proprioception, etc. Humans are flexible when it comes to "putting36 themselves in another's shoes," and can sympathetically understand37 not only other humans, but entities ranging from animals to cartoon38 characters to [[http://www.youtube.com/watch?v=0jz4HcwTQmU][single dots]] on a screen!41 #+caption: A cat drinking some water. Identifying this action is beyond the state of the art for computers.42 #+ATTR_LaTeX: :width 5cm43 [[./images/cat-drinking.jpg]]46 #+begin_listing clojure47 \caption{This is a basic test for the vision system. It only tests the vision-pipeline and does not deal with loading eyes from a blender file. The code creates two videos of the same rotating cube from different angles.}48 #+name: test-149 #+begin_src clojure50 (defn test-pipeline51 "Testing vision:52 Tests the vision system by creating two views of the same rotating53 object from different angles and displaying both of those views in54 JFrames.56 You should see a rotating cube, and two windows,57 each displaying a different view of the cube."58 ([] (test-pipeline false))59 ([record?]60 (let [candy61 (box 1 1 1 :physical? false :color ColorRGBA/Blue)]62 (world63 (doto (Node.)64 (.attachChild candy))65 {}66 (fn [world]67 (let [cam (.clone (.getCamera world))68 width (.getWidth cam)69 height (.getHeight cam)]70 (add-camera! world cam71 (comp72 (view-image73 (if record?74 (File. "/home/r/proj/cortex/render/vision/1")))75 BufferedImage!))76 (add-camera! world77 (doto (.clone cam)78 (.setLocation (Vector3f. -10 0 0))79 (.lookAt Vector3f/ZERO Vector3f/UNIT_Y))80 (comp81 (view-image82 (if record?83 (File. "/home/r/proj/cortex/render/vision/2")))84 BufferedImage!))85 (let [timer (IsoTimer. 60)]86 (.setTimer world timer)87 (display-dilated-time world timer))88 ;; This is here to restore the main view89 ;; after the other views have completed processing90 (add-camera! world (.getCamera world) no-op)))91 (fn [world tpf]92 (.rotate candy (* tpf 0.2) 0 0))))))93 #+end_src94 #+end_listing96 - This is test1 \cite{Tappert77}.98 \cite{Tappert77}99 lol100 \cite{Tappert77}105 #+caption: This sensory predicate detects when the worm is resting on the106 #+caption: ground using the worm's sense of touch.107 #+name: resting-intro108 #+begin_listing clojure109 #+begin_src clojure110 (defn resting?111 "Is the worm resting on the ground?"112 [experiences]113 (every?114 (fn [touch-data]115 (< 0.9 (contact worm-segment-bottom touch-data)))116 (:touch (peek experiences))))117 #+end_src118 #+end_listing120 #+caption: Even complicated actions such as ``wiggling'' are fairly simple121 #+caption: to describe with a rich enough language.122 #+name: wiggling-intro123 #+begin_listing clojure124 #+begin_src clojure125 (defn wiggling?126 "Is the worm wiggling?"127 [experiences]128 (let [analysis-interval 0x40]129 (when (> (count experiences) analysis-interval)130 (let [a-flex 3131 a-ex 2132 muscle-activity133 (map :muscle (vector:last-n experiences analysis-interval))134 base-activity135 (map #(- (% a-flex) (% a-ex)) muscle-activity)]136 (= 2137 (first138 (max-indexed139 (map #(Math/abs %)140 (take 20 (fft base-activity))))))))))141 #+end_src142 #+end_listing