# HG changeset patch # User Robert McIntyre # Date 1328898847 25200 # Node ID f5ea63245b3b2cea4bc4562667dd8e58028ce41a # Parent f283c62bd2128f27881f5f7d78f9b697e9b65175 completed vision demonstration video and first draft of vision.org diff -r f283c62bd212 -r f5ea63245b3b assets/Models/subtitles/test-render.png Binary file assets/Models/subtitles/test-render.png has changed diff -r f283c62bd212 -r f5ea63245b3b assets/Models/subtitles/worm-vision-subtitles.blend Binary file assets/Models/subtitles/worm-vision-subtitles.blend has changed diff -r f283c62bd212 -r f5ea63245b3b images/worm-with-eye.png Binary file images/worm-with-eye.png has changed diff -r f283c62bd212 -r f5ea63245b3b org/util.org --- a/org/util.org Fri Feb 10 02:19:24 2012 -0700 +++ b/org/util.org Fri Feb 10 11:34:07 2012 -0700 @@ -403,7 +403,8 @@ cannon-ball (sphere 0.7 :material "Common/MatDefs/Misc/Unshaded.j3md" - :texture "Textures/PokeCopper.jpg" + :color ColorRGBA/White + :name "cannonball!" :position (.add (.getLocation camera) (.mult (.getDirection camera) (float 1))) @@ -411,7 +412,9 @@ (.setLinearVelocity (.getControl cannon-ball RigidBodyControl) (.mult (.getDirection camera) (float 50))) ;50 - (add-element game cannon-ball (if node node (.getRootNode game))))))) + (add-element game cannon-ball (if node node (.getRootNode + game))) + cannon-ball)))) ([] (fire-cannon-ball false))) diff -r f283c62bd212 -r f5ea63245b3b org/vision.org --- a/org/vision.org Fri Feb 10 02:19:24 2012 -0700 +++ b/org/vision.org Fri Feb 10 11:34:07 2012 -0700 @@ -8,8 +8,7 @@ #+babel: :mkdirp yes :noweb yes :exports both * Vision - - + Vision is one of the most important senses for humans, so I need to build a simulated sense of vision for my AI. I will do this with simulated eyes. Each eye can be independely moved and should see its @@ -143,8 +142,6 @@ They can be queried every cycle, but their information may not necessairly be different every cycle. - - * Physical Eyes The vision pipeline described above handles the flow of rendered @@ -162,24 +159,6 @@ #+begin_src clojure (in-ns 'cortex.vision) -(import com.jme3.math.Vector3f) - -(def blender-rotation-correction - (doto (Quaternion.) - (.fromRotationMatrix - (doto (Matrix3f.) - (.setColumn 0 - (Vector3f. 1 0 0)) - (.setColumn 1 - (Vector3f. 0 -1 0)) - (.setColumn 2 - (Vector3f. 0 0 -1))) - - (doto (Matrix3f.) - (.setColumn 0 - (Vector3f. - - (defn add-eye! "Create a Camera centered on the current position of 'eye which follows the closest physical node in 'creature and sends visual @@ -197,10 +176,6 @@ ;; this part is consistent with using Z in ;; blender as the UP vector. (.mult rot Vector3f/UNIT_Y)) - - (println-repl "eye unit-z ->" (.mult rot Vector3f/UNIT_Z)) - (println-repl "eye unit-y ->" (.mult rot Vector3f/UNIT_Y)) - (println-repl "eye unit-x ->" (.mult rot Vector3f/UNIT_X)) (.setFrustumPerspective cam 45 (/ (.getWidth cam) (.getHeight cam)) 1 1000) (bind-sense target cam) cam)) @@ -279,6 +254,7 @@ dimansions of the smallest image required to contain all the retinal sensor maps. +#+name: retina #+begin_src clojure (defn retina-sensor-profile "Return a map of pixel sensitivity numbers to BufferedImages @@ -306,7 +282,7 @@ First off, get the children of the "eyes" empty node to find all the eyes the creature has. - +#+name: eye-node #+begin_src clojure (defvar ^{:arglists '([creature])} @@ -318,6 +294,7 @@ Then, add the camera created by =(add-eye!)= to the simulation by creating a new viewport. +#+name: add-camera #+begin_src clojure (defn add-camera! "Add a camera to the world, calling continuation on every frame @@ -343,6 +320,7 @@ massive gain in speed. =(vision-kernel)= generates a list of such continuation functions, one for each channel of the eye. +#+name: kernel #+begin_src clojure (in-ns 'cortex.vision) @@ -351,6 +329,21 @@ (invoke [this world] (vision-fn world)) (applyTo [this args] (apply vision-fn args))) +(defn pixel-sense [sensitivity pixel] + (let [s-r (bit-shift-right (bit-and 0xFF0000 sensitivity) 16) + s-g (bit-shift-right (bit-and 0x00FF00 sensitivity) 8) + s-b (bit-and 0x0000FF sensitivity) + + p-r (bit-shift-right (bit-and 0xFF0000 pixel) 16) + p-g (bit-shift-right (bit-and 0x00FF00 pixel) 8) + p-b (bit-and 0x0000FF pixel) + + total-sensitivity (* 255 (+ s-r s-g s-b))] + (float (/ (+ (* s-r p-r) + (* s-g p-g) + (* s-b p-b)) + total-sensitivity)))) + (defn vision-kernel "Returns a list of functions, each of which will return a color channel's worth of visual information when called inside a running @@ -378,7 +371,7 @@ (fn [[key image]] (let [whites (white-coordinates image) topology (vec (collapse whites)) - mask (color-channel-presets key key)] + sensitivity (sensitivity-presets key key)] (attached-viewport. (fn [world] (register-eye! world) @@ -386,8 +379,9 @@ topology (vec (for [[x y] whites] - (bit-and - mask (.getRGB @vision-image x y)))))) + (pixel-sense + sensitivity + (.getRGB @vision-image x y)))))) register-eye!))) retinal-map)))) @@ -398,7 +392,6 @@ (runonce (fn [world] (add-camera! world (.getCamera world) no-op)))) - #+end_src Note that since each of the functions generated by =(vision-kernel)= @@ -419,6 +412,7 @@ =(vision-kernel)= to each eye in the creature and gather the results into one list of functions. +#+name: main #+begin_src clojure (defn vision! "Returns a function which returns visual sensory data when called @@ -436,7 +430,10 @@ =(view-sense)= to construct a function that will create a display for visual data. +#+name: display #+begin_src clojure +(in-ns 'cortex.vision) + (defn view-vision "Creates a function which accepts a list of visual sensor-data and displays each element of the list to the screen." @@ -448,7 +445,7 @@ (dorun (for [i (range (count coords))] (.setRGB image ((coords i) 0) ((coords i) 1) - (sensor-data i)))) + (gray (int (* 255 (sensor-data i))))))) image)))) #+end_src @@ -547,10 +544,18 @@ This is the approximation to the human eye described earlier. +#+name: test-2 #+begin_src clojure (in-ns 'cortex.test.vision) -(import com.aurellem.capture.Capture) +(defn change-color [obj color] + (println-repl obj) + (if obj + (.setColor (.getMaterial obj) "Color" color))) + +(defn colored-cannon-ball [color] + (comp #(change-color % color) + (fire-cannon-ball))) (defn test-worm-vision [] (let [the-worm (doto (worm)(body!)) @@ -566,12 +571,19 @@ :position (Vector3f. 0 -5 0)) z-axis (box 0.01 0.01 1 :physical? false :color ColorRGBA/Blue - :position (Vector3f. 0 -5 0))] + :position (Vector3f. 0 -5 0)) + timer (RatchetTimer. 60)] (world (nodify [(floor) the-worm x-axis y-axis z-axis me]) - standard-debug-controls + (assoc standard-debug-controls + "key-r" (colored-cannon-ball ColorRGBA/Red) + "key-b" (colored-cannon-ball ColorRGBA/Blue) + "key-g" (colored-cannon-ball ColorRGBA/Green)) (fn [world] (light-up-everything world) + (speed-up world) + (.setTimer world timer) + (display-dialated-time world timer) ;; add a view from the worm's perspective (add-camera! world @@ -583,9 +595,11 @@ (File. "/home/r/proj/cortex/render/worm-vision/worm-view")) BufferedImage!)) (set-gravity world Vector3f/ZERO) - (Capture/captureVideo - world - (File. "/home/r/proj/cortex/render/worm-vision/main-view"))) + (try + (Capture/captureVideo + world + (File. "/home/r/proj/cortex/render/worm-vision/main-view")))) + (fn [world _ ] (.setLocalTranslation me (.getLocation (.getCamera world))) (vision-display @@ -594,6 +608,69 @@ (fix-display world))))) #+end_src +** Methods to Generate the Worm Video +#+name: magick2 +#+begin_src clojure +(ns cortex.video.magick2 + (:import java.io.File) + (:use clojure.contrib.shell-out)) + +(defn images [path] + (sort (rest (file-seq (File. path))))) + +(def base "/home/r/proj/cortex/render/worm-vision/") + +(defn pics [file] + (images (str base file))) + +(defn combine-images [] + (let [main-view (pics "main-view") + worm-view (pics "worm-view") + blue (pics "0") + green (pics "1") + red (pics "2") + gray (pics "3") + blender (let [b-pics (pics "blender")] + (concat b-pics (repeat 9001 (last b-pics)))) + background (repeat 9001 (File. (str base "background.png"))) + targets (map + #(File. (str base "out/" (format "%07d.png" %))) + (range 0 (count main-view)))] + (dorun + (pmap + (comp + (fn [[background main-view worm-view red green blue gray blender target]] + (println target) + (sh "convert" + background + main-view "-geometry" "+18+17" "-composite" + worm-view "-geometry" "+677+17" "-composite" + green "-geometry" "+685+430" "-composite" + red "-geometry" "+788+430" "-composite" + blue "-geometry" "+894+430" "-composite" + gray "-geometry" "+1000+430" "-composite" + blender "-geometry" "+0+0" "-composite" + target)) + (fn [& args] (map #(.getCanonicalPath %) args))) + background main-view worm-view red green blue gray blender targets)))) +#+end_src + +#+begin_src sh :results silent +cd /home/r/proj/cortex/render/worm-vision +ffmpeg -r 25 -b 9001k -i out/%07d.png -vcodec libtheora worm-vision.ogg +#+end_src + +* Demonstration of Vision +#+begin_html +
+ +

Simulated Vision in a Virtual Environment

+
+#+end_html + * Headers #+name: vision-header @@ -602,7 +679,7 @@ "Simulate the sense of vision in jMonkeyEngine3. Enables multiple eyes from different positions to observe the same world, and pass the observed data to any arbitray function. Automatically reads - eye-nodes from specially prepared blender files and instanttiates + eye-nodes from specially prepared blender files and instantiates them in the world as actual eyes." {:author "Robert McIntyre"} (:use (cortex world sense util)) @@ -612,7 +689,7 @@ (:import java.nio.ByteBuffer) (:import java.awt.image.BufferedImage) (:import (com.jme3.renderer ViewPort Camera)) - (:import com.jme3.math.ColorRGBA) + (:import (com.jme3.math ColorRGBA Vector3f Matrix3f)) (:import com.jme3.renderer.Renderer) (:import com.jme3.app.Application) (:import com.jme3.texture.FrameBuffer) @@ -632,21 +709,46 @@ (:import com.jme3.math.ColorRGBA) (:import com.jme3.scene.Node) (:import com.jme3.math.Vector3f) - (:import java.io.File)) + (:import java.io.File) + (:import (com.aurellem.capture Capture RatchetTimer))) #+end_src +* Onward! + - As a neat bonus, this idea behind simulated vision also enables one + to [[../../cortex/html/capture-video.html][capture live video feeds from jMonkeyEngine]]. + - Now that we have vision, it's time to tackle [[./hearing.org][hearing]]. - -- As a neat bonus, this idea behind simulated vision also enables one - to [[../../cortex/html/capture-video.html][capture live video feeds from jMonkeyEngine]]. +* Source Listing + - [[../src/cortex/vision.clj][cortex.vision]] + - [[../src/cortex/test/vision.clj][cortex.test.vision]] + - [[../src/cortex/video/magick2.clj][cortex.video.magick2]] + - [[../assets/Models/subtitles/worm-vision-subtitles.blend][worm-vision-subtitles.blend]] +#+html: + - [[http://hg.bortreb.com ][source-repository]] + * COMMENT Generate Source #+begin_src clojure :tangle ../src/cortex/vision.clj -<> +<> +<> +<> +<> +<> +<> +<> +<> +<> +<
> +<> #+end_src #+begin_src clojure :tangle ../src/cortex/test/vision.clj <> <> +<> #+end_src + +#+begin_src clojure :tangle ../src/cortex/video/magick2.clj +<> +#+end_src