rlm@34: #+title: Simulated Sense of Sight rlm@23: #+author: Robert McIntyre rlm@23: #+email: rlm@mit.edu rlm@38: #+description: Simulated sight for AI research using JMonkeyEngine3 and clojure rlm@34: #+keywords: computer vision, jMonkeyEngine3, clojure rlm@23: #+SETUPFILE: ../../aurellem/org/setup.org rlm@23: #+INCLUDE: ../../aurellem/org/level-0.org rlm@23: #+babel: :mkdirp yes :noweb yes :exports both rlm@23: rlm@34: * Vision rlm@23: rlm@34: I want to make creatures with eyes. Each eye can be independely moved rlm@34: and should see its own version of the world depending on where it is. rlm@66: #+name: eyes rlm@23: #+begin_src clojure rlm@34: (ns cortex.vision rlm@34: "Simulate the sense of vision in jMonkeyEngine3. Enables multiple rlm@34: eyes from different positions to observe the same world, and pass rlm@34: the observed data to any arbitray function." rlm@34: {:author "Robert McIntyre"} rlm@34: (:use cortex.world) rlm@34: (:import com.jme3.post.SceneProcessor) rlm@113: (:import (com.jme3.util BufferUtils Screenshots)) rlm@34: (:import java.nio.ByteBuffer) rlm@34: (:import java.awt.image.BufferedImage) rlm@34: (:import com.jme3.renderer.ViewPort) rlm@113: (:import com.jme3.math.ColorRGBA) rlm@113: (:import com.jme3.renderer.Renderer)) rlm@23: rlm@113: rlm@113: (defn vision-pipeline rlm@34: "Create a SceneProcessor object which wraps a vision processing rlm@113: continuation function. The continuation is a function that takes rlm@113: [#^Renderer r #^FrameBuffer fb #^ByteBuffer b #^BufferedImage bi], rlm@113: each of which has already been appropiately sized." rlm@23: [continuation] rlm@23: (let [byte-buffer (atom nil) rlm@113: renderer (atom nil) rlm@113: image (atom nil)] rlm@23: (proxy [SceneProcessor] [] rlm@23: (initialize rlm@23: [renderManager viewPort] rlm@23: (let [cam (.getCamera viewPort) rlm@23: width (.getWidth cam) rlm@23: height (.getHeight cam)] rlm@23: (reset! renderer (.getRenderer renderManager)) rlm@23: (reset! byte-buffer rlm@23: (BufferUtils/createByteBuffer rlm@113: (* width height 4))) rlm@113: (reset! image (BufferedImage. rlm@113: width height rlm@113: BufferedImage/TYPE_4BYTE_ABGR)))) rlm@23: (isInitialized [] (not (nil? @byte-buffer))) rlm@23: (reshape [_ _ _]) rlm@23: (preFrame [_]) rlm@23: (postQueue [_]) rlm@23: (postFrame rlm@23: [#^FrameBuffer fb] rlm@23: (.clear @byte-buffer) rlm@113: (continuation @renderer fb @byte-buffer @image)) rlm@23: (cleanup [])))) rlm@23: rlm@113: (defn frameBuffer->byteBuffer! rlm@113: "Transfer the data in the graphics card (Renderer, FrameBuffer) to rlm@113: the CPU (ByteBuffer)." rlm@113: [#^Renderer r #^FrameBuffer fb #^ByteBuffer bb] rlm@113: (.readFrameBuffer r fb bb) bb) rlm@113: rlm@113: (defn byteBuffer->bufferedImage! rlm@113: "Convert the C-style BGRA image data in the ByteBuffer bb to the AWT rlm@113: style ABGR image data and place it in BufferedImage bi." rlm@113: [#^ByteBuffer bb #^BufferedImage bi] rlm@113: (Screenshots/convertScreenShot bb bi) bi) rlm@113: rlm@113: (defn BufferedImage! rlm@113: "Continuation which will grab the buffered image from the materials rlm@113: provided by (vision-pipeline)." rlm@113: [#^Renderer r #^FrameBuffer fb #^ByteBuffer bb #^BufferedImage bi] rlm@113: (byteBuffer->bufferedImage! rlm@113: (frameBuffer->byteBuffer! r fb bb) bi)) rlm@112: rlm@23: (defn add-eye rlm@34: "Add an eye to the world, calling continuation on every frame rlm@34: produced." rlm@23: [world camera continuation] rlm@23: (let [width (.getWidth camera) rlm@23: height (.getHeight camera) rlm@23: render-manager (.getRenderManager world) rlm@23: viewport (.createMainView render-manager "eye-view" camera)] rlm@23: (doto viewport rlm@23: (.setClearFlags true true true) rlm@112: (.setBackgroundColor ColorRGBA/Black) rlm@113: (.addProcessor (vision-pipeline continuation)) rlm@23: (.attachScene (.getRootNode world))))) rlm@34: #+end_src rlm@23: rlm@112: #+results: eyes rlm@112: : #'cortex.vision/add-eye rlm@112: rlm@34: Note the use of continuation passing style for connecting the eye to a rlm@34: function to process the output. You can create any number of eyes, and rlm@34: each of them will see the world from their own =Camera=. Once every rlm@34: frame, the rendered image is copied to a =BufferedImage=, and that rlm@34: data is sent off to the continuation function. Moving the =Camera= rlm@34: which was used to create the eye will change what the eye sees. rlm@23: rlm@34: * Example rlm@23: rlm@66: #+name: test-vision rlm@23: #+begin_src clojure rlm@68: (ns cortex.test.vision rlm@34: (:use (cortex world util vision)) rlm@34: (:import java.awt.image.BufferedImage) rlm@34: (:import javax.swing.JPanel) rlm@34: (:import javax.swing.SwingUtilities) rlm@34: (:import java.awt.Dimension) rlm@34: (:import javax.swing.JFrame) rlm@34: (:import com.jme3.math.ColorRGBA) rlm@45: (:import com.jme3.scene.Node) rlm@113: (:import com.jme3.math.Vector3f)) rlm@23: rlm@36: (defn test-two-eyes rlm@69: "Testing vision: rlm@69: Tests the vision system by creating two views of the same rotating rlm@69: object from different angles and displaying both of those views in rlm@69: JFrames. rlm@69: rlm@69: You should see a rotating cube, and two windows, rlm@69: each displaying a different view of the cube." rlm@36: [] rlm@58: (let [candy rlm@58: (box 1 1 1 :physical? false :color ColorRGBA/Blue)] rlm@112: (world rlm@112: (doto (Node.) rlm@112: (.attachChild candy)) rlm@112: {} rlm@112: (fn [world] rlm@112: (let [cam (.clone (.getCamera world)) rlm@112: width (.getWidth cam) rlm@112: height (.getHeight cam)] rlm@112: (add-eye world cam rlm@113: ;;no-op rlm@113: (comp (view-image) BufferedImage!) rlm@112: ) rlm@112: (add-eye world rlm@112: (doto (.clone cam) rlm@112: (.setLocation (Vector3f. -10 0 0)) rlm@112: (.lookAt Vector3f/ZERO Vector3f/UNIT_Y)) rlm@113: ;;no-op rlm@113: (comp (view-image) BufferedImage!)) rlm@112: ;; This is here to restore the main view rlm@112: ;; after the other views have completed processing rlm@112: (add-eye world (.getCamera world) no-op))) rlm@112: (fn [world tpf] rlm@112: (.rotate candy (* tpf 0.2) 0 0))))) rlm@23: #+end_src rlm@23: rlm@112: #+results: test-vision rlm@112: : #'cortex.test.vision/test-two-eyes rlm@112: rlm@34: The example code will create two videos of the same rotating object rlm@34: from different angles. It can be used both for stereoscopic vision rlm@34: simulation or for simulating multiple creatures, each with their own rlm@34: sense of vision. rlm@24: rlm@35: - As a neat bonus, this idea behind simulated vision also enables one rlm@35: to [[../../cortex/html/capture-video.html][capture live video feeds from jMonkeyEngine]]. rlm@35: rlm@24: rlm@24: * COMMENT code generation rlm@34: #+begin_src clojure :tangle ../src/cortex/vision.clj rlm@24: <> rlm@24: #+end_src rlm@24: rlm@68: #+begin_src clojure :tangle ../src/cortex/test/vision.clj rlm@24: <> rlm@24: #+end_src