comparison org/eyes.org @ 151:aaacf087504c

refactored vision code
author Robert McIntyre <rlm@mit.edu>
date Fri, 03 Feb 2012 05:52:18 -0700
parents 9d0fe7f54e14
children 9e6a30b8c99a
comparison
equal deleted inserted replaced
150:e1232043656a 151:aaacf087504c
9 9
10 * Vision 10 * Vision
11 11
12 I want to make creatures with eyes. Each eye can be independely moved 12 I want to make creatures with eyes. Each eye can be independely moved
13 and should see its own version of the world depending on where it is. 13 and should see its own version of the world depending on where it is.
14
15 Here's how vision will work.
16
17 Make the continuation in scene-processor take FrameBuffer,
18 byte-buffer, BufferedImage already sized to the correct
19 dimensions. the continuation will decide wether to "mix" them
20 into the BufferedImage, lazily ignore them, or mix them halfway
21 and call c/graphics card routines.
22
23 (vision creature) will take an optional :skip argument which will
24 inform the continuations in scene processor to skip the given
25 number of cycles 0 means that no cycles will be skipped.
26
27 (vision creature) will return [init-functions sensor-functions].
28 The init-functions are each single-arg functions that take the
29 world and register the cameras and must each be called before the
30 corresponding sensor-functions. Each init-function returns the
31 viewport for that eye which can be manipulated, saved, etc. Each
32 sensor-function is a thunk and will return data in the same
33 format as the tactile-sensor functions the structure is
34 [topology, sensor-data]. Internally, these sensor-functions
35 maintain a reference to sensor-data which is periodically updated
36 by the continuation function established by its init-function.
37 They can be queried every cycle, but their information may not
38 necessairly be different every cycle.
39
40 Each eye in the creature in blender will work the same way as
41 joints -- a zero dimensional object with no geometry whose local
42 coordinate system determines the orientation of the resulting
43 eye. All eyes will have a parent named "eyes" just as all joints
44 have a parent named "joints". The resulting camera will be a
45 ChaseCamera or a CameraNode bound to the geo that is closest to
46 the eye marker. The eye marker will contain the metadata for the
47 eye, and will be moved by it's bound geometry. The dimensions of
48 the eye's camera are equal to the dimensions of the eye's "UV"
49 map.
50
51
52
53
54
14 #+name: eyes 55 #+name: eyes
15 #+begin_src clojure 56 #+begin_src clojure
16 (ns cortex.vision 57 (ns cortex.vision
17 "Simulate the sense of vision in jMonkeyEngine3. Enables multiple 58 "Simulate the sense of vision in jMonkeyEngine3. Enables multiple
18 eyes from different positions to observe the same world, and pass 59 eyes from different positions to observe the same world, and pass
19 the observed data to any arbitray function." 60 the observed data to any arbitray function."
20 {:author "Robert McIntyre"} 61 {:author "Robert McIntyre"}
21 (:use cortex.world) 62 (:use (cortex world sense util))
22 (:import com.jme3.post.SceneProcessor) 63 (:import com.jme3.post.SceneProcessor)
23 (:import (com.jme3.util BufferUtils Screenshots)) 64 (:import (com.jme3.util BufferUtils Screenshots))
24 (:import java.nio.ByteBuffer) 65 (:import java.nio.ByteBuffer)
25 (:import java.awt.image.BufferedImage) 66 (:import java.awt.image.BufferedImage)
26 (:import com.jme3.renderer.ViewPort) 67 (:import com.jme3.renderer.ViewPort)
27 (:import com.jme3.math.ColorRGBA) 68 (:import com.jme3.math.ColorRGBA)
28 (:import com.jme3.renderer.Renderer)) 69 (:import com.jme3.renderer.Renderer)
70 (:import jme3tools.converters.ImageToAwt)
71 (:import com.jme3.scene.Node))
72
73 (cortex.import/mega-import-jme3)
29 74
30 75
31 (defn vision-pipeline 76 (defn vision-pipeline
32 "Create a SceneProcessor object which wraps a vision processing 77 "Create a SceneProcessor object which wraps a vision processing
33 continuation function. The continuation is a function that takes 78 continuation function. The continuation is a function that takes
90 (doto viewport 135 (doto viewport
91 (.setClearFlags true true true) 136 (.setClearFlags true true true)
92 (.setBackgroundColor ColorRGBA/Black) 137 (.setBackgroundColor ColorRGBA/Black)
93 (.addProcessor (vision-pipeline continuation)) 138 (.addProcessor (vision-pipeline continuation))
94 (.attachScene (.getRootNode world))))) 139 (.attachScene (.getRootNode world)))))
140
141 (defn retina-sensor-image
142 "Return a map of pixel selection functions to BufferedImages
143 describing the distribution of light-sensitive components on this
144 geometry's surface. Each function creates an integer from the rgb
145 values found in the pixel. :red, :green, :blue, :gray are already
146 defined as extracting the red green blue and average components
147 respectively."
148 [#^Spatial eye]
149 (if-let [eye-map (meta-data eye "eye")]
150 (map-vals
151 #(ImageToAwt/convert
152 (.getImage (.loadTexture (asset-manager) %))
153 false false 0)
154 (eval (read-string eye-map)))))
155
156 (defn eye-dimensions
157 "returns the width and height specified in the metadata of the eye"
158 [#^Spatial eye]
159 (let [dimensions
160 (map #(vector (.getWidth %) (.getHeight %))
161 (vals (retina-sensor-image eye)))]
162 [(apply max (map first dimensions))
163 (apply max (map second dimensions))]))
164
165 (defn creature-eyes
166 ;;dylan
167 "Return the children of the creature's \"eyes\" node."
168 ;;"The eye nodes which are children of the \"eyes\" node in the
169 ;;creature."
170 [#^Node creature]
171 (if-let [eye-node (.getChild creature "eyes")]
172 (seq (.getChildren eye-node))
173 (do (println-repl "could not find eyes node") [])))
174
175
176 (defn attach-eye
177 "Attach a Camera to the appropiate area and return the Camera."
178 [#^Node creature #^Spatial eye]
179 (let [target (closest-node creature eye)
180 [cam-width cam-height] (eye-dimensions eye)
181 cam (Camera. cam-width cam-height)]
182 (.setLocation cam (.getWorldTranslation eye))
183 (.setRotation cam (.getWorldRotation eye))
184 (.setFrustumPerspective
185 cam 45 (/ (.getWidth cam) (.getHeight cam))
186 1 1000)
187 (bind-sense target cam)
188 cam))
189
190 (def presets
191 {:all 0xFFFFFF
192 :red 0xFF0000
193 :blue 0x0000FF
194 :green 0x00FF00})
195
196 (defn enable-vision
197 "return [init-function sensor-functions] for a particular eye"
198 [#^Node creature #^Spatial eye & {skip :skip :or {skip 0}}]
199 (let [retinal-map (retina-sensor-image eye)
200 camera (attach-eye creature eye)
201 vision-image
202 (atom
203 (BufferedImage. (.getWidth camera)
204 (.getHeight camera)
205 BufferedImage/TYPE_BYTE_BINARY))]
206 [(fn [world]
207 (add-eye
208 world camera
209 (let [counter (atom 0)]
210 (fn [r fb bb bi]
211 (if (zero? (rem (swap! counter inc) (inc skip)))
212 (reset! vision-image (BufferedImage! r fb bb bi)))))))
213 (vec
214 (map
215 (fn [[key image]]
216 (let [whites (white-coordinates image)
217 topology (vec (collapse whites))
218 mask (presets key)]
219 (fn []
220 (vector
221 topology
222 (vec
223 (for [[x y] whites]
224 (bit-and
225 mask (.getRGB @vision-image x y))))))))
226 retinal-map))]))
227
228 (defn vision
229 [#^Node creature & {skip :skip :or {skip 0}}]
230 (reduce
231 (fn [[init-a senses-a]
232 [init-b senses-b]]
233 [(conj init-a init-b)
234 (into senses-a senses-b)])
235 [[][]]
236 (for [eye (creature-eyes creature)]
237 (enable-vision creature eye))))
238
239
95 #+end_src 240 #+end_src
96 241
97 #+results: eyes
98 : #'cortex.vision/add-eye
99 242
100 Note the use of continuation passing style for connecting the eye to a 243 Note the use of continuation passing style for connecting the eye to a
101 function to process the output. You can create any number of eyes, and 244 function to process the output. You can create any number of eyes, and
102 each of them will see the world from their own =Camera=. Once every 245 each of them will see the world from their own =Camera=. Once every
103 frame, the rendered image is copied to a =BufferedImage=, and that 246 frame, the rendered image is copied to a =BufferedImage=, and that