view org/sense.org @ 199:305439cec54d

added video to sense.org
author Robert McIntyre <rlm@mit.edu>
date Mon, 06 Feb 2012 01:40:22 -0700
parents fc0bf33bded2
children 7eb966144dad
line wrap: on
line source
1 #+title:
2 #+author: Robert McIntyre
3 #+email: rlm@mit.edu
4 #+description: sensory utilities
5 #+keywords: simulation, jMonkeyEngine3, clojure, simulated senses
6 #+SETUPFILE: ../../aurellem/org/setup.org
7 #+INCLUDE: ../../aurellem/org/level-0.org
10 * Blender Utilities
11 In blender, any object can be assigned an arbitray number of key-value
12 pairs which are called "Custom Properties". These are accessable in
13 jMonkyeEngine when blender files are imported with the
14 =BlenderLoader=. =(meta-data)= extracts these properties.
16 #+name: blender-1
17 #+begin_src clojure
18 (defn meta-data
19 "Get the meta-data for a node created with blender."
20 [blender-node key]
21 (if-let [data (.getUserData blender-node "properties")]
22 (.findValue data key) nil))
23 #+end_src
25 Blender uses a different coordinate system than jMonkeyEngine so it
26 is useful to be able to convert between the two. These only come into
27 play when the meta-data of a node refers to a vector in the blender
28 coordinate system.
30 #+name: blender-2
31 #+begin_src clojure
32 (defn jme-to-blender
33 "Convert from JME coordinates to Blender coordinates"
34 [#^Vector3f in]
35 (Vector3f. (.getX in) (- (.getZ in)) (.getY in)))
37 (defn blender-to-jme
38 "Convert from Blender coordinates to JME coordinates"
39 [#^Vector3f in]
40 (Vector3f. (.getX in) (.getZ in) (- (.getY in))))
41 #+end_src
43 * Sense Topology
45 Human beings are three-dimensional objects, and the nerves that
46 transmit data from our various sense organs to our brain are
47 essentially one-dimensional. This leaves up to two dimensions in which
48 our sensory information may flow. For example, imagine your skin: it
49 is a two-dimensional surface around a three-dimensional object (your
50 body). It has discrete touch sensors embedded at various points, and
51 the density of these sensors corresponds to the sensitivity of that
52 region of skin. Each touch sensor connects to a nerve, all of which
53 eventually are bundled together as they travel up the spinal cord to
54 the brain. Intersect the spinal nerves with a guillotining plane and
55 you will see all of the sensory data of the skin revealed in a roughly
56 circular two-dimensional image which is the cross section of the
57 spinal cord. Points on this image that are close together in this
58 circle represent touch sensors that are /probably/ close together on
59 the skin, although there is of course some cutting and rerangement
60 that has to be done to transfer the complicated surface of the skin
61 onto a two dimensional image.
63 Most human senses consist of many discrete sensors of various
64 properties distributed along a surface at various densities. For
65 skin, it is Pacinian corpuscles, Meissner's corpuscles, Merkel's
66 disks, and Ruffini's endings, which detect pressure and vibration of
67 various intensities. For ears, it is the stereocilia distributed
68 along the basilar membrane inside the cochlea; each one is sensitive
69 to a slightly different frequency of sound. For eyes, it is rods
70 and cones distributed along the surface of the retina. In each case,
71 we can describe the sense with a surface and a distribution of sensors
72 along that surface.
74 ** UV-maps
76 Blender and jMonkeyEngine already have support for exactly this sort
77 of data structure because it is used to "skin" models for games. It is
78 called [[http://wiki.blender.org/index.php/Doc:2.6/Manual/Textures/Mapping/UV][UV-mapping]]. The three-dimensional surface is cut and smooshed
79 until it fits on a two-dimensional image. You paint whatever you want
80 on that image, and when the three-dimensional shape is rendered in a
81 game that image the smooshing and cutting us reversed and the image
82 appears on the three-dimensional object.
84 To make a sense, interpret the UV-image as describing the distribution
85 of that senses sensors. To get different types of sensors, you can
86 either use a different color for each type of sensor, or use multiple
87 UV-maps, each labeled with that sensor type. I generally use a white
88 pixel to mean the presense of a sensor and a black pixel to mean the
89 absense of a sensor, and use one UV-map for each sensor-type within a
90 given sense. The paths to the images are not stored as the actual
91 UV-map of the blender object but are instead referenced in the
92 meta-data of the node.
94 #+CAPTION: The UV-map for an enlongated icososphere. The white dots each represent a touch sensor. They are dense in the regions that describe the tip of the finger, and less dense along the dorsal side of the finger opposite the tip.
95 #+ATTR_HTML: width="300"
96 [[../images/finger-UV.png]]
98 #+CAPTION: Ventral side of the UV-mapped finger. Notice the density of touch sensors at the tip.
99 #+ATTR_HTML: width="300"
100 [[../images/finger-1.png]]
102 #+CAPTION: Side view of the UV-mapped finger.
103 #+ATTR_HTML: width="300"
104 [[../images/finger-2.png]]
106 #+CAPTION: Head on view of the finger. In both the head and side views you can see the divide where the touch-sensors transition from high density to low density.
107 #+ATTR_HTML: width="300"
108 [[../images/finger-3.png]]
110 The following code loads images and gets the locations of the white
111 pixels so that they can be used to create senses. =(load-image)= finds
112 images using jMonkeyEngine's asset-manager, so the image path is
113 expected to be relative to the =assets= directory. Thanks to Dylan
114 for the beautiful version of filter-pixels.
116 #+name: topology-1
117 #+begin_src clojure
118 (defn load-image
119 "Load an image as a BufferedImage using the asset-manager system."
120 [asset-relative-path]
121 (ImageToAwt/convert
122 (.getImage (.loadTexture (asset-manager) asset-relative-path))
123 false false 0))
125 (def white 0xFFFFFF)
127 (defn white? [rgb]
128 (= (bit-and white rgb) white))
130 (defn filter-pixels
131 "List the coordinates of all pixels matching pred, within the bounds
132 provided. If bounds are not specified then the entire image is
133 searched.
134 bounds -> [x0 y0 width height]"
135 {:author "Dylan Holmes"}
136 ([pred #^BufferedImage image]
137 (filter-pixels pred image [0 0 (.getWidth image) (.getHeight image)]))
138 ([pred #^BufferedImage image [x0 y0 width height]]
139 ((fn accumulate [x y matches]
140 (cond
141 (>= y (+ height y0)) matches
142 (>= x (+ width x0)) (recur 0 (inc y) matches)
143 (pred (.getRGB image x y))
144 (recur (inc x) y (conj matches [x y]))
145 :else (recur (inc x) y matches)))
146 x0 y0 [])))
148 (defn white-coordinates
149 "Coordinates of all the white pixels in a subset of the image."
150 ([#^BufferedImage image bounds]
151 (filter-pixels white? image bounds))
152 ([#^BufferedImage image]
153 (filter-pixels white? image)))
154 #+end_src
156 ** Topology
158 Information from the senses is transmitted to the brain via bundles of
159 axons, whether it be the optic nerve or the spinal cord. While these
160 bundles more or less perserve the overall topology of a sense's
161 two-dimensional surface, they do not perserve the percise euclidean
162 distances between every sensor. =(collapse)= is here to smoosh the
163 sensors described by a UV-map into a contigous region that still
164 perserves the topology of the original sense.
166 #+name: topology-2
167 #+begin_src clojure
168 (defn average [coll]
169 (/ (reduce + coll) (count coll)))
171 (defn collapse-1d
172 "One dimensional analogue of collapse."
173 [center line]
174 (let [length (count line)
175 num-above (count (filter (partial < center) line))
176 num-below (- length num-above)]
177 (range (- center num-below)
178 (+ center num-above))))
180 (defn collapse
181 "Take a set of pairs of integers and collapse them into a
182 contigous bitmap with no \"holes\"."
183 [points]
184 (if (empty? points) []
185 (let
186 [num-points (count points)
187 center (vector
188 (int (average (map first points)))
189 (int (average (map first points))))
190 flattened
191 (reduce
192 concat
193 (map
194 (fn [column]
195 (map vector
196 (map first column)
197 (collapse-1d (second center)
198 (map second column))))
199 (partition-by first (sort-by first points))))
200 squeezed
201 (reduce
202 concat
203 (map
204 (fn [row]
205 (map vector
206 (collapse-1d (first center)
207 (map first row))
208 (map second row)))
209 (partition-by second (sort-by second flattened))))
210 relocated
211 (let [min-x (apply min (map first squeezed))
212 min-y (apply min (map second squeezed))]
213 (map (fn [[x y]]
214 [(- x min-x)
215 (- y min-y)])
216 squeezed))]
217 relocated)))
218 #+end_src
219 * Viewing Sense Data
221 It's vital to /see/ the sense data to make sure that everything is
222 behaving as it should. =(view-sense)= is here so that each sense can
223 define its own way of turning sense-data into pictures, while the
224 actual rendering of said pictures stays in one central place.
225 =(points->image)= helps senses generate a base image onto which they
226 can overlay actual sense data.
228 #+name: view-senses
229 #+begin_src clojure
230 (in-ns 'cortex.sense)
232 (defn points->image
233 "Take a collection of points and visuliaze it as a BufferedImage."
234 [points]
235 (if (empty? points)
236 (BufferedImage. 1 1 BufferedImage/TYPE_BYTE_BINARY)
237 (let [xs (vec (map first points))
238 ys (vec (map second points))
239 x0 (apply min xs)
240 y0 (apply min ys)
241 width (- (apply max xs) x0)
242 height (- (apply max ys) y0)
243 image (BufferedImage. (inc width) (inc height)
244 BufferedImage/TYPE_INT_RGB)]
245 (dorun
246 (for [x (range (.getWidth image))
247 y (range (.getHeight image))]
248 (.setRGB image x y 0xFF0000)))
249 (dorun
250 (for [index (range (count points))]
251 (.setRGB image (- (xs index) x0) (- (ys index) y0) -1)))
252 image)))
254 (defn view-image
255 "Initailizes a JPanel on which you may draw a BufferedImage.
256 Returns a function that accepts a BufferedImage and draws it to the
257 JPanel. If given a directory it will save the images as png files
258 starting at 0000000.png and incrementing from there."
259 ([#^File save]
260 (let [idx (atom -1)
261 image
262 (atom
263 (BufferedImage. 1 1 BufferedImage/TYPE_4BYTE_ABGR))
264 panel
265 (proxy [JPanel] []
266 (paint
267 [graphics]
268 (proxy-super paintComponent graphics)
269 (.drawImage graphics @image 0 0 nil)))
270 frame (JFrame. "Display Image")]
271 (SwingUtilities/invokeLater
272 (fn []
273 (doto frame
274 (-> (.getContentPane) (.add panel))
275 (.pack)
276 (.setLocationRelativeTo nil)
277 (.setResizable true)
278 (.setVisible true))))
279 (fn [#^BufferedImage i]
280 (reset! image i)
281 (.setSize frame (+ 8 (.getWidth i)) (+ 28 (.getHeight i)))
282 (.repaint panel 0 0 (.getWidth i) (.getHeight i))
283 (if save
284 (ImageIO/write
285 i "png"
286 (File. save (format "%07d.png" (swap! idx inc))))))))
287 ([] (view-image nil)))
289 (defn view-sense
290 "Take a kernel that produces a BufferedImage from some sense data
291 and return a function which takes a list of sense data, uses the
292 kernel to convert to images, and displays those images, each in
293 its own JFrame."
294 [sense-display-kernel]
295 (let [windows (atom [])]
296 (fn [data]
297 (if (> (count data) (count @windows))
298 (reset!
299 windows (map (fn [_] (view-image)) (range (count data)))))
300 (dorun
301 (map
302 (fn [display datum]
303 (display (sense-display-kernel datum)))
304 @windows data)))))
306 (defn gray
307 "Create a gray RGB pixel with R, G, and B set to num. num must be
308 between 0 and 255."
309 [num]
310 (+ num
311 (bit-shift-left num 8)
312 (bit-shift-left num 16)))
313 #+end_src
315 * Building a Sense from Nodes
316 My method for defining senses in blender is the following:
318 Senses like vision and hearing are localized to a single point
319 and follow a particular object around. For these:
321 - Create a single top-level empty node whose name is the name of the sense
322 - Add empty nodes which each contain meta-data relevant
323 to the sense, including a UV-map describing the number/distribution
324 of sensors if applicipable.
325 - Make each empty-node the child of the top-level
326 node. =(sense-nodes)= below generates functions to find these children.
328 For touch, store the path to the UV-map which describes touch-sensors in the
329 meta-data of the object to which that map applies.
331 Each sense provides code that analyzes the Node structure of the
332 creature and creates sense-functions. They also modify the Node
333 structure if necessary.
335 Empty nodes created in blender have no appearance or physical presence
336 in jMonkeyEngine, but do appear in the scene graph. Empty nodes that
337 represent a sense which "follows" another geometry (like eyes and
338 ears) follow the closest physical object. =(closest-node)= finds this
339 closest object given the Creature and a particular empty node.
341 #+name: node-1
342 #+begin_src clojure
343 (defn sense-nodes
344 "For some senses there is a special empty blender node whose
345 children are considered markers for an instance of that sense. This
346 function generates functions to find those children, given the name
347 of the special parent node."
348 [parent-name]
349 (fn [#^Node creature]
350 (if-let [sense-node (.getChild creature parent-name)]
351 (seq (.getChildren sense-node))
352 (do (println-repl "could not find" parent-name "node") []))))
354 (defn closest-node
355 "Return the node in creature which is closest to the given node."
356 [#^Node creature #^Node empty]
357 (loop [radius (float 0.01)]
358 (let [results (CollisionResults.)]
359 (.collideWith
360 creature
361 (BoundingBox. (.getWorldTranslation empty)
362 radius radius radius)
363 results)
364 (if-let [target (first results)]
365 (.getGeometry target)
366 (recur (float (* 2 radius)))))))
368 (defn world-to-local
369 "Convert the world coordinates into coordinates relative to the
370 object (i.e. local coordinates), taking into account the rotation
371 of object."
372 [#^Spatial object world-coordinate]
373 (.worldToLocal object world-coordinate nil))
375 (defn local-to-world
376 "Convert the local coordinates into world relative coordinates"
377 [#^Spatial object local-coordinate]
378 (.localToWorld object local-coordinate nil))
379 #+end_src
381 =(bind-sense)= binds either a Camera or a Listener object to any
382 object so that they will follow that object no matter how it
383 moves. It is used to create both eyes and ears.
385 #+name: node-2
386 #+begin_src clojure
387 (defn bind-sense
388 "Bind the sense to the Spatial such that it will maintain its
389 current position relative to the Spatial no matter how the spatial
390 moves. 'sense can be either a Camera or Listener object."
391 [#^Spatial obj sense]
392 (let [sense-offset (.subtract (.getLocation sense)
393 (.getWorldTranslation obj))
394 initial-sense-rotation (Quaternion. (.getRotation sense))
395 base-anti-rotation (.inverse (.getWorldRotation obj))]
396 (.addControl
397 obj
398 (proxy [AbstractControl] []
399 (controlUpdate [tpf]
400 (let [total-rotation
401 (.mult base-anti-rotation (.getWorldRotation obj))]
402 (.setLocation
403 sense
404 (.add
405 (.mult total-rotation sense-offset)
406 (.getWorldTranslation obj)))
407 (.setRotation
408 sense
409 (.mult total-rotation initial-sense-rotation))))
410 (controlRender [_ _])))))
411 #+end_src
413 Here is some example code which shows a camera bound to a blue
414 box as it is buffeted by white cannonballs.
416 #+name: test
417 #+begin_src clojure
418 (ns cortex.test.sense
419 (:use (cortex world util sense vision))
420 (:import
421 java.io.File
422 (com.jme3.math Vector3f ColorRGBA)
423 (com.aurellem.capture RatchetTimer Capture)))
425 (defn test-bind-sense
426 "Show a camera that stays in the same relative position to a blue cube."
427 []
428 (let [camera-pos (Vector3f. 0 30 0)
429 rock (box 1 1 1 :color ColorRGBA/Blue
430 :position (Vector3f. 0 10 0)
431 :mass 30)
432 rot (.getWorldRotation rock)
433 table (box 3 1 10 :color ColorRGBA/Gray :mass 0
434 :position (Vector3f. 0 -3 0))]
435 (world
436 (nodify [rock table])
437 standard-debug-controls
438 (fn [world]
439 (let
440 [cam (doto (.clone (.getCamera world))
441 (.setLocation camera-pos)
442 (.lookAt Vector3f/ZERO
443 Vector3f/UNIT_X))]
444 (bind-sense rock cam)
445 (.setTimer world (RatchetTimer. 60))
446 (Capture/captureVideo
447 world (File. "/home/r/proj/cortex/render/bind-sense0"))
448 (add-camera!
449 world cam
450 (comp (view-image
451 (File. "/home/r/proj/cortex/render/bind-sense1"))
452 BufferedImage!))
453 (add-camera! world (.getCamera world) no-op)))
454 no-op)))
455 #+end_src
457 ** Demo Video
459 #+begin_html
460 <video controls="controls" width="755">
461 <source src="../video/bind-sense.ogg" type="video/ogg"
462 preload="none" poster="../images/aurellem-1280x480.png" />
463 </video>
465 #+end_html
467 note to self: the video was created with the following commands:
470 #+begin_src clojure :results silent
471 (in-ns 'user)
472 (import java.io.File)
473 (use 'clojure.contrib.shell-out)
475 (let
476 [idx (atom -1)
477 left (rest
478 (sort
479 (file-seq (File. "/home/r/proj/cortex/render/bind-sense0/"))))
480 right (rest
481 (sort
482 (file-seq (File. "/home/r/proj/cortex/render/bind-sense1/"))))]
483 (dorun
484 (map
485 (fn [im-1 im-2]
486 (println idx)
487 (sh "convert" (.getCanonicalPath im-1)
488 (.getCanonicalPath im-2) "+append"
489 (.getCanonicalPath
490 (File. "/home/r/proj/cortex/render/bind-sense/"
491 (format "%07d.png" (swap! idx inc))))))
492 left right)))
493 #+end_src
495 #+begin_src sh :results silent
496 cd /home/r/proj/cortex/render/
497 cp ../images/aurellem-1280x480.png bind-sense/0000000.png
498 ffmpeg -r 60 -b 9000k -i bind-sense/%07d.png bind-sense.ogg
499 #+end_src
503 * Bookkeeping
504 Here is the header for this namespace, included for completness.
505 #+name: header
506 #+begin_src clojure
507 (ns cortex.sense
508 "Here are functions useful in the construction of two or more
509 sensors/effectors."
510 {:author "Robert McInytre"}
511 (:use (cortex world util))
512 (:import ij.process.ImageProcessor)
513 (:import jme3tools.converters.ImageToAwt)
514 (:import java.awt.image.BufferedImage)
515 (:import com.jme3.collision.CollisionResults)
516 (:import com.jme3.bounding.BoundingBox)
517 (:import (com.jme3.scene Node Spatial))
518 (:import com.jme3.scene.control.AbstractControl)
519 (:import (com.jme3.math Quaternion Vector3f))
520 (:import javax.imageio.ImageIO)
521 (:import java.io.File)
522 (:import (javax.swing JPanel JFrame SwingUtilities)))
524 #+end_src
526 * Source Listing
527 Full source: [[../src/cortex/sense.clj][sense.clj]]
530 * COMMENT generate source
531 #+begin_src clojure :tangle ../src/cortex/sense.clj
532 <<header>>
533 <<blender-1>>
534 <<blender-2>>
535 <<topology-1>>
536 <<topology-2>>
537 <<node-1>>
538 <<node-2>>
539 <<view-senses>>
540 #+end_src
542 #+begin_src clojure :tangle ../src/cortex/test/sense.clj
543 <<test>>
544 #+end_src