annotate org/sense.org @ 198:fc0bf33bded2

fleshing out prose in sense.org
author Robert McIntyre <rlm@mit.edu>
date Sun, 05 Feb 2012 14:01:47 -0700
parents 16cbce075a0b
children 305439cec54d
rev   line source
rlm@198 1 #+title:
rlm@151 2 #+author: Robert McIntyre
rlm@151 3 #+email: rlm@mit.edu
rlm@151 4 #+description: sensory utilities
rlm@151 5 #+keywords: simulation, jMonkeyEngine3, clojure, simulated senses
rlm@151 6 #+SETUPFILE: ../../aurellem/org/setup.org
rlm@151 7 #+INCLUDE: ../../aurellem/org/level-0.org
rlm@151 8
rlm@151 9
rlm@197 10 * Blender Utilities
rlm@198 11 In blender, any object can be assigned an arbitray number of key-value
rlm@198 12 pairs which are called "Custom Properties". These are accessable in
rlm@198 13 jMonkyeEngine when blender files are imported with the
rlm@198 14 =BlenderLoader=. =(meta-data)= extracts these properties.
rlm@198 15
rlm@198 16 #+name: blender-1
rlm@197 17 #+begin_src clojure
rlm@181 18 (defn meta-data
rlm@181 19 "Get the meta-data for a node created with blender."
rlm@181 20 [blender-node key]
rlm@151 21 (if-let [data (.getUserData blender-node "properties")]
rlm@198 22 (.findValue data key) nil))
rlm@198 23 #+end_src
rlm@151 24
rlm@198 25 Blender uses a different coordinate system than jMonkeyEngine so it
rlm@198 26 is useful to be able to convert between the two. These only come into
rlm@198 27 play when the meta-data of a node refers to a vector in the blender
rlm@198 28 coordinate system.
rlm@198 29
rlm@198 30 #+name: blender-2
rlm@198 31 #+begin_src clojure
rlm@197 32 (defn jme-to-blender
rlm@197 33 "Convert from JME coordinates to Blender coordinates"
rlm@197 34 [#^Vector3f in]
rlm@198 35 (Vector3f. (.getX in) (- (.getZ in)) (.getY in)))
rlm@151 36
rlm@197 37 (defn blender-to-jme
rlm@197 38 "Convert from Blender coordinates to JME coordinates"
rlm@197 39 [#^Vector3f in]
rlm@198 40 (Vector3f. (.getX in) (.getZ in) (- (.getY in))))
rlm@197 41 #+end_src
rlm@197 42
rlm@198 43 * Sense Topology
rlm@198 44
rlm@198 45 Human beings are three-dimensional objects, and the nerves that
rlm@198 46 transmit data from our various sense organs to our brain are
rlm@198 47 essentially one-dimensional. This leaves up to two dimensions in which
rlm@198 48 our sensory information may flow. For example, imagine your skin: it
rlm@198 49 is a two-dimensional surface around a three-dimensional object (your
rlm@198 50 body). It has discrete touch sensors embedded at various points, and
rlm@198 51 the density of these sensors corresponds to the sensitivity of that
rlm@198 52 region of skin. Each touch sensor connects to a nerve, all of which
rlm@198 53 eventually are bundled together as they travel up the spinal cord to
rlm@198 54 the brain. Intersect the spinal nerves with a guillotining plane and
rlm@198 55 you will see all of the sensory data of the skin revealed in a roughly
rlm@198 56 circular two-dimensional image which is the cross section of the
rlm@198 57 spinal cord. Points on this image that are close together in this
rlm@198 58 circle represent touch sensors that are /probably/ close together on
rlm@198 59 the skin, although there is of course some cutting and rerangement
rlm@198 60 that has to be done to transfer the complicated surface of the skin
rlm@198 61 onto a two dimensional image.
rlm@198 62
rlm@198 63 Most human senses consist of many discrete sensors of various
rlm@198 64 properties distributed along a surface at various densities. For
rlm@198 65 skin, it is Pacinian corpuscles, Meissner's corpuscles, Merkel's
rlm@198 66 disks, and Ruffini's endings, which detect pressure and vibration of
rlm@198 67 various intensities. For ears, it is the stereocilia distributed
rlm@198 68 along the basilar membrane inside the cochlea; each one is sensitive
rlm@198 69 to a slightly different frequency of sound. For eyes, it is rods
rlm@198 70 and cones distributed along the surface of the retina. In each case,
rlm@198 71 we can describe the sense with a surface and a distribution of sensors
rlm@198 72 along that surface.
rlm@198 73
rlm@198 74 ** UV-maps
rlm@198 75
rlm@198 76 Blender and jMonkeyEngine already have support for exactly this sort
rlm@198 77 of data structure because it is used to "skin" models for games. It is
rlm@198 78 called [[http://wiki.blender.org/index.php/Doc:2.6/Manual/Textures/Mapping/UV][UV-mapping]]. The three-dimensional surface is cut and smooshed
rlm@198 79 until it fits on a two-dimensional image. You paint whatever you want
rlm@198 80 on that image, and when the three-dimensional shape is rendered in a
rlm@198 81 game that image the smooshing and cutting us reversed and the image
rlm@198 82 appears on the three-dimensional object.
rlm@198 83
rlm@198 84 To make a sense, interpret the UV-image as describing the distribution
rlm@198 85 of that senses sensors. To get different types of sensors, you can
rlm@198 86 either use a different color for each type of sensor, or use multiple
rlm@198 87 UV-maps, each labeled with that sensor type. I generally use a white
rlm@198 88 pixel to mean the presense of a sensor and a black pixel to mean the
rlm@198 89 absense of a sensor, and use one UV-map for each sensor-type within a
rlm@198 90 given sense. The paths to the images are not stored as the actual
rlm@198 91 UV-map of the blender object but are instead referenced in the
rlm@198 92 meta-data of the node.
rlm@198 93
rlm@198 94 #+CAPTION: The UV-map for an enlongated icososphere. The white dots each represent a touch sensor. They are dense in the regions that describe the tip of the finger, and less dense along the dorsal side of the finger opposite the tip.
rlm@198 95 #+ATTR_HTML: width="300"
rlm@198 96 [[../images/finger-UV.png]]
rlm@198 97
rlm@198 98 #+CAPTION: Ventral side of the UV-mapped finger. Notice the density of touch sensors at the tip.
rlm@198 99 #+ATTR_HTML: width="300"
rlm@198 100 [[../images/finger-1.png]]
rlm@198 101
rlm@198 102 #+CAPTION: Side view of the UV-mapped finger.
rlm@198 103 #+ATTR_HTML: width="300"
rlm@198 104 [[../images/finger-2.png]]
rlm@198 105
rlm@198 106 #+CAPTION: Head on view of the finger. In both the head and side views you can see the divide where the touch-sensors transition from high density to low density.
rlm@198 107 #+ATTR_HTML: width="300"
rlm@198 108 [[../images/finger-3.png]]
rlm@198 109
rlm@198 110 The following code loads images and gets the locations of the white
rlm@198 111 pixels so that they can be used to create senses. =(load-image)= finds
rlm@198 112 images using jMonkeyEngine's asset-manager, so the image path is
rlm@198 113 expected to be relative to the =assets= directory. Thanks to Dylan
rlm@198 114 for the beautiful version of filter-pixels.
rlm@198 115
rlm@198 116 #+name: topology-1
rlm@197 117 #+begin_src clojure
rlm@197 118 (defn load-image
rlm@197 119 "Load an image as a BufferedImage using the asset-manager system."
rlm@197 120 [asset-relative-path]
rlm@197 121 (ImageToAwt/convert
rlm@197 122 (.getImage (.loadTexture (asset-manager) asset-relative-path))
rlm@197 123 false false 0))
rlm@151 124
rlm@181 125 (def white 0xFFFFFF)
rlm@181 126
rlm@181 127 (defn white? [rgb]
rlm@181 128 (= (bit-and white rgb) white))
rlm@181 129
rlm@151 130 (defn filter-pixels
rlm@151 131 "List the coordinates of all pixels matching pred, within the bounds
rlm@198 132 provided. If bounds are not specified then the entire image is
rlm@198 133 searched.
rlm@182 134 bounds -> [x0 y0 width height]"
rlm@151 135 {:author "Dylan Holmes"}
rlm@151 136 ([pred #^BufferedImage image]
rlm@151 137 (filter-pixels pred image [0 0 (.getWidth image) (.getHeight image)]))
rlm@151 138 ([pred #^BufferedImage image [x0 y0 width height]]
rlm@151 139 ((fn accumulate [x y matches]
rlm@151 140 (cond
rlm@151 141 (>= y (+ height y0)) matches
rlm@151 142 (>= x (+ width x0)) (recur 0 (inc y) matches)
rlm@151 143 (pred (.getRGB image x y))
rlm@151 144 (recur (inc x) y (conj matches [x y]))
rlm@151 145 :else (recur (inc x) y matches)))
rlm@151 146 x0 y0 [])))
rlm@151 147
rlm@151 148 (defn white-coordinates
rlm@151 149 "Coordinates of all the white pixels in a subset of the image."
rlm@151 150 ([#^BufferedImage image bounds]
rlm@181 151 (filter-pixels white? image bounds))
rlm@151 152 ([#^BufferedImage image]
rlm@181 153 (filter-pixels white? image)))
rlm@198 154 #+end_src
rlm@151 155
rlm@198 156 ** Topology
rlm@151 157
rlm@198 158 Information from the senses is transmitted to the brain via bundles of
rlm@198 159 axons, whether it be the optic nerve or the spinal cord. While these
rlm@198 160 bundles more or less perserve the overall topology of a sense's
rlm@198 161 two-dimensional surface, they do not perserve the percise euclidean
rlm@198 162 distances between every sensor. =(collapse)= is here to smoosh the
rlm@198 163 sensors described by a UV-map into a contigous region that still
rlm@198 164 perserves the topology of the original sense.
rlm@198 165
rlm@198 166 #+name: topology-2
rlm@198 167 #+begin_src clojure
rlm@151 168 (defn average [coll]
rlm@151 169 (/ (reduce + coll) (count coll)))
rlm@151 170
rlm@151 171 (defn collapse-1d
rlm@182 172 "One dimensional analogue of collapse."
rlm@151 173 [center line]
rlm@151 174 (let [length (count line)
rlm@151 175 num-above (count (filter (partial < center) line))
rlm@151 176 num-below (- length num-above)]
rlm@151 177 (range (- center num-below)
rlm@151 178 (+ center num-above))))
rlm@151 179
rlm@151 180 (defn collapse
rlm@151 181 "Take a set of pairs of integers and collapse them into a
rlm@182 182 contigous bitmap with no \"holes\"."
rlm@151 183 [points]
rlm@151 184 (if (empty? points) []
rlm@151 185 (let
rlm@151 186 [num-points (count points)
rlm@151 187 center (vector
rlm@151 188 (int (average (map first points)))
rlm@151 189 (int (average (map first points))))
rlm@151 190 flattened
rlm@151 191 (reduce
rlm@151 192 concat
rlm@151 193 (map
rlm@151 194 (fn [column]
rlm@151 195 (map vector
rlm@151 196 (map first column)
rlm@151 197 (collapse-1d (second center)
rlm@151 198 (map second column))))
rlm@151 199 (partition-by first (sort-by first points))))
rlm@151 200 squeezed
rlm@151 201 (reduce
rlm@151 202 concat
rlm@151 203 (map
rlm@151 204 (fn [row]
rlm@151 205 (map vector
rlm@151 206 (collapse-1d (first center)
rlm@151 207 (map first row))
rlm@151 208 (map second row)))
rlm@151 209 (partition-by second (sort-by second flattened))))
rlm@182 210 relocated
rlm@151 211 (let [min-x (apply min (map first squeezed))
rlm@151 212 min-y (apply min (map second squeezed))]
rlm@151 213 (map (fn [[x y]]
rlm@151 214 [(- x min-x)
rlm@151 215 (- y min-y)])
rlm@151 216 squeezed))]
rlm@182 217 relocated)))
rlm@198 218 #+end_src
rlm@198 219 * Viewing Sense Data
rlm@151 220
rlm@198 221 It's vital to /see/ the sense data to make sure that everything is
rlm@198 222 behaving as it should. =(view-sense)= is here so that each sense can
rlm@198 223 define its own way of turning sense-data into pictures, while the
rlm@198 224 actual rendering of said pictures stays in one central place.
rlm@198 225 =(points->image)= helps senses generate a base image onto which they
rlm@198 226 can overlay actual sense data.
rlm@198 227
rlm@198 228 #+name view-senses
rlm@198 229 #+begin_src clojure
rlm@198 230 (defn view-sense
rlm@198 231 "Take a kernel that produces a BufferedImage from some sense data
rlm@198 232 and return a function which takes a list of sense data, uses the
rlm@198 233 kernel to convert to images, and displays those images, each in
rlm@198 234 its own JFrame."
rlm@198 235 [sense-display-kernel]
rlm@198 236 (let [windows (atom [])]
rlm@198 237 (fn [data]
rlm@198 238 (if (> (count data) (count @windows))
rlm@198 239 (reset!
rlm@198 240 windows (map (fn [_] (view-image)) (range (count data)))))
rlm@198 241 (dorun
rlm@198 242 (map
rlm@198 243 (fn [display datum]
rlm@198 244 (display (sense-display-kernel datum)))
rlm@198 245 @windows data)))))
rlm@198 246
rlm@198 247 (defn points->image
rlm@198 248 "Take a collection of points and visuliaze it as a BufferedImage."
rlm@198 249 [points]
rlm@198 250 (if (empty? points)
rlm@198 251 (BufferedImage. 1 1 BufferedImage/TYPE_BYTE_BINARY)
rlm@198 252 (let [xs (vec (map first points))
rlm@198 253 ys (vec (map second points))
rlm@198 254 x0 (apply min xs)
rlm@198 255 y0 (apply min ys)
rlm@198 256 width (- (apply max xs) x0)
rlm@198 257 height (- (apply max ys) y0)
rlm@198 258 image (BufferedImage. (inc width) (inc height)
rlm@198 259 BufferedImage/TYPE_INT_RGB)]
rlm@198 260 (dorun
rlm@198 261 (for [x (range (.getWidth image))
rlm@198 262 y (range (.getHeight image))]
rlm@198 263 (.setRGB image x y 0xFF0000)))
rlm@198 264 (dorun
rlm@198 265 (for [index (range (count points))]
rlm@198 266 (.setRGB image (- (xs index) x0) (- (ys index) y0) -1)))
rlm@198 267 image)))
rlm@198 268
rlm@198 269 (defn gray
rlm@198 270 "Create a gray RGB pixel with R, G, and B set to num. num must be
rlm@198 271 between 0 and 255."
rlm@198 272 [num]
rlm@198 273 (+ num
rlm@198 274 (bit-shift-left num 8)
rlm@198 275 (bit-shift-left num 16)))
rlm@197 276 #+end_src
rlm@197 277
rlm@198 278 * Building a Sense from Nodes
rlm@198 279 My method for defining senses in blender is the following:
rlm@198 280
rlm@198 281 Senses like vision and hearing are localized to a single point
rlm@198 282 and follow a particular object around. For these:
rlm@198 283
rlm@198 284 - Create a single top-level empty node whose name is the name of the sense
rlm@198 285 - Add empty nodes which each contain meta-data relevant
rlm@198 286 to the sense, including a UV-map describing the number/distribution
rlm@198 287 of sensors if applicipable.
rlm@198 288 - Make each empty-node the child of the top-level
rlm@198 289 node. =(sense-nodes)= below generates functions to find these children.
rlm@198 290
rlm@198 291 For touch, store the path to the UV-map which describes touch-sensors in the
rlm@198 292 meta-data of the object to which that map applies.
rlm@198 293
rlm@198 294 Each sense provides code that analyzes the Node structure of the
rlm@198 295 creature and creates sense-functions. They also modify the Node
rlm@198 296 structure if necessary.
rlm@198 297
rlm@198 298 Empty nodes created in blender have no appearance or physical presence
rlm@198 299 in jMonkeyEngine, but do appear in the scene graph. Empty nodes that
rlm@198 300 represent a sense which "follows" another geometry (like eyes and
rlm@198 301 ears) follow the closest physical object. =(closest-node)= finds this
rlm@198 302 closest object given the Creature and a particular empty node.
rlm@198 303
rlm@198 304 #+name: node-1
rlm@197 305 #+begin_src clojure
rlm@198 306 (defn sense-nodes
rlm@198 307 "For some senses there is a special empty blender node whose
rlm@198 308 children are considered markers for an instance of that sense. This
rlm@198 309 function generates functions to find those children, given the name
rlm@198 310 of the special parent node."
rlm@198 311 [parent-name]
rlm@198 312 (fn [#^Node creature]
rlm@198 313 (if-let [sense-node (.getChild creature parent-name)]
rlm@198 314 (seq (.getChildren sense-node))
rlm@198 315 (do (println-repl "could not find" parent-name "node") []))))
rlm@198 316
rlm@197 317 (defn closest-node
rlm@197 318 "Return the node in creature which is closest to the given node."
rlm@198 319 [#^Node creature #^Node empty]
rlm@197 320 (loop [radius (float 0.01)]
rlm@197 321 (let [results (CollisionResults.)]
rlm@197 322 (.collideWith
rlm@197 323 creature
rlm@198 324 (BoundingBox. (.getWorldTranslation empty)
rlm@197 325 radius radius radius)
rlm@197 326 results)
rlm@197 327 (if-let [target (first results)]
rlm@197 328 (.getGeometry target)
rlm@197 329 (recur (float (* 2 radius)))))))
rlm@197 330
rlm@198 331 (defn world-to-local
rlm@198 332 "Convert the world coordinates into coordinates relative to the
rlm@198 333 object (i.e. local coordinates), taking into account the rotation
rlm@198 334 of object."
rlm@198 335 [#^Spatial object world-coordinate]
rlm@198 336 (.worldToLocal object world-coordinate nil))
rlm@198 337
rlm@198 338 (defn local-to-world
rlm@198 339 "Convert the local coordinates into world relative coordinates"
rlm@198 340 [#^Spatial object local-coordinate]
rlm@198 341 (.localToWorld object local-coordinate nil))
rlm@198 342 #+end_src
rlm@198 343
rlm@198 344 =(bind-sense)= binds either a Camera or a Listener object to any
rlm@198 345 object so that they will follow that object no matter how it
rlm@198 346 moves. Here is some example code which shows a camera bound to a blue
rlm@198 347 box as it is buffeted by white cannonballs.
rlm@198 348
rlm@198 349 #+name: node-2
rlm@198 350 #+begin_src clojure
rlm@197 351 (defn bind-sense
rlm@197 352 "Bind the sense to the Spatial such that it will maintain its
rlm@197 353 current position relative to the Spatial no matter how the spatial
rlm@197 354 moves. 'sense can be either a Camera or Listener object."
rlm@197 355 [#^Spatial obj sense]
rlm@197 356 (let [sense-offset (.subtract (.getLocation sense)
rlm@197 357 (.getWorldTranslation obj))
rlm@197 358 initial-sense-rotation (Quaternion. (.getRotation sense))
rlm@197 359 base-anti-rotation (.inverse (.getWorldRotation obj))]
rlm@197 360 (.addControl
rlm@197 361 obj
rlm@197 362 (proxy [AbstractControl] []
rlm@197 363 (controlUpdate [tpf]
rlm@197 364 (let [total-rotation
rlm@197 365 (.mult base-anti-rotation (.getWorldRotation obj))]
rlm@197 366 (.setLocation
rlm@197 367 sense
rlm@197 368 (.add
rlm@197 369 (.mult total-rotation sense-offset)
rlm@197 370 (.getWorldTranslation obj)))
rlm@197 371 (.setRotation
rlm@197 372 sense
rlm@197 373 (.mult total-rotation initial-sense-rotation))))
rlm@197 374 (controlRender [_ _])))))
rlm@197 375 #+end_src
rlm@164 376
rlm@198 377
rlm@198 378
rlm@198 379 * Bookkeeping
rlm@198 380 Here is the header for this namespace, included for completness.
rlm@198 381 #+name header
rlm@197 382 #+begin_src clojure
rlm@198 383 (ns cortex.sense
rlm@198 384 "Here are functions useful in the construction of two or more
rlm@198 385 sensors/effectors."
rlm@198 386 {:author "Robert McInytre"}
rlm@198 387 (:use (cortex world util))
rlm@198 388 (:import ij.process.ImageProcessor)
rlm@198 389 (:import jme3tools.converters.ImageToAwt)
rlm@198 390 (:import java.awt.image.BufferedImage)
rlm@198 391 (:import com.jme3.collision.CollisionResults)
rlm@198 392 (:import com.jme3.bounding.BoundingBox)
rlm@198 393 (:import (com.jme3.scene Node Spatial))
rlm@198 394 (:import com.jme3.scene.control.AbstractControl)
rlm@198 395 (:import (com.jme3.math Quaternion Vector3f)))
rlm@198 396 #+end_src
rlm@187 397
rlm@198 398 * Source Listing
rlm@198 399 Full source: [[../src/cortex/sense.clj][sense.clj]]
rlm@198 400
rlm@187 401
rlm@151 402 * COMMENT generate source
rlm@151 403 #+begin_src clojure :tangle ../src/cortex/sense.clj
rlm@197 404 <<header>>
rlm@198 405 <<blender-1>>
rlm@198 406 <<blender-2>>
rlm@198 407 <<topology-1>>
rlm@198 408 <<topology-2>>
rlm@198 409 <<node-1>>
rlm@198 410 <<node-2>>
rlm@197 411 <<view-senses>>
rlm@151 412 #+end_src