diff org/sense.org @ 198:fc0bf33bded2

fleshing out prose in sense.org
author Robert McIntyre <rlm@mit.edu>
date Sun, 05 Feb 2012 14:01:47 -0700
parents 16cbce075a0b
children 305439cec54d
line wrap: on
line diff
     1.1 --- a/org/sense.org	Sun Feb 05 06:55:41 2012 -0700
     1.2 +++ b/org/sense.org	Sun Feb 05 14:01:47 2012 -0700
     1.3 @@ -1,4 +1,4 @@
     1.4 -#+title: General Sense/Effector Utilities
     1.5 +#+title: 
     1.6  #+author: Robert McIntyre
     1.7  #+email: rlm@mit.edu
     1.8  #+description: sensory utilities
     1.9 @@ -6,51 +6,114 @@
    1.10  #+SETUPFILE: ../../aurellem/org/setup.org
    1.11  #+INCLUDE: ../../aurellem/org/level-0.org
    1.12  
    1.13 -* Namespace/Imports
    1.14 -#+name header
    1.15 -#+begin_src clojure
    1.16 -(ns cortex.sense
    1.17 -  "Here are functions useful in the construction of two or more
    1.18 -   sensors/effectors."
    1.19 -  {:author "Robert McInytre"}
    1.20 -  (:use (cortex world util))
    1.21 -  (:import ij.process.ImageProcessor)
    1.22 -  (:import jme3tools.converters.ImageToAwt)
    1.23 -  (:import java.awt.image.BufferedImage)
    1.24 -  (:import com.jme3.collision.CollisionResults)
    1.25 -  (:import com.jme3.bounding.BoundingBox)
    1.26 -  (:import (com.jme3.scene Node Spatial))
    1.27 -  (:import com.jme3.scene.control.AbstractControl)
    1.28 -  (:import (com.jme3.math Quaternion Vector3f)))
    1.29 -#+end_src
    1.30  
    1.31  * Blender Utilities
    1.32 -#+name: blender
    1.33 +In blender, any object can be assigned an arbitray number of key-value
    1.34 +pairs which are called "Custom Properties". These are accessable in
    1.35 +jMonkyeEngine when blender files are imported with the
    1.36 +=BlenderLoader=. =(meta-data)= extracts these properties.
    1.37 +
    1.38 +#+name: blender-1
    1.39  #+begin_src clojure
    1.40  (defn meta-data
    1.41    "Get the meta-data for a node created with blender."
    1.42    [blender-node key]
    1.43    (if-let [data (.getUserData blender-node "properties")]
    1.44 -    (.findValue data key)
    1.45 -    nil))
    1.46 +    (.findValue data key) nil))
    1.47 +#+end_src
    1.48  
    1.49 +Blender uses a different coordinate system than jMonkeyEngine so it
    1.50 +is useful to be able to convert between the two. These only come into
    1.51 +play when the meta-data of a node refers to a vector in the blender
    1.52 +coordinate system.
    1.53 +
    1.54 +#+name: blender-2
    1.55 +#+begin_src clojure
    1.56  (defn jme-to-blender
    1.57    "Convert from JME coordinates to Blender coordinates"
    1.58    [#^Vector3f in]
    1.59 -  (Vector3f. (.getX in)
    1.60 -             (- (.getZ in))
    1.61 -             (.getY in)))
    1.62 +  (Vector3f. (.getX in) (- (.getZ in)) (.getY in)))
    1.63  
    1.64  (defn blender-to-jme
    1.65    "Convert from Blender coordinates to JME coordinates"
    1.66    [#^Vector3f in]
    1.67 -  (Vector3f. (.getX in)
    1.68 -             (.getZ in)
    1.69 -             (- (.getY in))))
    1.70 +  (Vector3f. (.getX in) (.getZ in) (- (.getY in))))
    1.71  #+end_src
    1.72  
    1.73 -* Topology Related stuff
    1.74 -#+name: topology
    1.75 +* Sense Topology
    1.76 +
    1.77 +Human beings are three-dimensional objects, and the nerves that
    1.78 +transmit data from our various sense organs to our brain are
    1.79 +essentially one-dimensional. This leaves up to two dimensions in which
    1.80 +our sensory information may flow.  For example, imagine your skin: it
    1.81 +is a two-dimensional surface around a three-dimensional object (your
    1.82 +body). It has discrete touch sensors embedded at various points, and
    1.83 +the density of these sensors corresponds to the sensitivity of that
    1.84 +region of skin. Each touch sensor connects to a nerve, all of which
    1.85 +eventually are bundled together as they travel up the spinal cord to
    1.86 +the brain. Intersect the spinal nerves with a guillotining plane and
    1.87 +you will see all of the sensory data of the skin revealed in a roughly
    1.88 +circular two-dimensional image which is the cross section of the
    1.89 +spinal cord.  Points on this image that are close together in this
    1.90 +circle represent touch sensors that are /probably/ close together on
    1.91 +the skin, although there is of course some cutting and rerangement
    1.92 +that has to be done to transfer the complicated surface of the skin
    1.93 +onto a two dimensional image.
    1.94 +
    1.95 +Most human senses consist of many discrete sensors of various
    1.96 +properties distributed along a surface at various densities.  For
    1.97 +skin, it is Pacinian corpuscles, Meissner's corpuscles, Merkel's
    1.98 +disks, and Ruffini's endings, which detect pressure and vibration of
    1.99 +various intensities.  For ears, it is the stereocilia distributed
   1.100 +along the basilar membrane inside the cochlea; each one is sensitive
   1.101 +to a slightly different frequency of sound. For eyes, it is rods
   1.102 +and cones distributed along the surface of the retina. In each case,
   1.103 +we can describe the sense with a surface and a distribution of sensors
   1.104 +along that surface.
   1.105 +
   1.106 +** UV-maps
   1.107 +
   1.108 +Blender and jMonkeyEngine already have support for exactly this sort
   1.109 +of data structure because it is used to "skin" models for games. It is
   1.110 +called [[http://wiki.blender.org/index.php/Doc:2.6/Manual/Textures/Mapping/UV][UV-mapping]].  The three-dimensional surface is cut and smooshed
   1.111 +until it fits on a two-dimensional image. You paint whatever you want
   1.112 +on that image, and when the three-dimensional shape is rendered in a
   1.113 +game that image the smooshing and cutting us reversed and the image
   1.114 +appears on the three-dimensional object.
   1.115 +
   1.116 +To make a sense, interpret the UV-image as describing the distribution
   1.117 +of that senses sensors. To get different types of sensors, you can
   1.118 +either use a different color for each type of sensor, or use multiple
   1.119 +UV-maps, each labeled with that sensor type. I generally use a white
   1.120 +pixel to mean the presense of a sensor and a black pixel to mean the
   1.121 +absense of a sensor, and use one UV-map for each sensor-type within a
   1.122 +given sense.  The paths to the images are not stored as the actual
   1.123 +UV-map of the blender object but are instead referenced in the
   1.124 +meta-data of the node.
   1.125 +
   1.126 +#+CAPTION: The UV-map for an enlongated icososphere. The white dots each represent a touch sensor. They are dense in the regions that describe the tip of the finger, and less dense along the dorsal side of the finger opposite the tip.
   1.127 +#+ATTR_HTML: width="300"
   1.128 +[[../images/finger-UV.png]]
   1.129 +
   1.130 +#+CAPTION: Ventral side of the UV-mapped finger. Notice the density of touch sensors at the tip.
   1.131 +#+ATTR_HTML: width="300"
   1.132 +[[../images/finger-1.png]]
   1.133 +
   1.134 +#+CAPTION: Side view of the UV-mapped finger.
   1.135 +#+ATTR_HTML: width="300"
   1.136 +[[../images/finger-2.png]]
   1.137 +
   1.138 +#+CAPTION: Head on view of the finger. In both the head and side views you can see the divide where the touch-sensors transition from high density to low density.
   1.139 +#+ATTR_HTML: width="300"
   1.140 +[[../images/finger-3.png]]
   1.141 +
   1.142 +The following code loads images and gets the locations of the white
   1.143 +pixels so that they can be used to create senses. =(load-image)= finds
   1.144 +images using jMonkeyEngine's asset-manager, so the image path is
   1.145 +expected to be relative to the =assets= directory.  Thanks to Dylan
   1.146 +for the beautiful version of filter-pixels.
   1.147 +
   1.148 +#+name: topology-1
   1.149  #+begin_src clojure
   1.150  (defn load-image
   1.151    "Load an image as a BufferedImage using the asset-manager system." 
   1.152 @@ -66,7 +129,8 @@
   1.153  
   1.154  (defn filter-pixels
   1.155    "List the coordinates of all pixels matching pred, within the bounds
   1.156 -   provided.
   1.157 +   provided. If bounds are not specified then the entire image is
   1.158 +   searched.
   1.159     bounds -> [x0 y0 width height]"
   1.160    {:author "Dylan Holmes"}
   1.161    ([pred #^BufferedImage image]
   1.162 @@ -87,30 +151,20 @@
   1.163       (filter-pixels white? image bounds))
   1.164    ([#^BufferedImage image]
   1.165       (filter-pixels white? image)))
   1.166 +#+end_src
   1.167  
   1.168 -(defn points->image
   1.169 -  "Take a sparse collection of points and visuliaze it as a
   1.170 -   BufferedImage."
   1.171 -  [points]
   1.172 -  (if (empty? points)
   1.173 -    (BufferedImage. 1 1 BufferedImage/TYPE_BYTE_BINARY)
   1.174 -    (let [xs (vec (map first points))
   1.175 -          ys (vec (map second points))
   1.176 -          x0 (apply min xs)
   1.177 -          y0 (apply min ys)
   1.178 -          width (- (apply max xs) x0)
   1.179 -          height (- (apply max ys) y0)
   1.180 -          image (BufferedImage. (inc width) (inc height)
   1.181 -                                BufferedImage/TYPE_INT_RGB)]
   1.182 -      (dorun
   1.183 -       (for [x (range (.getWidth image))
   1.184 -             y (range (.getHeight image))]
   1.185 -         (.setRGB image x y 0xFF0000)))
   1.186 -      (dorun 
   1.187 -       (for [index (range (count points))]
   1.188 -         (.setRGB image (- (xs index) x0) (- (ys index) y0) -1)))
   1.189 -      image)))
   1.190 +** Topology
   1.191  
   1.192 +Information from the senses is transmitted to the brain via bundles of
   1.193 +axons, whether it be the optic nerve or the spinal cord. While these
   1.194 +bundles more or less perserve the overall topology of a sense's
   1.195 +two-dimensional surface, they do not perserve the percise euclidean
   1.196 +distances between every sensor. =(collapse)= is here to smoosh the
   1.197 +sensors described by a UV-map into a contigous region that still
   1.198 +perserves the topology of the original sense.
   1.199 +
   1.200 +#+name: topology-2
   1.201 +#+begin_src clojure
   1.202  (defn average [coll]
   1.203    (/ (reduce + coll) (count coll)))
   1.204  
   1.205 @@ -161,26 +215,139 @@
   1.206                       (- y min-y)])
   1.207                    squeezed))]
   1.208          relocated)))
   1.209 +#+end_src
   1.210 +* Viewing Sense Data
   1.211  
   1.212 +It's vital to /see/ the sense data to make sure that everything is
   1.213 +behaving as it should. =(view-sense)= is here so that each sense can
   1.214 +define its own way of turning sense-data into pictures, while the
   1.215 +actual rendering of said pictures stays in one central place.
   1.216 +=(points->image)= helps senses generate a base image onto which they
   1.217 +can overlay actual sense data.
   1.218 +
   1.219 +#+name view-senses
   1.220 +#+begin_src clojure
   1.221 +(defn view-sense 
   1.222 +  "Take a kernel that produces a BufferedImage from some sense data
   1.223 +   and return a function which takes a list of sense data, uses the
   1.224 +   kernel to convert to images, and displays those images, each in
   1.225 +   its own JFrame."
   1.226 +  [sense-display-kernel]
   1.227 +  (let [windows (atom [])]
   1.228 +    (fn [data]
   1.229 +      (if (> (count data) (count @windows))
   1.230 +        (reset! 
   1.231 +         windows (map (fn [_] (view-image)) (range (count data)))))
   1.232 +      (dorun
   1.233 +       (map
   1.234 +        (fn [display datum]
   1.235 +          (display (sense-display-kernel datum)))
   1.236 +        @windows data)))))
   1.237 +
   1.238 +(defn points->image
   1.239 +  "Take a collection of points and visuliaze it as a BufferedImage."
   1.240 +  [points]
   1.241 +  (if (empty? points)
   1.242 +    (BufferedImage. 1 1 BufferedImage/TYPE_BYTE_BINARY)
   1.243 +    (let [xs (vec (map first points))
   1.244 +          ys (vec (map second points))
   1.245 +          x0 (apply min xs)
   1.246 +          y0 (apply min ys)
   1.247 +          width (- (apply max xs) x0)
   1.248 +          height (- (apply max ys) y0)
   1.249 +          image (BufferedImage. (inc width) (inc height)
   1.250 +                                BufferedImage/TYPE_INT_RGB)]
   1.251 +      (dorun
   1.252 +       (for [x (range (.getWidth image))
   1.253 +             y (range (.getHeight image))]
   1.254 +         (.setRGB image x y 0xFF0000)))
   1.255 +      (dorun 
   1.256 +       (for [index (range (count points))]
   1.257 +         (.setRGB image (- (xs index) x0) (- (ys index) y0) -1)))
   1.258 +      image)))
   1.259 +
   1.260 +(defn gray
   1.261 +  "Create a gray RGB pixel with R, G, and B set to num. num must be
   1.262 +   between 0 and 255."
   1.263 +  [num]
   1.264 +  (+ num
   1.265 +     (bit-shift-left num 8)
   1.266 +     (bit-shift-left num 16)))
   1.267  #+end_src
   1.268  
   1.269 -* Node level stuff
   1.270 -#+name: node
   1.271 +* Building a Sense from Nodes
   1.272 +My method for defining senses in blender is the following:
   1.273 +
   1.274 +Senses like vision and hearing are localized to a single point
   1.275 +and follow a particular object around.  For these:
   1.276 +
   1.277 + - Create a single top-level empty node whose name is the name of the sense
   1.278 + - Add empty nodes which each contain meta-data relevant
   1.279 +   to the sense, including a UV-map describing the number/distribution
   1.280 +   of sensors if applicipable.
   1.281 + - Make each empty-node the child of the top-level
   1.282 +   node. =(sense-nodes)= below generates functions to find these children.
   1.283 +
   1.284 +For touch, store the path to the UV-map which describes touch-sensors in the
   1.285 +meta-data of the object to which that map applies.
   1.286 +
   1.287 +Each sense provides code that analyzes the Node structure of the
   1.288 +creature and creates sense-functions.  They also modify the Node
   1.289 +structure if necessary.
   1.290 +
   1.291 +Empty nodes created in blender have no appearance or physical presence
   1.292 +in jMonkeyEngine, but do appear in the scene graph. Empty nodes that
   1.293 +represent a sense which "follows" another geometry (like eyes and
   1.294 +ears) follow the closest physical object.  =(closest-node)= finds this
   1.295 +closest object given the Creature and a particular empty node.
   1.296 +
   1.297 +#+name: node-1
   1.298  #+begin_src clojure
   1.299 +(defn sense-nodes
   1.300 +  "For some senses there is a special empty blender node whose
   1.301 +   children are considered markers for an instance of that sense. This
   1.302 +   function generates functions to find those children, given the name
   1.303 +   of the special parent node."
   1.304 +  [parent-name]
   1.305 +  (fn [#^Node creature]
   1.306 +    (if-let [sense-node (.getChild creature parent-name)]
   1.307 +      (seq (.getChildren sense-node))
   1.308 +      (do (println-repl "could not find" parent-name "node") []))))
   1.309 +
   1.310  (defn closest-node
   1.311    "Return the node in creature which is closest to the given node."
   1.312 -  [#^Node creature #^Node eye]
   1.313 +  [#^Node creature #^Node empty]
   1.314    (loop [radius (float 0.01)]
   1.315      (let [results (CollisionResults.)]
   1.316        (.collideWith
   1.317         creature
   1.318 -       (BoundingBox. (.getWorldTranslation eye)
   1.319 +       (BoundingBox. (.getWorldTranslation empty)
   1.320                       radius radius radius)
   1.321         results)
   1.322        (if-let [target (first results)]
   1.323          (.getGeometry target)
   1.324          (recur (float (* 2 radius)))))))
   1.325  
   1.326 +(defn world-to-local
   1.327 +  "Convert the world coordinates into coordinates relative to the 
   1.328 +   object (i.e. local coordinates), taking into account the rotation
   1.329 +   of object."
   1.330 +  [#^Spatial object world-coordinate]
   1.331 +  (.worldToLocal object world-coordinate nil))
   1.332 +
   1.333 +(defn local-to-world
   1.334 +  "Convert the local coordinates into world relative coordinates"
   1.335 +  [#^Spatial object local-coordinate]
   1.336 +  (.localToWorld object local-coordinate nil))
   1.337 +#+end_src
   1.338 +
   1.339 +=(bind-sense)= binds either a Camera or a Listener object to any
   1.340 +object so that they will follow that object no matter how it
   1.341 +moves. Here is some example code which shows a camera bound to a blue
   1.342 +box as it is buffeted by white cannonballs.
   1.343 +
   1.344 +#+name: node-2
   1.345 +#+begin_src clojure
   1.346  (defn bind-sense
   1.347    "Bind the sense to the Spatial such that it will maintain its
   1.348     current position relative to the Spatial no matter how the spatial
   1.349 @@ -205,65 +372,41 @@
   1.350              sense
   1.351              (.mult total-rotation initial-sense-rotation))))
   1.352         (controlRender [_ _])))))
   1.353 -
   1.354 -(defn world-to-local
   1.355 -  "Convert the world coordinates into coordinates relative to the 
   1.356 -   object (i.e. local coordinates), taking into account the rotation
   1.357 -   of object."
   1.358 -  [#^Spatial object world-coordinate]
   1.359 -  (.worldToLocal object world-coordinate nil))
   1.360 -
   1.361 -(defn local-to-world
   1.362 -  "Convert the local coordinates into world relative coordinates"
   1.363 -  [#^Spatial object local-coordinate]
   1.364 -  (.localToWorld object local-coordinate nil))
   1.365 -
   1.366 -(defn sense-nodes
   1.367 -  "For each sense there is a special blender node whose children are
   1.368 -   considered markers for an instance of a that sense. This function
   1.369 -   generates functions to find those children, given the name of the
   1.370 -   special parent node."
   1.371 -  [parent-name]
   1.372 -  (fn [#^Node creature]
   1.373 -    (if-let [sense-node (.getChild creature parent-name)]
   1.374 -      (seq (.getChildren sense-node))
   1.375 -      (do (println-repl "could not find" parent-name "node") []))))
   1.376  #+end_src
   1.377  
   1.378 -* Viewing Senses
   1.379 -#+name view-senses
   1.380 +
   1.381 +
   1.382 +* Bookkeeping
   1.383 +Here is the header for this namespace, included for completness.
   1.384 +#+name header
   1.385  #+begin_src clojure
   1.386 -(defn view-sense 
   1.387 -  "Take a kernel that produces a BufferedImage from some sense data
   1.388 -   and return a function which takes a list of sense data, uses the
   1.389 -   kernem to convert to images, and displays those images, each in
   1.390 -   its own JFrame."
   1.391 -  [sense-display-kernel]
   1.392 -  (let [windows (atom [])]
   1.393 -    (fn [data]
   1.394 -      (if (> (count data) (count @windows))
   1.395 -        (reset! 
   1.396 -         windows (map (fn [_] (view-image)) (range (count data)))))
   1.397 -      (dorun
   1.398 -       (map
   1.399 -        (fn [display datum]
   1.400 -          (display (sense-display-kernel datum)))
   1.401 -        @windows data)))))
   1.402 +(ns cortex.sense
   1.403 +  "Here are functions useful in the construction of two or more
   1.404 +   sensors/effectors."
   1.405 +  {:author "Robert McInytre"}
   1.406 +  (:use (cortex world util))
   1.407 +  (:import ij.process.ImageProcessor)
   1.408 +  (:import jme3tools.converters.ImageToAwt)
   1.409 +  (:import java.awt.image.BufferedImage)
   1.410 +  (:import com.jme3.collision.CollisionResults)
   1.411 +  (:import com.jme3.bounding.BoundingBox)
   1.412 +  (:import (com.jme3.scene Node Spatial))
   1.413 +  (:import com.jme3.scene.control.AbstractControl)
   1.414 +  (:import (com.jme3.math Quaternion Vector3f)))
   1.415 +#+end_src
   1.416  
   1.417 -(defn gray
   1.418 -  "Create a gray RGB pixel with R, G, and B set to num. num must be
   1.419 -   between 0 and 255."
   1.420 -  [num]
   1.421 -  (+ num
   1.422 -     (bit-shift-left num 8)
   1.423 -     (bit-shift-left num 16)))
   1.424 -#+end_src
   1.425 +* Source Listing
   1.426 +  Full source: [[../src/cortex/sense.clj][sense.clj]]
   1.427 +
   1.428  
   1.429  * COMMENT generate source
   1.430  #+begin_src clojure :tangle ../src/cortex/sense.clj
   1.431  <<header>>
   1.432 -<<blender>>
   1.433 -<<topology>>
   1.434 -<<node>>
   1.435 +<<blender-1>>
   1.436 +<<blender-2>>
   1.437 +<<topology-1>>
   1.438 +<<topology-2>>
   1.439 +<<node-1>>
   1.440 +<<node-2>>
   1.441  <<view-senses>>
   1.442  #+end_src