rlm@198: #+title: rlm@151: #+author: Robert McIntyre rlm@151: #+email: rlm@mit.edu rlm@151: #+description: sensory utilities rlm@151: #+keywords: simulation, jMonkeyEngine3, clojure, simulated senses rlm@151: #+SETUPFILE: ../../aurellem/org/setup.org rlm@151: #+INCLUDE: ../../aurellem/org/level-0.org rlm@151: rlm@151: rlm@197: * Blender Utilities rlm@198: In blender, any object can be assigned an arbitray number of key-value rlm@198: pairs which are called "Custom Properties". These are accessable in rlm@198: jMonkyeEngine when blender files are imported with the rlm@198: =BlenderLoader=. =(meta-data)= extracts these properties. rlm@198: rlm@198: #+name: blender-1 rlm@197: #+begin_src clojure rlm@181: (defn meta-data rlm@181: "Get the meta-data for a node created with blender." rlm@181: [blender-node key] rlm@151: (if-let [data (.getUserData blender-node "properties")] rlm@198: (.findValue data key) nil)) rlm@198: #+end_src rlm@151: rlm@198: Blender uses a different coordinate system than jMonkeyEngine so it rlm@198: is useful to be able to convert between the two. These only come into rlm@198: play when the meta-data of a node refers to a vector in the blender rlm@198: coordinate system. rlm@198: rlm@198: #+name: blender-2 rlm@198: #+begin_src clojure rlm@197: (defn jme-to-blender rlm@197: "Convert from JME coordinates to Blender coordinates" rlm@197: [#^Vector3f in] rlm@198: (Vector3f. (.getX in) (- (.getZ in)) (.getY in))) rlm@151: rlm@197: (defn blender-to-jme rlm@197: "Convert from Blender coordinates to JME coordinates" rlm@197: [#^Vector3f in] rlm@198: (Vector3f. (.getX in) (.getZ in) (- (.getY in)))) rlm@197: #+end_src rlm@197: rlm@198: * Sense Topology rlm@198: rlm@198: Human beings are three-dimensional objects, and the nerves that rlm@198: transmit data from our various sense organs to our brain are rlm@198: essentially one-dimensional. This leaves up to two dimensions in which rlm@198: our sensory information may flow. For example, imagine your skin: it rlm@198: is a two-dimensional surface around a three-dimensional object (your rlm@198: body). It has discrete touch sensors embedded at various points, and rlm@198: the density of these sensors corresponds to the sensitivity of that rlm@198: region of skin. Each touch sensor connects to a nerve, all of which rlm@198: eventually are bundled together as they travel up the spinal cord to rlm@198: the brain. Intersect the spinal nerves with a guillotining plane and rlm@198: you will see all of the sensory data of the skin revealed in a roughly rlm@198: circular two-dimensional image which is the cross section of the rlm@198: spinal cord. Points on this image that are close together in this rlm@198: circle represent touch sensors that are /probably/ close together on rlm@198: the skin, although there is of course some cutting and rerangement rlm@198: that has to be done to transfer the complicated surface of the skin rlm@198: onto a two dimensional image. rlm@198: rlm@198: Most human senses consist of many discrete sensors of various rlm@198: properties distributed along a surface at various densities. For rlm@198: skin, it is Pacinian corpuscles, Meissner's corpuscles, Merkel's rlm@198: disks, and Ruffini's endings, which detect pressure and vibration of rlm@198: various intensities. For ears, it is the stereocilia distributed rlm@198: along the basilar membrane inside the cochlea; each one is sensitive rlm@198: to a slightly different frequency of sound. For eyes, it is rods rlm@198: and cones distributed along the surface of the retina. In each case, rlm@198: we can describe the sense with a surface and a distribution of sensors rlm@198: along that surface. rlm@198: rlm@198: ** UV-maps rlm@198: rlm@198: Blender and jMonkeyEngine already have support for exactly this sort rlm@198: of data structure because it is used to "skin" models for games. It is rlm@198: called [[http://wiki.blender.org/index.php/Doc:2.6/Manual/Textures/Mapping/UV][UV-mapping]]. The three-dimensional surface is cut and smooshed rlm@198: until it fits on a two-dimensional image. You paint whatever you want rlm@198: on that image, and when the three-dimensional shape is rendered in a rlm@198: game that image the smooshing and cutting us reversed and the image rlm@198: appears on the three-dimensional object. rlm@198: rlm@198: To make a sense, interpret the UV-image as describing the distribution rlm@198: of that senses sensors. To get different types of sensors, you can rlm@198: either use a different color for each type of sensor, or use multiple rlm@198: UV-maps, each labeled with that sensor type. I generally use a white rlm@198: pixel to mean the presense of a sensor and a black pixel to mean the rlm@198: absense of a sensor, and use one UV-map for each sensor-type within a rlm@198: given sense. The paths to the images are not stored as the actual rlm@198: UV-map of the blender object but are instead referenced in the rlm@198: meta-data of the node. rlm@198: rlm@198: #+CAPTION: The UV-map for an enlongated icososphere. The white dots each represent a touch sensor. They are dense in the regions that describe the tip of the finger, and less dense along the dorsal side of the finger opposite the tip. rlm@198: #+ATTR_HTML: width="300" rlm@198: [[../images/finger-UV.png]] rlm@198: rlm@198: #+CAPTION: Ventral side of the UV-mapped finger. Notice the density of touch sensors at the tip. rlm@198: #+ATTR_HTML: width="300" rlm@198: [[../images/finger-1.png]] rlm@198: rlm@198: #+CAPTION: Side view of the UV-mapped finger. rlm@198: #+ATTR_HTML: width="300" rlm@198: [[../images/finger-2.png]] rlm@198: rlm@198: #+CAPTION: Head on view of the finger. In both the head and side views you can see the divide where the touch-sensors transition from high density to low density. rlm@198: #+ATTR_HTML: width="300" rlm@198: [[../images/finger-3.png]] rlm@198: rlm@198: The following code loads images and gets the locations of the white rlm@198: pixels so that they can be used to create senses. =(load-image)= finds rlm@198: images using jMonkeyEngine's asset-manager, so the image path is rlm@198: expected to be relative to the =assets= directory. Thanks to Dylan rlm@198: for the beautiful version of filter-pixels. rlm@198: rlm@198: #+name: topology-1 rlm@197: #+begin_src clojure rlm@197: (defn load-image rlm@197: "Load an image as a BufferedImage using the asset-manager system." rlm@197: [asset-relative-path] rlm@197: (ImageToAwt/convert rlm@197: (.getImage (.loadTexture (asset-manager) asset-relative-path)) rlm@197: false false 0)) rlm@151: rlm@181: (def white 0xFFFFFF) rlm@181: rlm@181: (defn white? [rgb] rlm@181: (= (bit-and white rgb) white)) rlm@181: rlm@151: (defn filter-pixels rlm@151: "List the coordinates of all pixels matching pred, within the bounds rlm@198: provided. If bounds are not specified then the entire image is rlm@198: searched. rlm@182: bounds -> [x0 y0 width height]" rlm@151: {:author "Dylan Holmes"} rlm@151: ([pred #^BufferedImage image] rlm@151: (filter-pixels pred image [0 0 (.getWidth image) (.getHeight image)])) rlm@151: ([pred #^BufferedImage image [x0 y0 width height]] rlm@151: ((fn accumulate [x y matches] rlm@151: (cond rlm@151: (>= y (+ height y0)) matches rlm@151: (>= x (+ width x0)) (recur 0 (inc y) matches) rlm@151: (pred (.getRGB image x y)) rlm@151: (recur (inc x) y (conj matches [x y])) rlm@151: :else (recur (inc x) y matches))) rlm@151: x0 y0 []))) rlm@151: rlm@151: (defn white-coordinates rlm@151: "Coordinates of all the white pixels in a subset of the image." rlm@151: ([#^BufferedImage image bounds] rlm@181: (filter-pixels white? image bounds)) rlm@151: ([#^BufferedImage image] rlm@181: (filter-pixels white? image))) rlm@198: #+end_src rlm@151: rlm@198: ** Topology rlm@151: rlm@198: Information from the senses is transmitted to the brain via bundles of rlm@198: axons, whether it be the optic nerve or the spinal cord. While these rlm@198: bundles more or less perserve the overall topology of a sense's rlm@198: two-dimensional surface, they do not perserve the percise euclidean rlm@198: distances between every sensor. =(collapse)= is here to smoosh the rlm@198: sensors described by a UV-map into a contigous region that still rlm@198: perserves the topology of the original sense. rlm@198: rlm@198: #+name: topology-2 rlm@198: #+begin_src clojure rlm@151: (defn average [coll] rlm@151: (/ (reduce + coll) (count coll))) rlm@151: rlm@151: (defn collapse-1d rlm@182: "One dimensional analogue of collapse." rlm@151: [center line] rlm@151: (let [length (count line) rlm@151: num-above (count (filter (partial < center) line)) rlm@151: num-below (- length num-above)] rlm@151: (range (- center num-below) rlm@151: (+ center num-above)))) rlm@151: rlm@151: (defn collapse rlm@151: "Take a set of pairs of integers and collapse them into a rlm@182: contigous bitmap with no \"holes\"." rlm@151: [points] rlm@151: (if (empty? points) [] rlm@151: (let rlm@151: [num-points (count points) rlm@151: center (vector rlm@151: (int (average (map first points))) rlm@151: (int (average (map first points)))) rlm@151: flattened rlm@151: (reduce rlm@151: concat rlm@151: (map rlm@151: (fn [column] rlm@151: (map vector rlm@151: (map first column) rlm@151: (collapse-1d (second center) rlm@151: (map second column)))) rlm@151: (partition-by first (sort-by first points)))) rlm@151: squeezed rlm@151: (reduce rlm@151: concat rlm@151: (map rlm@151: (fn [row] rlm@151: (map vector rlm@151: (collapse-1d (first center) rlm@151: (map first row)) rlm@151: (map second row))) rlm@151: (partition-by second (sort-by second flattened)))) rlm@182: relocated rlm@151: (let [min-x (apply min (map first squeezed)) rlm@151: min-y (apply min (map second squeezed))] rlm@151: (map (fn [[x y]] rlm@151: [(- x min-x) rlm@151: (- y min-y)]) rlm@151: squeezed))] rlm@182: relocated))) rlm@198: #+end_src rlm@198: * Viewing Sense Data rlm@151: rlm@198: It's vital to /see/ the sense data to make sure that everything is rlm@198: behaving as it should. =(view-sense)= is here so that each sense can rlm@198: define its own way of turning sense-data into pictures, while the rlm@198: actual rendering of said pictures stays in one central place. rlm@198: =(points->image)= helps senses generate a base image onto which they rlm@198: can overlay actual sense data. rlm@198: rlm@198: #+name view-senses rlm@198: #+begin_src clojure rlm@198: (defn view-sense rlm@198: "Take a kernel that produces a BufferedImage from some sense data rlm@198: and return a function which takes a list of sense data, uses the rlm@198: kernel to convert to images, and displays those images, each in rlm@198: its own JFrame." rlm@198: [sense-display-kernel] rlm@198: (let [windows (atom [])] rlm@198: (fn [data] rlm@198: (if (> (count data) (count @windows)) rlm@198: (reset! rlm@198: windows (map (fn [_] (view-image)) (range (count data))))) rlm@198: (dorun rlm@198: (map rlm@198: (fn [display datum] rlm@198: (display (sense-display-kernel datum))) rlm@198: @windows data))))) rlm@198: rlm@198: (defn points->image rlm@198: "Take a collection of points and visuliaze it as a BufferedImage." rlm@198: [points] rlm@198: (if (empty? points) rlm@198: (BufferedImage. 1 1 BufferedImage/TYPE_BYTE_BINARY) rlm@198: (let [xs (vec (map first points)) rlm@198: ys (vec (map second points)) rlm@198: x0 (apply min xs) rlm@198: y0 (apply min ys) rlm@198: width (- (apply max xs) x0) rlm@198: height (- (apply max ys) y0) rlm@198: image (BufferedImage. (inc width) (inc height) rlm@198: BufferedImage/TYPE_INT_RGB)] rlm@198: (dorun rlm@198: (for [x (range (.getWidth image)) rlm@198: y (range (.getHeight image))] rlm@198: (.setRGB image x y 0xFF0000))) rlm@198: (dorun rlm@198: (for [index (range (count points))] rlm@198: (.setRGB image (- (xs index) x0) (- (ys index) y0) -1))) rlm@198: image))) rlm@198: rlm@198: (defn gray rlm@198: "Create a gray RGB pixel with R, G, and B set to num. num must be rlm@198: between 0 and 255." rlm@198: [num] rlm@198: (+ num rlm@198: (bit-shift-left num 8) rlm@198: (bit-shift-left num 16))) rlm@197: #+end_src rlm@197: rlm@198: * Building a Sense from Nodes rlm@198: My method for defining senses in blender is the following: rlm@198: rlm@198: Senses like vision and hearing are localized to a single point rlm@198: and follow a particular object around. For these: rlm@198: rlm@198: - Create a single top-level empty node whose name is the name of the sense rlm@198: - Add empty nodes which each contain meta-data relevant rlm@198: to the sense, including a UV-map describing the number/distribution rlm@198: of sensors if applicipable. rlm@198: - Make each empty-node the child of the top-level rlm@198: node. =(sense-nodes)= below generates functions to find these children. rlm@198: rlm@198: For touch, store the path to the UV-map which describes touch-sensors in the rlm@198: meta-data of the object to which that map applies. rlm@198: rlm@198: Each sense provides code that analyzes the Node structure of the rlm@198: creature and creates sense-functions. They also modify the Node rlm@198: structure if necessary. rlm@198: rlm@198: Empty nodes created in blender have no appearance or physical presence rlm@198: in jMonkeyEngine, but do appear in the scene graph. Empty nodes that rlm@198: represent a sense which "follows" another geometry (like eyes and rlm@198: ears) follow the closest physical object. =(closest-node)= finds this rlm@198: closest object given the Creature and a particular empty node. rlm@198: rlm@198: #+name: node-1 rlm@197: #+begin_src clojure rlm@198: (defn sense-nodes rlm@198: "For some senses there is a special empty blender node whose rlm@198: children are considered markers for an instance of that sense. This rlm@198: function generates functions to find those children, given the name rlm@198: of the special parent node." rlm@198: [parent-name] rlm@198: (fn [#^Node creature] rlm@198: (if-let [sense-node (.getChild creature parent-name)] rlm@198: (seq (.getChildren sense-node)) rlm@198: (do (println-repl "could not find" parent-name "node") [])))) rlm@198: rlm@197: (defn closest-node rlm@197: "Return the node in creature which is closest to the given node." rlm@198: [#^Node creature #^Node empty] rlm@197: (loop [radius (float 0.01)] rlm@197: (let [results (CollisionResults.)] rlm@197: (.collideWith rlm@197: creature rlm@198: (BoundingBox. (.getWorldTranslation empty) rlm@197: radius radius radius) rlm@197: results) rlm@197: (if-let [target (first results)] rlm@197: (.getGeometry target) rlm@197: (recur (float (* 2 radius))))))) rlm@197: rlm@198: (defn world-to-local rlm@198: "Convert the world coordinates into coordinates relative to the rlm@198: object (i.e. local coordinates), taking into account the rotation rlm@198: of object." rlm@198: [#^Spatial object world-coordinate] rlm@198: (.worldToLocal object world-coordinate nil)) rlm@198: rlm@198: (defn local-to-world rlm@198: "Convert the local coordinates into world relative coordinates" rlm@198: [#^Spatial object local-coordinate] rlm@198: (.localToWorld object local-coordinate nil)) rlm@198: #+end_src rlm@198: rlm@198: =(bind-sense)= binds either a Camera or a Listener object to any rlm@198: object so that they will follow that object no matter how it rlm@198: moves. Here is some example code which shows a camera bound to a blue rlm@198: box as it is buffeted by white cannonballs. rlm@198: rlm@198: #+name: node-2 rlm@198: #+begin_src clojure rlm@197: (defn bind-sense rlm@197: "Bind the sense to the Spatial such that it will maintain its rlm@197: current position relative to the Spatial no matter how the spatial rlm@197: moves. 'sense can be either a Camera or Listener object." rlm@197: [#^Spatial obj sense] rlm@197: (let [sense-offset (.subtract (.getLocation sense) rlm@197: (.getWorldTranslation obj)) rlm@197: initial-sense-rotation (Quaternion. (.getRotation sense)) rlm@197: base-anti-rotation (.inverse (.getWorldRotation obj))] rlm@197: (.addControl rlm@197: obj rlm@197: (proxy [AbstractControl] [] rlm@197: (controlUpdate [tpf] rlm@197: (let [total-rotation rlm@197: (.mult base-anti-rotation (.getWorldRotation obj))] rlm@197: (.setLocation rlm@197: sense rlm@197: (.add rlm@197: (.mult total-rotation sense-offset) rlm@197: (.getWorldTranslation obj))) rlm@197: (.setRotation rlm@197: sense rlm@197: (.mult total-rotation initial-sense-rotation)))) rlm@197: (controlRender [_ _]))))) rlm@197: #+end_src rlm@164: rlm@198: rlm@198: rlm@198: * Bookkeeping rlm@198: Here is the header for this namespace, included for completness. rlm@198: #+name header rlm@197: #+begin_src clojure rlm@198: (ns cortex.sense rlm@198: "Here are functions useful in the construction of two or more rlm@198: sensors/effectors." rlm@198: {:author "Robert McInytre"} rlm@198: (:use (cortex world util)) rlm@198: (:import ij.process.ImageProcessor) rlm@198: (:import jme3tools.converters.ImageToAwt) rlm@198: (:import java.awt.image.BufferedImage) rlm@198: (:import com.jme3.collision.CollisionResults) rlm@198: (:import com.jme3.bounding.BoundingBox) rlm@198: (:import (com.jme3.scene Node Spatial)) rlm@198: (:import com.jme3.scene.control.AbstractControl) rlm@198: (:import (com.jme3.math Quaternion Vector3f))) rlm@198: #+end_src rlm@187: rlm@198: * Source Listing rlm@198: Full source: [[../src/cortex/sense.clj][sense.clj]] rlm@198: rlm@187: rlm@151: * COMMENT generate source rlm@151: #+begin_src clojure :tangle ../src/cortex/sense.clj rlm@197: <
> rlm@198: <> rlm@198: <> rlm@198: <> rlm@198: <> rlm@198: <> rlm@198: <> rlm@197: <> rlm@151: #+end_src