Mercurial > cortex
comparison org/sense.org @ 198:fc0bf33bded2
fleshing out prose in sense.org
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Sun, 05 Feb 2012 14:01:47 -0700 |
parents | 16cbce075a0b |
children | 305439cec54d |
comparison
equal
deleted
inserted
replaced
197:16cbce075a0b | 198:fc0bf33bded2 |
---|---|
1 #+title: General Sense/Effector Utilities | 1 #+title: |
2 #+author: Robert McIntyre | 2 #+author: Robert McIntyre |
3 #+email: rlm@mit.edu | 3 #+email: rlm@mit.edu |
4 #+description: sensory utilities | 4 #+description: sensory utilities |
5 #+keywords: simulation, jMonkeyEngine3, clojure, simulated senses | 5 #+keywords: simulation, jMonkeyEngine3, clojure, simulated senses |
6 #+SETUPFILE: ../../aurellem/org/setup.org | 6 #+SETUPFILE: ../../aurellem/org/setup.org |
7 #+INCLUDE: ../../aurellem/org/level-0.org | 7 #+INCLUDE: ../../aurellem/org/level-0.org |
8 | 8 |
9 * Namespace/Imports | |
10 #+name header | |
11 #+begin_src clojure | |
12 (ns cortex.sense | |
13 "Here are functions useful in the construction of two or more | |
14 sensors/effectors." | |
15 {:author "Robert McInytre"} | |
16 (:use (cortex world util)) | |
17 (:import ij.process.ImageProcessor) | |
18 (:import jme3tools.converters.ImageToAwt) | |
19 (:import java.awt.image.BufferedImage) | |
20 (:import com.jme3.collision.CollisionResults) | |
21 (:import com.jme3.bounding.BoundingBox) | |
22 (:import (com.jme3.scene Node Spatial)) | |
23 (:import com.jme3.scene.control.AbstractControl) | |
24 (:import (com.jme3.math Quaternion Vector3f))) | |
25 #+end_src | |
26 | 9 |
27 * Blender Utilities | 10 * Blender Utilities |
28 #+name: blender | 11 In blender, any object can be assigned an arbitray number of key-value |
12 pairs which are called "Custom Properties". These are accessable in | |
13 jMonkyeEngine when blender files are imported with the | |
14 =BlenderLoader=. =(meta-data)= extracts these properties. | |
15 | |
16 #+name: blender-1 | |
29 #+begin_src clojure | 17 #+begin_src clojure |
30 (defn meta-data | 18 (defn meta-data |
31 "Get the meta-data for a node created with blender." | 19 "Get the meta-data for a node created with blender." |
32 [blender-node key] | 20 [blender-node key] |
33 (if-let [data (.getUserData blender-node "properties")] | 21 (if-let [data (.getUserData blender-node "properties")] |
34 (.findValue data key) | 22 (.findValue data key) nil)) |
35 nil)) | 23 #+end_src |
36 | 24 |
25 Blender uses a different coordinate system than jMonkeyEngine so it | |
26 is useful to be able to convert between the two. These only come into | |
27 play when the meta-data of a node refers to a vector in the blender | |
28 coordinate system. | |
29 | |
30 #+name: blender-2 | |
31 #+begin_src clojure | |
37 (defn jme-to-blender | 32 (defn jme-to-blender |
38 "Convert from JME coordinates to Blender coordinates" | 33 "Convert from JME coordinates to Blender coordinates" |
39 [#^Vector3f in] | 34 [#^Vector3f in] |
40 (Vector3f. (.getX in) | 35 (Vector3f. (.getX in) (- (.getZ in)) (.getY in))) |
41 (- (.getZ in)) | |
42 (.getY in))) | |
43 | 36 |
44 (defn blender-to-jme | 37 (defn blender-to-jme |
45 "Convert from Blender coordinates to JME coordinates" | 38 "Convert from Blender coordinates to JME coordinates" |
46 [#^Vector3f in] | 39 [#^Vector3f in] |
47 (Vector3f. (.getX in) | 40 (Vector3f. (.getX in) (.getZ in) (- (.getY in)))) |
48 (.getZ in) | 41 #+end_src |
49 (- (.getY in)))) | 42 |
50 #+end_src | 43 * Sense Topology |
51 | 44 |
52 * Topology Related stuff | 45 Human beings are three-dimensional objects, and the nerves that |
53 #+name: topology | 46 transmit data from our various sense organs to our brain are |
47 essentially one-dimensional. This leaves up to two dimensions in which | |
48 our sensory information may flow. For example, imagine your skin: it | |
49 is a two-dimensional surface around a three-dimensional object (your | |
50 body). It has discrete touch sensors embedded at various points, and | |
51 the density of these sensors corresponds to the sensitivity of that | |
52 region of skin. Each touch sensor connects to a nerve, all of which | |
53 eventually are bundled together as they travel up the spinal cord to | |
54 the brain. Intersect the spinal nerves with a guillotining plane and | |
55 you will see all of the sensory data of the skin revealed in a roughly | |
56 circular two-dimensional image which is the cross section of the | |
57 spinal cord. Points on this image that are close together in this | |
58 circle represent touch sensors that are /probably/ close together on | |
59 the skin, although there is of course some cutting and rerangement | |
60 that has to be done to transfer the complicated surface of the skin | |
61 onto a two dimensional image. | |
62 | |
63 Most human senses consist of many discrete sensors of various | |
64 properties distributed along a surface at various densities. For | |
65 skin, it is Pacinian corpuscles, Meissner's corpuscles, Merkel's | |
66 disks, and Ruffini's endings, which detect pressure and vibration of | |
67 various intensities. For ears, it is the stereocilia distributed | |
68 along the basilar membrane inside the cochlea; each one is sensitive | |
69 to a slightly different frequency of sound. For eyes, it is rods | |
70 and cones distributed along the surface of the retina. In each case, | |
71 we can describe the sense with a surface and a distribution of sensors | |
72 along that surface. | |
73 | |
74 ** UV-maps | |
75 | |
76 Blender and jMonkeyEngine already have support for exactly this sort | |
77 of data structure because it is used to "skin" models for games. It is | |
78 called [[http://wiki.blender.org/index.php/Doc:2.6/Manual/Textures/Mapping/UV][UV-mapping]]. The three-dimensional surface is cut and smooshed | |
79 until it fits on a two-dimensional image. You paint whatever you want | |
80 on that image, and when the three-dimensional shape is rendered in a | |
81 game that image the smooshing and cutting us reversed and the image | |
82 appears on the three-dimensional object. | |
83 | |
84 To make a sense, interpret the UV-image as describing the distribution | |
85 of that senses sensors. To get different types of sensors, you can | |
86 either use a different color for each type of sensor, or use multiple | |
87 UV-maps, each labeled with that sensor type. I generally use a white | |
88 pixel to mean the presense of a sensor and a black pixel to mean the | |
89 absense of a sensor, and use one UV-map for each sensor-type within a | |
90 given sense. The paths to the images are not stored as the actual | |
91 UV-map of the blender object but are instead referenced in the | |
92 meta-data of the node. | |
93 | |
94 #+CAPTION: The UV-map for an enlongated icososphere. The white dots each represent a touch sensor. They are dense in the regions that describe the tip of the finger, and less dense along the dorsal side of the finger opposite the tip. | |
95 #+ATTR_HTML: width="300" | |
96 [[../images/finger-UV.png]] | |
97 | |
98 #+CAPTION: Ventral side of the UV-mapped finger. Notice the density of touch sensors at the tip. | |
99 #+ATTR_HTML: width="300" | |
100 [[../images/finger-1.png]] | |
101 | |
102 #+CAPTION: Side view of the UV-mapped finger. | |
103 #+ATTR_HTML: width="300" | |
104 [[../images/finger-2.png]] | |
105 | |
106 #+CAPTION: Head on view of the finger. In both the head and side views you can see the divide where the touch-sensors transition from high density to low density. | |
107 #+ATTR_HTML: width="300" | |
108 [[../images/finger-3.png]] | |
109 | |
110 The following code loads images and gets the locations of the white | |
111 pixels so that they can be used to create senses. =(load-image)= finds | |
112 images using jMonkeyEngine's asset-manager, so the image path is | |
113 expected to be relative to the =assets= directory. Thanks to Dylan | |
114 for the beautiful version of filter-pixels. | |
115 | |
116 #+name: topology-1 | |
54 #+begin_src clojure | 117 #+begin_src clojure |
55 (defn load-image | 118 (defn load-image |
56 "Load an image as a BufferedImage using the asset-manager system." | 119 "Load an image as a BufferedImage using the asset-manager system." |
57 [asset-relative-path] | 120 [asset-relative-path] |
58 (ImageToAwt/convert | 121 (ImageToAwt/convert |
64 (defn white? [rgb] | 127 (defn white? [rgb] |
65 (= (bit-and white rgb) white)) | 128 (= (bit-and white rgb) white)) |
66 | 129 |
67 (defn filter-pixels | 130 (defn filter-pixels |
68 "List the coordinates of all pixels matching pred, within the bounds | 131 "List the coordinates of all pixels matching pred, within the bounds |
69 provided. | 132 provided. If bounds are not specified then the entire image is |
133 searched. | |
70 bounds -> [x0 y0 width height]" | 134 bounds -> [x0 y0 width height]" |
71 {:author "Dylan Holmes"} | 135 {:author "Dylan Holmes"} |
72 ([pred #^BufferedImage image] | 136 ([pred #^BufferedImage image] |
73 (filter-pixels pred image [0 0 (.getWidth image) (.getHeight image)])) | 137 (filter-pixels pred image [0 0 (.getWidth image) (.getHeight image)])) |
74 ([pred #^BufferedImage image [x0 y0 width height]] | 138 ([pred #^BufferedImage image [x0 y0 width height]] |
85 "Coordinates of all the white pixels in a subset of the image." | 149 "Coordinates of all the white pixels in a subset of the image." |
86 ([#^BufferedImage image bounds] | 150 ([#^BufferedImage image bounds] |
87 (filter-pixels white? image bounds)) | 151 (filter-pixels white? image bounds)) |
88 ([#^BufferedImage image] | 152 ([#^BufferedImage image] |
89 (filter-pixels white? image))) | 153 (filter-pixels white? image))) |
90 | 154 #+end_src |
91 (defn points->image | 155 |
92 "Take a sparse collection of points and visuliaze it as a | 156 ** Topology |
93 BufferedImage." | 157 |
94 [points] | 158 Information from the senses is transmitted to the brain via bundles of |
95 (if (empty? points) | 159 axons, whether it be the optic nerve or the spinal cord. While these |
96 (BufferedImage. 1 1 BufferedImage/TYPE_BYTE_BINARY) | 160 bundles more or less perserve the overall topology of a sense's |
97 (let [xs (vec (map first points)) | 161 two-dimensional surface, they do not perserve the percise euclidean |
98 ys (vec (map second points)) | 162 distances between every sensor. =(collapse)= is here to smoosh the |
99 x0 (apply min xs) | 163 sensors described by a UV-map into a contigous region that still |
100 y0 (apply min ys) | 164 perserves the topology of the original sense. |
101 width (- (apply max xs) x0) | 165 |
102 height (- (apply max ys) y0) | 166 #+name: topology-2 |
103 image (BufferedImage. (inc width) (inc height) | 167 #+begin_src clojure |
104 BufferedImage/TYPE_INT_RGB)] | |
105 (dorun | |
106 (for [x (range (.getWidth image)) | |
107 y (range (.getHeight image))] | |
108 (.setRGB image x y 0xFF0000))) | |
109 (dorun | |
110 (for [index (range (count points))] | |
111 (.setRGB image (- (xs index) x0) (- (ys index) y0) -1))) | |
112 image))) | |
113 | |
114 (defn average [coll] | 168 (defn average [coll] |
115 (/ (reduce + coll) (count coll))) | 169 (/ (reduce + coll) (count coll))) |
116 | 170 |
117 (defn collapse-1d | 171 (defn collapse-1d |
118 "One dimensional analogue of collapse." | 172 "One dimensional analogue of collapse." |
159 (map (fn [[x y]] | 213 (map (fn [[x y]] |
160 [(- x min-x) | 214 [(- x min-x) |
161 (- y min-y)]) | 215 (- y min-y)]) |
162 squeezed))] | 216 squeezed))] |
163 relocated))) | 217 relocated))) |
164 | 218 #+end_src |
165 #+end_src | 219 * Viewing Sense Data |
166 | 220 |
167 * Node level stuff | 221 It's vital to /see/ the sense data to make sure that everything is |
168 #+name: node | 222 behaving as it should. =(view-sense)= is here so that each sense can |
169 #+begin_src clojure | 223 define its own way of turning sense-data into pictures, while the |
224 actual rendering of said pictures stays in one central place. | |
225 =(points->image)= helps senses generate a base image onto which they | |
226 can overlay actual sense data. | |
227 | |
228 #+name view-senses | |
229 #+begin_src clojure | |
230 (defn view-sense | |
231 "Take a kernel that produces a BufferedImage from some sense data | |
232 and return a function which takes a list of sense data, uses the | |
233 kernel to convert to images, and displays those images, each in | |
234 its own JFrame." | |
235 [sense-display-kernel] | |
236 (let [windows (atom [])] | |
237 (fn [data] | |
238 (if (> (count data) (count @windows)) | |
239 (reset! | |
240 windows (map (fn [_] (view-image)) (range (count data))))) | |
241 (dorun | |
242 (map | |
243 (fn [display datum] | |
244 (display (sense-display-kernel datum))) | |
245 @windows data))))) | |
246 | |
247 (defn points->image | |
248 "Take a collection of points and visuliaze it as a BufferedImage." | |
249 [points] | |
250 (if (empty? points) | |
251 (BufferedImage. 1 1 BufferedImage/TYPE_BYTE_BINARY) | |
252 (let [xs (vec (map first points)) | |
253 ys (vec (map second points)) | |
254 x0 (apply min xs) | |
255 y0 (apply min ys) | |
256 width (- (apply max xs) x0) | |
257 height (- (apply max ys) y0) | |
258 image (BufferedImage. (inc width) (inc height) | |
259 BufferedImage/TYPE_INT_RGB)] | |
260 (dorun | |
261 (for [x (range (.getWidth image)) | |
262 y (range (.getHeight image))] | |
263 (.setRGB image x y 0xFF0000))) | |
264 (dorun | |
265 (for [index (range (count points))] | |
266 (.setRGB image (- (xs index) x0) (- (ys index) y0) -1))) | |
267 image))) | |
268 | |
269 (defn gray | |
270 "Create a gray RGB pixel with R, G, and B set to num. num must be | |
271 between 0 and 255." | |
272 [num] | |
273 (+ num | |
274 (bit-shift-left num 8) | |
275 (bit-shift-left num 16))) | |
276 #+end_src | |
277 | |
278 * Building a Sense from Nodes | |
279 My method for defining senses in blender is the following: | |
280 | |
281 Senses like vision and hearing are localized to a single point | |
282 and follow a particular object around. For these: | |
283 | |
284 - Create a single top-level empty node whose name is the name of the sense | |
285 - Add empty nodes which each contain meta-data relevant | |
286 to the sense, including a UV-map describing the number/distribution | |
287 of sensors if applicipable. | |
288 - Make each empty-node the child of the top-level | |
289 node. =(sense-nodes)= below generates functions to find these children. | |
290 | |
291 For touch, store the path to the UV-map which describes touch-sensors in the | |
292 meta-data of the object to which that map applies. | |
293 | |
294 Each sense provides code that analyzes the Node structure of the | |
295 creature and creates sense-functions. They also modify the Node | |
296 structure if necessary. | |
297 | |
298 Empty nodes created in blender have no appearance or physical presence | |
299 in jMonkeyEngine, but do appear in the scene graph. Empty nodes that | |
300 represent a sense which "follows" another geometry (like eyes and | |
301 ears) follow the closest physical object. =(closest-node)= finds this | |
302 closest object given the Creature and a particular empty node. | |
303 | |
304 #+name: node-1 | |
305 #+begin_src clojure | |
306 (defn sense-nodes | |
307 "For some senses there is a special empty blender node whose | |
308 children are considered markers for an instance of that sense. This | |
309 function generates functions to find those children, given the name | |
310 of the special parent node." | |
311 [parent-name] | |
312 (fn [#^Node creature] | |
313 (if-let [sense-node (.getChild creature parent-name)] | |
314 (seq (.getChildren sense-node)) | |
315 (do (println-repl "could not find" parent-name "node") [])))) | |
316 | |
170 (defn closest-node | 317 (defn closest-node |
171 "Return the node in creature which is closest to the given node." | 318 "Return the node in creature which is closest to the given node." |
172 [#^Node creature #^Node eye] | 319 [#^Node creature #^Node empty] |
173 (loop [radius (float 0.01)] | 320 (loop [radius (float 0.01)] |
174 (let [results (CollisionResults.)] | 321 (let [results (CollisionResults.)] |
175 (.collideWith | 322 (.collideWith |
176 creature | 323 creature |
177 (BoundingBox. (.getWorldTranslation eye) | 324 (BoundingBox. (.getWorldTranslation empty) |
178 radius radius radius) | 325 radius radius radius) |
179 results) | 326 results) |
180 (if-let [target (first results)] | 327 (if-let [target (first results)] |
181 (.getGeometry target) | 328 (.getGeometry target) |
182 (recur (float (* 2 radius))))))) | 329 (recur (float (* 2 radius))))))) |
183 | 330 |
331 (defn world-to-local | |
332 "Convert the world coordinates into coordinates relative to the | |
333 object (i.e. local coordinates), taking into account the rotation | |
334 of object." | |
335 [#^Spatial object world-coordinate] | |
336 (.worldToLocal object world-coordinate nil)) | |
337 | |
338 (defn local-to-world | |
339 "Convert the local coordinates into world relative coordinates" | |
340 [#^Spatial object local-coordinate] | |
341 (.localToWorld object local-coordinate nil)) | |
342 #+end_src | |
343 | |
344 =(bind-sense)= binds either a Camera or a Listener object to any | |
345 object so that they will follow that object no matter how it | |
346 moves. Here is some example code which shows a camera bound to a blue | |
347 box as it is buffeted by white cannonballs. | |
348 | |
349 #+name: node-2 | |
350 #+begin_src clojure | |
184 (defn bind-sense | 351 (defn bind-sense |
185 "Bind the sense to the Spatial such that it will maintain its | 352 "Bind the sense to the Spatial such that it will maintain its |
186 current position relative to the Spatial no matter how the spatial | 353 current position relative to the Spatial no matter how the spatial |
187 moves. 'sense can be either a Camera or Listener object." | 354 moves. 'sense can be either a Camera or Listener object." |
188 [#^Spatial obj sense] | 355 [#^Spatial obj sense] |
203 (.getWorldTranslation obj))) | 370 (.getWorldTranslation obj))) |
204 (.setRotation | 371 (.setRotation |
205 sense | 372 sense |
206 (.mult total-rotation initial-sense-rotation)))) | 373 (.mult total-rotation initial-sense-rotation)))) |
207 (controlRender [_ _]))))) | 374 (controlRender [_ _]))))) |
208 | 375 #+end_src |
209 (defn world-to-local | 376 |
210 "Convert the world coordinates into coordinates relative to the | 377 |
211 object (i.e. local coordinates), taking into account the rotation | 378 |
212 of object." | 379 * Bookkeeping |
213 [#^Spatial object world-coordinate] | 380 Here is the header for this namespace, included for completness. |
214 (.worldToLocal object world-coordinate nil)) | 381 #+name header |
215 | 382 #+begin_src clojure |
216 (defn local-to-world | 383 (ns cortex.sense |
217 "Convert the local coordinates into world relative coordinates" | 384 "Here are functions useful in the construction of two or more |
218 [#^Spatial object local-coordinate] | 385 sensors/effectors." |
219 (.localToWorld object local-coordinate nil)) | 386 {:author "Robert McInytre"} |
220 | 387 (:use (cortex world util)) |
221 (defn sense-nodes | 388 (:import ij.process.ImageProcessor) |
222 "For each sense there is a special blender node whose children are | 389 (:import jme3tools.converters.ImageToAwt) |
223 considered markers for an instance of a that sense. This function | 390 (:import java.awt.image.BufferedImage) |
224 generates functions to find those children, given the name of the | 391 (:import com.jme3.collision.CollisionResults) |
225 special parent node." | 392 (:import com.jme3.bounding.BoundingBox) |
226 [parent-name] | 393 (:import (com.jme3.scene Node Spatial)) |
227 (fn [#^Node creature] | 394 (:import com.jme3.scene.control.AbstractControl) |
228 (if-let [sense-node (.getChild creature parent-name)] | 395 (:import (com.jme3.math Quaternion Vector3f))) |
229 (seq (.getChildren sense-node)) | 396 #+end_src |
230 (do (println-repl "could not find" parent-name "node") [])))) | 397 |
231 #+end_src | 398 * Source Listing |
232 | 399 Full source: [[../src/cortex/sense.clj][sense.clj]] |
233 * Viewing Senses | 400 |
234 #+name view-senses | |
235 #+begin_src clojure | |
236 (defn view-sense | |
237 "Take a kernel that produces a BufferedImage from some sense data | |
238 and return a function which takes a list of sense data, uses the | |
239 kernem to convert to images, and displays those images, each in | |
240 its own JFrame." | |
241 [sense-display-kernel] | |
242 (let [windows (atom [])] | |
243 (fn [data] | |
244 (if (> (count data) (count @windows)) | |
245 (reset! | |
246 windows (map (fn [_] (view-image)) (range (count data))))) | |
247 (dorun | |
248 (map | |
249 (fn [display datum] | |
250 (display (sense-display-kernel datum))) | |
251 @windows data))))) | |
252 | |
253 (defn gray | |
254 "Create a gray RGB pixel with R, G, and B set to num. num must be | |
255 between 0 and 255." | |
256 [num] | |
257 (+ num | |
258 (bit-shift-left num 8) | |
259 (bit-shift-left num 16))) | |
260 #+end_src | |
261 | 401 |
262 * COMMENT generate source | 402 * COMMENT generate source |
263 #+begin_src clojure :tangle ../src/cortex/sense.clj | 403 #+begin_src clojure :tangle ../src/cortex/sense.clj |
264 <<header>> | 404 <<header>> |
265 <<blender>> | 405 <<blender-1>> |
266 <<topology>> | 406 <<blender-2>> |
267 <<node>> | 407 <<topology-1>> |
408 <<topology-2>> | |
409 <<node-1>> | |
410 <<node-2>> | |
268 <<view-senses>> | 411 <<view-senses>> |
269 #+end_src | 412 #+end_src |