annotate org/touch.org @ 336:70469ff8eb56

more Viewable stuff.
author Robert McIntyre <rlm@mit.edu>
date Fri, 20 Jul 2012 13:04:25 -0500
parents 698d48b91cd5
children 4f5a5d5f1613
rev   line source
rlm@37 1 #+title: Simulated Sense of Touch
rlm@0 2 #+author: Robert McIntyre
rlm@0 3 #+email: rlm@mit.edu
rlm@37 4 #+description: Simulated touch for AI research using JMonkeyEngine and clojure.
rlm@37 5 #+keywords: simulation, tactile sense, jMonkeyEngine3, clojure
rlm@4 6 #+SETUPFILE: ../../aurellem/org/setup.org
rlm@4 7 #+INCLUDE: ../../aurellem/org/level-0.org
rlm@0 8
rlm@37 9 * Touch
rlm@0 10
rlm@226 11 Touch is critical to navigation and spatial reasoning and as such I
rlm@226 12 need a simulated version of it to give to my AI creatures.
rlm@0 13
rlm@306 14 Human skin has a wide array of touch sensors, each of which specialize
rlm@228 15 in detecting different vibrational modes and pressures. These sensors
rlm@228 16 can integrate a vast expanse of skin (i.e. your entire palm), or a
rlm@228 17 tiny patch of skin at the tip of your finger. The hairs of the skin
rlm@228 18 help detect objects before they even come into contact with the skin
rlm@247 19 proper.
rlm@228 20
rlm@248 21 However, touch in my simulated world can not exactly correspond to
rlm@248 22 human touch because my creatures are made out of completely rigid
rlm@248 23 segments that don't deform like human skin.
rlm@248 24
rlm@228 25 Instead of measuring deformation or vibration, I surround each rigid
rlm@247 26 part with a plenitude of hair-like objects (/feelers/) which do not
rlm@247 27 interact with the physical world. Physical objects can pass through
rlm@248 28 them with no effect. The feelers are able to tell when other objects
rlm@248 29 pass through them, and they constantly report how much of their extent
rlm@248 30 is covered. So even though the creature's body parts do not deform,
rlm@248 31 the feelers create a margin around those body parts which achieves a
rlm@248 32 sense of touch which is a hybrid between a human's sense of
rlm@248 33 deformation and sense from hairs.
rlm@228 34
rlm@306 35 Implementing touch in jMonkeyEngine follows a different technical route
rlm@228 36 than vision and hearing. Those two senses piggybacked off
rlm@228 37 jMonkeyEngine's 3D audio and video rendering subsystems. To simulate
rlm@247 38 touch, I use jMonkeyEngine's physics system to execute many small
rlm@247 39 collision detections, one for each feeler. The placement of the
rlm@247 40 feelers is determined by a UV-mapped image which shows where each
rlm@247 41 feeler should be on the 3D surface of the body.
rlm@228 42
rlm@229 43 * Defining Touch Meta-Data in Blender
rlm@229 44
rlm@245 45 Each geometry can have a single UV map which describes the position of
rlm@247 46 the feelers which will constitute its sense of touch. This image path
rlm@245 47 is stored under the "touch" key. The image itself is black and white,
rlm@247 48 with black meaning a feeler length of 0 (no feeler is present) and
rlm@247 49 white meaning a feeler length of =scale=, which is a float stored
rlm@247 50 under the key "scale".
rlm@229 51
rlm@231 52 #+name: meta-data
rlm@0 53 #+begin_src clojure
rlm@229 54 (defn tactile-sensor-profile
rlm@229 55 "Return the touch-sensor distribution image in BufferedImage format,
rlm@229 56 or nil if it does not exist."
rlm@229 57 [#^Geometry obj]
rlm@229 58 (if-let [image-path (meta-data obj "touch")]
rlm@229 59 (load-image image-path)))
rlm@233 60
rlm@233 61 (defn tactile-scale
rlm@247 62 "Return the length of each feeler. Default scale is 0.01
rlm@247 63 jMonkeyEngine units."
rlm@233 64 [#^Geometry obj]
rlm@233 65 (if-let [scale (meta-data obj "scale")]
rlm@233 66 scale 0.1))
rlm@228 67 #+end_src
rlm@156 68
rlm@246 69 Here is an example of a UV-map which specifies the position of touch
rlm@247 70 sensors along the surface of the upper segment of the worm.
rlm@229 71
rlm@246 72 #+attr_html: width=755
rlm@246 73 #+caption: This is the tactile-sensor-profile for the upper segment of the worm. It defines regions of high touch sensitivity (where there are many white pixels) and regions of low sensitivity (where white pixels are sparse).
rlm@246 74 [[../images/finger-UV.png]]
rlm@234 75
rlm@247 76 * Implementation Summary
rlm@247 77
rlm@247 78 To simulate touch there are three conceptual steps. For each solid
rlm@247 79 object in the creature, you first have to get UV image and scale
rlm@306 80 parameter which define the position and length of the feelers. Then,
rlm@247 81 you use the triangles which compose the mesh and the UV data stored in
rlm@247 82 the mesh to determine the world-space position and orientation of each
rlm@248 83 feeler. Then once every frame, update these positions and orientations
rlm@248 84 to match the current position and orientation of the object, and use
rlm@247 85 physics collision detection to gather tactile data.
rlm@238 86
rlm@247 87 Extracting the meta-data has already been described. The third step,
rlm@273 88 physics collision detection, is handled in =touch-kernel=.
rlm@247 89 Translating the positions and orientations of the feelers from the
rlm@248 90 UV-map to world-space is itself a three-step process.
rlm@239 91
rlm@238 92 - Find the triangles which make up the mesh in pixel-space and in
rlm@273 93 world-space. =triangles= =pixel-triangles=.
rlm@239 94
rlm@247 95 - Find the coordinates of each feeler in world-space. These are the
rlm@273 96 origins of the feelers. =feeler-origins=.
rlm@239 97
rlm@238 98 - Calculate the normals of the triangles in world space, and add
rlm@238 99 them to each of the origins of the feelers. These are the
rlm@273 100 normalized coordinates of the tips of the feelers. =feeler-tips=.
rlm@239 101
rlm@247 102 * Triangle Math
rlm@306 103 ** Shrapnel Conversion Functions
rlm@239 104
rlm@247 105 #+name: triangles-1
rlm@247 106 #+begin_src clojure
rlm@247 107 (defn vector3f-seq [#^Vector3f v]
rlm@247 108 [(.getX v) (.getY v) (.getZ v)])
rlm@247 109
rlm@247 110 (defn triangle-seq [#^Triangle tri]
rlm@247 111 [(vector3f-seq (.get1 tri))
rlm@247 112 (vector3f-seq (.get2 tri))
rlm@247 113 (vector3f-seq (.get3 tri))])
rlm@247 114
rlm@247 115 (defn ->vector3f
rlm@247 116 ([coords] (Vector3f. (nth coords 0 0)
rlm@247 117 (nth coords 1 0)
rlm@247 118 (nth coords 2 0))))
rlm@247 119
rlm@247 120 (defn ->triangle [points]
rlm@247 121 (apply #(Triangle. %1 %2 %3) (map ->vector3f points)))
rlm@247 122 #+end_src
rlm@247 123
rlm@306 124 It is convenient to treat a =Triangle= as a vector of vectors, and a
rlm@255 125 =Vector2f= or =Vector3f= as vectors of floats. (->vector3f) and
rlm@273 126 (->triangle) undo the operations of =vector3f-seq= and
rlm@273 127 =triangle-seq=. If these classes implemented =Iterable= then =seq=
rlm@306 128 would work on them automatically.
rlm@248 129
rlm@247 130 ** Decomposing a 3D shape into Triangles
rlm@247 131
rlm@248 132 The rigid objects which make up a creature have an underlying
rlm@247 133 =Geometry=, which is a =Mesh= plus a =Material= and other important
rlm@248 134 data involved with displaying the object.
rlm@247 135
rlm@247 136 A =Mesh= is composed of =Triangles=, and each =Triangle= has three
rlm@306 137 vertices which have coordinates in world space and UV space.
rlm@247 138
rlm@273 139 Here, =triangles= gets all the world-space triangles which compose a
rlm@273 140 mesh, while =pixel-triangles= gets those same triangles expressed in
rlm@247 141 pixel coordinates (which are UV coordinates scaled to fit the height
rlm@247 142 and width of the UV image).
rlm@247 143
rlm@247 144 #+name: triangles-2
rlm@247 145 #+begin_src clojure
rlm@247 146 (in-ns 'cortex.touch)
rlm@247 147 (defn triangle
rlm@247 148 "Get the triangle specified by triangle-index from the mesh."
rlm@247 149 [#^Geometry geo triangle-index]
rlm@247 150 (triangle-seq
rlm@247 151 (let [scratch (Triangle.)]
rlm@247 152 (.getTriangle (.getMesh geo) triangle-index scratch) scratch)))
rlm@247 153
rlm@247 154 (defn triangles
rlm@247 155 "Return a sequence of all the Triangles which compose a given
rlm@247 156 Geometry."
rlm@247 157 [#^Geometry geo]
rlm@247 158 (map (partial triangle geo) (range (.getTriangleCount (.getMesh geo)))))
rlm@247 159
rlm@247 160 (defn triangle-vertex-indices
rlm@247 161 "Get the triangle vertex indices of a given triangle from a given
rlm@247 162 mesh."
rlm@247 163 [#^Mesh mesh triangle-index]
rlm@247 164 (let [indices (int-array 3)]
rlm@247 165 (.getTriangle mesh triangle-index indices)
rlm@247 166 (vec indices)))
rlm@247 167
rlm@247 168 (defn vertex-UV-coord
rlm@247 169 "Get the UV-coordinates of the vertex named by vertex-index"
rlm@247 170 [#^Mesh mesh vertex-index]
rlm@247 171 (let [UV-buffer
rlm@247 172 (.getData
rlm@247 173 (.getBuffer
rlm@247 174 mesh
rlm@247 175 VertexBuffer$Type/TexCoord))]
rlm@247 176 [(.get UV-buffer (* vertex-index 2))
rlm@247 177 (.get UV-buffer (+ 1 (* vertex-index 2)))]))
rlm@247 178
rlm@247 179 (defn pixel-triangle [#^Geometry geo image index]
rlm@247 180 (let [mesh (.getMesh geo)
rlm@247 181 width (.getWidth image)
rlm@247 182 height (.getHeight image)]
rlm@247 183 (vec (map (fn [[u v]] (vector (* width u) (* height v)))
rlm@247 184 (map (partial vertex-UV-coord mesh)
rlm@247 185 (triangle-vertex-indices mesh index))))))
rlm@247 186
rlm@248 187 (defn pixel-triangles
rlm@248 188 "The pixel-space triangles of the Geometry, in the same order as
rlm@248 189 (triangles geo)"
rlm@248 190 [#^Geometry geo image]
rlm@248 191 (let [height (.getHeight image)
rlm@248 192 width (.getWidth image)]
rlm@248 193 (map (partial pixel-triangle geo image)
rlm@248 194 (range (.getTriangleCount (.getMesh geo))))))
rlm@247 195 #+end_src
rlm@247 196 ** The Affine Transform from one Triangle to Another
rlm@247 197
rlm@273 198 =pixel-triangles= gives us the mesh triangles expressed in pixel
rlm@273 199 coordinates and =triangles= gives us the mesh triangles expressed in
rlm@247 200 world coordinates. The tactile-sensor-profile gives the position of
rlm@248 201 each feeler in pixel-space. In order to convert pixel-space
rlm@247 202 coordinates into world-space coordinates we need something that takes
rlm@247 203 coordinates on the surface of one triangle and gives the corresponding
rlm@247 204 coordinates on the surface of another triangle.
rlm@247 205
rlm@247 206 Triangles are [[http://mathworld.wolfram.com/AffineTransformation.html ][affine]], which means any triangle can be transformed into
rlm@247 207 any other by a combination of translation, scaling, and
rlm@248 208 rotation. The affine transformation from one triangle to another
rlm@247 209 is readily computable if the triangle is expressed in terms of a $4x4$
rlm@247 210 matrix.
rlm@247 211
rlm@247 212 \begin{bmatrix}
rlm@247 213 x_1 & x_2 & x_3 & n_x \\
rlm@247 214 y_1 & y_2 & y_3 & n_y \\
rlm@247 215 z_1 & z_2 & z_3 & n_z \\
rlm@247 216 1 & 1 & 1 & 1
rlm@247 217 \end{bmatrix}
rlm@247 218
rlm@306 219 Here, the first three columns of the matrix are the vertices of the
rlm@247 220 triangle. The last column is the right-handed unit normal of the
rlm@247 221 triangle.
rlm@247 222
rlm@247 223 With two triangles $T_{1}$ and $T_{2}$ each expressed as a matrix like
rlm@247 224 above, the affine transform from $T_{1}$ to $T_{2}$ is
rlm@247 225
rlm@247 226 $T_{2}T_{1}^{-1}$
rlm@247 227
rlm@306 228 The clojure code below recapitulates the formulas above, using
rlm@248 229 jMonkeyEngine's =Matrix4f= objects, which can describe any affine
rlm@248 230 transformation.
rlm@247 231
rlm@247 232 #+name: triangles-3
rlm@247 233 #+begin_src clojure
rlm@247 234 (in-ns 'cortex.touch)
rlm@247 235
rlm@247 236 (defn triangle->matrix4f
rlm@247 237 "Converts the triangle into a 4x4 matrix: The first three columns
rlm@247 238 contain the vertices of the triangle; the last contains the unit
rlm@247 239 normal of the triangle. The bottom row is filled with 1s."
rlm@247 240 [#^Triangle t]
rlm@247 241 (let [mat (Matrix4f.)
rlm@247 242 [vert-1 vert-2 vert-3]
rlm@247 243 ((comp vec map) #(.get t %) (range 3))
rlm@247 244 unit-normal (do (.calculateNormal t)(.getNormal t))
rlm@247 245 vertices [vert-1 vert-2 vert-3 unit-normal]]
rlm@247 246 (dorun
rlm@247 247 (for [row (range 4) col (range 3)]
rlm@247 248 (do
rlm@247 249 (.set mat col row (.get (vertices row) col))
rlm@247 250 (.set mat 3 row 1)))) mat))
rlm@247 251
rlm@247 252 (defn triangles->affine-transform
rlm@247 253 "Returns the affine transformation that converts each vertex in the
rlm@247 254 first triangle into the corresponding vertex in the second
rlm@247 255 triangle."
rlm@247 256 [#^Triangle tri-1 #^Triangle tri-2]
rlm@247 257 (.mult
rlm@247 258 (triangle->matrix4f tri-2)
rlm@247 259 (.invert (triangle->matrix4f tri-1))))
rlm@247 260 #+end_src
rlm@247 261 ** Triangle Boundaries
rlm@247 262
rlm@247 263 For efficiency's sake I will divide the tactile-profile image into
rlm@247 264 small squares which inscribe each pixel-triangle, then extract the
rlm@247 265 points which lie inside the triangle and map them to 3D-space using
rlm@273 266 =triangle-transform= above. To do this I need a function,
rlm@273 267 =convex-bounds= which finds the smallest box which inscribes a 2D
rlm@247 268 triangle.
rlm@247 269
rlm@273 270 =inside-triangle?= determines whether a point is inside a triangle
rlm@247 271 in 2D pixel-space.
rlm@247 272
rlm@247 273 #+name: triangles-4
rlm@247 274 #+begin_src clojure
rlm@247 275 (defn convex-bounds
rlm@247 276 "Returns the smallest square containing the given vertices, as a
rlm@247 277 vector of integers [left top width height]."
rlm@247 278 [verts]
rlm@247 279 (let [xs (map first verts)
rlm@247 280 ys (map second verts)
rlm@247 281 x0 (Math/floor (apply min xs))
rlm@247 282 y0 (Math/floor (apply min ys))
rlm@247 283 x1 (Math/ceil (apply max xs))
rlm@247 284 y1 (Math/ceil (apply max ys))]
rlm@247 285 [x0 y0 (- x1 x0) (- y1 y0)]))
rlm@247 286
rlm@247 287 (defn same-side?
rlm@247 288 "Given the points p1 and p2 and the reference point ref, is point p
rlm@247 289 on the same side of the line that goes through p1 and p2 as ref is?"
rlm@247 290 [p1 p2 ref p]
rlm@247 291 (<=
rlm@247 292 0
rlm@247 293 (.dot
rlm@247 294 (.cross (.subtract p2 p1) (.subtract p p1))
rlm@247 295 (.cross (.subtract p2 p1) (.subtract ref p1)))))
rlm@247 296
rlm@247 297 (defn inside-triangle?
rlm@247 298 "Is the point inside the triangle?"
rlm@247 299 {:author "Dylan Holmes"}
rlm@247 300 [#^Triangle tri #^Vector3f p]
rlm@247 301 (let [[vert-1 vert-2 vert-3] [(.get1 tri) (.get2 tri) (.get3 tri)]]
rlm@247 302 (and
rlm@247 303 (same-side? vert-1 vert-2 vert-3 p)
rlm@247 304 (same-side? vert-2 vert-3 vert-1 p)
rlm@247 305 (same-side? vert-3 vert-1 vert-2 p))))
rlm@247 306 #+end_src
rlm@247 307
rlm@247 308 * Feeler Coordinates
rlm@247 309
rlm@247 310 The triangle-related functions above make short work of calculating
rlm@247 311 the positions and orientations of each feeler in world-space.
rlm@247 312
rlm@247 313 #+name: sensors
rlm@247 314 #+begin_src clojure
rlm@247 315 (in-ns 'cortex.touch)
rlm@247 316
rlm@247 317 (defn feeler-pixel-coords
rlm@247 318 "Returns the coordinates of the feelers in pixel space in lists, one
rlm@247 319 list for each triangle, ordered in the same way as (triangles) and
rlm@247 320 (pixel-triangles)."
rlm@247 321 [#^Geometry geo image]
rlm@247 322 (map
rlm@247 323 (fn [pixel-triangle]
rlm@247 324 (filter
rlm@247 325 (fn [coord]
rlm@247 326 (inside-triangle? (->triangle pixel-triangle)
rlm@247 327 (->vector3f coord)))
rlm@247 328 (white-coordinates image (convex-bounds pixel-triangle))))
rlm@247 329 (pixel-triangles geo image)))
rlm@247 330
rlm@247 331 (defn feeler-world-coords
rlm@247 332 "Returns the coordinates of the feelers in world space in lists, one
rlm@247 333 list for each triangle, ordered in the same way as (triangles) and
rlm@247 334 (pixel-triangles)."
rlm@247 335 [#^Geometry geo image]
rlm@247 336 (let [transforms
rlm@247 337 (map #(triangles->affine-transform
rlm@247 338 (->triangle %1) (->triangle %2))
rlm@247 339 (pixel-triangles geo image)
rlm@247 340 (triangles geo))]
rlm@247 341 (map (fn [transform coords]
rlm@247 342 (map #(.mult transform (->vector3f %)) coords))
rlm@247 343 transforms (feeler-pixel-coords geo image))))
rlm@247 344
rlm@247 345 (defn feeler-origins
rlm@247 346 "The world space coordinates of the root of each feeler."
rlm@247 347 [#^Geometry geo image]
rlm@247 348 (reduce concat (feeler-world-coords geo image)))
rlm@247 349
rlm@247 350 (defn feeler-tips
rlm@247 351 "The world space coordinates of the tip of each feeler."
rlm@247 352 [#^Geometry geo image]
rlm@247 353 (let [world-coords (feeler-world-coords geo image)
rlm@247 354 normals
rlm@247 355 (map
rlm@247 356 (fn [triangle]
rlm@247 357 (.calculateNormal triangle)
rlm@247 358 (.clone (.getNormal triangle)))
rlm@247 359 (map ->triangle (triangles geo)))]
rlm@247 360
rlm@247 361 (mapcat (fn [origins normal]
rlm@247 362 (map #(.add % normal) origins))
rlm@247 363 world-coords normals)))
rlm@247 364
rlm@247 365 (defn touch-topology
rlm@247 366 "touch-topology? is not a function."
rlm@247 367 [#^Geometry geo image]
rlm@247 368 (collapse (reduce concat (feeler-pixel-coords geo image))))
rlm@247 369 #+end_src
rlm@247 370 * Simulated Touch
rlm@247 371
rlm@273 372 =touch-kernel= generates functions to be called from within a
rlm@247 373 simulation that perform the necessary physics collisions to collect
rlm@273 374 tactile data, and =touch!= recursively applies it to every node in
rlm@247 375 the creature.
rlm@238 376
rlm@233 377 #+name: kernel
rlm@233 378 #+begin_src clojure
rlm@233 379 (in-ns 'cortex.touch)
rlm@233 380
rlm@244 381 (defn set-ray [#^Ray ray #^Matrix4f transform
rlm@244 382 #^Vector3f origin #^Vector3f tip]
rlm@306 383 ;; Doing everything locally reduces garbage collection by enough to
rlm@243 384 ;; be worth it.
rlm@243 385 (.mult transform origin (.getOrigin ray))
rlm@243 386 (.mult transform tip (.getDirection ray))
rlm@249 387 (.subtractLocal (.getDirection ray) (.getOrigin ray))
rlm@249 388 (.normalizeLocal (.getDirection ray)))
rlm@242 389
rlm@249 390 (import com.jme3.math.FastMath)
rlm@249 391
rlm@249 392
rlm@233 393 (defn touch-kernel
rlm@234 394 "Constructs a function which will return tactile sensory data from
rlm@234 395 'geo when called from inside a running simulation"
rlm@234 396 [#^Geometry geo]
rlm@243 397 (if-let
rlm@243 398 [profile (tactile-sensor-profile geo)]
rlm@243 399 (let [ray-reference-origins (feeler-origins geo profile)
rlm@243 400 ray-reference-tips (feeler-tips geo profile)
rlm@244 401 ray-length (tactile-scale geo)
rlm@243 402 current-rays (map (fn [_] (Ray.)) ray-reference-origins)
rlm@249 403 topology (touch-topology geo profile)
rlm@249 404 correction (float (* ray-length -0.2))]
rlm@249 405
rlm@249 406 ;; slight tolerance for very close collisions.
rlm@249 407 (dorun
rlm@249 408 (map (fn [origin tip]
rlm@249 409 (.addLocal origin (.mult (.subtract tip origin)
rlm@249 410 correction)))
rlm@249 411 ray-reference-origins ray-reference-tips))
rlm@244 412 (dorun (map #(.setLimit % ray-length) current-rays))
rlm@233 413 (fn [node]
rlm@243 414 (let [transform (.getWorldMatrix geo)]
rlm@243 415 (dorun
rlm@244 416 (map (fn [ray ref-origin ref-tip]
rlm@244 417 (set-ray ray transform ref-origin ref-tip))
rlm@243 418 current-rays ray-reference-origins
rlm@244 419 ray-reference-tips))
rlm@233 420 (vector
rlm@243 421 topology
rlm@233 422 (vec
rlm@243 423 (for [ray current-rays]
rlm@233 424 (do
rlm@233 425 (let [results (CollisionResults.)]
rlm@233 426 (.collideWith node ray results)
rlm@233 427 (let [touch-objects
rlm@233 428 (filter #(not (= geo (.getGeometry %)))
rlm@249 429 results)
rlm@249 430 limit (.getLimit ray)]
rlm@233 431 [(if (empty? touch-objects)
rlm@249 432 limit
rlm@249 433 (let [response
rlm@249 434 (apply min (map #(.getDistance %)
rlm@249 435 touch-objects))]
rlm@249 436 (FastMath/clamp
rlm@249 437 (float
rlm@283 438 (if (> response limit) (float 0.0)
rlm@249 439 (+ response correction)))
rlm@249 440 (float 0.0)
rlm@249 441 limit)))
rlm@249 442 limit])))))))))))
rlm@233 443
rlm@233 444 (defn touch!
rlm@233 445 "Endow the creature with the sense of touch. Returns a sequence of
rlm@306 446 functions, one for each body part with a tactile-sensor-profile,
rlm@233 447 each of which when called returns sensory data for that body part."
rlm@233 448 [#^Node creature]
rlm@233 449 (filter
rlm@233 450 (comp not nil?)
rlm@233 451 (map touch-kernel
rlm@233 452 (filter #(isa? (class %) Geometry)
rlm@233 453 (node-seq creature)))))
rlm@233 454 #+end_src
rlm@233 455
rlm@249 456 #+results: kernel
rlm@249 457 : #'cortex.touch/touch!
rlm@249 458
rlm@247 459 * Visualizing Touch
rlm@238 460
rlm@249 461 Each feeler is represented in the image as a single pixel. The
rlm@306 462 greyscale value of each pixel represents how deep the feeler
rlm@249 463 represented by that pixel is inside another object. Black means that
rlm@249 464 nothing is touching the feeler, while white means that the feeler is
rlm@249 465 completely inside another object, which is presumably flush with the
rlm@249 466 surface of the triangle from which the feeler originates.
rlm@249 467
rlm@233 468 #+name: visualization
rlm@233 469 #+begin_src clojure
rlm@233 470 (in-ns 'cortex.touch)
rlm@233 471
rlm@233 472 (defn touch->gray
rlm@306 473 "Convert a pair of [distance, max-distance] into a gray-scale pixel."
rlm@233 474 [distance max-distance]
rlm@245 475 (gray (- 255 (rem (int (* 255 (/ distance max-distance))) 256))))
rlm@233 476
rlm@233 477 (defn view-touch
rlm@245 478 "Creates a function which accepts a list of touch sensor-data and
rlm@233 479 displays each element to the screen."
rlm@233 480 []
rlm@233 481 (view-sense
rlm@246 482 (fn [[coords sensor-data]]
rlm@233 483 (let [image (points->image coords)]
rlm@233 484 (dorun
rlm@233 485 (for [i (range (count coords))]
rlm@250 486 (.setRGB image ((coords i) 0) ((coords i) 1)
rlm@250 487 (apply touch->gray (sensor-data i)))))
rlm@283 488 image))))
rlm@233 489 #+end_src
rlm@249 490
rlm@249 491 #+results: visualization
rlm@249 492 : #'cortex.touch/view-touch
rlm@249 493
rlm@250 494 * Basic Test of Touch
rlm@249 495
rlm@249 496 The worm's sense of touch is a bit complicated, so for this basic test
rlm@249 497 I'll use a new creature --- a simple cube which has touch sensors
rlm@249 498 evenly distributed along each of its sides.
rlm@249 499
rlm@253 500 #+name: test-touch-0
rlm@249 501 #+begin_src clojure
rlm@249 502 (in-ns 'cortex.test.touch)
rlm@249 503
rlm@249 504 (defn touch-cube []
rlm@249 505 (load-blender-model "Models/test-touch/touch-cube.blend"))
rlm@249 506 #+end_src
rlm@249 507
rlm@253 508 ** The Touch Cube
rlm@249 509 #+begin_html
rlm@249 510 <div class="figure">
rlm@249 511 <center>
rlm@249 512 <video controls="controls" width="500">
rlm@249 513 <source src="../video/touch-cube.ogg" type="video/ogg"
rlm@249 514 preload="none" poster="../images/aurellem-1280x480.png" />
rlm@249 515 </video>
rlm@309 516 <br> <a href="http://youtu.be/aEao4m8meAI"> YouTube </a>
rlm@249 517 </center>
rlm@249 518 <p>A simple creature with evenly distributed touch sensors.</p>
rlm@249 519 </div>
rlm@249 520 #+end_html
rlm@249 521
rlm@249 522 The tactile-sensor-profile image for this simple creature looks like
rlm@249 523 this:
rlm@249 524
rlm@249 525 #+attr_html: width=500
rlm@249 526 #+caption: The distribution of feelers along the touch-cube. The colors of the faces are irrelevant; only the white pixels specify feelers.
rlm@249 527 [[../images/touch-profile.png]]
rlm@249 528
rlm@253 529 #+name: test-touch-1
rlm@249 530 #+begin_src clojure
rlm@249 531 (in-ns 'cortex.test.touch)
rlm@249 532
rlm@249 533 (defn test-basic-touch
rlm@321 534 "Testing touch:
rlm@321 535 You should see a cube fall onto a table. There is a cross-shaped
rlm@321 536 display which reports the cube's sensation of touch. This display
rlm@321 537 should change when the cube hits the table, and whenever you hit
rlm@321 538 the cube with balls.
rlm@321 539
rlm@321 540 Keys:
rlm@321 541 <space> : fire ball"
rlm@249 542 ([] (test-basic-touch false))
rlm@249 543 ([record?]
rlm@249 544 (let [the-cube (doto (touch-cube) (body!))
rlm@249 545 touch (touch! the-cube)
rlm@249 546 touch-display (view-touch)]
rlm@250 547 (world
rlm@250 548 (nodify [the-cube
rlm@250 549 (box 10 1 10 :position (Vector3f. 0 -10 0)
rlm@250 550 :color ColorRGBA/Gray :mass 0)])
rlm@250 551
rlm@250 552 standard-debug-controls
rlm@250 553
rlm@250 554 (fn [world]
rlm@250 555 (if record?
rlm@250 556 (Capture/captureVideo
rlm@250 557 world
rlm@250 558 (File. "/home/r/proj/cortex/render/touch-cube/main-view/")))
rlm@250 559 (speed-up world)
rlm@250 560 (light-up-everything world))
rlm@250 561
rlm@250 562 (fn [world tpf]
rlm@250 563 (touch-display
rlm@250 564 (map #(% (.getRootNode world)) touch)
rlm@250 565 (if record?
rlm@250 566 (File. "/home/r/proj/cortex/render/touch-cube/touch/"))))))))
rlm@250 567 #+end_src
rlm@249 568
rlm@250 569 ** Basic Touch Demonstration
rlm@249 570
rlm@250 571 #+begin_html
rlm@250 572 <div class="figure">
rlm@250 573 <center>
rlm@250 574 <video controls="controls" width="755">
rlm@250 575 <source src="../video/basic-touch.ogg" type="video/ogg"
rlm@250 576 preload="none" poster="../images/aurellem-1280x480.png" />
rlm@250 577 </video>
rlm@309 578 <br> <a href="http://youtu.be/8xNEtD-a8f0"> YouTube </a>
rlm@250 579 </center>
rlm@250 580 <p>The simple creature responds to touch.</p>
rlm@250 581 </div>
rlm@250 582 #+end_html
rlm@249 583
rlm@250 584 ** Generating the Basic Touch Video
rlm@253 585 #+name: magick4
rlm@250 586 #+begin_src clojure
rlm@250 587 (ns cortex.video.magick4
rlm@250 588 (:import java.io.File)
rlm@316 589 (:use clojure.java.shell))
rlm@250 590
rlm@250 591 (defn images [path]
rlm@250 592 (sort (rest (file-seq (File. path)))))
rlm@250 593
rlm@250 594 (def base "/home/r/proj/cortex/render/touch-cube/")
rlm@250 595
rlm@250 596 (defn pics [file]
rlm@250 597 (images (str base file)))
rlm@250 598
rlm@250 599 (defn combine-images []
rlm@250 600 (let [main-view (pics "main-view")
rlm@250 601 touch (pics "touch/0")
rlm@250 602 background (repeat 9001 (File. (str base "background.png")))
rlm@250 603 targets (map
rlm@250 604 #(File. (str base "out/" (format "%07d.png" %)))
rlm@250 605 (range 0 (count main-view)))]
rlm@250 606 (dorun
rlm@250 607 (pmap
rlm@250 608 (comp
rlm@250 609 (fn [[background main-view touch target]]
rlm@250 610 (println target)
rlm@250 611 (sh "convert"
rlm@250 612 touch
rlm@250 613 "-resize" "x300"
rlm@250 614 "-rotate" "180"
rlm@250 615 background
rlm@250 616 "-swap" "0,1"
rlm@250 617 "-geometry" "+776+129"
rlm@250 618 "-composite"
rlm@250 619 main-view "-geometry" "+66+21" "-composite"
rlm@250 620 target))
rlm@250 621 (fn [& args] (map #(.getCanonicalPath %) args)))
rlm@250 622 background main-view touch targets))))
rlm@249 623 #+end_src
rlm@249 624
rlm@252 625 #+begin_src sh :results silent
rlm@312 626 cd ~/proj/cortex/render/touch-cube/
rlm@252 627 ffmpeg -r 60 -i out/%07d.png -b:v 9000k -c:v libtheora basic-touch.ogg
rlm@252 628 #+end_src
rlm@250 629
rlm@312 630 #+begin_src sh :results silent
rlm@312 631 cd ~/proj/cortex/render/touch-cube/
rlm@312 632 ffmpeg -r 30 -i blender-intro/%07d.png -b:v 9000k -c:v libtheora touch-cube.ogg
rlm@312 633 #+end_src
rlm@312 634
rlm@232 635 * Adding Touch to the Worm
rlm@232 636
rlm@253 637 #+name: test-touch-2
rlm@232 638 #+begin_src clojure
rlm@253 639 (in-ns 'cortex.test.touch)
rlm@232 640
rlm@283 641 (defn test-worm-touch
rlm@321 642 "Testing touch:
rlm@321 643 You will see the worm fall onto a table. There is a display which
rlm@321 644 reports the worm's sense of touch. It should change when the worm
rlm@321 645 hits the table and when you hit it with balls.
rlm@321 646
rlm@321 647 Keys:
rlm@321 648 <space> : fire ball"
rlm@283 649 ([] (test-worm-touch false))
rlm@253 650 ([record?]
rlm@253 651 (let [the-worm (doto (worm) (body!))
rlm@253 652 touch (touch! the-worm)
rlm@253 653 touch-display (view-touch)]
rlm@253 654 (world
rlm@253 655 (nodify [the-worm (floor)])
rlm@253 656 standard-debug-controls
rlm@253 657
rlm@253 658 (fn [world]
rlm@253 659 (if record?
rlm@253 660 (Capture/captureVideo
rlm@253 661 world
rlm@253 662 (File. "/home/r/proj/cortex/render/worm-touch/main-view/")))
rlm@253 663 (speed-up world)
rlm@253 664 (light-up-everything world))
rlm@232 665
rlm@253 666 (fn [world tpf]
rlm@253 667 (touch-display
rlm@253 668 (map #(% (.getRootNode world)) touch)
rlm@253 669 (if record?
rlm@253 670 (File. "/home/r/proj/cortex/render/worm-touch/touch/"))))))))
rlm@232 671 #+end_src
rlm@247 672
rlm@253 673 ** Worm Touch Demonstration
rlm@253 674 #+begin_html
rlm@253 675 <div class="figure">
rlm@253 676 <center>
rlm@253 677 <video controls="controls" width="550">
rlm@253 678 <source src="../video/worm-touch.ogg" type="video/ogg"
rlm@253 679 preload="none" poster="../images/aurellem-1280x480.png" />
rlm@253 680 </video>
rlm@309 681 <br> <a href="http://youtu.be/RHx2wqzNVcU"> YouTube </a>
rlm@253 682 </center>
rlm@253 683 <p>The worm responds to touch.</p>
rlm@253 684 </div>
rlm@253 685 #+end_html
rlm@252 686
rlm@252 687
rlm@253 688 ** Generating the Worm Touch Video
rlm@253 689 #+name: magick5
rlm@253 690 #+begin_src clojure
rlm@253 691 (ns cortex.video.magick5
rlm@253 692 (:import java.io.File)
rlm@316 693 (:use clojure.java.shell))
rlm@253 694
rlm@253 695 (defn images [path]
rlm@253 696 (sort (rest (file-seq (File. path)))))
rlm@253 697
rlm@253 698 (def base "/home/r/proj/cortex/render/worm-touch/")
rlm@253 699
rlm@253 700 (defn pics [file]
rlm@253 701 (images (str base file)))
rlm@253 702
rlm@253 703 (defn combine-images []
rlm@253 704 (let [main-view (pics "main-view")
rlm@253 705 touch (pics "touch/0")
rlm@253 706 targets (map
rlm@253 707 #(File. (str base "out/" (format "%07d.png" %)))
rlm@253 708 (range 0 (count main-view)))]
rlm@253 709 (dorun
rlm@253 710 (pmap
rlm@253 711 (comp
rlm@253 712 (fn [[ main-view touch target]]
rlm@253 713 (println target)
rlm@253 714 (sh "convert"
rlm@253 715 main-view
rlm@253 716 touch "-geometry" "+0+0" "-composite"
rlm@253 717 target))
rlm@253 718 (fn [& args] (map #(.getCanonicalPath %) args)))
rlm@253 719 main-view touch targets))))
rlm@253 720 #+end_src
rlm@252 721
rlm@312 722 #+begin_src sh :results silent
rlm@312 723 cd ~/proj/cortex/render/worm-touch
rlm@312 724 ffmpeg -r 60 -i out/%07d.png -b:v 9000k -c:v libtheora worm-touch.ogg
rlm@312 725 #+end_src
rlm@312 726
rlm@247 727 * Headers
rlm@247 728
rlm@247 729 #+name: touch-header
rlm@247 730 #+begin_src clojure
rlm@247 731 (ns cortex.touch
rlm@247 732 "Simulate the sense of touch in jMonkeyEngine3. Enables any Geometry
rlm@247 733 to be outfitted with touch sensors with density determined by a UV
rlm@247 734 image. In this way a Geometry can know what parts of itself are
rlm@247 735 touching nearby objects. Reads specially prepared blender files to
rlm@247 736 construct this sense automatically."
rlm@247 737 {:author "Robert McIntyre"}
rlm@247 738 (:use (cortex world util sense))
rlm@247 739 (:import (com.jme3.scene Geometry Node Mesh))
rlm@247 740 (:import com.jme3.collision.CollisionResults)
rlm@247 741 (:import com.jme3.scene.VertexBuffer$Type)
rlm@247 742 (:import (com.jme3.math Triangle Vector3f Vector2f Ray Matrix4f)))
rlm@247 743 #+end_src
rlm@247 744
rlm@253 745 #+name: test-touch-header
rlm@253 746 #+begin_src clojure
rlm@253 747 (ns cortex.test.touch
rlm@253 748 (:use (cortex world util sense body touch))
rlm@253 749 (:use cortex.test.body)
rlm@253 750 (:import com.aurellem.capture.Capture)
rlm@253 751 (:import java.io.File)
rlm@253 752 (:import (com.jme3.math Vector3f ColorRGBA)))
rlm@253 753 #+end_src
rlm@253 754
rlm@228 755 * Source Listing
rlm@253 756 - [[../src/cortex/touch.clj][cortex.touch]]
rlm@253 757 - [[../src/cortex/test/touch.clj][cortex.test.touch]]
rlm@253 758 - [[../src/cortex/video/magick4.clj][cortex.video.magick4]]
rlm@253 759 - [[../src/cortex/video/magick5.clj][cortex.video.magick5]]
rlm@283 760 - [[../assets/Models/test-touch/touch-cube.blend][touch-cube.blend]]
rlm@253 761 #+html: <ul> <li> <a href="../org/touch.org">This org file</a> </li> </ul>
rlm@253 762 - [[http://hg.bortreb.com ][source-repository]]
rlm@253 763
rlm@254 764 * Next
rlm@332 765 So far I've implemented simulated Vision, Hearing, and
rlm@332 766 Touch, the most obvious and prominent senses that humans
rlm@332 767 have. Smell and Taste shall remain unimplemented for
rlm@332 768 now. This accounts for the "five senses" that feature so
rlm@332 769 prominently in our lives. But humans have far more than the
rlm@332 770 five main senses. There are internal chemical senses, pain
rlm@332 771 (which is *not* the same as touch), heat sensitivity, and
rlm@332 772 our sense of balance, among others. One extra sense is so
rlm@332 773 important that I must implement it to have a hope of making
rlm@332 774 creatures that can gracefully control their own bodies. It
rlm@332 775 is Proprioception, which is the sense of the location of
rlm@332 776 each body part in relation to the other body parts.
rlm@253 777
rlm@332 778 Close your eyes, and touch your nose with your right index
rlm@332 779 finger. How did you do it? You could not see your hand, and
rlm@332 780 neither your hand nor your nose could use the sense of touch
rlm@332 781 to guide the path of your hand. There are no sound cues,
rlm@332 782 and Taste and Smell certainly don't provide any help. You
rlm@332 783 know where your hand is without your other senses because of
rlm@332 784 Proprioception.
rlm@228 785
rlm@254 786 Onward to [[./proprioception.org][proprioception]]!
rlm@228 787
rlm@226 788 * COMMENT Code Generation
rlm@39 789 #+begin_src clojure :tangle ../src/cortex/touch.clj
rlm@231 790 <<touch-header>>
rlm@231 791 <<meta-data>>
rlm@231 792 <<triangles-1>>
rlm@247 793 <<triangles-2>>
rlm@231 794 <<triangles-3>>
rlm@231 795 <<triangles-4>>
rlm@231 796 <<sensors>>
rlm@231 797 <<kernel>>
rlm@231 798 <<visualization>>
rlm@0 799 #+end_src
rlm@0 800
rlm@253 801 #+begin_src clojure :tangle ../src/cortex/test/touch.clj
rlm@253 802 <<test-touch-header>>
rlm@253 803 <<test-touch-0>>
rlm@253 804 <<test-touch-1>>
rlm@253 805 <<test-touch-2>>
rlm@253 806 #+end_src
rlm@232 807
rlm@253 808 #+begin_src clojure :tangle ../src/cortex/video/magick4.clj
rlm@253 809 <<magick4>>
rlm@39 810 #+end_src
rlm@39 811
rlm@253 812 #+begin_src clojure :tangle ../src/cortex/video/magick5.clj
rlm@253 813 <<magick5>>
rlm@253 814 #+end_src