view org/touch.org @ 450:432f2c4646cb

sleepig.
author Robert McIntyre <rlm@mit.edu>
date Wed, 26 Mar 2014 03:18:57 -0400
parents 5205535237fb
children 763d13f77e03
line wrap: on
line source
1 #+title: Simulated Sense of Touch
2 #+author: Robert McIntyre
3 #+email: rlm@mit.edu
4 #+description: Simulated touch for AI research using JMonkeyEngine and clojure.
5 #+keywords: simulation, tactile sense, jMonkeyEngine3, clojure
6 #+SETUPFILE: ../../aurellem/org/setup.org
7 #+INCLUDE: ../../aurellem/org/level-0.org
9 * Touch
11 Touch is critical to navigation and spatial reasoning and as such I
12 need a simulated version of it to give to my AI creatures.
14 Human skin has a wide array of touch sensors, each of which specialize
15 in detecting different vibrational modes and pressures. These sensors
16 can integrate a vast expanse of skin (i.e. your entire palm), or a
17 tiny patch of skin at the tip of your finger. The hairs of the skin
18 help detect objects before they even come into contact with the skin
19 proper.
21 However, touch in my simulated world can not exactly correspond to
22 human touch because my creatures are made out of completely rigid
23 segments that don't deform like human skin.
25 Instead of measuring deformation or vibration, I surround each rigid
26 part with a plenitude of hair-like objects (/feelers/) which do not
27 interact with the physical world. Physical objects can pass through
28 them with no effect. The feelers are able to tell when other objects
29 pass through them, and they constantly report how much of their extent
30 is covered. So even though the creature's body parts do not deform,
31 the feelers create a margin around those body parts which achieves a
32 sense of touch which is a hybrid between a human's sense of
33 deformation and sense from hairs.
35 Implementing touch in jMonkeyEngine follows a different technical route
36 than vision and hearing. Those two senses piggybacked off
37 jMonkeyEngine's 3D audio and video rendering subsystems. To simulate
38 touch, I use jMonkeyEngine's physics system to execute many small
39 collision detections, one for each feeler. The placement of the
40 feelers is determined by a UV-mapped image which shows where each
41 feeler should be on the 3D surface of the body.
43 * Defining Touch Meta-Data in Blender
45 Each geometry can have a single UV map which describes the position of
46 the feelers which will constitute its sense of touch. This image path
47 is stored under the "touch" key. The image itself is black and white,
48 with black meaning a feeler length of 0 (no feeler is present) and
49 white meaning a feeler length of =scale=, which is a float stored
50 under the key "scale".
52 #+name: meta-data
53 #+begin_src clojure
54 (defn tactile-sensor-profile
55 "Return the touch-sensor distribution image in BufferedImage format,
56 or nil if it does not exist."
57 [#^Geometry obj]
58 (if-let [image-path (meta-data obj "touch")]
59 (load-image image-path)))
61 (defn tactile-scale
62 "Return the length of each feeler. Default scale is 0.01
63 jMonkeyEngine units."
64 [#^Geometry obj]
65 (if-let [scale (meta-data obj "scale")]
66 scale 0.1))
67 #+end_src
69 Here is an example of a UV-map which specifies the position of touch
70 sensors along the surface of the upper segment of the worm.
72 #+attr_html: width=755
73 #+caption: This is the tactile-sensor-profile for the upper segment of the worm. It defines regions of high touch sensitivity (where there are many white pixels) and regions of low sensitivity (where white pixels are sparse).
74 [[../images/finger-UV.png]]
76 * Implementation Summary
78 To simulate touch there are three conceptual steps. For each solid
79 object in the creature, you first have to get UV image and scale
80 parameter which define the position and length of the feelers. Then,
81 you use the triangles which comprise the mesh and the UV data stored in
82 the mesh to determine the world-space position and orientation of each
83 feeler. Then once every frame, update these positions and orientations
84 to match the current position and orientation of the object, and use
85 physics collision detection to gather tactile data.
87 Extracting the meta-data has already been described. The third step,
88 physics collision detection, is handled in =touch-kernel=.
89 Translating the positions and orientations of the feelers from the
90 UV-map to world-space is itself a three-step process.
92 - Find the triangles which make up the mesh in pixel-space and in
93 world-space. =triangles= =pixel-triangles=.
95 - Find the coordinates of each feeler in world-space. These are the
96 origins of the feelers. =feeler-origins=.
98 - Calculate the normals of the triangles in world space, and add
99 them to each of the origins of the feelers. These are the
100 normalized coordinates of the tips of the feelers. =feeler-tips=.
102 * Triangle Math
103 ** Shrapnel Conversion Functions
105 #+name: triangles-1
106 #+begin_src clojure
107 (defn vector3f-seq [#^Vector3f v]
108 [(.getX v) (.getY v) (.getZ v)])
110 (defn triangle-seq [#^Triangle tri]
111 [(vector3f-seq (.get1 tri))
112 (vector3f-seq (.get2 tri))
113 (vector3f-seq (.get3 tri))])
115 (defn ->vector3f
116 ([coords] (Vector3f. (nth coords 0 0)
117 (nth coords 1 0)
118 (nth coords 2 0))))
120 (defn ->triangle [points]
121 (apply #(Triangle. %1 %2 %3) (map ->vector3f points)))
122 #+end_src
124 It is convenient to treat a =Triangle= as a vector of vectors, and a
125 =Vector2f= or =Vector3f= as vectors of floats. (->vector3f) and
126 (->triangle) undo the operations of =vector3f-seq= and
127 =triangle-seq=. If these classes implemented =Iterable= then =seq=
128 would work on them automatically.
130 ** Decomposing a 3D shape into Triangles
132 The rigid objects which make up a creature have an underlying
133 =Geometry=, which is a =Mesh= plus a =Material= and other important
134 data involved with displaying the object.
136 A =Mesh= is composed of =Triangles=, and each =Triangle= has three
137 vertices which have coordinates in world space and UV space.
139 Here, =triangles= gets all the world-space triangles which comprise a
140 mesh, while =pixel-triangles= gets those same triangles expressed in
141 pixel coordinates (which are UV coordinates scaled to fit the height
142 and width of the UV image).
144 #+name: triangles-2
145 #+begin_src clojure
146 (in-ns 'cortex.touch)
147 (defn triangle
148 "Get the triangle specified by triangle-index from the mesh."
149 [#^Geometry geo triangle-index]
150 (triangle-seq
151 (let [scratch (Triangle.)]
152 (.getTriangle (.getMesh geo) triangle-index scratch) scratch)))
154 (defn triangles
155 "Return a sequence of all the Triangles which comprise a given
156 Geometry."
157 [#^Geometry geo]
158 (map (partial triangle geo) (range (.getTriangleCount (.getMesh geo)))))
160 (defn triangle-vertex-indices
161 "Get the triangle vertex indices of a given triangle from a given
162 mesh."
163 [#^Mesh mesh triangle-index]
164 (let [indices (int-array 3)]
165 (.getTriangle mesh triangle-index indices)
166 (vec indices)))
168 (defn vertex-UV-coord
169 "Get the UV-coordinates of the vertex named by vertex-index"
170 [#^Mesh mesh vertex-index]
171 (let [UV-buffer
172 (.getData
173 (.getBuffer
174 mesh
175 VertexBuffer$Type/TexCoord))]
176 [(.get UV-buffer (* vertex-index 2))
177 (.get UV-buffer (+ 1 (* vertex-index 2)))]))
179 (defn pixel-triangle [#^Geometry geo image index]
180 (let [mesh (.getMesh geo)
181 width (.getWidth image)
182 height (.getHeight image)]
183 (vec (map (fn [[u v]] (vector (* width u) (* height v)))
184 (map (partial vertex-UV-coord mesh)
185 (triangle-vertex-indices mesh index))))))
187 (defn pixel-triangles
188 "The pixel-space triangles of the Geometry, in the same order as
189 (triangles geo)"
190 [#^Geometry geo image]
191 (let [height (.getHeight image)
192 width (.getWidth image)]
193 (map (partial pixel-triangle geo image)
194 (range (.getTriangleCount (.getMesh geo))))))
195 #+end_src
196 ** The Affine Transform from one Triangle to Another
198 =pixel-triangles= gives us the mesh triangles expressed in pixel
199 coordinates and =triangles= gives us the mesh triangles expressed in
200 world coordinates. The tactile-sensor-profile gives the position of
201 each feeler in pixel-space. In order to convert pixel-space
202 coordinates into world-space coordinates we need something that takes
203 coordinates on the surface of one triangle and gives the corresponding
204 coordinates on the surface of another triangle.
206 Triangles are [[http://mathworld.wolfram.com/AffineTransformation.html ][affine]], which means any triangle can be transformed into
207 any other by a combination of translation, scaling, and
208 rotation. The affine transformation from one triangle to another
209 is readily computable if the triangle is expressed in terms of a $4x4$
210 matrix.
212 \begin{bmatrix}
213 x_1 & x_2 & x_3 & n_x \\
214 y_1 & y_2 & y_3 & n_y \\
215 z_1 & z_2 & z_3 & n_z \\
216 1 & 1 & 1 & 1
217 \end{bmatrix}
219 Here, the first three columns of the matrix are the vertices of the
220 triangle. The last column is the right-handed unit normal of the
221 triangle.
223 With two triangles $T_{1}$ and $T_{2}$ each expressed as a matrix like
224 above, the affine transform from $T_{1}$ to $T_{2}$ is
226 $T_{2}T_{1}^{-1}$
228 The clojure code below recapitulates the formulas above, using
229 jMonkeyEngine's =Matrix4f= objects, which can describe any affine
230 transformation.
232 #+name: triangles-3
233 #+begin_src clojure
234 (in-ns 'cortex.touch)
236 (defn triangle->matrix4f
237 "Converts the triangle into a 4x4 matrix: The first three columns
238 contain the vertices of the triangle; the last contains the unit
239 normal of the triangle. The bottom row is filled with 1s."
240 [#^Triangle t]
241 (let [mat (Matrix4f.)
242 [vert-1 vert-2 vert-3]
243 (mapv #(.get t %) (range 3))
244 unit-normal (do (.calculateNormal t)(.getNormal t))
245 vertices [vert-1 vert-2 vert-3 unit-normal]]
246 (dorun
247 (for [row (range 4) col (range 3)]
248 (do
249 (.set mat col row (.get (vertices row) col))
250 (.set mat 3 row 1)))) mat))
252 (defn triangles->affine-transform
253 "Returns the affine transformation that converts each vertex in the
254 first triangle into the corresponding vertex in the second
255 triangle."
256 [#^Triangle tri-1 #^Triangle tri-2]
257 (.mult
258 (triangle->matrix4f tri-2)
259 (.invert (triangle->matrix4f tri-1))))
260 #+end_src
261 ** Triangle Boundaries
263 For efficiency's sake I will divide the tactile-profile image into
264 small squares which inscribe each pixel-triangle, then extract the
265 points which lie inside the triangle and map them to 3D-space using
266 =triangle-transform= above. To do this I need a function,
267 =convex-bounds= which finds the smallest box which inscribes a 2D
268 triangle.
270 =inside-triangle?= determines whether a point is inside a triangle
271 in 2D pixel-space.
273 #+name: triangles-4
274 #+begin_src clojure
275 (defn convex-bounds
276 "Returns the smallest square containing the given vertices, as a
277 vector of integers [left top width height]."
278 [verts]
279 (let [xs (map first verts)
280 ys (map second verts)
281 x0 (Math/floor (apply min xs))
282 y0 (Math/floor (apply min ys))
283 x1 (Math/ceil (apply max xs))
284 y1 (Math/ceil (apply max ys))]
285 [x0 y0 (- x1 x0) (- y1 y0)]))
287 (defn same-side?
288 "Given the points p1 and p2 and the reference point ref, is point p
289 on the same side of the line that goes through p1 and p2 as ref is?"
290 [p1 p2 ref p]
291 (<=
292 0
293 (.dot
294 (.cross (.subtract p2 p1) (.subtract p p1))
295 (.cross (.subtract p2 p1) (.subtract ref p1)))))
297 (defn inside-triangle?
298 "Is the point inside the triangle?"
299 {:author "Dylan Holmes"}
300 [#^Triangle tri #^Vector3f p]
301 (let [[vert-1 vert-2 vert-3] [(.get1 tri) (.get2 tri) (.get3 tri)]]
302 (and
303 (same-side? vert-1 vert-2 vert-3 p)
304 (same-side? vert-2 vert-3 vert-1 p)
305 (same-side? vert-3 vert-1 vert-2 p))))
306 #+end_src
308 * Feeler Coordinates
310 The triangle-related functions above make short work of calculating
311 the positions and orientations of each feeler in world-space.
313 #+name: sensors
314 #+begin_src clojure
315 (in-ns 'cortex.touch)
317 (defn feeler-pixel-coords
318 "Returns the coordinates of the feelers in pixel space in lists, one
319 list for each triangle, ordered in the same way as (triangles) and
320 (pixel-triangles)."
321 [#^Geometry geo image]
322 (map
323 (fn [pixel-triangle]
324 (filter
325 (fn [coord]
326 (inside-triangle? (->triangle pixel-triangle)
327 (->vector3f coord)))
328 (white-coordinates image (convex-bounds pixel-triangle))))
329 (pixel-triangles geo image)))
331 (defn feeler-world-coords
332 "Returns the coordinates of the feelers in world space in lists, one
333 list for each triangle, ordered in the same way as (triangles) and
334 (pixel-triangles)."
335 [#^Geometry geo image]
336 (let [transforms
337 (map #(triangles->affine-transform
338 (->triangle %1) (->triangle %2))
339 (pixel-triangles geo image)
340 (triangles geo))]
341 (map (fn [transform coords]
342 (map #(.mult transform (->vector3f %)) coords))
343 transforms (feeler-pixel-coords geo image))))
345 (defn feeler-origins
346 "The world space coordinates of the root of each feeler."
347 [#^Geometry geo image]
348 (reduce concat (feeler-world-coords geo image)))
350 (defn feeler-tips
351 "The world space coordinates of the tip of each feeler."
352 [#^Geometry geo image]
353 (let [world-coords (feeler-world-coords geo image)
354 normals
355 (map
356 (fn [triangle]
357 (.calculateNormal triangle)
358 (.clone (.getNormal triangle)))
359 (map ->triangle (triangles geo)))]
361 (mapcat (fn [origins normal]
362 (map #(.add % normal) origins))
363 world-coords normals)))
365 (defn touch-topology
366 "touch-topology? is not a function."
367 [#^Geometry geo image]
368 (collapse (reduce concat (feeler-pixel-coords geo image))))
369 #+end_src
370 * Simulated Touch
372 =touch-kernel= generates functions to be called from within a
373 simulation that perform the necessary physics collisions to collect
374 tactile data, and =touch!= recursively applies it to every node in
375 the creature.
377 #+name: kernel
378 #+begin_src clojure
379 (in-ns 'cortex.touch)
381 (defn set-ray [#^Ray ray #^Matrix4f transform
382 #^Vector3f origin #^Vector3f tip]
383 ;; Doing everything locally reduces garbage collection by enough to
384 ;; be worth it.
385 (.mult transform origin (.getOrigin ray))
386 (.mult transform tip (.getDirection ray))
387 (.subtractLocal (.getDirection ray) (.getOrigin ray))
388 (.normalizeLocal (.getDirection ray)))
390 (import com.jme3.math.FastMath)
393 (defn touch-kernel
394 "Constructs a function which will return tactile sensory data from
395 'geo when called from inside a running simulation"
396 [#^Geometry geo]
397 (if-let
398 [profile (tactile-sensor-profile geo)]
399 (let [ray-reference-origins (feeler-origins geo profile)
400 ray-reference-tips (feeler-tips geo profile)
401 ray-length (tactile-scale geo)
402 current-rays (map (fn [_] (Ray.)) ray-reference-origins)
403 topology (touch-topology geo profile)
404 correction (float (* ray-length -0.2))]
406 ;; slight tolerance for very close collisions.
407 (dorun
408 (map (fn [origin tip]
409 (.addLocal origin (.mult (.subtract tip origin)
410 correction)))
411 ray-reference-origins ray-reference-tips))
412 (dorun (map #(.setLimit % ray-length) current-rays))
413 (fn [node]
414 (let [transform (.getWorldMatrix geo)]
415 (dorun
416 (map (fn [ray ref-origin ref-tip]
417 (set-ray ray transform ref-origin ref-tip))
418 current-rays ray-reference-origins
419 ray-reference-tips))
420 (vector
421 topology
422 (vec
423 (for [ray current-rays]
424 (do
425 (let [results (CollisionResults.)]
426 (.collideWith node ray results)
427 (let [touch-objects
428 (filter #(not (= geo (.getGeometry %)))
429 results)
430 limit (.getLimit ray)]
431 [(if (empty? touch-objects)
432 limit
433 (let [response
434 (apply min (map #(.getDistance %)
435 touch-objects))]
436 (FastMath/clamp
437 (float
438 (if (> response limit) (float 0.0)
439 (+ response correction)))
440 (float 0.0)
441 limit)))
442 limit])))))))))))
444 (defn touch!
445 "Endow the creature with the sense of touch. Returns a sequence of
446 functions, one for each body part with a tactile-sensor-profile,
447 each of which when called returns sensory data for that body part."
448 [#^Node creature]
449 (filter
450 (comp not nil?)
451 (map touch-kernel
452 (filter #(isa? (class %) Geometry)
453 (node-seq creature)))))
454 #+end_src
456 #+results: kernel
457 : #'cortex.touch/touch!
459 * Visualizing Touch
461 Each feeler is represented in the image as a single pixel. The
462 greyscale value of each pixel represents how deep the feeler
463 represented by that pixel is inside another object. Black means that
464 nothing is touching the feeler, while white means that the feeler is
465 completely inside another object, which is presumably flush with the
466 surface of the triangle from which the feeler originates.
468 #+name: visualization
469 #+begin_src clojure
470 (in-ns 'cortex.touch)
472 (defn touch->gray
473 "Convert a pair of [distance, max-distance] into a gray-scale pixel."
474 [distance max-distance]
475 (gray (- 255 (rem (int (* 255 (/ distance max-distance))) 256))))
477 (defn view-touch
478 "Creates a function which accepts a list of touch sensor-data and
479 displays each element to the screen."
480 []
481 (view-sense
482 (fn [[coords sensor-data]]
483 (let [image (points->image coords)]
484 (dorun
485 (for [i (range (count coords))]
486 (.setRGB image ((coords i) 0) ((coords i) 1)
487 (apply touch->gray (sensor-data i)))))
488 image))))
489 #+end_src
491 #+results: visualization
492 : #'cortex.touch/view-touch
494 * Basic Test of Touch
496 The worm's sense of touch is a bit complicated, so for this basic test
497 I'll use a new creature --- a simple cube which has touch sensors
498 evenly distributed along each of its sides.
500 #+name: test-touch-0
501 #+begin_src clojure
502 (in-ns 'cortex.test.touch)
504 (defn touch-cube []
505 (load-blender-model "Models/test-touch/touch-cube.blend"))
506 #+end_src
508 ** The Touch Cube
509 #+begin_html
510 <div class="figure">
511 <center>
512 <video controls="controls" width="500">
513 <source src="../video/touch-cube.ogg" type="video/ogg"
514 preload="none" poster="../images/aurellem-1280x480.png" />
515 </video>
516 <br> <a href="http://youtu.be/aEao4m8meAI"> YouTube </a>
517 </center>
518 <p>A simple creature with evenly distributed touch sensors.</p>
519 </div>
520 #+end_html
522 The tactile-sensor-profile image for this simple creature looks like
523 this:
525 #+attr_html: width=500
526 #+caption: The distribution of feelers along the touch-cube. The colors of the faces are irrelevant; only the white pixels specify feelers.
527 [[../images/touch-profile.png]]
529 #+name: test-touch-1
530 #+begin_src clojure
531 (in-ns 'cortex.test.touch)
533 (defn test-basic-touch
534 "Testing touch:
535 You should see a cube fall onto a table. There is a cross-shaped
536 display which reports the cube's sensation of touch. This display
537 should change when the cube hits the table, and whenever you hit
538 the cube with balls.
540 Keys:
541 <space> : fire ball"
542 ([] (test-basic-touch false))
543 ([record?]
544 (let [the-cube (doto (touch-cube) (body!))
545 touch (touch! the-cube)
546 touch-display (view-touch)]
547 (world
548 (nodify [the-cube
549 (box 10 1 10 :position (Vector3f. 0 -10 0)
550 :color ColorRGBA/Gray :mass 0)])
552 standard-debug-controls
554 (fn [world]
555 (let [timer (IsoTimer. 60)]
556 (.setTimer world timer)
557 (display-dilated-time world timer))
558 (if record?
559 (Capture/captureVideo
560 world
561 (File. "/home/r/proj/cortex/render/touch-cube/main-view/")))
562 (speed-up world)
563 (light-up-everything world))
565 (fn [world tpf]
566 (touch-display
567 (map #(% (.getRootNode world)) touch)
568 (if record?
569 (File. "/home/r/proj/cortex/render/touch-cube/touch/"))))))))
570 #+end_src
572 #+results: test-touch-1
573 : #'cortex.test.touch/test-basic-touch
575 ** Basic Touch Demonstration
577 #+begin_html
578 <div class="figure">
579 <center>
580 <video controls="controls" width="755">
581 <source src="../video/basic-touch.ogg" type="video/ogg"
582 preload="none" poster="../images/aurellem-1280x480.png" />
583 </video>
584 <br> <a href="http://youtu.be/8xNEtD-a8f0"> YouTube </a>
585 </center>
586 <p>The simple creature responds to touch.</p>
587 </div>
588 #+end_html
590 ** Generating the Basic Touch Video
591 #+name: magick4
592 #+begin_src clojure
593 (ns cortex.video.magick4
594 (:import java.io.File)
595 (:use clojure.java.shell))
597 (defn images [path]
598 (sort (rest (file-seq (File. path)))))
600 (def base "/home/r/proj/cortex/render/touch-cube/")
602 (defn pics [file]
603 (images (str base file)))
605 (defn combine-images []
606 (let [main-view (pics "main-view")
607 touch (pics "touch/0")
608 background (repeat 9001 (File. (str base "background.png")))
609 targets (map
610 #(File. (str base "out/" (format "%07d.png" %)))
611 (range 0 (count main-view)))]
612 (dorun
613 (pmap
614 (comp
615 (fn [[background main-view touch target]]
616 (println target)
617 (sh "convert"
618 touch
619 "-resize" "x300"
620 "-rotate" "180"
621 background
622 "-swap" "0,1"
623 "-geometry" "+776+129"
624 "-composite"
625 main-view "-geometry" "+66+21" "-composite"
626 target))
627 (fn [& args] (map #(.getCanonicalPath %) args)))
628 background main-view touch targets))))
629 #+end_src
631 #+begin_src sh :results silent
632 cd ~/proj/cortex/render/touch-cube/
633 ffmpeg -r 60 -i out/%07d.png -b:v 9000k -c:v libtheora basic-touch.ogg
634 #+end_src
636 #+begin_src sh :results silent
637 cd ~/proj/cortex/render/touch-cube/
638 ffmpeg -r 30 -i blender-intro/%07d.png -b:v 9000k -c:v libtheora touch-cube.ogg
639 #+end_src
641 * Adding Touch to the Worm
643 #+name: test-touch-2
644 #+begin_src clojure
645 (in-ns 'cortex.test.touch)
647 (defn test-worm-touch
648 "Testing touch:
649 You will see the worm fall onto a table. There is a display which
650 reports the worm's sense of touch. It should change when the worm
651 hits the table and when you hit it with balls.
653 Keys:
654 <space> : fire ball"
655 ([] (test-worm-touch false))
656 ([record?]
657 (let [the-worm (doto (worm) (body!))
658 touch (touch! the-worm)
659 touch-display (view-touch)]
660 (world
661 (nodify [the-worm (floor)])
662 standard-debug-controls
664 (fn [world]
665 (let [timer (IsoTimer. 60)]
666 (.setTimer world timer)
667 (display-dilated-time world timer))
668 (if record?
669 (Capture/captureVideo
670 world
671 (File. "/home/r/proj/cortex/render/worm-touch/main-view/")))
672 (speed-up world)
673 (light-up-everything world))
675 (fn [world tpf]
676 (touch-display
677 (map #(% (.getRootNode world)) touch)
678 (if record?
679 (File. "/home/r/proj/cortex/render/worm-touch/touch/"))))))))
680 #+end_src
682 #+results: test-touch-2
683 : #'cortex.test.touch/test-worm-touch
685 ** Worm Touch Demonstration
686 #+begin_html
687 <div class="figure">
688 <center>
689 <video controls="controls" width="550">
690 <source src="../video/worm-touch.ogg" type="video/ogg"
691 preload="none" poster="../images/aurellem-1280x480.png" />
692 </video>
693 <br> <a href="http://youtu.be/RHx2wqzNVcU"> YouTube </a>
694 </center>
695 <p>The worm responds to touch.</p>
696 </div>
697 #+end_html
700 ** Generating the Worm Touch Video
701 #+name: magick5
702 #+begin_src clojure
703 (ns cortex.video.magick5
704 (:import java.io.File)
705 (:use clojure.java.shell))
707 (defn images [path]
708 (sort (rest (file-seq (File. path)))))
710 (def base "/home/r/proj/cortex/render/worm-touch/")
712 (defn pics [file]
713 (images (str base file)))
715 (defn combine-images []
716 (let [main-view (pics "main-view")
717 touch (pics "touch/0")
718 targets (map
719 #(File. (str base "out/" (format "%07d.png" %)))
720 (range 0 (count main-view)))]
721 (dorun
722 (pmap
723 (comp
724 (fn [[ main-view touch target]]
725 (println target)
726 (sh "convert"
727 main-view
728 touch "-geometry" "+0+0" "-composite"
729 target))
730 (fn [& args] (map #(.getCanonicalPath %) args)))
731 main-view touch targets))))
732 #+end_src
734 #+begin_src sh :results silent
735 cd ~/proj/cortex/render/worm-touch
736 ffmpeg -r 60 -i out/%07d.png -b:v 9000k -c:v libtheora worm-touch.ogg
737 #+end_src
739 * Headers
741 #+name: touch-header
742 #+begin_src clojure
743 (ns cortex.touch
744 "Simulate the sense of touch in jMonkeyEngine3. Enables any Geometry
745 to be outfitted with touch sensors with density determined by a UV
746 image. In this way a Geometry can know what parts of itself are
747 touching nearby objects. Reads specially prepared blender files to
748 construct this sense automatically."
749 {:author "Robert McIntyre"}
750 (:use (cortex world util sense))
751 (:import (com.jme3.scene Geometry Node Mesh))
752 (:import com.jme3.collision.CollisionResults)
753 (:import com.jme3.scene.VertexBuffer$Type)
754 (:import (com.jme3.math Triangle Vector3f Vector2f Ray Matrix4f)))
755 #+end_src
757 #+name: test-touch-header
758 #+begin_src clojure
759 (ns cortex.test.touch
760 (:use (cortex world util sense body touch))
761 (:use cortex.test.body)
762 (:import (com.aurellem.capture Capture IsoTimer))
763 (:import java.io.File)
764 (:import (com.jme3.math Vector3f ColorRGBA)))
765 #+end_src
767 #+results: test-touch-header
768 : com.jme3.math.ColorRGBA
770 * Source Listing
771 - [[../src/cortex/touch.clj][cortex.touch]]
772 - [[../src/cortex/test/touch.clj][cortex.test.touch]]
773 - [[../src/cortex/video/magick4.clj][cortex.video.magick4]]
774 - [[../src/cortex/video/magick5.clj][cortex.video.magick5]]
775 - [[../assets/Models/test-touch/touch-cube.blend][touch-cube.blend]]
776 #+html: <ul> <li> <a href="../org/touch.org">This org file</a> </li> </ul>
777 - [[http://hg.bortreb.com ][source-repository]]
779 * Next
780 So far I've implemented simulated Vision, Hearing, and
781 Touch, the most obvious and prominent senses that humans
782 have. Smell and Taste shall remain unimplemented for
783 now. This accounts for the "five senses" that feature so
784 prominently in our lives. But humans have far more than the
785 five main senses. There are internal chemical senses, pain
786 (which is *not* the same as touch), heat sensitivity, and
787 our sense of balance, among others. One extra sense is so
788 important that I must implement it to have a hope of making
789 creatures that can gracefully control their own bodies. It
790 is Proprioception, which is the sense of the location of
791 each body part in relation to the other body parts.
793 Close your eyes, and touch your nose with your right index
794 finger. How did you do it? You could not see your hand, and
795 neither your hand nor your nose could use the sense of touch
796 to guide the path of your hand. There are no sound cues,
797 and Taste and Smell certainly don't provide any help. You
798 know where your hand is without your other senses because of
799 Proprioception.
801 Onward to [[./proprioception.org][proprioception]]!
803 * COMMENT Code Generation
804 #+begin_src clojure :tangle ../src/cortex/touch.clj
805 <<touch-header>>
806 <<meta-data>>
807 <<triangles-1>>
808 <<triangles-2>>
809 <<triangles-3>>
810 <<triangles-4>>
811 <<sensors>>
812 <<kernel>>
813 <<visualization>>
814 #+end_src
816 #+begin_src clojure :tangle ../src/cortex/test/touch.clj
817 <<test-touch-header>>
818 <<test-touch-0>>
819 <<test-touch-1>>
820 <<test-touch-2>>
821 #+end_src
823 #+begin_src clojure :tangle ../src/cortex/video/magick4.clj
824 <<magick4>>
825 #+end_src
827 #+begin_src clojure :tangle ../src/cortex/video/magick5.clj
828 <<magick5>>
829 #+end_src