rlm@37: #+title: Simulated Sense of Touch rlm@0: #+author: Robert McIntyre rlm@0: #+email: rlm@mit.edu rlm@37: #+description: Simulated touch for AI research using JMonkeyEngine and clojure. rlm@37: #+keywords: simulation, tactile sense, jMonkeyEngine3, clojure rlm@4: #+SETUPFILE: ../../aurellem/org/setup.org rlm@4: #+INCLUDE: ../../aurellem/org/level-0.org rlm@0: rlm@37: * Touch rlm@0: rlm@226: Touch is critical to navigation and spatial reasoning and as such I rlm@226: need a simulated version of it to give to my AI creatures. rlm@0: rlm@306: Human skin has a wide array of touch sensors, each of which specialize rlm@228: in detecting different vibrational modes and pressures. These sensors rlm@228: can integrate a vast expanse of skin (i.e. your entire palm), or a rlm@228: tiny patch of skin at the tip of your finger. The hairs of the skin rlm@228: help detect objects before they even come into contact with the skin rlm@247: proper. rlm@228: rlm@248: However, touch in my simulated world can not exactly correspond to rlm@248: human touch because my creatures are made out of completely rigid rlm@248: segments that don't deform like human skin. rlm@248: rlm@228: Instead of measuring deformation or vibration, I surround each rigid rlm@247: part with a plenitude of hair-like objects (/feelers/) which do not rlm@247: interact with the physical world. Physical objects can pass through rlm@248: them with no effect. The feelers are able to tell when other objects rlm@248: pass through them, and they constantly report how much of their extent rlm@248: is covered. So even though the creature's body parts do not deform, rlm@248: the feelers create a margin around those body parts which achieves a rlm@248: sense of touch which is a hybrid between a human's sense of rlm@248: deformation and sense from hairs. rlm@228: rlm@306: Implementing touch in jMonkeyEngine follows a different technical route rlm@228: than vision and hearing. Those two senses piggybacked off rlm@228: jMonkeyEngine's 3D audio and video rendering subsystems. To simulate rlm@247: touch, I use jMonkeyEngine's physics system to execute many small rlm@247: collision detections, one for each feeler. The placement of the rlm@247: feelers is determined by a UV-mapped image which shows where each rlm@247: feeler should be on the 3D surface of the body. rlm@228: rlm@229: * Defining Touch Meta-Data in Blender rlm@229: rlm@245: Each geometry can have a single UV map which describes the position of rlm@247: the feelers which will constitute its sense of touch. This image path rlm@245: is stored under the "touch" key. The image itself is black and white, rlm@247: with black meaning a feeler length of 0 (no feeler is present) and rlm@247: white meaning a feeler length of =scale=, which is a float stored rlm@247: under the key "scale". rlm@229: rlm@231: #+name: meta-data rlm@0: #+begin_src clojure rlm@229: (defn tactile-sensor-profile rlm@229: "Return the touch-sensor distribution image in BufferedImage format, rlm@229: or nil if it does not exist." rlm@229: [#^Geometry obj] rlm@229: (if-let [image-path (meta-data obj "touch")] rlm@229: (load-image image-path))) rlm@233: rlm@233: (defn tactile-scale rlm@247: "Return the length of each feeler. Default scale is 0.01 rlm@247: jMonkeyEngine units." rlm@233: [#^Geometry obj] rlm@233: (if-let [scale (meta-data obj "scale")] rlm@233: scale 0.1)) rlm@228: #+end_src rlm@156: rlm@246: Here is an example of a UV-map which specifies the position of touch rlm@247: sensors along the surface of the upper segment of the worm. rlm@229: rlm@246: #+attr_html: width=755 rlm@246: #+caption: This is the tactile-sensor-profile for the upper segment of the worm. It defines regions of high touch sensitivity (where there are many white pixels) and regions of low sensitivity (where white pixels are sparse). rlm@246: [[../images/finger-UV.png]] rlm@234: rlm@247: * Implementation Summary rlm@247: rlm@247: To simulate touch there are three conceptual steps. For each solid rlm@247: object in the creature, you first have to get UV image and scale rlm@306: parameter which define the position and length of the feelers. Then, rlm@430: you use the triangles which comprise the mesh and the UV data stored in rlm@247: the mesh to determine the world-space position and orientation of each rlm@248: feeler. Then once every frame, update these positions and orientations rlm@248: to match the current position and orientation of the object, and use rlm@247: physics collision detection to gather tactile data. rlm@238: rlm@247: Extracting the meta-data has already been described. The third step, rlm@273: physics collision detection, is handled in =touch-kernel=. rlm@247: Translating the positions and orientations of the feelers from the rlm@248: UV-map to world-space is itself a three-step process. rlm@239: rlm@238: - Find the triangles which make up the mesh in pixel-space and in rlm@273: world-space. =triangles= =pixel-triangles=. rlm@239: rlm@247: - Find the coordinates of each feeler in world-space. These are the rlm@273: origins of the feelers. =feeler-origins=. rlm@239: rlm@238: - Calculate the normals of the triangles in world space, and add rlm@238: them to each of the origins of the feelers. These are the rlm@273: normalized coordinates of the tips of the feelers. =feeler-tips=. rlm@239: rlm@247: * Triangle Math rlm@306: ** Shrapnel Conversion Functions rlm@239: rlm@247: #+name: triangles-1 rlm@247: #+begin_src clojure rlm@247: (defn vector3f-seq [#^Vector3f v] rlm@247: [(.getX v) (.getY v) (.getZ v)]) rlm@247: rlm@247: (defn triangle-seq [#^Triangle tri] rlm@247: [(vector3f-seq (.get1 tri)) rlm@247: (vector3f-seq (.get2 tri)) rlm@247: (vector3f-seq (.get3 tri))]) rlm@247: rlm@247: (defn ->vector3f rlm@247: ([coords] (Vector3f. (nth coords 0 0) rlm@247: (nth coords 1 0) rlm@247: (nth coords 2 0)))) rlm@247: rlm@247: (defn ->triangle [points] rlm@247: (apply #(Triangle. %1 %2 %3) (map ->vector3f points))) rlm@247: #+end_src rlm@247: rlm@306: It is convenient to treat a =Triangle= as a vector of vectors, and a rlm@255: =Vector2f= or =Vector3f= as vectors of floats. (->vector3f) and rlm@273: (->triangle) undo the operations of =vector3f-seq= and rlm@273: =triangle-seq=. If these classes implemented =Iterable= then =seq= rlm@306: would work on them automatically. rlm@248: rlm@247: ** Decomposing a 3D shape into Triangles rlm@247: rlm@248: The rigid objects which make up a creature have an underlying rlm@247: =Geometry=, which is a =Mesh= plus a =Material= and other important rlm@248: data involved with displaying the object. rlm@247: rlm@247: A =Mesh= is composed of =Triangles=, and each =Triangle= has three rlm@306: vertices which have coordinates in world space and UV space. rlm@247: rlm@430: Here, =triangles= gets all the world-space triangles which comprise a rlm@273: mesh, while =pixel-triangles= gets those same triangles expressed in rlm@247: pixel coordinates (which are UV coordinates scaled to fit the height rlm@247: and width of the UV image). rlm@247: rlm@247: #+name: triangles-2 rlm@247: #+begin_src clojure rlm@247: (in-ns 'cortex.touch) rlm@247: (defn triangle rlm@247: "Get the triangle specified by triangle-index from the mesh." rlm@247: [#^Geometry geo triangle-index] rlm@247: (triangle-seq rlm@247: (let [scratch (Triangle.)] rlm@247: (.getTriangle (.getMesh geo) triangle-index scratch) scratch))) rlm@247: rlm@247: (defn triangles rlm@430: "Return a sequence of all the Triangles which comprise a given rlm@247: Geometry." rlm@247: [#^Geometry geo] rlm@247: (map (partial triangle geo) (range (.getTriangleCount (.getMesh geo))))) rlm@247: rlm@247: (defn triangle-vertex-indices rlm@247: "Get the triangle vertex indices of a given triangle from a given rlm@247: mesh." rlm@247: [#^Mesh mesh triangle-index] rlm@247: (let [indices (int-array 3)] rlm@247: (.getTriangle mesh triangle-index indices) rlm@247: (vec indices))) rlm@247: rlm@247: (defn vertex-UV-coord rlm@247: "Get the UV-coordinates of the vertex named by vertex-index" rlm@247: [#^Mesh mesh vertex-index] rlm@247: (let [UV-buffer rlm@247: (.getData rlm@247: (.getBuffer rlm@247: mesh rlm@247: VertexBuffer$Type/TexCoord))] rlm@247: [(.get UV-buffer (* vertex-index 2)) rlm@247: (.get UV-buffer (+ 1 (* vertex-index 2)))])) rlm@247: rlm@247: (defn pixel-triangle [#^Geometry geo image index] rlm@247: (let [mesh (.getMesh geo) rlm@247: width (.getWidth image) rlm@247: height (.getHeight image)] rlm@247: (vec (map (fn [[u v]] (vector (* width u) (* height v))) rlm@247: (map (partial vertex-UV-coord mesh) rlm@247: (triangle-vertex-indices mesh index)))))) rlm@247: rlm@248: (defn pixel-triangles rlm@248: "The pixel-space triangles of the Geometry, in the same order as rlm@248: (triangles geo)" rlm@248: [#^Geometry geo image] rlm@248: (let [height (.getHeight image) rlm@248: width (.getWidth image)] rlm@248: (map (partial pixel-triangle geo image) rlm@248: (range (.getTriangleCount (.getMesh geo)))))) rlm@247: #+end_src rlm@247: ** The Affine Transform from one Triangle to Another rlm@247: rlm@273: =pixel-triangles= gives us the mesh triangles expressed in pixel rlm@273: coordinates and =triangles= gives us the mesh triangles expressed in rlm@247: world coordinates. The tactile-sensor-profile gives the position of rlm@248: each feeler in pixel-space. In order to convert pixel-space rlm@247: coordinates into world-space coordinates we need something that takes rlm@247: coordinates on the surface of one triangle and gives the corresponding rlm@247: coordinates on the surface of another triangle. rlm@247: rlm@247: Triangles are [[http://mathworld.wolfram.com/AffineTransformation.html ][affine]], which means any triangle can be transformed into rlm@247: any other by a combination of translation, scaling, and rlm@248: rotation. The affine transformation from one triangle to another rlm@247: is readily computable if the triangle is expressed in terms of a $4x4$ rlm@247: matrix. rlm@247: rlm@247: \begin{bmatrix} rlm@247: x_1 & x_2 & x_3 & n_x \\ rlm@247: y_1 & y_2 & y_3 & n_y \\ rlm@247: z_1 & z_2 & z_3 & n_z \\ rlm@247: 1 & 1 & 1 & 1 rlm@247: \end{bmatrix} rlm@247: rlm@306: Here, the first three columns of the matrix are the vertices of the rlm@247: triangle. The last column is the right-handed unit normal of the rlm@247: triangle. rlm@247: rlm@247: With two triangles $T_{1}$ and $T_{2}$ each expressed as a matrix like rlm@247: above, the affine transform from $T_{1}$ to $T_{2}$ is rlm@247: rlm@247: $T_{2}T_{1}^{-1}$ rlm@247: rlm@306: The clojure code below recapitulates the formulas above, using rlm@248: jMonkeyEngine's =Matrix4f= objects, which can describe any affine rlm@248: transformation. rlm@247: rlm@247: #+name: triangles-3 rlm@247: #+begin_src clojure rlm@247: (in-ns 'cortex.touch) rlm@247: rlm@247: (defn triangle->matrix4f rlm@247: "Converts the triangle into a 4x4 matrix: The first three columns rlm@247: contain the vertices of the triangle; the last contains the unit rlm@247: normal of the triangle. The bottom row is filled with 1s." rlm@247: [#^Triangle t] rlm@247: (let [mat (Matrix4f.) rlm@247: [vert-1 vert-2 vert-3] rlm@430: (mapv #(.get t %) (range 3)) rlm@247: unit-normal (do (.calculateNormal t)(.getNormal t)) rlm@247: vertices [vert-1 vert-2 vert-3 unit-normal]] rlm@247: (dorun rlm@247: (for [row (range 4) col (range 3)] rlm@247: (do rlm@247: (.set mat col row (.get (vertices row) col)) rlm@247: (.set mat 3 row 1)))) mat)) rlm@247: rlm@247: (defn triangles->affine-transform rlm@247: "Returns the affine transformation that converts each vertex in the rlm@247: first triangle into the corresponding vertex in the second rlm@247: triangle." rlm@247: [#^Triangle tri-1 #^Triangle tri-2] rlm@247: (.mult rlm@247: (triangle->matrix4f tri-2) rlm@247: (.invert (triangle->matrix4f tri-1)))) rlm@247: #+end_src rlm@247: ** Triangle Boundaries rlm@247: rlm@247: For efficiency's sake I will divide the tactile-profile image into rlm@247: small squares which inscribe each pixel-triangle, then extract the rlm@247: points which lie inside the triangle and map them to 3D-space using rlm@273: =triangle-transform= above. To do this I need a function, rlm@273: =convex-bounds= which finds the smallest box which inscribes a 2D rlm@247: triangle. rlm@247: rlm@273: =inside-triangle?= determines whether a point is inside a triangle rlm@247: in 2D pixel-space. rlm@247: rlm@247: #+name: triangles-4 rlm@247: #+begin_src clojure rlm@247: (defn convex-bounds rlm@247: "Returns the smallest square containing the given vertices, as a rlm@247: vector of integers [left top width height]." rlm@247: [verts] rlm@247: (let [xs (map first verts) rlm@247: ys (map second verts) rlm@247: x0 (Math/floor (apply min xs)) rlm@247: y0 (Math/floor (apply min ys)) rlm@247: x1 (Math/ceil (apply max xs)) rlm@247: y1 (Math/ceil (apply max ys))] rlm@247: [x0 y0 (- x1 x0) (- y1 y0)])) rlm@247: rlm@247: (defn same-side? rlm@247: "Given the points p1 and p2 and the reference point ref, is point p rlm@247: on the same side of the line that goes through p1 and p2 as ref is?" rlm@247: [p1 p2 ref p] rlm@247: (<= rlm@247: 0 rlm@247: (.dot rlm@247: (.cross (.subtract p2 p1) (.subtract p p1)) rlm@247: (.cross (.subtract p2 p1) (.subtract ref p1))))) rlm@247: rlm@247: (defn inside-triangle? rlm@247: "Is the point inside the triangle?" rlm@247: {:author "Dylan Holmes"} rlm@247: [#^Triangle tri #^Vector3f p] rlm@247: (let [[vert-1 vert-2 vert-3] [(.get1 tri) (.get2 tri) (.get3 tri)]] rlm@247: (and rlm@247: (same-side? vert-1 vert-2 vert-3 p) rlm@247: (same-side? vert-2 vert-3 vert-1 p) rlm@247: (same-side? vert-3 vert-1 vert-2 p)))) rlm@247: #+end_src rlm@247: rlm@247: * Feeler Coordinates rlm@247: rlm@247: The triangle-related functions above make short work of calculating rlm@247: the positions and orientations of each feeler in world-space. rlm@247: rlm@247: #+name: sensors rlm@247: #+begin_src clojure rlm@247: (in-ns 'cortex.touch) rlm@247: rlm@247: (defn feeler-pixel-coords rlm@247: "Returns the coordinates of the feelers in pixel space in lists, one rlm@247: list for each triangle, ordered in the same way as (triangles) and rlm@247: (pixel-triangles)." rlm@247: [#^Geometry geo image] rlm@247: (map rlm@247: (fn [pixel-triangle] rlm@247: (filter rlm@247: (fn [coord] rlm@247: (inside-triangle? (->triangle pixel-triangle) rlm@247: (->vector3f coord))) rlm@247: (white-coordinates image (convex-bounds pixel-triangle)))) rlm@247: (pixel-triangles geo image))) rlm@247: rlm@247: (defn feeler-world-coords rlm@247: "Returns the coordinates of the feelers in world space in lists, one rlm@247: list for each triangle, ordered in the same way as (triangles) and rlm@247: (pixel-triangles)." rlm@247: [#^Geometry geo image] rlm@247: (let [transforms rlm@247: (map #(triangles->affine-transform rlm@247: (->triangle %1) (->triangle %2)) rlm@247: (pixel-triangles geo image) rlm@247: (triangles geo))] rlm@247: (map (fn [transform coords] rlm@247: (map #(.mult transform (->vector3f %)) coords)) rlm@247: transforms (feeler-pixel-coords geo image)))) rlm@247: rlm@247: (defn feeler-origins rlm@247: "The world space coordinates of the root of each feeler." rlm@247: [#^Geometry geo image] rlm@247: (reduce concat (feeler-world-coords geo image))) rlm@247: rlm@247: (defn feeler-tips rlm@247: "The world space coordinates of the tip of each feeler." rlm@247: [#^Geometry geo image] rlm@247: (let [world-coords (feeler-world-coords geo image) rlm@247: normals rlm@247: (map rlm@247: (fn [triangle] rlm@247: (.calculateNormal triangle) rlm@247: (.clone (.getNormal triangle))) rlm@247: (map ->triangle (triangles geo)))] rlm@247: rlm@247: (mapcat (fn [origins normal] rlm@247: (map #(.add % normal) origins)) rlm@247: world-coords normals))) rlm@247: rlm@247: (defn touch-topology rlm@247: "touch-topology? is not a function." rlm@247: [#^Geometry geo image] rlm@247: (collapse (reduce concat (feeler-pixel-coords geo image)))) rlm@247: #+end_src rlm@247: * Simulated Touch rlm@247: rlm@273: =touch-kernel= generates functions to be called from within a rlm@247: simulation that perform the necessary physics collisions to collect rlm@273: tactile data, and =touch!= recursively applies it to every node in rlm@247: the creature. rlm@238: rlm@233: #+name: kernel rlm@233: #+begin_src clojure rlm@233: (in-ns 'cortex.touch) rlm@233: rlm@244: (defn set-ray [#^Ray ray #^Matrix4f transform rlm@244: #^Vector3f origin #^Vector3f tip] rlm@306: ;; Doing everything locally reduces garbage collection by enough to rlm@243: ;; be worth it. rlm@243: (.mult transform origin (.getOrigin ray)) rlm@243: (.mult transform tip (.getDirection ray)) rlm@249: (.subtractLocal (.getDirection ray) (.getOrigin ray)) rlm@249: (.normalizeLocal (.getDirection ray))) rlm@242: rlm@249: (import com.jme3.math.FastMath) rlm@249: rlm@233: (defn touch-kernel rlm@234: "Constructs a function which will return tactile sensory data from rlm@234: 'geo when called from inside a running simulation" rlm@234: [#^Geometry geo] rlm@243: (if-let rlm@243: [profile (tactile-sensor-profile geo)] rlm@243: (let [ray-reference-origins (feeler-origins geo profile) rlm@243: ray-reference-tips (feeler-tips geo profile) rlm@244: ray-length (tactile-scale geo) rlm@243: current-rays (map (fn [_] (Ray.)) ray-reference-origins) rlm@249: topology (touch-topology geo profile) rlm@249: correction (float (* ray-length -0.2))] rlm@249: rlm@249: ;; slight tolerance for very close collisions. rlm@249: (dorun rlm@249: (map (fn [origin tip] rlm@249: (.addLocal origin (.mult (.subtract tip origin) rlm@249: correction))) rlm@249: ray-reference-origins ray-reference-tips)) rlm@244: (dorun (map #(.setLimit % ray-length) current-rays)) rlm@233: (fn [node] rlm@243: (let [transform (.getWorldMatrix geo)] rlm@243: (dorun rlm@244: (map (fn [ray ref-origin ref-tip] rlm@244: (set-ray ray transform ref-origin ref-tip)) rlm@243: current-rays ray-reference-origins rlm@244: ray-reference-tips)) rlm@233: (vector rlm@243: topology rlm@233: (vec rlm@243: (for [ray current-rays] rlm@233: (do rlm@233: (let [results (CollisionResults.)] rlm@233: (.collideWith node ray results) rlm@233: (let [touch-objects rlm@233: (filter #(not (= geo (.getGeometry %))) rlm@249: results) rlm@249: limit (.getLimit ray)] rlm@233: [(if (empty? touch-objects) rlm@249: limit rlm@249: (let [response rlm@249: (apply min (map #(.getDistance %) rlm@249: touch-objects))] rlm@249: (FastMath/clamp rlm@249: (float rlm@283: (if (> response limit) (float 0.0) rlm@249: (+ response correction))) rlm@249: (float 0.0) rlm@249: limit))) rlm@249: limit]))))))))))) rlm@233: rlm@233: (defn touch! rlm@233: "Endow the creature with the sense of touch. Returns a sequence of rlm@306: functions, one for each body part with a tactile-sensor-profile, rlm@233: each of which when called returns sensory data for that body part." rlm@233: [#^Node creature] rlm@233: (filter rlm@233: (comp not nil?) rlm@233: (map touch-kernel rlm@233: (filter #(isa? (class %) Geometry) rlm@233: (node-seq creature))))) rlm@233: #+end_src rlm@233: rlm@249: #+results: kernel rlm@249: : #'cortex.touch/touch! rlm@249: rlm@247: * Visualizing Touch rlm@238: rlm@249: Each feeler is represented in the image as a single pixel. The rlm@306: greyscale value of each pixel represents how deep the feeler rlm@249: represented by that pixel is inside another object. Black means that rlm@249: nothing is touching the feeler, while white means that the feeler is rlm@249: completely inside another object, which is presumably flush with the rlm@249: surface of the triangle from which the feeler originates. rlm@249: rlm@233: #+name: visualization rlm@233: #+begin_src clojure rlm@233: (in-ns 'cortex.touch) rlm@233: rlm@233: (defn touch->gray rlm@306: "Convert a pair of [distance, max-distance] into a gray-scale pixel." rlm@233: [distance max-distance] rlm@245: (gray (- 255 (rem (int (* 255 (/ distance max-distance))) 256)))) rlm@233: rlm@233: (defn view-touch rlm@245: "Creates a function which accepts a list of touch sensor-data and rlm@233: displays each element to the screen." rlm@233: [] rlm@233: (view-sense rlm@246: (fn [[coords sensor-data]] rlm@233: (let [image (points->image coords)] rlm@233: (dorun rlm@233: (for [i (range (count coords))] rlm@250: (.setRGB image ((coords i) 0) ((coords i) 1) rlm@250: (apply touch->gray (sensor-data i))))) rlm@283: image)))) rlm@233: #+end_src rlm@249: rlm@249: #+results: visualization rlm@249: : #'cortex.touch/view-touch rlm@249: rlm@250: * Basic Test of Touch rlm@249: rlm@249: The worm's sense of touch is a bit complicated, so for this basic test rlm@249: I'll use a new creature --- a simple cube which has touch sensors rlm@249: evenly distributed along each of its sides. rlm@249: rlm@253: #+name: test-touch-0 rlm@249: #+begin_src clojure rlm@249: (in-ns 'cortex.test.touch) rlm@249: rlm@249: (defn touch-cube [] rlm@249: (load-blender-model "Models/test-touch/touch-cube.blend")) rlm@249: #+end_src rlm@249: rlm@253: ** The Touch Cube rlm@249: #+begin_html rlm@249:
rlm@249:
rlm@249: rlm@309:
YouTube rlm@249:
rlm@249:

A simple creature with evenly distributed touch sensors.

rlm@249:
rlm@249: #+end_html rlm@249: rlm@249: The tactile-sensor-profile image for this simple creature looks like rlm@249: this: rlm@249: rlm@249: #+attr_html: width=500 rlm@249: #+caption: The distribution of feelers along the touch-cube. The colors of the faces are irrelevant; only the white pixels specify feelers. rlm@249: [[../images/touch-profile.png]] rlm@249: rlm@253: #+name: test-touch-1 rlm@249: #+begin_src clojure rlm@249: (in-ns 'cortex.test.touch) rlm@249: rlm@249: (defn test-basic-touch rlm@321: "Testing touch: rlm@321: You should see a cube fall onto a table. There is a cross-shaped rlm@321: display which reports the cube's sensation of touch. This display rlm@321: should change when the cube hits the table, and whenever you hit rlm@321: the cube with balls. rlm@321: rlm@321: Keys: rlm@321: : fire ball" rlm@249: ([] (test-basic-touch false)) rlm@249: ([record?] rlm@249: (let [the-cube (doto (touch-cube) (body!)) rlm@249: touch (touch! the-cube) rlm@249: touch-display (view-touch)] rlm@250: (world rlm@250: (nodify [the-cube rlm@250: (box 10 1 10 :position (Vector3f. 0 -10 0) rlm@250: :color ColorRGBA/Gray :mass 0)]) rlm@250: rlm@250: standard-debug-controls rlm@250: rlm@250: (fn [world] rlm@340: (let [timer (IsoTimer. 60)] rlm@340: (.setTimer world timer) rlm@340: (display-dilated-time world timer)) rlm@250: (if record? rlm@250: (Capture/captureVideo rlm@250: world rlm@250: (File. "/home/r/proj/cortex/render/touch-cube/main-view/"))) rlm@250: (speed-up world) rlm@250: (light-up-everything world)) rlm@250: rlm@250: (fn [world tpf] rlm@250: (touch-display rlm@250: (map #(% (.getRootNode world)) touch) rlm@250: (if record? rlm@250: (File. "/home/r/proj/cortex/render/touch-cube/touch/")))))))) rlm@250: #+end_src rlm@249: rlm@340: #+results: test-touch-1 rlm@340: : #'cortex.test.touch/test-basic-touch rlm@340: rlm@250: ** Basic Touch Demonstration rlm@249: rlm@250: #+begin_html rlm@250:
rlm@250:
rlm@250: rlm@309:
YouTube rlm@250:
rlm@250:

The simple creature responds to touch.

rlm@250:
rlm@250: #+end_html rlm@249: rlm@250: ** Generating the Basic Touch Video rlm@253: #+name: magick4 rlm@250: #+begin_src clojure rlm@250: (ns cortex.video.magick4 rlm@250: (:import java.io.File) rlm@316: (:use clojure.java.shell)) rlm@250: rlm@250: (defn images [path] rlm@250: (sort (rest (file-seq (File. path))))) rlm@250: rlm@250: (def base "/home/r/proj/cortex/render/touch-cube/") rlm@250: rlm@250: (defn pics [file] rlm@250: (images (str base file))) rlm@250: rlm@250: (defn combine-images [] rlm@250: (let [main-view (pics "main-view") rlm@250: touch (pics "touch/0") rlm@250: background (repeat 9001 (File. (str base "background.png"))) rlm@250: targets (map rlm@250: #(File. (str base "out/" (format "%07d.png" %))) rlm@250: (range 0 (count main-view)))] rlm@250: (dorun rlm@250: (pmap rlm@250: (comp rlm@250: (fn [[background main-view touch target]] rlm@250: (println target) rlm@250: (sh "convert" rlm@250: touch rlm@250: "-resize" "x300" rlm@250: "-rotate" "180" rlm@250: background rlm@250: "-swap" "0,1" rlm@250: "-geometry" "+776+129" rlm@250: "-composite" rlm@250: main-view "-geometry" "+66+21" "-composite" rlm@250: target)) rlm@250: (fn [& args] (map #(.getCanonicalPath %) args))) rlm@250: background main-view touch targets)))) rlm@249: #+end_src rlm@249: rlm@252: #+begin_src sh :results silent rlm@312: cd ~/proj/cortex/render/touch-cube/ rlm@252: ffmpeg -r 60 -i out/%07d.png -b:v 9000k -c:v libtheora basic-touch.ogg rlm@252: #+end_src rlm@250: rlm@312: #+begin_src sh :results silent rlm@312: cd ~/proj/cortex/render/touch-cube/ rlm@312: ffmpeg -r 30 -i blender-intro/%07d.png -b:v 9000k -c:v libtheora touch-cube.ogg rlm@312: #+end_src rlm@312: rlm@232: * Adding Touch to the Worm rlm@232: rlm@253: #+name: test-touch-2 rlm@232: #+begin_src clojure rlm@253: (in-ns 'cortex.test.touch) rlm@232: rlm@283: (defn test-worm-touch rlm@321: "Testing touch: rlm@321: You will see the worm fall onto a table. There is a display which rlm@321: reports the worm's sense of touch. It should change when the worm rlm@321: hits the table and when you hit it with balls. rlm@321: rlm@321: Keys: rlm@321: : fire ball" rlm@283: ([] (test-worm-touch false)) rlm@253: ([record?] rlm@253: (let [the-worm (doto (worm) (body!)) rlm@253: touch (touch! the-worm) rlm@253: touch-display (view-touch)] rlm@253: (world rlm@253: (nodify [the-worm (floor)]) rlm@253: standard-debug-controls rlm@253: rlm@253: (fn [world] rlm@340: (let [timer (IsoTimer. 60)] rlm@340: (.setTimer world timer) rlm@340: (display-dilated-time world timer)) rlm@253: (if record? rlm@253: (Capture/captureVideo rlm@253: world rlm@253: (File. "/home/r/proj/cortex/render/worm-touch/main-view/"))) rlm@253: (speed-up world) rlm@253: (light-up-everything world)) rlm@232: rlm@253: (fn [world tpf] rlm@253: (touch-display rlm@253: (map #(% (.getRootNode world)) touch) rlm@253: (if record? rlm@253: (File. "/home/r/proj/cortex/render/worm-touch/touch/")))))))) rlm@232: #+end_src rlm@247: rlm@340: #+results: test-touch-2 rlm@340: : #'cortex.test.touch/test-worm-touch rlm@340: rlm@253: ** Worm Touch Demonstration rlm@253: #+begin_html rlm@253:
rlm@253:
rlm@253: rlm@309:
YouTube rlm@253:
rlm@253:

The worm responds to touch.

rlm@253:
rlm@253: #+end_html rlm@252: rlm@252: rlm@253: ** Generating the Worm Touch Video rlm@253: #+name: magick5 rlm@253: #+begin_src clojure rlm@253: (ns cortex.video.magick5 rlm@253: (:import java.io.File) rlm@316: (:use clojure.java.shell)) rlm@253: rlm@253: (defn images [path] rlm@253: (sort (rest (file-seq (File. path))))) rlm@253: rlm@253: (def base "/home/r/proj/cortex/render/worm-touch/") rlm@253: rlm@253: (defn pics [file] rlm@253: (images (str base file))) rlm@253: rlm@253: (defn combine-images [] rlm@253: (let [main-view (pics "main-view") rlm@253: touch (pics "touch/0") rlm@253: targets (map rlm@253: #(File. (str base "out/" (format "%07d.png" %))) rlm@253: (range 0 (count main-view)))] rlm@253: (dorun rlm@253: (pmap rlm@253: (comp rlm@253: (fn [[ main-view touch target]] rlm@253: (println target) rlm@253: (sh "convert" rlm@253: main-view rlm@253: touch "-geometry" "+0+0" "-composite" rlm@253: target)) rlm@253: (fn [& args] (map #(.getCanonicalPath %) args))) rlm@253: main-view touch targets)))) rlm@253: #+end_src rlm@252: rlm@312: #+begin_src sh :results silent rlm@312: cd ~/proj/cortex/render/worm-touch rlm@312: ffmpeg -r 60 -i out/%07d.png -b:v 9000k -c:v libtheora worm-touch.ogg rlm@312: #+end_src rlm@312: rlm@247: * Headers rlm@247: rlm@247: #+name: touch-header rlm@247: #+begin_src clojure rlm@247: (ns cortex.touch rlm@247: "Simulate the sense of touch in jMonkeyEngine3. Enables any Geometry rlm@247: to be outfitted with touch sensors with density determined by a UV rlm@247: image. In this way a Geometry can know what parts of itself are rlm@247: touching nearby objects. Reads specially prepared blender files to rlm@247: construct this sense automatically." rlm@247: {:author "Robert McIntyre"} rlm@247: (:use (cortex world util sense)) rlm@247: (:import (com.jme3.scene Geometry Node Mesh)) rlm@247: (:import com.jme3.collision.CollisionResults) rlm@247: (:import com.jme3.scene.VertexBuffer$Type) rlm@247: (:import (com.jme3.math Triangle Vector3f Vector2f Ray Matrix4f))) rlm@247: #+end_src rlm@247: rlm@253: #+name: test-touch-header rlm@253: #+begin_src clojure rlm@253: (ns cortex.test.touch rlm@253: (:use (cortex world util sense body touch)) rlm@253: (:use cortex.test.body) rlm@340: (:import (com.aurellem.capture Capture IsoTimer)) rlm@253: (:import java.io.File) rlm@253: (:import (com.jme3.math Vector3f ColorRGBA))) rlm@253: #+end_src rlm@253: rlm@340: #+results: test-touch-header rlm@340: : com.jme3.math.ColorRGBA rlm@340: rlm@228: * Source Listing rlm@253: - [[../src/cortex/touch.clj][cortex.touch]] rlm@253: - [[../src/cortex/test/touch.clj][cortex.test.touch]] rlm@253: - [[../src/cortex/video/magick4.clj][cortex.video.magick4]] rlm@253: - [[../src/cortex/video/magick5.clj][cortex.video.magick5]] rlm@283: - [[../assets/Models/test-touch/touch-cube.blend][touch-cube.blend]] rlm@253: #+html: rlm@253: - [[http://hg.bortreb.com ][source-repository]] rlm@253: rlm@254: * Next rlm@332: So far I've implemented simulated Vision, Hearing, and rlm@332: Touch, the most obvious and prominent senses that humans rlm@332: have. Smell and Taste shall remain unimplemented for rlm@332: now. This accounts for the "five senses" that feature so rlm@332: prominently in our lives. But humans have far more than the rlm@332: five main senses. There are internal chemical senses, pain rlm@332: (which is *not* the same as touch), heat sensitivity, and rlm@332: our sense of balance, among others. One extra sense is so rlm@332: important that I must implement it to have a hope of making rlm@332: creatures that can gracefully control their own bodies. It rlm@332: is Proprioception, which is the sense of the location of rlm@332: each body part in relation to the other body parts. rlm@253: rlm@332: Close your eyes, and touch your nose with your right index rlm@332: finger. How did you do it? You could not see your hand, and rlm@332: neither your hand nor your nose could use the sense of touch rlm@332: to guide the path of your hand. There are no sound cues, rlm@332: and Taste and Smell certainly don't provide any help. You rlm@332: know where your hand is without your other senses because of rlm@332: Proprioception. rlm@228: rlm@254: Onward to [[./proprioception.org][proprioception]]! rlm@228: rlm@226: * COMMENT Code Generation rlm@39: #+begin_src clojure :tangle ../src/cortex/touch.clj rlm@231: <> rlm@231: <> rlm@231: <> rlm@247: <> rlm@231: <> rlm@231: <> rlm@231: <> rlm@231: <> rlm@231: <> rlm@0: #+end_src rlm@0: rlm@253: #+begin_src clojure :tangle ../src/cortex/test/touch.clj rlm@253: <> rlm@253: <> rlm@253: <> rlm@253: <> rlm@253: #+end_src rlm@232: rlm@253: #+begin_src clojure :tangle ../src/cortex/video/magick4.clj rlm@253: <> rlm@39: #+end_src rlm@39: rlm@253: #+begin_src clojure :tangle ../src/cortex/video/magick5.clj rlm@253: <> rlm@253: #+end_src