changeset 474:57c7d5aec8d5

mix in touch; need to clean it up.
author Robert McIntyre <rlm@mit.edu>
date Fri, 28 Mar 2014 21:05:12 -0400
parents 486ce07f5545
children 3ec428e096e5
files thesis/cortex.org
diffstat 1 files changed, 429 insertions(+), 1 deletions(-) [+]
line wrap: on
line diff
     1.1 --- a/thesis/cortex.org	Fri Mar 28 20:49:13 2014 -0400
     1.2 +++ b/thesis/cortex.org	Fri Mar 28 21:05:12 2014 -0400
     1.3 @@ -1554,6 +1554,434 @@
     1.4  
     1.5  ** Touch uses hundreds of hair-like elements
     1.6  
     1.7 +   Touch is critical to navigation and spatial reasoning and as such I
     1.8 +   need a simulated version of it to give to my AI creatures.
     1.9 +   
    1.10 +   Human skin has a wide array of touch sensors, each of which
    1.11 +   specialize in detecting different vibrational modes and pressures.
    1.12 +   These sensors can integrate a vast expanse of skin (i.e. your
    1.13 +   entire palm), or a tiny patch of skin at the tip of your finger.
    1.14 +   The hairs of the skin help detect objects before they even come
    1.15 +   into contact with the skin proper.
    1.16 +   
    1.17 +   However, touch in my simulated world can not exactly correspond to
    1.18 +   human touch because my creatures are made out of completely rigid
    1.19 +   segments that don't deform like human skin.
    1.20 +   
    1.21 +   Instead of measuring deformation or vibration, I surround each
    1.22 +   rigid part with a plenitude of hair-like objects (/feelers/) which
    1.23 +   do not interact with the physical world. Physical objects can pass
    1.24 +   through them with no effect. The feelers are able to tell when
    1.25 +   other objects pass through them, and they constantly report how
    1.26 +   much of their extent is covered. So even though the creature's body
    1.27 +   parts do not deform, the feelers create a margin around those body
    1.28 +   parts which achieves a sense of touch which is a hybrid between a
    1.29 +   human's sense of deformation and sense from hairs.
    1.30 +   
    1.31 +   Implementing touch in jMonkeyEngine follows a different technical
    1.32 +   route than vision and hearing. Those two senses piggybacked off
    1.33 +   jMonkeyEngine's 3D audio and video rendering subsystems. To
    1.34 +   simulate touch, I use jMonkeyEngine's physics system to execute
    1.35 +   many small collision detections, one for each feeler. The placement
    1.36 +   of the feelers is determined by a UV-mapped image which shows where
    1.37 +   each feeler should be on the 3D surface of the body.
    1.38 +
    1.39 +*** Defining Touch Meta-Data in Blender
    1.40 +
    1.41 +    Each geometry can have a single UV map which describes the
    1.42 +    position of the feelers which will constitute its sense of touch.
    1.43 +    This image path is stored under the ``touch'' key. The image itself
    1.44 +    is black and white, with black meaning a feeler length of 0 (no
    1.45 +    feeler is present) and white meaning a feeler length of =scale=,
    1.46 +    which is a float stored under the key "scale".
    1.47 +
    1.48 +#+name: meta-data
    1.49 +#+begin_src clojure
    1.50 +(defn tactile-sensor-profile
    1.51 +  "Return the touch-sensor distribution image in BufferedImage format,
    1.52 +   or nil if it does not exist."
    1.53 +  [#^Geometry obj]
    1.54 +  (if-let [image-path (meta-data obj "touch")]
    1.55 +    (load-image image-path)))
    1.56 +
    1.57 +(defn tactile-scale
    1.58 +  "Return the length of each feeler. Default scale is 0.01
    1.59 +  jMonkeyEngine units."
    1.60 +  [#^Geometry obj]
    1.61 +  (if-let [scale (meta-data obj "scale")]
    1.62 +    scale 0.1))
    1.63 +#+end_src
    1.64 +
    1.65 +    Here is an example of a UV-map which specifies the position of touch
    1.66 +    sensors along the surface of the upper segment of the worm.
    1.67 +
    1.68 +    #+attr_html: width=755
    1.69 +    #+caption: This is the tactile-sensor-profile for the upper segment of the worm. It defines regions of high touch sensitivity (where there are many white pixels) and regions of low sensitivity (where white pixels are sparse).
    1.70 +    [[../images/finger-UV.png]]
    1.71 +
    1.72 +*** Implementation Summary
    1.73 +  
    1.74 +    To simulate touch there are three conceptual steps. For each solid
    1.75 +    object in the creature, you first have to get UV image and scale
    1.76 +    parameter which define the position and length of the feelers.
    1.77 +    Then, you use the triangles which comprise the mesh and the UV
    1.78 +    data stored in the mesh to determine the world-space position and
    1.79 +    orientation of each feeler. Then once every frame, update these
    1.80 +    positions and orientations to match the current position and
    1.81 +    orientation of the object, and use physics collision detection to
    1.82 +    gather tactile data.
    1.83 +    
    1.84 +    Extracting the meta-data has already been described. The third
    1.85 +    step, physics collision detection, is handled in =touch-kernel=.
    1.86 +    Translating the positions and orientations of the feelers from the
    1.87 +    UV-map to world-space is itself a three-step process.
    1.88 +
    1.89 +  - Find the triangles which make up the mesh in pixel-space and in
    1.90 +    world-space. =triangles= =pixel-triangles=.
    1.91 +
    1.92 +  - Find the coordinates of each feeler in world-space. These are the
    1.93 +    origins of the feelers. =feeler-origins=.
    1.94 +    
    1.95 +  - Calculate the normals of the triangles in world space, and add
    1.96 +    them to each of the origins of the feelers. These are the
    1.97 +    normalized coordinates of the tips of the feelers. =feeler-tips=.
    1.98 +
    1.99 +*** Triangle Math
   1.100 +
   1.101 +The rigid objects which make up a creature have an underlying
   1.102 +=Geometry=, which is a =Mesh= plus a =Material= and other important
   1.103 +data involved with displaying the object.
   1.104 +
   1.105 +A =Mesh= is composed of =Triangles=, and each =Triangle= has three
   1.106 +vertices which have coordinates in world space and UV space.
   1.107 + 
   1.108 +Here, =triangles= gets all the world-space triangles which comprise a
   1.109 +mesh, while =pixel-triangles= gets those same triangles expressed in
   1.110 +pixel coordinates (which are UV coordinates scaled to fit the height
   1.111 +and width of the UV image).
   1.112 +
   1.113 +#+name: triangles-2
   1.114 +#+begin_src clojure
   1.115 +(in-ns 'cortex.touch)
   1.116 +(defn triangle
   1.117 +  "Get the triangle specified by triangle-index from the mesh."
   1.118 +  [#^Geometry geo triangle-index]
   1.119 +  (triangle-seq
   1.120 +   (let [scratch (Triangle.)]
   1.121 +     (.getTriangle (.getMesh geo) triangle-index scratch) scratch)))
   1.122 +
   1.123 +(defn triangles
   1.124 +  "Return a sequence of all the Triangles which comprise a given
   1.125 +   Geometry." 
   1.126 +  [#^Geometry geo]
   1.127 +  (map (partial triangle geo) (range (.getTriangleCount (.getMesh geo)))))
   1.128 +
   1.129 +(defn triangle-vertex-indices
   1.130 +  "Get the triangle vertex indices of a given triangle from a given
   1.131 +   mesh."
   1.132 +  [#^Mesh mesh triangle-index]
   1.133 +  (let [indices (int-array 3)]
   1.134 +    (.getTriangle mesh triangle-index indices)
   1.135 +    (vec indices)))
   1.136 +
   1.137 +(defn vertex-UV-coord
   1.138 +  "Get the UV-coordinates of the vertex named by vertex-index"
   1.139 +  [#^Mesh mesh vertex-index]
   1.140 +  (let [UV-buffer
   1.141 +        (.getData
   1.142 +         (.getBuffer
   1.143 +          mesh
   1.144 +          VertexBuffer$Type/TexCoord))]
   1.145 +    [(.get UV-buffer (* vertex-index 2))
   1.146 +     (.get UV-buffer (+ 1 (* vertex-index 2)))]))
   1.147 +
   1.148 +(defn pixel-triangle [#^Geometry geo image index]
   1.149 +  (let [mesh (.getMesh geo)
   1.150 +        width (.getWidth image)
   1.151 +        height (.getHeight image)]
   1.152 +    (vec (map (fn [[u v]] (vector (* width u) (* height v)))
   1.153 +              (map (partial vertex-UV-coord mesh)
   1.154 +                   (triangle-vertex-indices mesh index))))))
   1.155 +
   1.156 +(defn pixel-triangles 
   1.157 +  "The pixel-space triangles of the Geometry, in the same order as
   1.158 +   (triangles geo)"
   1.159 +  [#^Geometry geo image]
   1.160 +  (let [height (.getHeight image)
   1.161 +        width (.getWidth image)]
   1.162 +    (map (partial pixel-triangle geo image)
   1.163 +         (range (.getTriangleCount (.getMesh geo))))))
   1.164 +#+end_src
   1.165 +
   1.166 +*** The Affine Transform from one Triangle to Another
   1.167 +
   1.168 +=pixel-triangles= gives us the mesh triangles expressed in pixel
   1.169 +coordinates and =triangles= gives us the mesh triangles expressed in
   1.170 +world coordinates. The tactile-sensor-profile gives the position of
   1.171 +each feeler in pixel-space. In order to convert pixel-space
   1.172 +coordinates into world-space coordinates we need something that takes
   1.173 +coordinates on the surface of one triangle and gives the corresponding
   1.174 +coordinates on the surface of another triangle.
   1.175 +
   1.176 +Triangles are [[http://mathworld.wolfram.com/AffineTransformation.html ][affine]], which means any triangle can be transformed into
   1.177 +any other by a combination of translation, scaling, and
   1.178 +rotation. The affine transformation from one triangle to another
   1.179 +is readily computable if the triangle is expressed in terms of a $4x4$
   1.180 +matrix.
   1.181 +
   1.182 +\begin{bmatrix}
   1.183 +x_1 & x_2 & x_3 & n_x \\
   1.184 +y_1 & y_2 & y_3 & n_y \\
   1.185 +z_1 & z_2 & z_3 & n_z \\
   1.186 +1 & 1 & 1 & 1
   1.187 +\end{bmatrix}
   1.188 +
   1.189 +Here, the first three columns of the matrix are the vertices of the
   1.190 +triangle. The last column is the right-handed unit normal of the
   1.191 +triangle.
   1.192 +
   1.193 +With two triangles $T_{1}$ and $T_{2}$ each expressed as a matrix like
   1.194 +above, the affine transform from $T_{1}$ to $T_{2}$ is 
   1.195 +
   1.196 +$T_{2}T_{1}^{-1}$
   1.197 +
   1.198 +The clojure code below recapitulates the formulas above, using
   1.199 +jMonkeyEngine's =Matrix4f= objects, which can describe any affine
   1.200 +transformation.
   1.201 +
   1.202 +#+name: triangles-3
   1.203 +#+begin_src clojure
   1.204 +(in-ns 'cortex.touch)
   1.205 +
   1.206 +(defn triangle->matrix4f
   1.207 +  "Converts the triangle into a 4x4 matrix: The first three columns
   1.208 +   contain the vertices of the triangle; the last contains the unit
   1.209 +   normal of the triangle. The bottom row is filled with 1s."
   1.210 +  [#^Triangle t]
   1.211 +  (let [mat (Matrix4f.)
   1.212 +        [vert-1 vert-2 vert-3]
   1.213 +        (mapv #(.get t %) (range 3))
   1.214 +        unit-normal (do (.calculateNormal t)(.getNormal t))
   1.215 +        vertices [vert-1 vert-2 vert-3 unit-normal]]
   1.216 +    (dorun 
   1.217 +     (for [row (range 4) col (range 3)]
   1.218 +       (do
   1.219 +         (.set mat col row (.get (vertices row) col))
   1.220 +         (.set mat 3 row 1)))) mat))
   1.221 +
   1.222 +(defn triangles->affine-transform
   1.223 +  "Returns the affine transformation that converts each vertex in the
   1.224 +   first triangle into the corresponding vertex in the second
   1.225 +   triangle."
   1.226 +  [#^Triangle tri-1 #^Triangle tri-2]
   1.227 +  (.mult 
   1.228 +   (triangle->matrix4f tri-2)
   1.229 +   (.invert (triangle->matrix4f tri-1))))
   1.230 +#+end_src
   1.231 +
   1.232 +*** Triangle Boundaries
   1.233 +  
   1.234 +For efficiency's sake I will divide the tactile-profile image into
   1.235 +small squares which inscribe each pixel-triangle, then extract the
   1.236 +points which lie inside the triangle and map them to 3D-space using
   1.237 +=triangle-transform= above. To do this I need a function,
   1.238 +=convex-bounds= which finds the smallest box which inscribes a 2D
   1.239 +triangle.
   1.240 +
   1.241 +=inside-triangle?= determines whether a point is inside a triangle
   1.242 +in 2D pixel-space.
   1.243 +
   1.244 +#+name: triangles-4
   1.245 +#+begin_src clojure
   1.246 +(defn convex-bounds
   1.247 +  "Returns the smallest square containing the given vertices, as a
   1.248 +   vector of integers [left top width height]."
   1.249 +  [verts]
   1.250 +  (let [xs (map first verts)
   1.251 +        ys (map second verts)
   1.252 +        x0 (Math/floor (apply min xs))
   1.253 +        y0 (Math/floor (apply min ys))
   1.254 +        x1 (Math/ceil (apply max xs))
   1.255 +        y1 (Math/ceil (apply max ys))]
   1.256 +    [x0 y0 (- x1 x0) (- y1 y0)]))
   1.257 +
   1.258 +(defn same-side?
   1.259 +  "Given the points p1 and p2 and the reference point ref, is point p
   1.260 +  on the same side of the line that goes through p1 and p2 as ref is?" 
   1.261 +  [p1 p2 ref p]
   1.262 +  (<=
   1.263 +   0
   1.264 +   (.dot 
   1.265 +    (.cross (.subtract p2 p1) (.subtract p p1))
   1.266 +    (.cross (.subtract p2 p1) (.subtract ref p1)))))
   1.267 +
   1.268 +(defn inside-triangle?
   1.269 +  "Is the point inside the triangle?"
   1.270 +  {:author "Dylan Holmes"}
   1.271 +  [#^Triangle tri #^Vector3f p]
   1.272 +  (let [[vert-1 vert-2 vert-3] [(.get1 tri) (.get2 tri) (.get3 tri)]]
   1.273 +    (and
   1.274 +     (same-side? vert-1 vert-2 vert-3 p)
   1.275 +     (same-side? vert-2 vert-3 vert-1 p)
   1.276 +     (same-side? vert-3 vert-1 vert-2 p))))
   1.277 +#+end_src
   1.278 +
   1.279 +*** Feeler Coordinates
   1.280 +
   1.281 +The triangle-related functions above make short work of calculating
   1.282 +the positions and orientations of each feeler in world-space.
   1.283 +
   1.284 +#+name: sensors
   1.285 +#+begin_src clojure
   1.286 +(in-ns 'cortex.touch)
   1.287 +
   1.288 +(defn feeler-pixel-coords
   1.289 + "Returns the coordinates of the feelers in pixel space in lists, one
   1.290 +  list for each triangle, ordered in the same way as (triangles) and
   1.291 +  (pixel-triangles)."
   1.292 + [#^Geometry geo image]
   1.293 + (map 
   1.294 +  (fn [pixel-triangle]
   1.295 +    (filter
   1.296 +     (fn [coord]
   1.297 +       (inside-triangle? (->triangle pixel-triangle)
   1.298 +                         (->vector3f coord)))
   1.299 +       (white-coordinates image (convex-bounds pixel-triangle))))
   1.300 +  (pixel-triangles geo image)))
   1.301 +
   1.302 +(defn feeler-world-coords 
   1.303 + "Returns the coordinates of the feelers in world space in lists, one
   1.304 +  list for each triangle, ordered in the same way as (triangles) and
   1.305 +  (pixel-triangles)."
   1.306 + [#^Geometry geo image]
   1.307 + (let [transforms
   1.308 +       (map #(triangles->affine-transform
   1.309 +              (->triangle %1) (->triangle %2))
   1.310 +            (pixel-triangles geo image)
   1.311 +            (triangles geo))]
   1.312 +   (map (fn [transform coords]
   1.313 +          (map #(.mult transform (->vector3f %)) coords))
   1.314 +        transforms (feeler-pixel-coords geo image))))
   1.315 +
   1.316 +(defn feeler-origins
   1.317 +  "The world space coordinates of the root of each feeler."
   1.318 +  [#^Geometry geo image]
   1.319 +   (reduce concat (feeler-world-coords geo image)))
   1.320 +
   1.321 +(defn feeler-tips
   1.322 +  "The world space coordinates of the tip of each feeler."
   1.323 +  [#^Geometry geo image]
   1.324 +  (let [world-coords (feeler-world-coords geo image)
   1.325 +        normals
   1.326 +        (map
   1.327 +         (fn [triangle]
   1.328 +           (.calculateNormal triangle)
   1.329 +           (.clone (.getNormal triangle)))
   1.330 +         (map ->triangle (triangles geo)))]
   1.331 +
   1.332 +    (mapcat (fn [origins normal]
   1.333 +              (map #(.add % normal) origins))
   1.334 +            world-coords normals)))
   1.335 +
   1.336 +(defn touch-topology
   1.337 +  "touch-topology? is not a function."
   1.338 +  [#^Geometry geo image]
   1.339 +  (collapse (reduce concat (feeler-pixel-coords geo image))))
   1.340 +#+end_src
   1.341 +*** Simulated Touch
   1.342 +
   1.343 +=touch-kernel= generates functions to be called from within a
   1.344 +simulation that perform the necessary physics collisions to collect
   1.345 +tactile data, and =touch!= recursively applies it to every node in
   1.346 +the creature. 
   1.347 +
   1.348 +#+name: kernel
   1.349 +#+begin_src clojure
   1.350 +(in-ns 'cortex.touch)
   1.351 +
   1.352 +(defn set-ray [#^Ray ray #^Matrix4f transform
   1.353 +               #^Vector3f origin #^Vector3f tip]
   1.354 +  ;; Doing everything locally reduces garbage collection by enough to
   1.355 +  ;; be worth it.
   1.356 +  (.mult transform origin (.getOrigin ray))
   1.357 +  (.mult transform tip (.getDirection ray))
   1.358 +  (.subtractLocal (.getDirection ray) (.getOrigin ray))
   1.359 +  (.normalizeLocal (.getDirection ray)))
   1.360 +
   1.361 +(import com.jme3.math.FastMath)
   1.362 + 
   1.363 +(defn touch-kernel
   1.364 +  "Constructs a function which will return tactile sensory data from
   1.365 +   'geo when called from inside a running simulation"
   1.366 +  [#^Geometry geo]
   1.367 +  (if-let
   1.368 +      [profile (tactile-sensor-profile geo)]
   1.369 +    (let [ray-reference-origins (feeler-origins geo profile)
   1.370 +          ray-reference-tips (feeler-tips geo profile)
   1.371 +          ray-length (tactile-scale geo)
   1.372 +          current-rays (map (fn [_] (Ray.)) ray-reference-origins)
   1.373 +          topology (touch-topology geo profile)
   1.374 +          correction (float (* ray-length -0.2))]
   1.375 +
   1.376 +      ;; slight tolerance for very close collisions.
   1.377 +      (dorun
   1.378 +       (map (fn [origin tip]
   1.379 +              (.addLocal origin (.mult (.subtract tip origin)
   1.380 +                                       correction)))
   1.381 +            ray-reference-origins ray-reference-tips))
   1.382 +      (dorun (map #(.setLimit % ray-length) current-rays))
   1.383 +      (fn [node]
   1.384 +        (let [transform (.getWorldMatrix geo)]
   1.385 +          (dorun
   1.386 +           (map (fn [ray ref-origin ref-tip]
   1.387 +                  (set-ray ray transform ref-origin ref-tip))
   1.388 +                current-rays ray-reference-origins
   1.389 +                ray-reference-tips))
   1.390 +          (vector
   1.391 +           topology
   1.392 +           (vec
   1.393 +            (for [ray current-rays]
   1.394 +              (do
   1.395 +                (let [results (CollisionResults.)]
   1.396 +                  (.collideWith node ray results)
   1.397 +                  (let [touch-objects
   1.398 +                        (filter #(not (= geo (.getGeometry %)))
   1.399 +                                results)
   1.400 +                        limit (.getLimit ray)]
   1.401 +                    [(if (empty? touch-objects)
   1.402 +                       limit
   1.403 +                       (let [response
   1.404 +                             (apply min (map #(.getDistance %)
   1.405 +                                             touch-objects))]
   1.406 +                         (FastMath/clamp
   1.407 +                          (float 
   1.408 +                           (if (> response limit) (float 0.0)
   1.409 +                               (+ response correction)))
   1.410 +                           (float 0.0)
   1.411 +                           limit)))
   1.412 +                     limit])))))))))))
   1.413 +
   1.414 +(defn touch! 
   1.415 +  "Endow the creature with the sense of touch. Returns a sequence of
   1.416 +   functions, one for each body part with a tactile-sensor-profile,
   1.417 +   each of which when called returns sensory data for that body part."
   1.418 +  [#^Node creature]
   1.419 +  (filter
   1.420 +   (comp not nil?)
   1.421 +   (map touch-kernel
   1.422 +        (filter #(isa? (class %) Geometry)
   1.423 +                (node-seq creature)))))
   1.424 +#+end_src
   1.425 +
   1.426 +
   1.427 +Armed with the =touch!= function, =CORTEX= becomes capable of giving
   1.428 +creatures a sense of touch. A simple test is to create a cube that is
   1.429 +outfitted with a uniform distrubition of touch sensors. It can feel
   1.430 +the ground and any balls that it touches.
   1.431 +
   1.432 +# insert touch cube image; UV map
   1.433 +# insert video
   1.434 +
   1.435  ** Proprioception is the sense that makes everything ``real''
   1.436  
   1.437  ** Muscles are both effectors and sensors
   1.438 @@ -1561,7 +1989,7 @@
   1.439  ** =CORTEX= brings complex creatures to life!
   1.440  
   1.441  ** =CORTEX= enables many possiblities for further research
   1.442 -
   1.443 +   
   1.444  * COMMENT Empathy in a simulated worm
   1.445  
   1.446    Here I develop a computational model of empathy, using =CORTEX= as a