Mercurial > cortex
comparison org/touch.org @ 306:7e7f8d6d9ec5
massive spellchecking
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Sat, 18 Feb 2012 10:59:41 -0700 |
parents | 23aadf376e9d |
children | 5d448182c807 |
comparison
equal
deleted
inserted
replaced
305:19c43ec6958d | 306:7e7f8d6d9ec5 |
---|---|
9 * Touch | 9 * Touch |
10 | 10 |
11 Touch is critical to navigation and spatial reasoning and as such I | 11 Touch is critical to navigation and spatial reasoning and as such I |
12 need a simulated version of it to give to my AI creatures. | 12 need a simulated version of it to give to my AI creatures. |
13 | 13 |
14 Human skin has a wide array of touch sensors, each of which speciliaze | 14 Human skin has a wide array of touch sensors, each of which specialize |
15 in detecting different vibrational modes and pressures. These sensors | 15 in detecting different vibrational modes and pressures. These sensors |
16 can integrate a vast expanse of skin (i.e. your entire palm), or a | 16 can integrate a vast expanse of skin (i.e. your entire palm), or a |
17 tiny patch of skin at the tip of your finger. The hairs of the skin | 17 tiny patch of skin at the tip of your finger. The hairs of the skin |
18 help detect objects before they even come into contact with the skin | 18 help detect objects before they even come into contact with the skin |
19 proper. | 19 proper. |
30 is covered. So even though the creature's body parts do not deform, | 30 is covered. So even though the creature's body parts do not deform, |
31 the feelers create a margin around those body parts which achieves a | 31 the feelers create a margin around those body parts which achieves a |
32 sense of touch which is a hybrid between a human's sense of | 32 sense of touch which is a hybrid between a human's sense of |
33 deformation and sense from hairs. | 33 deformation and sense from hairs. |
34 | 34 |
35 Implementing touch in jMonkeyEngine follows a different techinal route | 35 Implementing touch in jMonkeyEngine follows a different technical route |
36 than vision and hearing. Those two senses piggybacked off | 36 than vision and hearing. Those two senses piggybacked off |
37 jMonkeyEngine's 3D audio and video rendering subsystems. To simulate | 37 jMonkeyEngine's 3D audio and video rendering subsystems. To simulate |
38 touch, I use jMonkeyEngine's physics system to execute many small | 38 touch, I use jMonkeyEngine's physics system to execute many small |
39 collision detections, one for each feeler. The placement of the | 39 collision detections, one for each feeler. The placement of the |
40 feelers is determined by a UV-mapped image which shows where each | 40 feelers is determined by a UV-mapped image which shows where each |
75 | 75 |
76 * Implementation Summary | 76 * Implementation Summary |
77 | 77 |
78 To simulate touch there are three conceptual steps. For each solid | 78 To simulate touch there are three conceptual steps. For each solid |
79 object in the creature, you first have to get UV image and scale | 79 object in the creature, you first have to get UV image and scale |
80 paramater which define the position and length of the feelers. Then, | 80 parameter which define the position and length of the feelers. Then, |
81 you use the triangles which compose the mesh and the UV data stored in | 81 you use the triangles which compose the mesh and the UV data stored in |
82 the mesh to determine the world-space position and orientation of each | 82 the mesh to determine the world-space position and orientation of each |
83 feeler. Then once every frame, update these positions and orientations | 83 feeler. Then once every frame, update these positions and orientations |
84 to match the current position and orientation of the object, and use | 84 to match the current position and orientation of the object, and use |
85 physics collision detection to gather tactile data. | 85 physics collision detection to gather tactile data. |
98 - Calculate the normals of the triangles in world space, and add | 98 - Calculate the normals of the triangles in world space, and add |
99 them to each of the origins of the feelers. These are the | 99 them to each of the origins of the feelers. These are the |
100 normalized coordinates of the tips of the feelers. =feeler-tips=. | 100 normalized coordinates of the tips of the feelers. =feeler-tips=. |
101 | 101 |
102 * Triangle Math | 102 * Triangle Math |
103 ** Schrapnel Conversion Functions | 103 ** Shrapnel Conversion Functions |
104 | 104 |
105 #+name: triangles-1 | 105 #+name: triangles-1 |
106 #+begin_src clojure | 106 #+begin_src clojure |
107 (defn vector3f-seq [#^Vector3f v] | 107 (defn vector3f-seq [#^Vector3f v] |
108 [(.getX v) (.getY v) (.getZ v)]) | 108 [(.getX v) (.getY v) (.getZ v)]) |
119 | 119 |
120 (defn ->triangle [points] | 120 (defn ->triangle [points] |
121 (apply #(Triangle. %1 %2 %3) (map ->vector3f points))) | 121 (apply #(Triangle. %1 %2 %3) (map ->vector3f points))) |
122 #+end_src | 122 #+end_src |
123 | 123 |
124 It is convienent to treat a =Triangle= as a vector of vectors, and a | 124 It is convenient to treat a =Triangle= as a vector of vectors, and a |
125 =Vector2f= or =Vector3f= as vectors of floats. (->vector3f) and | 125 =Vector2f= or =Vector3f= as vectors of floats. (->vector3f) and |
126 (->triangle) undo the operations of =vector3f-seq= and | 126 (->triangle) undo the operations of =vector3f-seq= and |
127 =triangle-seq=. If these classes implemented =Iterable= then =seq= | 127 =triangle-seq=. If these classes implemented =Iterable= then =seq= |
128 would work on them automitacally. | 128 would work on them automatically. |
129 | 129 |
130 ** Decomposing a 3D shape into Triangles | 130 ** Decomposing a 3D shape into Triangles |
131 | 131 |
132 The rigid objects which make up a creature have an underlying | 132 The rigid objects which make up a creature have an underlying |
133 =Geometry=, which is a =Mesh= plus a =Material= and other important | 133 =Geometry=, which is a =Mesh= plus a =Material= and other important |
134 data involved with displaying the object. | 134 data involved with displaying the object. |
135 | 135 |
136 A =Mesh= is composed of =Triangles=, and each =Triangle= has three | 136 A =Mesh= is composed of =Triangles=, and each =Triangle= has three |
137 verticies which have coordinates in world space and UV space. | 137 vertices which have coordinates in world space and UV space. |
138 | 138 |
139 Here, =triangles= gets all the world-space triangles which compose a | 139 Here, =triangles= gets all the world-space triangles which compose a |
140 mesh, while =pixel-triangles= gets those same triangles expressed in | 140 mesh, while =pixel-triangles= gets those same triangles expressed in |
141 pixel coordinates (which are UV coordinates scaled to fit the height | 141 pixel coordinates (which are UV coordinates scaled to fit the height |
142 and width of the UV image). | 142 and width of the UV image). |
214 y_1 & y_2 & y_3 & n_y \\ | 214 y_1 & y_2 & y_3 & n_y \\ |
215 z_1 & z_2 & z_3 & n_z \\ | 215 z_1 & z_2 & z_3 & n_z \\ |
216 1 & 1 & 1 & 1 | 216 1 & 1 & 1 & 1 |
217 \end{bmatrix} | 217 \end{bmatrix} |
218 | 218 |
219 Here, the first three columns of the matrix are the verticies of the | 219 Here, the first three columns of the matrix are the vertices of the |
220 triangle. The last column is the right-handed unit normal of the | 220 triangle. The last column is the right-handed unit normal of the |
221 triangle. | 221 triangle. |
222 | 222 |
223 With two triangles $T_{1}$ and $T_{2}$ each expressed as a matrix like | 223 With two triangles $T_{1}$ and $T_{2}$ each expressed as a matrix like |
224 above, the affine transform from $T_{1}$ to $T_{2}$ is | 224 above, the affine transform from $T_{1}$ to $T_{2}$ is |
225 | 225 |
226 $T_{2}T_{1}^{-1}$ | 226 $T_{2}T_{1}^{-1}$ |
227 | 227 |
228 The clojure code below recaptiulates the formulas above, using | 228 The clojure code below recapitulates the formulas above, using |
229 jMonkeyEngine's =Matrix4f= objects, which can describe any affine | 229 jMonkeyEngine's =Matrix4f= objects, which can describe any affine |
230 transformation. | 230 transformation. |
231 | 231 |
232 #+name: triangles-3 | 232 #+name: triangles-3 |
233 #+begin_src clojure | 233 #+begin_src clojure |
378 #+begin_src clojure | 378 #+begin_src clojure |
379 (in-ns 'cortex.touch) | 379 (in-ns 'cortex.touch) |
380 | 380 |
381 (defn set-ray [#^Ray ray #^Matrix4f transform | 381 (defn set-ray [#^Ray ray #^Matrix4f transform |
382 #^Vector3f origin #^Vector3f tip] | 382 #^Vector3f origin #^Vector3f tip] |
383 ;; Doing everything locally recduces garbage collection by enough to | 383 ;; Doing everything locally reduces garbage collection by enough to |
384 ;; be worth it. | 384 ;; be worth it. |
385 (.mult transform origin (.getOrigin ray)) | 385 (.mult transform origin (.getOrigin ray)) |
386 (.mult transform tip (.getDirection ray)) | 386 (.mult transform tip (.getDirection ray)) |
387 (.subtractLocal (.getDirection ray) (.getOrigin ray)) | 387 (.subtractLocal (.getDirection ray) (.getOrigin ray)) |
388 (.normalizeLocal (.getDirection ray))) | 388 (.normalizeLocal (.getDirection ray))) |
441 limit))) | 441 limit))) |
442 limit]))))))))))) | 442 limit]))))))))))) |
443 | 443 |
444 (defn touch! | 444 (defn touch! |
445 "Endow the creature with the sense of touch. Returns a sequence of | 445 "Endow the creature with the sense of touch. Returns a sequence of |
446 functions, one for each body part with a tactile-sensor-proile, | 446 functions, one for each body part with a tactile-sensor-profile, |
447 each of which when called returns sensory data for that body part." | 447 each of which when called returns sensory data for that body part." |
448 [#^Node creature] | 448 [#^Node creature] |
449 (filter | 449 (filter |
450 (comp not nil?) | 450 (comp not nil?) |
451 (map touch-kernel | 451 (map touch-kernel |
457 : #'cortex.touch/touch! | 457 : #'cortex.touch/touch! |
458 | 458 |
459 * Visualizing Touch | 459 * Visualizing Touch |
460 | 460 |
461 Each feeler is represented in the image as a single pixel. The | 461 Each feeler is represented in the image as a single pixel. The |
462 grayscale value of each pixel represents how deep the feeler | 462 greyscale value of each pixel represents how deep the feeler |
463 represented by that pixel is inside another object. Black means that | 463 represented by that pixel is inside another object. Black means that |
464 nothing is touching the feeler, while white means that the feeler is | 464 nothing is touching the feeler, while white means that the feeler is |
465 completely inside another object, which is presumably flush with the | 465 completely inside another object, which is presumably flush with the |
466 surface of the triangle from which the feeler originates. | 466 surface of the triangle from which the feeler originates. |
467 | 467 |
468 #+name: visualization | 468 #+name: visualization |
469 #+begin_src clojure | 469 #+begin_src clojure |
470 (in-ns 'cortex.touch) | 470 (in-ns 'cortex.touch) |
471 | 471 |
472 (defn touch->gray | 472 (defn touch->gray |
473 "Convert a pair of [distance, max-distance] into a grayscale pixel." | 473 "Convert a pair of [distance, max-distance] into a gray-scale pixel." |
474 [distance max-distance] | 474 [distance max-distance] |
475 (gray (- 255 (rem (int (* 255 (/ distance max-distance))) 256)))) | 475 (gray (- 255 (rem (int (* 255 (/ distance max-distance))) 256)))) |
476 | 476 |
477 (defn view-touch | 477 (defn view-touch |
478 "Creates a function which accepts a list of touch sensor-data and | 478 "Creates a function which accepts a list of touch sensor-data and |
734 #+html: <ul> <li> <a href="../org/touch.org">This org file</a> </li> </ul> | 734 #+html: <ul> <li> <a href="../org/touch.org">This org file</a> </li> </ul> |
735 - [[http://hg.bortreb.com ][source-repository]] | 735 - [[http://hg.bortreb.com ][source-repository]] |
736 | 736 |
737 * Next | 737 * Next |
738 So far I've implemented simulated Vision, Hearing, and Touch, the most | 738 So far I've implemented simulated Vision, Hearing, and Touch, the most |
739 obvious and promiment senses that humans have. Smell and Taste shall | 739 obvious and prominent senses that humans have. Smell and Taste shall |
740 remain unimplemented for now. This accounts for the "five senses" that | 740 remain unimplemented for now. This accounts for the "five senses" that |
741 feature so prominently in our lives. But humans have far more than the | 741 feature so prominently in our lives. But humans have far more than the |
742 five main senses. There are internal chemical senses, pain (which is | 742 five main senses. There are internal chemical senses, pain (which is |
743 *not* the same as touch), heat sensitivity, and our sense of balance, | 743 *not* the same as touch), heat sensitivity, and our sense of balance, |
744 among others. One extra sense is so important that I must implement it | 744 among others. One extra sense is so important that I must implement it |