diff org/touch.org @ 306:7e7f8d6d9ec5

massive spellchecking
author Robert McIntyre <rlm@mit.edu>
date Sat, 18 Feb 2012 10:59:41 -0700
parents 23aadf376e9d
children 5d448182c807
line wrap: on
line diff
     1.1 --- a/org/touch.org	Sat Feb 18 10:28:14 2012 -0700
     1.2 +++ b/org/touch.org	Sat Feb 18 10:59:41 2012 -0700
     1.3 @@ -11,7 +11,7 @@
     1.4  Touch is critical to navigation and spatial reasoning and as such I
     1.5  need a simulated version of it to give to my AI creatures.
     1.6  
     1.7 -Human skin has a wide array of touch sensors, each of which speciliaze
     1.8 +Human skin has a wide array of touch sensors, each of which specialize
     1.9  in detecting different vibrational modes and pressures. These sensors
    1.10  can integrate a vast expanse of skin (i.e. your entire palm), or a
    1.11  tiny patch of skin at the tip of your finger. The hairs of the skin
    1.12 @@ -32,7 +32,7 @@
    1.13  sense of touch which is a hybrid between a human's sense of
    1.14  deformation and sense from hairs.
    1.15  
    1.16 -Implementing touch in jMonkeyEngine follows a different techinal route
    1.17 +Implementing touch in jMonkeyEngine follows a different technical route
    1.18  than vision and hearing. Those two senses piggybacked off
    1.19  jMonkeyEngine's 3D audio and video rendering subsystems. To simulate
    1.20  touch, I use jMonkeyEngine's physics system to execute many small
    1.21 @@ -77,7 +77,7 @@
    1.22    
    1.23  To simulate touch there are three conceptual steps. For each solid
    1.24  object in the creature, you first have to get UV image and scale
    1.25 -paramater which define the position and length of the feelers. Then,
    1.26 +parameter which define the position and length of the feelers. Then,
    1.27  you use the triangles which compose the mesh and the UV data stored in
    1.28  the mesh to determine the world-space position and orientation of each
    1.29  feeler. Then once every frame, update these positions and orientations
    1.30 @@ -100,7 +100,7 @@
    1.31      normalized coordinates of the tips of the feelers. =feeler-tips=.
    1.32  
    1.33  * Triangle Math
    1.34 -** Schrapnel Conversion Functions
    1.35 +** Shrapnel Conversion Functions
    1.36  
    1.37  #+name: triangles-1
    1.38  #+begin_src clojure
    1.39 @@ -121,11 +121,11 @@
    1.40    (apply #(Triangle. %1 %2 %3) (map ->vector3f points)))
    1.41  #+end_src
    1.42  
    1.43 -It is convienent to treat a =Triangle= as a vector of vectors, and a
    1.44 +It is convenient to treat a =Triangle= as a vector of vectors, and a
    1.45  =Vector2f= or =Vector3f= as vectors of floats. (->vector3f) and
    1.46  (->triangle) undo the operations of =vector3f-seq= and
    1.47  =triangle-seq=. If these classes implemented =Iterable= then =seq=
    1.48 -would work on them automitacally.
    1.49 +would work on them automatically.
    1.50  
    1.51  ** Decomposing a 3D shape into Triangles
    1.52  
    1.53 @@ -134,7 +134,7 @@
    1.54  data involved with displaying the object.
    1.55  
    1.56  A =Mesh= is composed of =Triangles=, and each =Triangle= has three
    1.57 -verticies which have coordinates in world space and UV space.
    1.58 +vertices which have coordinates in world space and UV space.
    1.59   
    1.60  Here, =triangles= gets all the world-space triangles which compose a
    1.61  mesh, while =pixel-triangles= gets those same triangles expressed in
    1.62 @@ -216,7 +216,7 @@
    1.63  1 & 1 & 1 & 1
    1.64  \end{bmatrix}
    1.65  
    1.66 -Here, the first three columns of the matrix are the verticies of the
    1.67 +Here, the first three columns of the matrix are the vertices of the
    1.68  triangle. The last column is the right-handed unit normal of the
    1.69  triangle.
    1.70  
    1.71 @@ -225,7 +225,7 @@
    1.72  
    1.73  $T_{2}T_{1}^{-1}$
    1.74  
    1.75 -The clojure code below recaptiulates the formulas above, using
    1.76 +The clojure code below recapitulates the formulas above, using
    1.77  jMonkeyEngine's =Matrix4f= objects, which can describe any affine
    1.78  transformation.
    1.79  
    1.80 @@ -380,7 +380,7 @@
    1.81  
    1.82  (defn set-ray [#^Ray ray #^Matrix4f transform
    1.83                 #^Vector3f origin #^Vector3f tip]
    1.84 -  ;; Doing everything locally recduces garbage collection by enough to
    1.85 +  ;; Doing everything locally reduces garbage collection by enough to
    1.86    ;; be worth it.
    1.87    (.mult transform origin (.getOrigin ray))
    1.88    (.mult transform tip (.getDirection ray))
    1.89 @@ -443,7 +443,7 @@
    1.90  
    1.91  (defn touch! 
    1.92    "Endow the creature with the sense of touch. Returns a sequence of
    1.93 -   functions, one for each body part with a tactile-sensor-proile,
    1.94 +   functions, one for each body part with a tactile-sensor-profile,
    1.95     each of which when called returns sensory data for that body part."
    1.96    [#^Node creature]
    1.97    (filter
    1.98 @@ -459,7 +459,7 @@
    1.99  * Visualizing Touch
   1.100  
   1.101  Each feeler is represented in the image as a single pixel. The
   1.102 -grayscale value of each pixel represents how deep the feeler
   1.103 +greyscale value of each pixel represents how deep the feeler
   1.104  represented by that pixel is inside another object.  Black means that
   1.105  nothing is touching the feeler, while white means that the feeler is
   1.106  completely inside another object, which is presumably flush with the
   1.107 @@ -470,7 +470,7 @@
   1.108  (in-ns 'cortex.touch)
   1.109  
   1.110  (defn touch->gray
   1.111 -  "Convert a pair of [distance, max-distance] into a grayscale pixel." 
   1.112 +  "Convert a pair of [distance, max-distance] into a gray-scale pixel." 
   1.113    [distance max-distance]
   1.114    (gray (- 255 (rem (int (* 255 (/ distance max-distance))) 256))))
   1.115  
   1.116 @@ -736,7 +736,7 @@
   1.117  
   1.118  * Next
   1.119  So far I've implemented simulated Vision, Hearing, and Touch, the most
   1.120 -obvious and promiment senses that humans have.  Smell and Taste shall
   1.121 +obvious and prominent senses that humans have.  Smell and Taste shall
   1.122  remain unimplemented for now. This accounts for the "five senses" that
   1.123  feature so prominently in our lives. But humans have far more than the
   1.124  five main senses. There are internal chemical senses, pain (which is