changeset 306:7e7f8d6d9ec5

massive spellchecking
author Robert McIntyre <rlm@mit.edu>
date Sat, 18 Feb 2012 10:59:41 -0700
parents 19c43ec6958d
children 3fdbd58bb2af
files org/body.org org/capture-video.org org/games.org org/hearing.org org/ideas.org org/integration.org org/intro.org org/movement.org org/proprioception.org org/sense.org org/test.org org/touch.org org/util.org org/vision.org org/world.org org/youtube.org
diffstat 16 files changed, 165 insertions(+), 159 deletions(-) [+]
line wrap: on
line diff
     1.1 --- a/org/body.org	Sat Feb 18 10:28:14 2012 -0700
     1.2 +++ b/org/body.org	Sat Feb 18 10:59:41 2012 -0700
     1.3 @@ -1,7 +1,7 @@
     1.4  #+title: Building a Body
     1.5  #+author: Robert McIntyre
     1.6  #+email: rlm@mit.edu 
     1.7 -#+description: Simulating a body (movement, touch, propioception) in jMonkeyEngine3.
     1.8 +#+description: Simulating a body (movement, touch, proprioception) in jMonkeyEngine3.
     1.9  #+SETUPFILE: ../../aurellem/org/setup.org
    1.10  #+INCLUDE: ../../aurellem/org/level-0.org
    1.11  
    1.12 @@ -22,7 +22,7 @@
    1.13  to model humans and create realistic animations. It is hard to use for
    1.14  my purposes because it is difficult to update the creature's Physics
    1.15  Collision Mesh in tandem with its Geometric Mesh under the influence
    1.16 -of the armature. Withouth this the creature will not be able to grab
    1.17 +of the armature. Without this the creature will not be able to grab
    1.18  things in its environment, and it won't be able to tell where its
    1.19  physical body is by using its eyes. Also, armatures do not specify
    1.20  any rotational limits for a joint, making it hard to model elbows,
    1.21 @@ -43,12 +43,12 @@
    1.22  presence of its body. Each individual section is simulated by a
    1.23  separate rigid body that corresponds exactly with its visual
    1.24  representation and does not change. Sections are connected by
    1.25 -invisible joints that are well supported in jMonkyeEngine. Bullet, the
    1.26 +invisible joints that are well supported in jMonkeyEngine. Bullet, the
    1.27  physics backend for jMonkeyEngine, can efficiently simulate hundreds
    1.28  of rigid bodies connected by joints. Sections do not have to stay as
    1.29  one piece forever; they can be dynamically replaced with multiple
    1.30  sections to simulate splitting in two. This could be used to simulate
    1.31 -retractable claws or EVE's hands, which are able to coalece into one
    1.32 +retractable claws or EVE's hands, which are able to coalesce into one
    1.33  object in the movie.
    1.34  
    1.35  * Solidifying the Body
    1.36 @@ -107,7 +107,7 @@
    1.37  </video>
    1.38  </center>
    1.39  <p>The hand model directly loaded from blender. It has no physical
    1.40 -  presense in the simulation. </p>
    1.41 +  presence in the simulation. </p>
    1.42  </div>
    1.43  #+end_html
    1.44  
    1.45 @@ -218,7 +218,7 @@
    1.46  [[../images/hand-screenshot1.png]]
    1.47  
    1.48  The empty node in the upper right, highlighted in yellow, is the
    1.49 -parent node of all the emptys which represent joints. The following
    1.50 +parent node of all the empties which represent joints. The following
    1.51  functions must do three things to translate these into real joints:
    1.52  
    1.53   - Find the children of the "joints" node.
    1.54 @@ -241,7 +241,7 @@
    1.55  
    1.56  ** Joint Targets and Orientation
    1.57  
    1.58 -This technique for finding a joint's targets is very similiar to
    1.59 +This technique for finding a joint's targets is very similar to
    1.60  =cortex.sense/closest-node=.  A small cube, centered around the
    1.61  empty-node, grows exponentially until it intersects two /physical/
    1.62  objects. The objects are ordered according to the joint's rotation,
    1.63 @@ -283,7 +283,7 @@
    1.64  
    1.65  This section of code iterates through all the different ways of
    1.66  specifying joints using blender meta-data and converts each one to the
    1.67 -appropriate jMonkyeEngine joint.
    1.68 +appropriate jMonkeyEngine joint.
    1.69  
    1.70  #+name: joints-4
    1.71  #+begin_src clojure
    1.72 @@ -307,7 +307,7 @@
    1.73       pivot-b))
    1.74  
    1.75    ;; but instead we must do this:
    1.76 -  (println-repl "substuting 6DOF joint for POINT2POINT joint!")
    1.77 +  (println-repl "substituting 6DOF joint for POINT2POINT joint!")
    1.78    (doto
    1.79        (SixDofJoint.
    1.80         control-a
    1.81 @@ -475,14 +475,14 @@
    1.82  
    1.83  * Wrap-Up!
    1.84  
    1.85 -It is convienent to combine =physical!= and =joints!= into one
    1.86 +It is convenient to combine =physical!= and =joints!= into one
    1.87  function that completely creates the creature's physical body.
    1.88  
    1.89  #+name: joints-6
    1.90  #+begin_src clojure
    1.91  (defn body!
    1.92    "Endow the creature with a physical body connected with joints.  The
    1.93 -   particulars of the joints and the masses of each pody part are
    1.94 +   particulars of the joints and the masses of each body part are
    1.95     determined in blender."
    1.96    [#^Node creature]
    1.97    (physical! creature)
    1.98 @@ -544,7 +544,7 @@
    1.99  (ns cortex.body
   1.100    "Assemble a physical creature using the definitions found in a
   1.101     specially prepared blender file. Creates rigid bodies and joints so
   1.102 -   that a creature can have a physical presense in the simulation."
   1.103 +   that a creature can have a physical presence in the simulation."
   1.104    {:author "Robert McIntyre"}
   1.105    (:use (cortex world util sense))
   1.106    (:use clojure.contrib.def)
     2.1 --- a/org/capture-video.org	Sat Feb 18 10:28:14 2012 -0700
     2.2 +++ b/org/capture-video.org	Sat Feb 18 10:59:41 2012 -0700
     2.3 @@ -2,7 +2,7 @@
     2.4  #+author: Robert McIntyre
     2.5  #+email: rlm@mit.edu
     2.6  #+description: Capture video from a JMonkeyEngine3 Application with Xuggle, and use gstreamer to compress the video to upload to YouTube.
     2.7 -#+keywords: JME3, video, Xuggle, JMonkeyEngine, youtube, capture video, Java
     2.8 +#+keywords: JME3, video, Xuggle, JMonkeyEngine, YouTube, capture video, Java
     2.9  #+SETUPFILE: ../../aurellem/org/setup.org
    2.10  #+INCLUDE: ../../aurellem/org/level-0.org
    2.11  
    2.12 @@ -70,7 +70,7 @@
    2.13  # throttles its physics engine and graphics display.  If the
    2.14  # computations involved in running the game are too intense, then the
    2.15  # game will first skip frames, then sacrifice physics accuracy.  If
    2.16 -# there are particuraly demanding computations, then you may only get
    2.17 +# there are particularly demanding computations, then you may only get
    2.18  # 1 fps, and the ball may tunnel through the floor or obstacles due to
    2.19  # inaccurate physics simulation, but after the end of one user-hour,
    2.20  # that ball will have traveled one game-mile.
    2.21 @@ -134,7 +134,7 @@
    2.22  first step the game forward by up to four physics ticks before
    2.23  rendering to the screen. If it still isn't fast enough then it will
    2.24  decrease the accuracy of the physics engine until game-time and user
    2.25 -time are synched or a certain threshold is reached, at which point the
    2.26 +time are synced or a certain threshold is reached, at which point the
    2.27  game visibly slows down. In this case, JME3 continuously repeat a
    2.28  cycle of two physics ticks, and one screen render. For every
    2.29  user-second that passes, one game-second will pass, but the game will
    2.30 @@ -198,7 +198,7 @@
    2.31  where it finalizes the recording (e.g. by writing headers for a video
    2.32  file).
    2.33  
    2.34 -An appropiate interface describing this behaviour could look like
    2.35 +An appropriate interface describing this behavior could look like
    2.36  this:
    2.37  
    2.38  =./src/com/aurellem/capture/video/VideoRecorder.java=
    2.39 @@ -214,7 +214,7 @@
    2.40  =.cleanup()= method, it is only called when the =SceneProcessor= is
    2.41  removed from the =RenderManager=, not when the game is shutting down
    2.42  when the user pressed ESC, for example. To obtain reliable shutdown
    2.43 -behaviour, we also have to implement =AppState=, which provides a
    2.44 +behavior, we also have to implement =AppState=, which provides a
    2.45  =.cleanup()= method that /is/ called on shutdown.
    2.46  
    2.47  Here is an AbstractVideoRecorder class that takes care of the details
    2.48 @@ -311,7 +311,7 @@
    2.49  
    2.50    - Establish an =Isotimer= and set its framerate :: For example, if
    2.51         you want to record video with a framerate of 30 fps, include
    2.52 -       the following line of code somewhere in the initializtion of
    2.53 +       the following line of code somewhere in the initialization of
    2.54         your application: 
    2.55  #+begin_src java :exports code
    2.56  this.setTimer(new IsoTimer(30));
    2.57 @@ -362,7 +362,7 @@
    2.58  : 932K	hello-video/hello-video-moving.flv
    2.59  : 640K	hello-video/hello-video-static.flv
    2.60  
    2.61 -And can be immediately uploaded to youtube
    2.62 +And can be immediately uploaded to YouTube
    2.63  
    2.64  - [[http://www.youtube.com/watch?v=C8gxVAySaPg][hello-video-moving.flv]]
    2.65  #+BEGIN_HTML
    2.66 @@ -471,11 +471,11 @@
    2.67  : 180M	physics-videos/helloPhysics.flv
    2.68  
    2.69  
    2.70 -Thats a terribly large size!
    2.71 +That's a terribly large size!
    2.72  Let's compress it:
    2.73  
    2.74  ** COMMENT Compressing the HelloPhysics Video
    2.75 -First, we'll scale the video, then, we'll decrease it's bitrate. The
    2.76 +First, we'll scale the video, then, we'll decrease it's bit-rate. The
    2.77  end result will be perfect for upload to YouTube.
    2.78  
    2.79  #+begin_src sh :results silent
     3.1 --- a/org/games.org	Sat Feb 18 10:28:14 2012 -0700
     3.2 +++ b/org/games.org	Sat Feb 18 10:59:41 2012 -0700
     3.3 @@ -17,7 +17,7 @@
     3.4  less direct translation from the java source [[http://jmonkeyengine.org/wiki/doku.php/jme3:beginner:hello_simpleapplication][here]].
     3.5  
     3.6  Of note is the fact that since we don't have access to the
     3.7 -=AssetManager= via extendig =SimpleApplication=, we have to build one
     3.8 +=AssetManager= via extending =SimpleApplication=, we have to build one
     3.9  ourselves. 
    3.10  
    3.11  #+name: hello-simple-app
     4.1 --- a/org/hearing.org	Sat Feb 18 10:28:14 2012 -0700
     4.2 +++ b/org/hearing.org	Sat Feb 18 10:59:41 2012 -0700
     4.3 @@ -91,8 +91,8 @@
     4.4   - Get access to the rendered sound data for further processing from
     4.5     clojure.
     4.6  
     4.7 -I named it the "Multiple Audio Send" Deives, or =Send= Device for
     4.8 -short, since it sends audio data back to the callig application like
     4.9 +I named it the "Multiple Audio Send" Device, or =Send= Device for
    4.10 +short, since it sends audio data back to the calling application like
    4.11  an Aux-Send cable on a mixing board.
    4.12  
    4.13  Onward to the actual Device!
    4.14 @@ -762,8 +762,8 @@
    4.15  
    4.16  ** SoundProcessors are like SceneProcessors
    4.17  
    4.18 -A =SoundProcessor= is analgous to a =SceneProcessor=. Every frame, the
    4.19 -=SoundProcessor= registered with a given =Listener= recieves the
    4.20 +A =SoundProcessor= is analogous to a =SceneProcessor=. Every frame, the
    4.21 +=SoundProcessor= registered with a given =Listener= receives the
    4.22  rendered sound data and can do whatever processing it wants with it.
    4.23  
    4.24  #+include "../../jmeCapture/src/com/aurellem/capture/audio/SoundProcessor.java" src java  
    4.25 @@ -777,9 +777,9 @@
    4.26  ** Hearing Pipeline
    4.27  
    4.28  All sound rendering is done in the CPU, so =hearing-pipeline= is
    4.29 -much less complicated than =vision-pipelie= The bytes available in
    4.30 +much less complicated than =vision-pipeline= The bytes available in
    4.31  the ByteBuffer obtained from the =send= Device have different meanings
    4.32 -dependant upon the particular hardware or your system.  That is why
    4.33 +dependent upon the particular hardware or your system.  That is why
    4.34  the =AudioFormat= object is necessary to provide the meaning that the
    4.35  raw bytes lack. =byteBuffer->pulse-vector= uses the excellent
    4.36  conversion facilities from [[http://www.tritonus.org/ ][tritonus]] ([[http://tritonus.sourceforge.net/apidoc/org/tritonus/share/sampled/FloatSampleTools.html#byte2floatInterleaved%2528byte%5B%5D,%2520int,%2520float%5B%5D,%2520int,%2520int,%2520javax.sound.sampled.AudioFormat%2529][javadoc]]) to generate a clojure vector of
    4.37 @@ -796,7 +796,7 @@
    4.38    "Creates a SoundProcessor which wraps a sound processing
    4.39    continuation function. The continuation is a function that takes
    4.40    [#^ByteBuffer b #^Integer int numSamples #^AudioFormat af ], each of which
    4.41 -  has already been apprpiately sized."
    4.42 +  has already been appropriately sized."
    4.43    [continuation]
    4.44    (proxy [SoundProcessor] []
    4.45      (cleanup [])
    4.46 @@ -828,7 +828,7 @@
    4.47  position of an "ear" node to the closest physical object in the
    4.48  creature. That =Listener= will stay in the same orientation to the
    4.49  object with which it is bound, just as the camera in the [[http://aurellem.localhost/cortex/html/sense.html#sec-4-1][sense binding
    4.50 -demonstration]].  =OpenAL= simulates the doppler effect for moving
    4.51 +demonstration]].  =OpenAL= simulates the Doppler effect for moving
    4.52  listeners, =update-listener-velocity!= ensures that this velocity
    4.53  information is always up-to-date.
    4.54  
    4.55 @@ -878,7 +878,7 @@
    4.56  #+name: hearing-kernel
    4.57  #+begin_src clojure
    4.58  (defn hearing-kernel
    4.59 -  "Returns a functon which returns auditory sensory data when called
    4.60 +  "Returns a function which returns auditory sensory data when called
    4.61     inside a running simulation."
    4.62    [#^Node creature #^Spatial ear]
    4.63    (let [hearing-data (atom [])
    4.64 @@ -916,7 +916,7 @@
    4.65  
    4.66  ** Visualizing Hearing
    4.67  
    4.68 -This is a simple visualization function which displaye the waveform
    4.69 +This is a simple visualization function which displays the waveform
    4.70  reported by the simulated sense of hearing. It converts the values
    4.71  reported in the vector returned by the hearing function from the range
    4.72  [-1.0, 1.0] to the range [0 255], converts to integer, and displays
    4.73 @@ -1077,9 +1077,9 @@
    4.74  </center>
    4.75  <p>The worm can now hear the sound pulses produced from the
    4.76    hymn. Notice the strikingly different pattern that human speech
    4.77 -  makes compared to the insturments. Once the worm is pushed off the
    4.78 +  makes compared to the instruments. Once the worm is pushed off the
    4.79    floor, the sound it hears is attenuated, and the display of the
    4.80 -  sound it hears beomes fainter. This shows the 3D localization of
    4.81 +  sound it hears becomes fainter. This shows the 3D localization of
    4.82    sound in this world.</p>
    4.83  </div>
    4.84  #+end_html
     5.1 --- a/org/ideas.org	Sat Feb 18 10:28:14 2012 -0700
     5.2 +++ b/org/ideas.org	Sat Feb 18 10:59:41 2012 -0700
     5.3 @@ -9,7 +9,7 @@
     5.4  | Vision                       | Variable Coloration             |
     5.5  | Hearing                      | Speech                          |
     5.6  | Proprioception               | Movement                        |
     5.7 -| Smell/Taste (Chemoreception) | Pheremones                      |
     5.8 +| Smell/Taste (Chemoreception) | Pheromones                      |
     5.9  | Touch                        | Movement / Controllable Texture |
    5.10  | Acceleration                 | Movement                        |
    5.11  | Balance (sense gravity)      | Movement                        |
    5.12 @@ -17,7 +17,7 @@
    5.13  
    5.14  - New Senses/Effectors
    5.15  - Levitation
    5.16 -- Telekenesis
    5.17 +- Telekinesis
    5.18  - control of gravity within a certain radius
    5.19  - speed up/slow time
    5.20  - object creation/destruction
    5.21 @@ -42,8 +42,8 @@
    5.22     "extend all your fingers" or "move your hand into the area with
    5.23     blue light" or "decrease the angle of this joint".  It would be
    5.24     like Sussman's HACKER, except it would operate with much more data
    5.25 -   in a more realistic world.  Start off with "calestanthics" to
    5.26 -   develop subrouitines over the motor control API.  This would be the
    5.27 +   in a more realistic world.  Start off with "calisthenics" to
    5.28 +   develop subroutines over the motor control API.  This would be the
    5.29     "spinal chord" of a more intelligent creature.  The low level
    5.30     programming code might be a turning machine that could develop
    5.31     programs to iterate over a "tape" where each entry in the tape
    5.32 @@ -52,7 +52,7 @@
    5.33     creature interacts using its fingers to press keys on a virtual
    5.34     keyboard.  The creature can access the internet, watch videos, take
    5.35     over the world, anything it wants.
    5.36 - - Make virtual insturments like pianos, drumbs, etc that it learns to
    5.37 + - Make virtual instruments like pianos, drums, etc that it learns to
    5.38     play.
    5.39   - make a joint that figures out what type of joint it is (range of
    5.40     motion)
    5.41 @@ -63,17 +63,17 @@
    5.42  
    5.43  * goals
    5.44  
    5.45 -** have to get done before winston
    5.46 - - [X] write an explination for why greyscale bitmaps for senses is
    5.47 -       appropiate -- 1/2 day
    5.48 +** have to get done before Winston
    5.49 + - [X] write an explanation for why greyscale bitmaps for senses is
    5.50 +       appropriate -- 1/2 day
    5.51   - [X] muscle control -- day
    5.52   - [X] proprioception sensor map in the style of the other senses -- day
    5.53   - [X] refactor integration code to distribute to each of the senses
    5.54         -- day
    5.55   - [X] create video showing all the senses for Winston -- 2 days
    5.56 - - [ ] spellchecking !!!!!
    5.57 + - [ ] spell checking !!!!!
    5.58   - [ ] send package to friends for critiques -- 2 days
    5.59 - - [X] fix videos that were encoded wrong, test on InterNet Explorer.
    5.60 + - [X] fix videos that were encoded wrong, test on Internet Explorer.
    5.61   - [X] redo videos vision with new collapse code
    5.62   - [X] find a topology that looks good. (maybe nil topology?)
    5.63   - [X] fix red part of touch cube in video and image
    5.64 @@ -82,14 +82,14 @@
    5.65   - [ ] additional senses to be implemented for Winston   |  -- 2 days 
    5.66   - [ ] send Winston package                             /
    5.67  
    5.68 -** would be cool to get done before winston
    5.69 +** would be cool to get done before Winston
    5.70   - [X] enable greyscale bitmaps for touch -- 2 hours 
    5.71   - [X] use sawfish to auto-tile sense windows -- 6 hours
    5.72   - [X] sawfish keybinding to automatically delete all sense windows
    5.73   - [ ] proof of concept C sense manipulation  -- 2 days 
    5.74   - [ ] proof of concept GPU sense manipulation -- week
    5.75 - - [ ] fourier view of sound -- 2 or 3 days 
    5.76 - - [ ] dancing music listener -- 1 day, depends on fourier
    5.77 + - [ ] Fourier view of sound -- 2 or 3 days 
    5.78 + - [ ] dancing music listener -- 1 day, depends on Fourier
    5.79   - [ ] uberjar cortex demo
    5.80  
    5.81  ** don't have to get done before winston
    5.82 @@ -105,7 +105,7 @@
    5.83  * jMonkeyEngine ideas
    5.84   - [ ] video showing bullet joints problem
    5.85   - [ ] add mult for Matrix to Ray
    5.86 - - [ ] add iteraterator constructs to Vector3f, Vector2f, Triangle,
    5.87 + - [ ] add iterator constructs to Vector3f, Vector2f, Triangle,
    5.88     Matrix3f, Matrix4f, etc
    5.89  
    5.90  ;;In the elder days of Art,
     6.1 --- a/org/integration.org	Sat Feb 18 10:28:14 2012 -0700
     6.2 +++ b/org/integration.org	Sat Feb 18 10:59:41 2012 -0700
     6.3 @@ -1,4 +1,4 @@
     6.4 -#+title: 
     6.5 +#+title: Integration
     6.6  #+author: Robert McIntyre
     6.7  #+email: rlm@mit.edu
     6.8  #+description: 
     6.9 @@ -103,7 +103,7 @@
    6.10  (def full 9001)
    6.11  
    6.12  
    6.13 -;; Coreography:
    6.14 +;; Choreography:
    6.15  
    6.16  ;; Let the hand fall palm-up
    6.17  
    6.18 @@ -117,7 +117,7 @@
    6.19  ;; hand FORCEFULLY catapults the  block so that it hits the camera.
    6.20  
    6.21  
    6.22 -;; the systax here is [keyframe body-part force]
    6.23 +;; the syntax here is [keyframe body-part force]
    6.24  (def move-fingers
    6.25    [[300 :pinky-3-f 50]
    6.26     [320 :pinky-2-f 80]
    6.27 @@ -867,6 +867,7 @@
    6.28  #+end_src
    6.29    
    6.30  
    6.31 +
    6.32  * COMMENT generate source
    6.33  #+begin_src clojure :tangle ../src/cortex/test/integration.clj
    6.34  <<integration>>
     7.1 --- a/org/intro.org	Sat Feb 18 10:28:14 2012 -0700
     7.2 +++ b/org/intro.org	Sat Feb 18 10:59:41 2012 -0700
     7.3 @@ -2,7 +2,7 @@
     7.4  #+author: Robert McIntyre
     7.5  #+email: rlm@mit.edu
     7.6  #+description: Simulating senses for AI research using JMonkeyEngine3
     7.7 -#+keywords: Alan Turing, AI, sinulated senses, jMonkeyEngine3, virtual world
     7.8 +#+keywords: Alan Turing, AI, simulated senses, jMonkeyEngine3, virtual world
     7.9  #+SETUPFILE: ../../aurellem/org/setup.org
    7.10  #+INCLUDE: ../../aurellem/org/level-0.org
    7.11  #+babel: :mkdirp yes :noweb yes
     8.1 --- a/org/movement.org	Sat Feb 18 10:28:14 2012 -0700
     8.2 +++ b/org/movement.org	Sat Feb 18 10:59:41 2012 -0700
     8.3 @@ -9,7 +9,7 @@
     8.4  
     8.5  * Muscles
     8.6  
     8.7 -Surprisingly enough, terristerial creatures only move by using torque
     8.8 +Surprisingly enough, terrestrial creatures only move by using torque
     8.9  applied about their joints.  There's not a single straight line of
    8.10  force in the human body at all!  (A straight line of force would
    8.11  correspond to some sort of jet or rocket propulsion.)
    8.12 @@ -17,23 +17,23 @@
    8.13  In humans, muscles are composed of muscle fibers which can contract to
    8.14  exert force. The muscle fibers which compose a muscle are partitioned
    8.15  into discrete groups which are each controlled by a single alpha motor
    8.16 -neuton. A single alpha motor neuron might control as little as three
    8.17 +neuron. A single alpha motor neuron might control as little as three
    8.18  or as many as one thousand muscle fibers. When the alpha motor neuron
    8.19  is engaged by the spinal cord, it activates all of the muscle fibers
    8.20  to which it is attached. The spinal cord generally engages the alpha
    8.21  motor neurons which control few muscle fibers before the motor neurons
    8.22 -which control many muscle fibers.  This recruitment stragety allows
    8.23 -for percise movements at low strength. The collection of all motor
    8.24 +which control many muscle fibers.  This recruitment strategy allows
    8.25 +for precise movements at low strength. The collection of all motor
    8.26  neurons that control a muscle is called the motor pool. The brain
    8.27  essentially says "activate 30% of the motor pool" and the spinal cord
    8.28 -recruits motor neurons untill 30% are activated. Since the
    8.29 +recruits motor neurons until 30% are activated. Since the
    8.30  distribution of power among motor neurons is unequal and recruitment
    8.31  goes from weakest to strongest, the first 30% of the motor pool might
    8.32  be 5% of the strength of the muscle.
    8.33  
    8.34 -My simulated muscles follow a similiar design: Each muscle is defined
    8.35 +My simulated muscles follow a similar design: Each muscle is defined
    8.36  by a 1-D array of numbers (the "motor pool"). Each entry in the array
    8.37 -represents a motor neuron which controlls a number of muscle fibers
    8.38 +represents a motor neuron which controls a number of muscle fibers
    8.39  equal to the value of the entry. Each muscle has a scalar strength
    8.40  factor which determines the total force the muscle can exert when all
    8.41  motor neurons are activated.  The effector function for a muscle takes
    8.42 @@ -42,9 +42,9 @@
    8.43  motor-neuron will apply force in proportion to its value in the array.
    8.44  Lower values cause less force.  The lower values can be put at the
    8.45  "beginning" of the 1-D array to simulate the layout of actual human
    8.46 -muscles, which are capable of more percise movements when exerting
    8.47 -less force. Or, the motor pool can simulate more exoitic recruitment
    8.48 -strageties which do not correspond to human muscles.
    8.49 +muscles, which are capable of more precise movements when exerting
    8.50 +less force. Or, the motor pool can simulate more exotic recruitment
    8.51 +strategies which do not correspond to human muscles.
    8.52  
    8.53  This 1D array is defined in an image file for ease of
    8.54  creation/visualization. Here is an example muscle profile image.
    8.55 @@ -140,7 +140,7 @@
    8.56  
    8.57  =movement-kernel= creates a function that will move the nearest
    8.58  physical object to the muscle node. The muscle exerts a rotational
    8.59 -force dependant on it's orientation to the object in the blender
    8.60 +force dependent on it's orientation to the object in the blender
    8.61  file. The function returned by =movement-kernel= is also a sense
    8.62  function: it returns the percent of the total muscle strength that is
    8.63  currently being employed. This is analogous to muscle tension in
    8.64 @@ -148,7 +148,7 @@
    8.65  post. 
    8.66  
    8.67  * Visualizing Muscle Tension
    8.68 -Muscle exertion is a percent of a total, so the visulazation is just a
    8.69 +Muscle exertion is a percent of a total, so the visualization is just a
    8.70  simple percent bar.
    8.71  
    8.72  #+name: visualization
    8.73 @@ -240,7 +240,7 @@
    8.74  <p>The worm is now able to move. The bar in the lower right displays
    8.75    the power output of the muscle . Each jump causes 20 more motor neurons to
    8.76    be recruited.  Notice that the power output increases non-linearly
    8.77 -  with motror neuron recruitement, similiar to a human muscle.</p>
    8.78 +  with motor neuron recruitment, similar to a human muscle.</p>
    8.79  </div>
    8.80  #+end_html
    8.81  
     9.1 --- a/org/proprioception.org	Sat Feb 18 10:28:14 2012 -0700
     9.2 +++ b/org/proprioception.org	Sat Feb 18 10:59:41 2012 -0700
     9.3 @@ -25,10 +25,10 @@
     9.4  
     9.5  Proprioception in humans is mediated by [[http://en.wikipedia.org/wiki/Articular_capsule][joint capsules]], [[http://en.wikipedia.org/wiki/Muscle_spindle][muscle
     9.6  spindles]], and the [[http://en.wikipedia.org/wiki/Golgi_tendon_organ][Golgi tendon organs]]. These measure the relative
     9.7 -positions of each pody part by monitoring muscle strain and length.
     9.8 +positions of each body part by monitoring muscle strain and length.
     9.9  
    9.10 -It's clear that this is a vital sense for fulid, graceful
    9.11 -movement. It's also particurally easy to implement in jMonkeyEngine.
    9.12 +It's clear that this is a vital sense for fluid, graceful
    9.13 +movement. It's also particularly easy to implement in jMonkeyEngine.
    9.14  
    9.15  My simulated proprioception calculates the relative angles of each
    9.16  joint from the rest position defined in the blender file. This
    9.17 @@ -83,7 +83,7 @@
    9.18  * Proprioception Kernel
    9.19  
    9.20  Given a joint, =proprioception-kernel= produces a function that
    9.21 -calculates the euler angles between the the objects the joint
    9.22 +calculates the Euler angles between the the objects the joint
    9.23  connects. 
    9.24  
    9.25  #+name: proprioception
    9.26 @@ -142,7 +142,7 @@
    9.27  
    9.28  Proprioception has the lowest bandwidth of all the senses so far, and
    9.29  it doesn't lend itself as readily to visual representation like
    9.30 -vision, hearing, or touch. This visualization code creates a "guage"
    9.31 +vision, hearing, or touch. This visualization code creates a "gauge"
    9.32  to view each of the three relative angles along a circle.
    9.33  
    9.34  #+name: visualize
    9.35 @@ -217,7 +217,7 @@
    9.36  
    9.37  (defn test-proprioception
    9.38    "Testing proprioception:
    9.39 -   You should see two foating bars, and a printout of pitch, yaw, and
    9.40 +   You should see two floating bars, and a printout of pitch, yaw, and
    9.41     roll. Pressing key-r/key-t should move the blue bar up and down and
    9.42     change only the value of pitch. key-f/key-g moves it side to side
    9.43     and changes yaw. key-v/key-b will spin the blue segment clockwise
    9.44 @@ -338,7 +338,7 @@
    9.45  #+begin_src clojure
    9.46  (ns cortex.proprioception
    9.47    "Simulate the sense of proprioception (ability to detect the
    9.48 -  relative positions of body parts with repsect to other body parts)
    9.49 +  relative positions of body parts with respect to other body parts)
    9.50    in jMonkeyEngine3. Reads specially prepared blender files to
    9.51    automatically generate proprioceptive senses."
    9.52    (:use (cortex world util sense body))
    10.1 --- a/org/sense.org	Sat Feb 18 10:28:14 2012 -0700
    10.2 +++ b/org/sense.org	Sat Feb 18 10:59:41 2012 -0700
    10.3 @@ -7,9 +7,9 @@
    10.4  #+INCLUDE: ../../aurellem/org/level-0.org
    10.5  
    10.6  * Blender Utilities
    10.7 -In blender, any object can be assigned an arbitray number of key-value
    10.8 -pairs which are called "Custom Properties". These are accessable in
    10.9 -jMonkyeEngine when blender files are imported with the
   10.10 +In blender, any object can be assigned an arbitrary number of key-value
   10.11 +pairs which are called "Custom Properties". These are accessible in
   10.12 +jMonkeyEngine when blender files are imported with the
   10.13  =BlenderLoader=. =meta-data= extracts these properties.
   10.14  
   10.15  #+name: blender-1
   10.16 @@ -55,7 +55,7 @@
   10.17  circular two-dimensional image which is the cross section of the
   10.18  spinal cord.  Points on this image that are close together in this
   10.19  circle represent touch sensors that are /probably/ close together on
   10.20 -the skin, although there is of course some cutting and rerangement
   10.21 +the skin, although there is of course some cutting and rearrangement
   10.22  that has to be done to transfer the complicated surface of the skin
   10.23  onto a two dimensional image.
   10.24  
   10.25 @@ -84,13 +84,13 @@
   10.26  of that senses sensors. To get different types of sensors, you can
   10.27  either use a different color for each type of sensor, or use multiple
   10.28  UV-maps, each labeled with that sensor type. I generally use a white
   10.29 -pixel to mean the presense of a sensor and a black pixel to mean the
   10.30 -absense of a sensor, and use one UV-map for each sensor-type within a
   10.31 +pixel to mean the presence of a sensor and a black pixel to mean the
   10.32 +absence of a sensor, and use one UV-map for each sensor-type within a
   10.33  given sense.  The paths to the images are not stored as the actual
   10.34  UV-map of the blender object but are instead referenced in the
   10.35  meta-data of the node.
   10.36  
   10.37 -#+CAPTION: The UV-map for an enlongated icososphere. The white dots each represent a touch sensor. They are dense in the regions that describe the tip of the finger, and less dense along the dorsal side of the finger opposite the tip.
   10.38 +#+CAPTION: The UV-map for an elongated icososphere. The white dots each represent a touch sensor. They are dense in the regions that describe the tip of the finger, and less dense along the dorsal side of the finger opposite the tip.
   10.39  #+ATTR_HTML: width="300"
   10.40  [[../images/finger-UV.png]]
   10.41  
   10.42 @@ -156,11 +156,11 @@
   10.43  
   10.44  Information from the senses is transmitted to the brain via bundles of
   10.45  axons, whether it be the optic nerve or the spinal cord. While these
   10.46 -bundles more or less perserve the overall topology of a sense's
   10.47 -two-dimensional surface, they do not perserve the percise euclidean
   10.48 +bundles more or less preserve the overall topology of a sense's
   10.49 +two-dimensional surface, they do not preserve the precise euclidean
   10.50  distances between every sensor. =collapse= is here to smoosh the
   10.51 -sensors described by a UV-map into a contigous region that still
   10.52 -perserves the topology of the original sense.
   10.53 +sensors described by a UV-map into a contiguous region that still
   10.54 +preserves the topology of the original sense.
   10.55  
   10.56  #+name: topology-2
   10.57  #+begin_src clojure
   10.58 @@ -180,7 +180,7 @@
   10.59  
   10.60  (defn collapse
   10.61    "Take a sequence of pairs of integers and collapse them into a
   10.62 -   contigous bitmap with no \"holes\" or negative entries, as close to
   10.63 +   contiguous bitmap with no \"holes\" or negative entries, as close to
   10.64     the origin [0 0] as the shape permits. The order of the points is
   10.65     preserved.
   10.66  
   10.67 @@ -224,11 +224,11 @@
   10.68                      [(- x min-x)
   10.69                       (- y min-y)])
   10.70                    squeezed))
   10.71 -           point-correspondance
   10.72 +           point-correspondence
   10.73             (zipmap  (sort points) (sort relocated))
   10.74  
   10.75             original-order
   10.76 -           (vec (map point-correspondance points))]
   10.77 +           (vec (map point-correspondence points))]
   10.78          original-order)))
   10.79  #+end_src
   10.80  * Viewing Sense Data
   10.81 @@ -245,7 +245,7 @@
   10.82  (in-ns 'cortex.sense)
   10.83  
   10.84  (defn view-image
   10.85 -  "Initailizes a JPanel on which you may draw a BufferedImage.
   10.86 +  "Initializes a JPanel on which you may draw a BufferedImage.
   10.87     Returns a function that accepts a BufferedImage and draws it to the
   10.88     JPanel. If given a directory it will save the images as png files
   10.89     starting at 0000000.png and incrementing from there."
   10.90 @@ -309,7 +309,7 @@
   10.91           
   10.92  
   10.93  (defn points->image
   10.94 -  "Take a collection of points and visuliaze it as a BufferedImage."
   10.95 +  "Take a collection of points and visualize it as a BufferedImage."
   10.96    [points]
   10.97    (if (empty? points)
   10.98      (BufferedImage. 1 1 BufferedImage/TYPE_BYTE_BINARY)
   10.99 @@ -348,7 +348,7 @@
  10.100   - Create a single top-level empty node whose name is the name of the sense
  10.101   - Add empty nodes which each contain meta-data relevant
  10.102     to the sense, including a UV-map describing the number/distribution
  10.103 -   of sensors if applicipable.
  10.104 +   of sensors if applicable.
  10.105   - Make each empty-node the child of the top-level
  10.106     node. =sense-nodes= below generates functions to find these children.
  10.107  
  10.108 @@ -538,7 +538,7 @@
  10.109  (ns cortex.sense
  10.110    "Here are functions useful in the construction of two or more
  10.111     sensors/effectors."
  10.112 -  {:author "Robert McInytre"}
  10.113 +  {:author "Robert McIntyre"}
  10.114    (:use (cortex world util))
  10.115    (:import ij.process.ImageProcessor)
  10.116    (:import jme3tools.converters.ImageToAwt)
    11.1 --- a/org/test.org	Sat Feb 18 10:28:14 2012 -0700
    11.2 +++ b/org/test.org	Sat Feb 18 10:59:41 2012 -0700
    11.3 @@ -1,7 +1,7 @@
    11.4  #+title: Test Suite
    11.5  #+author: Robert McIntyre
    11.6  #+email: rlm@mit.edu
    11.7 -#+description: Simulating a body (movement, touch, propioception) in jMonkeyEngine3.
    11.8 +#+description: Simulating a body (movement, touch, proprioception) in jMonkeyEngine3.
    11.9  #+SETUPFILE: ../../aurellem/org/setup.org
   11.10  #+INCLUDE: ../../aurellem/org/level-0.org
   11.11  
   11.12 @@ -17,7 +17,7 @@
   11.13  	   com.jme3.system.AppSettings))
   11.14  
   11.15  (defn run-world
   11.16 -  "run the simulation and wait until it closes proprely."
   11.17 +  "run the simulation and wait until it closes properly."
   11.18    [world]
   11.19    (let [lock (promise)]
   11.20      (.enqueue
    12.1 --- a/org/touch.org	Sat Feb 18 10:28:14 2012 -0700
    12.2 +++ b/org/touch.org	Sat Feb 18 10:59:41 2012 -0700
    12.3 @@ -11,7 +11,7 @@
    12.4  Touch is critical to navigation and spatial reasoning and as such I
    12.5  need a simulated version of it to give to my AI creatures.
    12.6  
    12.7 -Human skin has a wide array of touch sensors, each of which speciliaze
    12.8 +Human skin has a wide array of touch sensors, each of which specialize
    12.9  in detecting different vibrational modes and pressures. These sensors
   12.10  can integrate a vast expanse of skin (i.e. your entire palm), or a
   12.11  tiny patch of skin at the tip of your finger. The hairs of the skin
   12.12 @@ -32,7 +32,7 @@
   12.13  sense of touch which is a hybrid between a human's sense of
   12.14  deformation and sense from hairs.
   12.15  
   12.16 -Implementing touch in jMonkeyEngine follows a different techinal route
   12.17 +Implementing touch in jMonkeyEngine follows a different technical route
   12.18  than vision and hearing. Those two senses piggybacked off
   12.19  jMonkeyEngine's 3D audio and video rendering subsystems. To simulate
   12.20  touch, I use jMonkeyEngine's physics system to execute many small
   12.21 @@ -77,7 +77,7 @@
   12.22    
   12.23  To simulate touch there are three conceptual steps. For each solid
   12.24  object in the creature, you first have to get UV image and scale
   12.25 -paramater which define the position and length of the feelers. Then,
   12.26 +parameter which define the position and length of the feelers. Then,
   12.27  you use the triangles which compose the mesh and the UV data stored in
   12.28  the mesh to determine the world-space position and orientation of each
   12.29  feeler. Then once every frame, update these positions and orientations
   12.30 @@ -100,7 +100,7 @@
   12.31      normalized coordinates of the tips of the feelers. =feeler-tips=.
   12.32  
   12.33  * Triangle Math
   12.34 -** Schrapnel Conversion Functions
   12.35 +** Shrapnel Conversion Functions
   12.36  
   12.37  #+name: triangles-1
   12.38  #+begin_src clojure
   12.39 @@ -121,11 +121,11 @@
   12.40    (apply #(Triangle. %1 %2 %3) (map ->vector3f points)))
   12.41  #+end_src
   12.42  
   12.43 -It is convienent to treat a =Triangle= as a vector of vectors, and a
   12.44 +It is convenient to treat a =Triangle= as a vector of vectors, and a
   12.45  =Vector2f= or =Vector3f= as vectors of floats. (->vector3f) and
   12.46  (->triangle) undo the operations of =vector3f-seq= and
   12.47  =triangle-seq=. If these classes implemented =Iterable= then =seq=
   12.48 -would work on them automitacally.
   12.49 +would work on them automatically.
   12.50  
   12.51  ** Decomposing a 3D shape into Triangles
   12.52  
   12.53 @@ -134,7 +134,7 @@
   12.54  data involved with displaying the object.
   12.55  
   12.56  A =Mesh= is composed of =Triangles=, and each =Triangle= has three
   12.57 -verticies which have coordinates in world space and UV space.
   12.58 +vertices which have coordinates in world space and UV space.
   12.59   
   12.60  Here, =triangles= gets all the world-space triangles which compose a
   12.61  mesh, while =pixel-triangles= gets those same triangles expressed in
   12.62 @@ -216,7 +216,7 @@
   12.63  1 & 1 & 1 & 1
   12.64  \end{bmatrix}
   12.65  
   12.66 -Here, the first three columns of the matrix are the verticies of the
   12.67 +Here, the first three columns of the matrix are the vertices of the
   12.68  triangle. The last column is the right-handed unit normal of the
   12.69  triangle.
   12.70  
   12.71 @@ -225,7 +225,7 @@
   12.72  
   12.73  $T_{2}T_{1}^{-1}$
   12.74  
   12.75 -The clojure code below recaptiulates the formulas above, using
   12.76 +The clojure code below recapitulates the formulas above, using
   12.77  jMonkeyEngine's =Matrix4f= objects, which can describe any affine
   12.78  transformation.
   12.79  
   12.80 @@ -380,7 +380,7 @@
   12.81  
   12.82  (defn set-ray [#^Ray ray #^Matrix4f transform
   12.83                 #^Vector3f origin #^Vector3f tip]
   12.84 -  ;; Doing everything locally recduces garbage collection by enough to
   12.85 +  ;; Doing everything locally reduces garbage collection by enough to
   12.86    ;; be worth it.
   12.87    (.mult transform origin (.getOrigin ray))
   12.88    (.mult transform tip (.getDirection ray))
   12.89 @@ -443,7 +443,7 @@
   12.90  
   12.91  (defn touch! 
   12.92    "Endow the creature with the sense of touch. Returns a sequence of
   12.93 -   functions, one for each body part with a tactile-sensor-proile,
   12.94 +   functions, one for each body part with a tactile-sensor-profile,
   12.95     each of which when called returns sensory data for that body part."
   12.96    [#^Node creature]
   12.97    (filter
   12.98 @@ -459,7 +459,7 @@
   12.99  * Visualizing Touch
  12.100  
  12.101  Each feeler is represented in the image as a single pixel. The
  12.102 -grayscale value of each pixel represents how deep the feeler
  12.103 +greyscale value of each pixel represents how deep the feeler
  12.104  represented by that pixel is inside another object.  Black means that
  12.105  nothing is touching the feeler, while white means that the feeler is
  12.106  completely inside another object, which is presumably flush with the
  12.107 @@ -470,7 +470,7 @@
  12.108  (in-ns 'cortex.touch)
  12.109  
  12.110  (defn touch->gray
  12.111 -  "Convert a pair of [distance, max-distance] into a grayscale pixel." 
  12.112 +  "Convert a pair of [distance, max-distance] into a gray-scale pixel." 
  12.113    [distance max-distance]
  12.114    (gray (- 255 (rem (int (* 255 (/ distance max-distance))) 256))))
  12.115  
  12.116 @@ -736,7 +736,7 @@
  12.117  
  12.118  * Next
  12.119  So far I've implemented simulated Vision, Hearing, and Touch, the most
  12.120 -obvious and promiment senses that humans have.  Smell and Taste shall
  12.121 +obvious and prominent senses that humans have.  Smell and Taste shall
  12.122  remain unimplemented for now. This accounts for the "five senses" that
  12.123  feature so prominently in our lives. But humans have far more than the
  12.124  five main senses. There are internal chemical senses, pain (which is
    13.1 --- a/org/util.org	Sat Feb 18 10:28:14 2012 -0700
    13.2 +++ b/org/util.org	Sat Feb 18 10:59:41 2012 -0700
    13.3 @@ -28,7 +28,7 @@
    13.4  (defn jme-class? [classname]
    13.5    (and
    13.6     (.startsWith classname "com.jme3.")
    13.7 -   ;; Don't import the Lwjgl stuff since it can throw exceptions
    13.8 +   ;; Don't import the LWJGL stuff since it can throw exceptions
    13.9     ;;  upon being loaded.
   13.10     (not (re-matches #".*Lwjgl.*" classname))))
   13.11  
   13.12 @@ -52,10 +52,10 @@
   13.13  happy with the general structure of a namespace I can deal with
   13.14  importing only the classes it actually needs.
   13.15  
   13.16 -The =mega-import-jme3= is quite usefull for debugging purposes since
   13.17 +The =mega-import-jme3= is quite useful for debugging purposes since
   13.18  it allows completion for almost all of JME's classes from the REPL.
   13.19  
   13.20 -Out of curiousity, let's see just how many classes =mega-import-jme3=
   13.21 +Out of curiosity, let's see just how many classes =mega-import-jme3=
   13.22  imports:
   13.23  
   13.24  #+begin_src clojure :exports both :results output
   13.25 @@ -226,7 +226,7 @@
   13.26  (in-ns 'cortex.util)
   13.27  
   13.28  (defn load-bullet 
   13.29 -  "Runnig this function unpacks the native bullet libraries and makes
   13.30 +  "Running this function unpacks the native bullet libraries and makes
   13.31     them available."
   13.32    []
   13.33    (let [sim (world (Node.) {} no-op no-op)]
   13.34 @@ -316,8 +316,8 @@
   13.35    ([] (sphere 0.5)))
   13.36  
   13.37  (defn x-ray 
   13.38 -   "A usefull material for debuging -- it can be seen no matter what
   13.39 -    object occuldes it."
   13.40 +   "A useful material for debugging -- it can be seen no matter what
   13.41 +    object occludes it."
   13.42     [#^ColorRGBA color]
   13.43     (doto (Material. (asset-manager)
   13.44                      "Common/MatDefs/Misc/Unshaded.j3md")
   13.45 @@ -359,7 +359,7 @@
   13.46  (in-ns 'cortex.util)
   13.47  
   13.48  (defn basic-light-setup
   13.49 -  "returns a sequence of lights appropiate for fully lighting a scene"
   13.50 +  "returns a sequence of lights appropriate for fully lighting a scene"
   13.51    []
   13.52    (conj
   13.53     (doall
   13.54 @@ -379,7 +379,7 @@
   13.55       (.setColor ColorRGBA/White))))
   13.56  
   13.57  (defn light-up-everything
   13.58 -  "Add lights to a world appropiate for quickly seeing everything
   13.59 +  "Add lights to a world appropriate for quickly seeing everything
   13.60    in the scene.  Adds six DirectionalLights facing in orthogonal
   13.61    directions, and one AmbientLight to provide overall lighting
   13.62    coverage."
   13.63 @@ -433,7 +433,7 @@
   13.64     forces
   13.65     [root-node
   13.66      keymap
   13.67 -    intilization
   13.68 +    initialization
   13.69      world-loop]]
   13.70    (let [add-keypress
   13.71          (fn [state keymap key]
   13.72 @@ -472,14 +472,14 @@
   13.73                         (splice-loop))]
   13.74      [root-node
   13.75       keymap*
   13.76 -     intilization
   13.77 +     initialization
   13.78       world-loop*]))
   13.79  
   13.80  (import com.jme3.font.BitmapText)
   13.81  (import com.jme3.scene.control.AbstractControl)
   13.82  (import com.aurellem.capture.IsoTimer)
   13.83  
   13.84 -(defn display-dialated-time
   13.85 +(defn display-dilated-time
   13.86    "Shows the time as it is flowing in the simulation on a HUD display.
   13.87     Useful for making videos."
   13.88    [world timer]
    14.1 --- a/org/vision.org	Sat Feb 18 10:28:14 2012 -0700
    14.2 +++ b/org/vision.org	Sat Feb 18 10:59:41 2012 -0700
    14.3 @@ -11,11 +11,11 @@
    14.4   
    14.5  Vision is one of the most important senses for humans, so I need to
    14.6  build a simulated sense of vision for my AI. I will do this with
    14.7 -simulated eyes. Each eye can be independely moved and should see its
    14.8 +simulated eyes. Each eye can be independently moved and should see its
    14.9  own version of the world depending on where it is.
   14.10  
   14.11 -Making these simulated eyes a reality is simple bacause jMonkeyEngine
   14.12 -already conatains extensive support for multiple views of the same 3D
   14.13 +Making these simulated eyes a reality is simple because jMonkeyEngine
   14.14 +already contains extensive support for multiple views of the same 3D
   14.15  simulated world. The reason jMonkeyEngine has this support is because
   14.16  the support is necessary to create games with split-screen
   14.17  views. Multiple views are also used to create efficient
   14.18 @@ -26,7 +26,7 @@
   14.19  [[../images/goldeneye-4-player.png]]
   14.20  
   14.21  ** =ViewPorts=, =SceneProcessors=, and the =RenderManager=. 
   14.22 -# =Viewports= are cameras; =RenderManger= takes snapshots each frame. 
   14.23 +# =ViewPorts= are cameras; =RenderManger= takes snapshots each frame. 
   14.24  #* A Brief Description of jMonkeyEngine's Rendering Pipeline
   14.25  
   14.26  jMonkeyEngine allows you to create a =ViewPort=, which represents a
   14.27 @@ -35,13 +35,13 @@
   14.28  =ViewPort=, rendering the scene in the GPU. For each =ViewPort= there
   14.29  is a =FrameBuffer= which represents the rendered image in the GPU.
   14.30  
   14.31 -#+caption: =ViewPorts= are cameras in the world. During each frame, the =Rendermanager= records a snapshot of what each view is currently seeing; these snapshots are =Framebuffer= objects.
   14.32 +#+caption: =ViewPorts= are cameras in the world. During each frame, the =RenderManager= records a snapshot of what each view is currently seeing; these snapshots are =FrameBuffer= objects.
   14.33  #+ATTR_HTML: width="400"
   14.34  [[../images/diagram_rendermanager2.png]]
   14.35  
   14.36  Each =ViewPort= can have any number of attached =SceneProcessor=
   14.37  objects, which are called every time a new frame is rendered. A
   14.38 -=SceneProcessor= recieves its =ViewPort's= =FrameBuffer= and can do
   14.39 +=SceneProcessor= receives its =ViewPort's= =FrameBuffer= and can do
   14.40  whatever it wants to the data.  Often this consists of invoking GPU
   14.41  specific operations on the rendered image.  The =SceneProcessor= can
   14.42  also copy the GPU image data to RAM and process it with the CPU.
   14.43 @@ -51,11 +51,11 @@
   14.44  
   14.45  Each eye in the simulated creature needs its own =ViewPort= so that
   14.46  it can see the world from its own perspective. To this =ViewPort=, I
   14.47 -add a =SceneProcessor= that feeds the visual data to any arbitray
   14.48 +add a =SceneProcessor= that feeds the visual data to any arbitrary
   14.49  continuation function for further processing.  That continuation
   14.50  function may perform both CPU and GPU operations on the data. To make
   14.51  this easy for the continuation function, the =SceneProcessor=
   14.52 -maintains appropriatly sized buffers in RAM to hold the data.  It does
   14.53 +maintains appropriately sized buffers in RAM to hold the data.  It does
   14.54  not do any copying from the GPU to the CPU itself because it is a slow
   14.55  operation.
   14.56  
   14.57 @@ -65,7 +65,7 @@
   14.58    "Create a SceneProcessor object which wraps a vision processing
   14.59    continuation function. The continuation is a function that takes 
   14.60    [#^Renderer r #^FrameBuffer fb #^ByteBuffer b #^BufferedImage bi],
   14.61 -  each of which has already been appropiately sized."
   14.62 +  each of which has already been appropriately sized."
   14.63    [continuation]
   14.64    (let [byte-buffer (atom nil)
   14.65  	renderer (atom nil)
   14.66 @@ -99,7 +99,7 @@
   14.67  =FrameBuffer= references the GPU image data, but the pixel data can
   14.68  not be used directly on the CPU.  The =ByteBuffer= and =BufferedImage=
   14.69  are initially "empty" but are sized to hold the data in the
   14.70 -=FrameBuffer=. I call transfering the GPU image data to the CPU
   14.71 +=FrameBuffer=. I call transferring the GPU image data to the CPU
   14.72  structures "mixing" the image data. I have provided three functions to
   14.73  do this mixing.
   14.74  
   14.75 @@ -129,7 +129,7 @@
   14.76  entirely in terms of =BufferedImage= inputs. Just compose that
   14.77  =BufferedImage= algorithm with =BufferedImage!=. However, a vision
   14.78  processing algorithm that is entirely hosted on the GPU does not have
   14.79 -to pay for this convienence.
   14.80 +to pay for this convenience.
   14.81  
   14.82  * Optical sensor arrays are described with images and referenced with metadata
   14.83  The vision pipeline described above handles the flow of rendered
   14.84 @@ -139,7 +139,7 @@
   14.85  An eye is described in blender in the same way as a joint. They are
   14.86  zero dimensional empty objects with no geometry whose local coordinate
   14.87  system determines the orientation of the resulting eye. All eyes are
   14.88 -childern of a parent node named "eyes" just as all joints have a
   14.89 +children of a parent node named "eyes" just as all joints have a
   14.90  parent named "joints". An eye binds to the nearest physical object
   14.91  with =bind-sense=.
   14.92  
   14.93 @@ -184,8 +184,8 @@
   14.94  
   14.95  I want to be able to model any retinal configuration, so my eye-nodes
   14.96  in blender contain metadata pointing to images that describe the
   14.97 -percise position of the individual sensors using white pixels. The
   14.98 -meta-data also describes the percise sensitivity to light that the
   14.99 +precise position of the individual sensors using white pixels. The
  14.100 +meta-data also describes the precise sensitivity to light that the
  14.101  sensors described in the image have.  An eye can contain any number of
  14.102  these images. For example, the metadata for an eye might look like
  14.103  this:
  14.104 @@ -215,11 +215,11 @@
  14.105  relative sensitivity to the channels red, green, and blue. These
  14.106  sensitivity values are packed into an integer in the order =|_|R|G|B|=
  14.107  in 8-bit fields. The RGB values of a pixel in the image are added
  14.108 -together with these sensitivities as linear weights. Therfore,
  14.109 +together with these sensitivities as linear weights. Therefore,
  14.110  0xFF0000 means sensitive to red only while 0xFFFFFF means sensitive to
  14.111  all colors equally (gray).
  14.112  
  14.113 -For convienence I've defined a few symbols for the more common
  14.114 +For convenience I've defined a few symbols for the more common
  14.115  sensitivity values.
  14.116  
  14.117  #+name: sensitivity
  14.118 @@ -383,10 +383,10 @@
  14.119  only once the first time any of the functions from the list returned
  14.120  by =vision-kernel= is called.  Each of the functions returned by
  14.121  =vision-kernel= also allows access to the =Viewport= through which
  14.122 -it recieves images.
  14.123 +it receives images.
  14.124  
  14.125 -The in-game display can be disrupted by all the viewports that the
  14.126 -functions greated by =vision-kernel= add. This doesn't affect the
  14.127 +The in-game display can be disrupted by all the ViewPorts that the
  14.128 +functions generated by =vision-kernel= add. This doesn't affect the
  14.129  simulation or the simulated senses, but can be annoying.
  14.130  =gen-fix-display= restores the in-simulation display.
  14.131  
  14.132 @@ -572,7 +572,7 @@
  14.133                  (light-up-everything world)
  14.134                  (speed-up world)
  14.135                  (.setTimer world timer)
  14.136 -                (display-dialated-time world timer)
  14.137 +                (display-dilated-time world timer)
  14.138                  ;; add a view from the worm's perspective
  14.139                  (if record?
  14.140                    (Capture/captureVideo
  14.141 @@ -685,7 +685,7 @@
  14.142  (ns cortex.vision
  14.143    "Simulate the sense of vision in jMonkeyEngine3. Enables multiple
  14.144    eyes from different positions to observe the same world, and pass
  14.145 -  the observed data to any arbitray function. Automatically reads
  14.146 +  the observed data to any arbitrary function. Automatically reads
  14.147    eye-nodes from specially prepared blender files and instantiates
  14.148    them in the world as actual eyes."
  14.149    {:author "Robert McIntyre"}
    15.1 --- a/org/world.org	Sat Feb 18 10:28:14 2012 -0700
    15.2 +++ b/org/world.org	Sat Feb 18 10:59:41 2012 -0700
    15.3 @@ -49,7 +49,7 @@
    15.4  #+name: header
    15.5  #+begin_src clojure :results silent    
    15.6  (ns cortex.world
    15.7 -  "World Creation, abstracion over jme3's input system, and REPL
    15.8 +  "World Creation, abstraction over jme3's input system, and REPL
    15.9    driven exception handling"
   15.10    {:author "Robert McIntyre"}
   15.11    
   15.12 @@ -85,7 +85,7 @@
   15.13    (doto (AppSettings. true)
   15.14      (.setFullscreen false)
   15.15      (.setTitle "Aurellem.")
   15.16 -    ;; The "Send" AudioRenderer supports sumulated hearing.
   15.17 +    ;; The "Send" AudioRenderer supports simulated hearing.
   15.18      (.setAudioRenderer "Send"))
   15.19    "These settings control how the game is displayed on the screen for
   15.20     debugging purposes.  Use binding forms to change this if desired.
    16.1 --- a/org/youtube.org	Sat Feb 18 10:28:14 2012 -0700
    16.2 +++ b/org/youtube.org	Sat Feb 18 10:59:41 2012 -0700
    16.3 @@ -1,21 +1,26 @@
    16.4 -| local-file              | youtube-url |
    16.5 -|-------------------------+-------------|
    16.6 -| basic-touch.ogg         |             |
    16.7 -| hand.ogg                |             |
    16.8 -| worm-hearing.ogg        |             |
    16.9 -| bind-sense.ogg          |             |
   16.10 -| java-hearing-test.ogg   |             |
   16.11 -| worm-muscles.ogg        |             |
   16.12 -| crumbly-hand.ogg        |             |
   16.13 -| spinning-cube.ogg       |             |
   16.14 -| worm-touch.ogg          |             |
   16.15 -| cube.ogg                |             |
   16.16 -| test-proprioception.ogg |             |
   16.17 -| worm-vision.ogg         |             |
   16.18 -| full-hand.ogg           |             |
   16.19 -| touch-cube.ogg          |             |
   16.20 -| ghost-hand.ogg          |             |
   16.21 -| worm-1.ogg              |             |
   16.22 -|-------------------------+-------------|
   16.23 +| local-file              | youtube-url                 |
   16.24 +|-------------------------+-----------------------------|
   16.25 +| basic-touch.ogg         | http://youtu.be/8xNEtD-a8f0 |
   16.26 +| hand.ogg                | http://youtu.be/DvoN2wWQ_6o |
   16.27 +| worm-hearing.ogg        |                             |
   16.28 +| bind-sense.ogg          |                             |
   16.29 +| java-hearing-test.ogg   |                             |
   16.30 +| worm-muscles.ogg        |                             |
   16.31 +| crumbly-hand.ogg        |                             |
   16.32 +| spinning-cube.ogg       |                             |
   16.33 +| worm-touch.ogg          |                             |
   16.34 +| cube.ogg                |                             |
   16.35 +| test-proprioception.ogg |                             |
   16.36 +| worm-vision.ogg         |                             |
   16.37 +| full-hand.ogg           |                             |
   16.38 +| touch-cube.ogg          |                             |
   16.39 +| ghost-hand.ogg          |                             |
   16.40 +| worm-1.ogg              |                             |
   16.41 +|-------------------------+-----------------------------|
   16.42  
   16.43  
   16.44 +
   16.45 +
   16.46 +
   16.47 +Simulated senses in jMonkeyEngine3.
   16.48 +See http://aurellem.org