# HG changeset patch # User Robert McIntyre # Date 1329587981 25200 # Node ID 7e7f8d6d9ec50a732356415660517402b02675b5 # Parent 19c43ec6958d9d3cb6d9dae1d8179fa025b11ad9 massive spellchecking diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/body.org --- a/org/body.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/body.org Sat Feb 18 10:59:41 2012 -0700 @@ -1,7 +1,7 @@ #+title: Building a Body #+author: Robert McIntyre #+email: rlm@mit.edu -#+description: Simulating a body (movement, touch, propioception) in jMonkeyEngine3. +#+description: Simulating a body (movement, touch, proprioception) in jMonkeyEngine3. #+SETUPFILE: ../../aurellem/org/setup.org #+INCLUDE: ../../aurellem/org/level-0.org @@ -22,7 +22,7 @@ to model humans and create realistic animations. It is hard to use for my purposes because it is difficult to update the creature's Physics Collision Mesh in tandem with its Geometric Mesh under the influence -of the armature. Withouth this the creature will not be able to grab +of the armature. Without this the creature will not be able to grab things in its environment, and it won't be able to tell where its physical body is by using its eyes. Also, armatures do not specify any rotational limits for a joint, making it hard to model elbows, @@ -43,12 +43,12 @@ presence of its body. Each individual section is simulated by a separate rigid body that corresponds exactly with its visual representation and does not change. Sections are connected by -invisible joints that are well supported in jMonkyeEngine. Bullet, the +invisible joints that are well supported in jMonkeyEngine. Bullet, the physics backend for jMonkeyEngine, can efficiently simulate hundreds of rigid bodies connected by joints. Sections do not have to stay as one piece forever; they can be dynamically replaced with multiple sections to simulate splitting in two. This could be used to simulate -retractable claws or EVE's hands, which are able to coalece into one +retractable claws or EVE's hands, which are able to coalesce into one object in the movie. * Solidifying the Body @@ -107,7 +107,7 @@

The hand model directly loaded from blender. It has no physical - presense in the simulation.

+ presence in the simulation.

#+end_html @@ -218,7 +218,7 @@ [[../images/hand-screenshot1.png]] The empty node in the upper right, highlighted in yellow, is the -parent node of all the emptys which represent joints. The following +parent node of all the empties which represent joints. The following functions must do three things to translate these into real joints: - Find the children of the "joints" node. @@ -241,7 +241,7 @@ ** Joint Targets and Orientation -This technique for finding a joint's targets is very similiar to +This technique for finding a joint's targets is very similar to =cortex.sense/closest-node=. A small cube, centered around the empty-node, grows exponentially until it intersects two /physical/ objects. The objects are ordered according to the joint's rotation, @@ -283,7 +283,7 @@ This section of code iterates through all the different ways of specifying joints using blender meta-data and converts each one to the -appropriate jMonkyeEngine joint. +appropriate jMonkeyEngine joint. #+name: joints-4 #+begin_src clojure @@ -307,7 +307,7 @@ pivot-b)) ;; but instead we must do this: - (println-repl "substuting 6DOF joint for POINT2POINT joint!") + (println-repl "substituting 6DOF joint for POINT2POINT joint!") (doto (SixDofJoint. control-a @@ -475,14 +475,14 @@ * Wrap-Up! -It is convienent to combine =physical!= and =joints!= into one +It is convenient to combine =physical!= and =joints!= into one function that completely creates the creature's physical body. #+name: joints-6 #+begin_src clojure (defn body! "Endow the creature with a physical body connected with joints. The - particulars of the joints and the masses of each pody part are + particulars of the joints and the masses of each body part are determined in blender." [#^Node creature] (physical! creature) @@ -544,7 +544,7 @@ (ns cortex.body "Assemble a physical creature using the definitions found in a specially prepared blender file. Creates rigid bodies and joints so - that a creature can have a physical presense in the simulation." + that a creature can have a physical presence in the simulation." {:author "Robert McIntyre"} (:use (cortex world util sense)) (:use clojure.contrib.def) diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/capture-video.org --- a/org/capture-video.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/capture-video.org Sat Feb 18 10:59:41 2012 -0700 @@ -2,7 +2,7 @@ #+author: Robert McIntyre #+email: rlm@mit.edu #+description: Capture video from a JMonkeyEngine3 Application with Xuggle, and use gstreamer to compress the video to upload to YouTube. -#+keywords: JME3, video, Xuggle, JMonkeyEngine, youtube, capture video, Java +#+keywords: JME3, video, Xuggle, JMonkeyEngine, YouTube, capture video, Java #+SETUPFILE: ../../aurellem/org/setup.org #+INCLUDE: ../../aurellem/org/level-0.org @@ -70,7 +70,7 @@ # throttles its physics engine and graphics display. If the # computations involved in running the game are too intense, then the # game will first skip frames, then sacrifice physics accuracy. If -# there are particuraly demanding computations, then you may only get +# there are particularly demanding computations, then you may only get # 1 fps, and the ball may tunnel through the floor or obstacles due to # inaccurate physics simulation, but after the end of one user-hour, # that ball will have traveled one game-mile. @@ -134,7 +134,7 @@ first step the game forward by up to four physics ticks before rendering to the screen. If it still isn't fast enough then it will decrease the accuracy of the physics engine until game-time and user -time are synched or a certain threshold is reached, at which point the +time are synced or a certain threshold is reached, at which point the game visibly slows down. In this case, JME3 continuously repeat a cycle of two physics ticks, and one screen render. For every user-second that passes, one game-second will pass, but the game will @@ -198,7 +198,7 @@ where it finalizes the recording (e.g. by writing headers for a video file). -An appropiate interface describing this behaviour could look like +An appropriate interface describing this behavior could look like this: =./src/com/aurellem/capture/video/VideoRecorder.java= @@ -214,7 +214,7 @@ =.cleanup()= method, it is only called when the =SceneProcessor= is removed from the =RenderManager=, not when the game is shutting down when the user pressed ESC, for example. To obtain reliable shutdown -behaviour, we also have to implement =AppState=, which provides a +behavior, we also have to implement =AppState=, which provides a =.cleanup()= method that /is/ called on shutdown. Here is an AbstractVideoRecorder class that takes care of the details @@ -311,7 +311,7 @@ - Establish an =Isotimer= and set its framerate :: For example, if you want to record video with a framerate of 30 fps, include - the following line of code somewhere in the initializtion of + the following line of code somewhere in the initialization of your application: #+begin_src java :exports code this.setTimer(new IsoTimer(30)); @@ -362,7 +362,7 @@ : 932K hello-video/hello-video-moving.flv : 640K hello-video/hello-video-static.flv -And can be immediately uploaded to youtube +And can be immediately uploaded to YouTube - [[http://www.youtube.com/watch?v=C8gxVAySaPg][hello-video-moving.flv]] #+BEGIN_HTML @@ -471,11 +471,11 @@ : 180M physics-videos/helloPhysics.flv -Thats a terribly large size! +That's a terribly large size! Let's compress it: ** COMMENT Compressing the HelloPhysics Video -First, we'll scale the video, then, we'll decrease it's bitrate. The +First, we'll scale the video, then, we'll decrease it's bit-rate. The end result will be perfect for upload to YouTube. #+begin_src sh :results silent diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/games.org --- a/org/games.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/games.org Sat Feb 18 10:59:41 2012 -0700 @@ -17,7 +17,7 @@ less direct translation from the java source [[http://jmonkeyengine.org/wiki/doku.php/jme3:beginner:hello_simpleapplication][here]]. Of note is the fact that since we don't have access to the -=AssetManager= via extendig =SimpleApplication=, we have to build one +=AssetManager= via extending =SimpleApplication=, we have to build one ourselves. #+name: hello-simple-app diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/hearing.org --- a/org/hearing.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/hearing.org Sat Feb 18 10:59:41 2012 -0700 @@ -91,8 +91,8 @@ - Get access to the rendered sound data for further processing from clojure. -I named it the "Multiple Audio Send" Deives, or =Send= Device for -short, since it sends audio data back to the callig application like +I named it the "Multiple Audio Send" Device, or =Send= Device for +short, since it sends audio data back to the calling application like an Aux-Send cable on a mixing board. Onward to the actual Device! @@ -762,8 +762,8 @@ ** SoundProcessors are like SceneProcessors -A =SoundProcessor= is analgous to a =SceneProcessor=. Every frame, the -=SoundProcessor= registered with a given =Listener= recieves the +A =SoundProcessor= is analogous to a =SceneProcessor=. Every frame, the +=SoundProcessor= registered with a given =Listener= receives the rendered sound data and can do whatever processing it wants with it. #+include "../../jmeCapture/src/com/aurellem/capture/audio/SoundProcessor.java" src java @@ -777,9 +777,9 @@ ** Hearing Pipeline All sound rendering is done in the CPU, so =hearing-pipeline= is -much less complicated than =vision-pipelie= The bytes available in +much less complicated than =vision-pipeline= The bytes available in the ByteBuffer obtained from the =send= Device have different meanings -dependant upon the particular hardware or your system. That is why +dependent upon the particular hardware or your system. That is why the =AudioFormat= object is necessary to provide the meaning that the raw bytes lack. =byteBuffer->pulse-vector= uses the excellent conversion facilities from [[http://www.tritonus.org/ ][tritonus]] ([[http://tritonus.sourceforge.net/apidoc/org/tritonus/share/sampled/FloatSampleTools.html#byte2floatInterleaved%2528byte%5B%5D,%2520int,%2520float%5B%5D,%2520int,%2520int,%2520javax.sound.sampled.AudioFormat%2529][javadoc]]) to generate a clojure vector of @@ -796,7 +796,7 @@ "Creates a SoundProcessor which wraps a sound processing continuation function. The continuation is a function that takes [#^ByteBuffer b #^Integer int numSamples #^AudioFormat af ], each of which - has already been apprpiately sized." + has already been appropriately sized." [continuation] (proxy [SoundProcessor] [] (cleanup []) @@ -828,7 +828,7 @@ position of an "ear" node to the closest physical object in the creature. That =Listener= will stay in the same orientation to the object with which it is bound, just as the camera in the [[http://aurellem.localhost/cortex/html/sense.html#sec-4-1][sense binding -demonstration]]. =OpenAL= simulates the doppler effect for moving +demonstration]]. =OpenAL= simulates the Doppler effect for moving listeners, =update-listener-velocity!= ensures that this velocity information is always up-to-date. @@ -878,7 +878,7 @@ #+name: hearing-kernel #+begin_src clojure (defn hearing-kernel - "Returns a functon which returns auditory sensory data when called + "Returns a function which returns auditory sensory data when called inside a running simulation." [#^Node creature #^Spatial ear] (let [hearing-data (atom []) @@ -916,7 +916,7 @@ ** Visualizing Hearing -This is a simple visualization function which displaye the waveform +This is a simple visualization function which displays the waveform reported by the simulated sense of hearing. It converts the values reported in the vector returned by the hearing function from the range [-1.0, 1.0] to the range [0 255], converts to integer, and displays @@ -1077,9 +1077,9 @@

The worm can now hear the sound pulses produced from the hymn. Notice the strikingly different pattern that human speech - makes compared to the insturments. Once the worm is pushed off the + makes compared to the instruments. Once the worm is pushed off the floor, the sound it hears is attenuated, and the display of the - sound it hears beomes fainter. This shows the 3D localization of + sound it hears becomes fainter. This shows the 3D localization of sound in this world.

#+end_html diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/ideas.org --- a/org/ideas.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/ideas.org Sat Feb 18 10:59:41 2012 -0700 @@ -9,7 +9,7 @@ | Vision | Variable Coloration | | Hearing | Speech | | Proprioception | Movement | -| Smell/Taste (Chemoreception) | Pheremones | +| Smell/Taste (Chemoreception) | Pheromones | | Touch | Movement / Controllable Texture | | Acceleration | Movement | | Balance (sense gravity) | Movement | @@ -17,7 +17,7 @@ - New Senses/Effectors - Levitation -- Telekenesis +- Telekinesis - control of gravity within a certain radius - speed up/slow time - object creation/destruction @@ -42,8 +42,8 @@ "extend all your fingers" or "move your hand into the area with blue light" or "decrease the angle of this joint". It would be like Sussman's HACKER, except it would operate with much more data - in a more realistic world. Start off with "calestanthics" to - develop subrouitines over the motor control API. This would be the + in a more realistic world. Start off with "calisthenics" to + develop subroutines over the motor control API. This would be the "spinal chord" of a more intelligent creature. The low level programming code might be a turning machine that could develop programs to iterate over a "tape" where each entry in the tape @@ -52,7 +52,7 @@ creature interacts using its fingers to press keys on a virtual keyboard. The creature can access the internet, watch videos, take over the world, anything it wants. - - Make virtual insturments like pianos, drumbs, etc that it learns to + - Make virtual instruments like pianos, drums, etc that it learns to play. - make a joint that figures out what type of joint it is (range of motion) @@ -63,17 +63,17 @@ * goals -** have to get done before winston - - [X] write an explination for why greyscale bitmaps for senses is - appropiate -- 1/2 day +** have to get done before Winston + - [X] write an explanation for why greyscale bitmaps for senses is + appropriate -- 1/2 day - [X] muscle control -- day - [X] proprioception sensor map in the style of the other senses -- day - [X] refactor integration code to distribute to each of the senses -- day - [X] create video showing all the senses for Winston -- 2 days - - [ ] spellchecking !!!!! + - [ ] spell checking !!!!! - [ ] send package to friends for critiques -- 2 days - - [X] fix videos that were encoded wrong, test on InterNet Explorer. + - [X] fix videos that were encoded wrong, test on Internet Explorer. - [X] redo videos vision with new collapse code - [X] find a topology that looks good. (maybe nil topology?) - [X] fix red part of touch cube in video and image @@ -82,14 +82,14 @@ - [ ] additional senses to be implemented for Winston | -- 2 days - [ ] send Winston package / -** would be cool to get done before winston +** would be cool to get done before Winston - [X] enable greyscale bitmaps for touch -- 2 hours - [X] use sawfish to auto-tile sense windows -- 6 hours - [X] sawfish keybinding to automatically delete all sense windows - [ ] proof of concept C sense manipulation -- 2 days - [ ] proof of concept GPU sense manipulation -- week - - [ ] fourier view of sound -- 2 or 3 days - - [ ] dancing music listener -- 1 day, depends on fourier + - [ ] Fourier view of sound -- 2 or 3 days + - [ ] dancing music listener -- 1 day, depends on Fourier - [ ] uberjar cortex demo ** don't have to get done before winston @@ -105,7 +105,7 @@ * jMonkeyEngine ideas - [ ] video showing bullet joints problem - [ ] add mult for Matrix to Ray - - [ ] add iteraterator constructs to Vector3f, Vector2f, Triangle, + - [ ] add iterator constructs to Vector3f, Vector2f, Triangle, Matrix3f, Matrix4f, etc ;;In the elder days of Art, diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/integration.org --- a/org/integration.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/integration.org Sat Feb 18 10:59:41 2012 -0700 @@ -1,4 +1,4 @@ -#+title: +#+title: Integration #+author: Robert McIntyre #+email: rlm@mit.edu #+description: @@ -103,7 +103,7 @@ (def full 9001) -;; Coreography: +;; Choreography: ;; Let the hand fall palm-up @@ -117,7 +117,7 @@ ;; hand FORCEFULLY catapults the block so that it hits the camera. -;; the systax here is [keyframe body-part force] +;; the syntax here is [keyframe body-part force] (def move-fingers [[300 :pinky-3-f 50] [320 :pinky-2-f 80] @@ -867,6 +867,7 @@ #+end_src + * COMMENT generate source #+begin_src clojure :tangle ../src/cortex/test/integration.clj <> diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/intro.org --- a/org/intro.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/intro.org Sat Feb 18 10:59:41 2012 -0700 @@ -2,7 +2,7 @@ #+author: Robert McIntyre #+email: rlm@mit.edu #+description: Simulating senses for AI research using JMonkeyEngine3 -#+keywords: Alan Turing, AI, sinulated senses, jMonkeyEngine3, virtual world +#+keywords: Alan Turing, AI, simulated senses, jMonkeyEngine3, virtual world #+SETUPFILE: ../../aurellem/org/setup.org #+INCLUDE: ../../aurellem/org/level-0.org #+babel: :mkdirp yes :noweb yes diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/movement.org --- a/org/movement.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/movement.org Sat Feb 18 10:59:41 2012 -0700 @@ -9,7 +9,7 @@ * Muscles -Surprisingly enough, terristerial creatures only move by using torque +Surprisingly enough, terrestrial creatures only move by using torque applied about their joints. There's not a single straight line of force in the human body at all! (A straight line of force would correspond to some sort of jet or rocket propulsion.) @@ -17,23 +17,23 @@ In humans, muscles are composed of muscle fibers which can contract to exert force. The muscle fibers which compose a muscle are partitioned into discrete groups which are each controlled by a single alpha motor -neuton. A single alpha motor neuron might control as little as three +neuron. A single alpha motor neuron might control as little as three or as many as one thousand muscle fibers. When the alpha motor neuron is engaged by the spinal cord, it activates all of the muscle fibers to which it is attached. The spinal cord generally engages the alpha motor neurons which control few muscle fibers before the motor neurons -which control many muscle fibers. This recruitment stragety allows -for percise movements at low strength. The collection of all motor +which control many muscle fibers. This recruitment strategy allows +for precise movements at low strength. The collection of all motor neurons that control a muscle is called the motor pool. The brain essentially says "activate 30% of the motor pool" and the spinal cord -recruits motor neurons untill 30% are activated. Since the +recruits motor neurons until 30% are activated. Since the distribution of power among motor neurons is unequal and recruitment goes from weakest to strongest, the first 30% of the motor pool might be 5% of the strength of the muscle. -My simulated muscles follow a similiar design: Each muscle is defined +My simulated muscles follow a similar design: Each muscle is defined by a 1-D array of numbers (the "motor pool"). Each entry in the array -represents a motor neuron which controlls a number of muscle fibers +represents a motor neuron which controls a number of muscle fibers equal to the value of the entry. Each muscle has a scalar strength factor which determines the total force the muscle can exert when all motor neurons are activated. The effector function for a muscle takes @@ -42,9 +42,9 @@ motor-neuron will apply force in proportion to its value in the array. Lower values cause less force. The lower values can be put at the "beginning" of the 1-D array to simulate the layout of actual human -muscles, which are capable of more percise movements when exerting -less force. Or, the motor pool can simulate more exoitic recruitment -strageties which do not correspond to human muscles. +muscles, which are capable of more precise movements when exerting +less force. Or, the motor pool can simulate more exotic recruitment +strategies which do not correspond to human muscles. This 1D array is defined in an image file for ease of creation/visualization. Here is an example muscle profile image. @@ -140,7 +140,7 @@ =movement-kernel= creates a function that will move the nearest physical object to the muscle node. The muscle exerts a rotational -force dependant on it's orientation to the object in the blender +force dependent on it's orientation to the object in the blender file. The function returned by =movement-kernel= is also a sense function: it returns the percent of the total muscle strength that is currently being employed. This is analogous to muscle tension in @@ -148,7 +148,7 @@ post. * Visualizing Muscle Tension -Muscle exertion is a percent of a total, so the visulazation is just a +Muscle exertion is a percent of a total, so the visualization is just a simple percent bar. #+name: visualization @@ -240,7 +240,7 @@

The worm is now able to move. The bar in the lower right displays the power output of the muscle . Each jump causes 20 more motor neurons to be recruited. Notice that the power output increases non-linearly - with motror neuron recruitement, similiar to a human muscle.

+ with motor neuron recruitment, similar to a human muscle.

#+end_html diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/proprioception.org --- a/org/proprioception.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/proprioception.org Sat Feb 18 10:59:41 2012 -0700 @@ -25,10 +25,10 @@ Proprioception in humans is mediated by [[http://en.wikipedia.org/wiki/Articular_capsule][joint capsules]], [[http://en.wikipedia.org/wiki/Muscle_spindle][muscle spindles]], and the [[http://en.wikipedia.org/wiki/Golgi_tendon_organ][Golgi tendon organs]]. These measure the relative -positions of each pody part by monitoring muscle strain and length. +positions of each body part by monitoring muscle strain and length. -It's clear that this is a vital sense for fulid, graceful -movement. It's also particurally easy to implement in jMonkeyEngine. +It's clear that this is a vital sense for fluid, graceful +movement. It's also particularly easy to implement in jMonkeyEngine. My simulated proprioception calculates the relative angles of each joint from the rest position defined in the blender file. This @@ -83,7 +83,7 @@ * Proprioception Kernel Given a joint, =proprioception-kernel= produces a function that -calculates the euler angles between the the objects the joint +calculates the Euler angles between the the objects the joint connects. #+name: proprioception @@ -142,7 +142,7 @@ Proprioception has the lowest bandwidth of all the senses so far, and it doesn't lend itself as readily to visual representation like -vision, hearing, or touch. This visualization code creates a "guage" +vision, hearing, or touch. This visualization code creates a "gauge" to view each of the three relative angles along a circle. #+name: visualize @@ -217,7 +217,7 @@ (defn test-proprioception "Testing proprioception: - You should see two foating bars, and a printout of pitch, yaw, and + You should see two floating bars, and a printout of pitch, yaw, and roll. Pressing key-r/key-t should move the blue bar up and down and change only the value of pitch. key-f/key-g moves it side to side and changes yaw. key-v/key-b will spin the blue segment clockwise @@ -338,7 +338,7 @@ #+begin_src clojure (ns cortex.proprioception "Simulate the sense of proprioception (ability to detect the - relative positions of body parts with repsect to other body parts) + relative positions of body parts with respect to other body parts) in jMonkeyEngine3. Reads specially prepared blender files to automatically generate proprioceptive senses." (:use (cortex world util sense body)) diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/sense.org --- a/org/sense.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/sense.org Sat Feb 18 10:59:41 2012 -0700 @@ -7,9 +7,9 @@ #+INCLUDE: ../../aurellem/org/level-0.org * Blender Utilities -In blender, any object can be assigned an arbitray number of key-value -pairs which are called "Custom Properties". These are accessable in -jMonkyeEngine when blender files are imported with the +In blender, any object can be assigned an arbitrary number of key-value +pairs which are called "Custom Properties". These are accessible in +jMonkeyEngine when blender files are imported with the =BlenderLoader=. =meta-data= extracts these properties. #+name: blender-1 @@ -55,7 +55,7 @@ circular two-dimensional image which is the cross section of the spinal cord. Points on this image that are close together in this circle represent touch sensors that are /probably/ close together on -the skin, although there is of course some cutting and rerangement +the skin, although there is of course some cutting and rearrangement that has to be done to transfer the complicated surface of the skin onto a two dimensional image. @@ -84,13 +84,13 @@ of that senses sensors. To get different types of sensors, you can either use a different color for each type of sensor, or use multiple UV-maps, each labeled with that sensor type. I generally use a white -pixel to mean the presense of a sensor and a black pixel to mean the -absense of a sensor, and use one UV-map for each sensor-type within a +pixel to mean the presence of a sensor and a black pixel to mean the +absence of a sensor, and use one UV-map for each sensor-type within a given sense. The paths to the images are not stored as the actual UV-map of the blender object but are instead referenced in the meta-data of the node. -#+CAPTION: The UV-map for an enlongated icososphere. The white dots each represent a touch sensor. They are dense in the regions that describe the tip of the finger, and less dense along the dorsal side of the finger opposite the tip. +#+CAPTION: The UV-map for an elongated icososphere. The white dots each represent a touch sensor. They are dense in the regions that describe the tip of the finger, and less dense along the dorsal side of the finger opposite the tip. #+ATTR_HTML: width="300" [[../images/finger-UV.png]] @@ -156,11 +156,11 @@ Information from the senses is transmitted to the brain via bundles of axons, whether it be the optic nerve or the spinal cord. While these -bundles more or less perserve the overall topology of a sense's -two-dimensional surface, they do not perserve the percise euclidean +bundles more or less preserve the overall topology of a sense's +two-dimensional surface, they do not preserve the precise euclidean distances between every sensor. =collapse= is here to smoosh the -sensors described by a UV-map into a contigous region that still -perserves the topology of the original sense. +sensors described by a UV-map into a contiguous region that still +preserves the topology of the original sense. #+name: topology-2 #+begin_src clojure @@ -180,7 +180,7 @@ (defn collapse "Take a sequence of pairs of integers and collapse them into a - contigous bitmap with no \"holes\" or negative entries, as close to + contiguous bitmap with no \"holes\" or negative entries, as close to the origin [0 0] as the shape permits. The order of the points is preserved. @@ -224,11 +224,11 @@ [(- x min-x) (- y min-y)]) squeezed)) - point-correspondance + point-correspondence (zipmap (sort points) (sort relocated)) original-order - (vec (map point-correspondance points))] + (vec (map point-correspondence points))] original-order))) #+end_src * Viewing Sense Data @@ -245,7 +245,7 @@ (in-ns 'cortex.sense) (defn view-image - "Initailizes a JPanel on which you may draw a BufferedImage. + "Initializes a JPanel on which you may draw a BufferedImage. Returns a function that accepts a BufferedImage and draws it to the JPanel. If given a directory it will save the images as png files starting at 0000000.png and incrementing from there." @@ -309,7 +309,7 @@ (defn points->image - "Take a collection of points and visuliaze it as a BufferedImage." + "Take a collection of points and visualize it as a BufferedImage." [points] (if (empty? points) (BufferedImage. 1 1 BufferedImage/TYPE_BYTE_BINARY) @@ -348,7 +348,7 @@ - Create a single top-level empty node whose name is the name of the sense - Add empty nodes which each contain meta-data relevant to the sense, including a UV-map describing the number/distribution - of sensors if applicipable. + of sensors if applicable. - Make each empty-node the child of the top-level node. =sense-nodes= below generates functions to find these children. @@ -538,7 +538,7 @@ (ns cortex.sense "Here are functions useful in the construction of two or more sensors/effectors." - {:author "Robert McInytre"} + {:author "Robert McIntyre"} (:use (cortex world util)) (:import ij.process.ImageProcessor) (:import jme3tools.converters.ImageToAwt) diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/test.org --- a/org/test.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/test.org Sat Feb 18 10:59:41 2012 -0700 @@ -1,7 +1,7 @@ #+title: Test Suite #+author: Robert McIntyre #+email: rlm@mit.edu -#+description: Simulating a body (movement, touch, propioception) in jMonkeyEngine3. +#+description: Simulating a body (movement, touch, proprioception) in jMonkeyEngine3. #+SETUPFILE: ../../aurellem/org/setup.org #+INCLUDE: ../../aurellem/org/level-0.org @@ -17,7 +17,7 @@ com.jme3.system.AppSettings)) (defn run-world - "run the simulation and wait until it closes proprely." + "run the simulation and wait until it closes properly." [world] (let [lock (promise)] (.enqueue diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/touch.org --- a/org/touch.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/touch.org Sat Feb 18 10:59:41 2012 -0700 @@ -11,7 +11,7 @@ Touch is critical to navigation and spatial reasoning and as such I need a simulated version of it to give to my AI creatures. -Human skin has a wide array of touch sensors, each of which speciliaze +Human skin has a wide array of touch sensors, each of which specialize in detecting different vibrational modes and pressures. These sensors can integrate a vast expanse of skin (i.e. your entire palm), or a tiny patch of skin at the tip of your finger. The hairs of the skin @@ -32,7 +32,7 @@ sense of touch which is a hybrid between a human's sense of deformation and sense from hairs. -Implementing touch in jMonkeyEngine follows a different techinal route +Implementing touch in jMonkeyEngine follows a different technical route than vision and hearing. Those two senses piggybacked off jMonkeyEngine's 3D audio and video rendering subsystems. To simulate touch, I use jMonkeyEngine's physics system to execute many small @@ -77,7 +77,7 @@ To simulate touch there are three conceptual steps. For each solid object in the creature, you first have to get UV image and scale -paramater which define the position and length of the feelers. Then, +parameter which define the position and length of the feelers. Then, you use the triangles which compose the mesh and the UV data stored in the mesh to determine the world-space position and orientation of each feeler. Then once every frame, update these positions and orientations @@ -100,7 +100,7 @@ normalized coordinates of the tips of the feelers. =feeler-tips=. * Triangle Math -** Schrapnel Conversion Functions +** Shrapnel Conversion Functions #+name: triangles-1 #+begin_src clojure @@ -121,11 +121,11 @@ (apply #(Triangle. %1 %2 %3) (map ->vector3f points))) #+end_src -It is convienent to treat a =Triangle= as a vector of vectors, and a +It is convenient to treat a =Triangle= as a vector of vectors, and a =Vector2f= or =Vector3f= as vectors of floats. (->vector3f) and (->triangle) undo the operations of =vector3f-seq= and =triangle-seq=. If these classes implemented =Iterable= then =seq= -would work on them automitacally. +would work on them automatically. ** Decomposing a 3D shape into Triangles @@ -134,7 +134,7 @@ data involved with displaying the object. A =Mesh= is composed of =Triangles=, and each =Triangle= has three -verticies which have coordinates in world space and UV space. +vertices which have coordinates in world space and UV space. Here, =triangles= gets all the world-space triangles which compose a mesh, while =pixel-triangles= gets those same triangles expressed in @@ -216,7 +216,7 @@ 1 & 1 & 1 & 1 \end{bmatrix} -Here, the first three columns of the matrix are the verticies of the +Here, the first three columns of the matrix are the vertices of the triangle. The last column is the right-handed unit normal of the triangle. @@ -225,7 +225,7 @@ $T_{2}T_{1}^{-1}$ -The clojure code below recaptiulates the formulas above, using +The clojure code below recapitulates the formulas above, using jMonkeyEngine's =Matrix4f= objects, which can describe any affine transformation. @@ -380,7 +380,7 @@ (defn set-ray [#^Ray ray #^Matrix4f transform #^Vector3f origin #^Vector3f tip] - ;; Doing everything locally recduces garbage collection by enough to + ;; Doing everything locally reduces garbage collection by enough to ;; be worth it. (.mult transform origin (.getOrigin ray)) (.mult transform tip (.getDirection ray)) @@ -443,7 +443,7 @@ (defn touch! "Endow the creature with the sense of touch. Returns a sequence of - functions, one for each body part with a tactile-sensor-proile, + functions, one for each body part with a tactile-sensor-profile, each of which when called returns sensory data for that body part." [#^Node creature] (filter @@ -459,7 +459,7 @@ * Visualizing Touch Each feeler is represented in the image as a single pixel. The -grayscale value of each pixel represents how deep the feeler +greyscale value of each pixel represents how deep the feeler represented by that pixel is inside another object. Black means that nothing is touching the feeler, while white means that the feeler is completely inside another object, which is presumably flush with the @@ -470,7 +470,7 @@ (in-ns 'cortex.touch) (defn touch->gray - "Convert a pair of [distance, max-distance] into a grayscale pixel." + "Convert a pair of [distance, max-distance] into a gray-scale pixel." [distance max-distance] (gray (- 255 (rem (int (* 255 (/ distance max-distance))) 256)))) @@ -736,7 +736,7 @@ * Next So far I've implemented simulated Vision, Hearing, and Touch, the most -obvious and promiment senses that humans have. Smell and Taste shall +obvious and prominent senses that humans have. Smell and Taste shall remain unimplemented for now. This accounts for the "five senses" that feature so prominently in our lives. But humans have far more than the five main senses. There are internal chemical senses, pain (which is diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/util.org --- a/org/util.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/util.org Sat Feb 18 10:59:41 2012 -0700 @@ -28,7 +28,7 @@ (defn jme-class? [classname] (and (.startsWith classname "com.jme3.") - ;; Don't import the Lwjgl stuff since it can throw exceptions + ;; Don't import the LWJGL stuff since it can throw exceptions ;; upon being loaded. (not (re-matches #".*Lwjgl.*" classname)))) @@ -52,10 +52,10 @@ happy with the general structure of a namespace I can deal with importing only the classes it actually needs. -The =mega-import-jme3= is quite usefull for debugging purposes since +The =mega-import-jme3= is quite useful for debugging purposes since it allows completion for almost all of JME's classes from the REPL. -Out of curiousity, let's see just how many classes =mega-import-jme3= +Out of curiosity, let's see just how many classes =mega-import-jme3= imports: #+begin_src clojure :exports both :results output @@ -226,7 +226,7 @@ (in-ns 'cortex.util) (defn load-bullet - "Runnig this function unpacks the native bullet libraries and makes + "Running this function unpacks the native bullet libraries and makes them available." [] (let [sim (world (Node.) {} no-op no-op)] @@ -316,8 +316,8 @@ ([] (sphere 0.5))) (defn x-ray - "A usefull material for debuging -- it can be seen no matter what - object occuldes it." + "A useful material for debugging -- it can be seen no matter what + object occludes it." [#^ColorRGBA color] (doto (Material. (asset-manager) "Common/MatDefs/Misc/Unshaded.j3md") @@ -359,7 +359,7 @@ (in-ns 'cortex.util) (defn basic-light-setup - "returns a sequence of lights appropiate for fully lighting a scene" + "returns a sequence of lights appropriate for fully lighting a scene" [] (conj (doall @@ -379,7 +379,7 @@ (.setColor ColorRGBA/White)))) (defn light-up-everything - "Add lights to a world appropiate for quickly seeing everything + "Add lights to a world appropriate for quickly seeing everything in the scene. Adds six DirectionalLights facing in orthogonal directions, and one AmbientLight to provide overall lighting coverage." @@ -433,7 +433,7 @@ forces [root-node keymap - intilization + initialization world-loop]] (let [add-keypress (fn [state keymap key] @@ -472,14 +472,14 @@ (splice-loop))] [root-node keymap* - intilization + initialization world-loop*])) (import com.jme3.font.BitmapText) (import com.jme3.scene.control.AbstractControl) (import com.aurellem.capture.IsoTimer) -(defn display-dialated-time +(defn display-dilated-time "Shows the time as it is flowing in the simulation on a HUD display. Useful for making videos." [world timer] diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/vision.org --- a/org/vision.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/vision.org Sat Feb 18 10:59:41 2012 -0700 @@ -11,11 +11,11 @@ Vision is one of the most important senses for humans, so I need to build a simulated sense of vision for my AI. I will do this with -simulated eyes. Each eye can be independely moved and should see its +simulated eyes. Each eye can be independently moved and should see its own version of the world depending on where it is. -Making these simulated eyes a reality is simple bacause jMonkeyEngine -already conatains extensive support for multiple views of the same 3D +Making these simulated eyes a reality is simple because jMonkeyEngine +already contains extensive support for multiple views of the same 3D simulated world. The reason jMonkeyEngine has this support is because the support is necessary to create games with split-screen views. Multiple views are also used to create efficient @@ -26,7 +26,7 @@ [[../images/goldeneye-4-player.png]] ** =ViewPorts=, =SceneProcessors=, and the =RenderManager=. -# =Viewports= are cameras; =RenderManger= takes snapshots each frame. +# =ViewPorts= are cameras; =RenderManger= takes snapshots each frame. #* A Brief Description of jMonkeyEngine's Rendering Pipeline jMonkeyEngine allows you to create a =ViewPort=, which represents a @@ -35,13 +35,13 @@ =ViewPort=, rendering the scene in the GPU. For each =ViewPort= there is a =FrameBuffer= which represents the rendered image in the GPU. -#+caption: =ViewPorts= are cameras in the world. During each frame, the =Rendermanager= records a snapshot of what each view is currently seeing; these snapshots are =Framebuffer= objects. +#+caption: =ViewPorts= are cameras in the world. During each frame, the =RenderManager= records a snapshot of what each view is currently seeing; these snapshots are =FrameBuffer= objects. #+ATTR_HTML: width="400" [[../images/diagram_rendermanager2.png]] Each =ViewPort= can have any number of attached =SceneProcessor= objects, which are called every time a new frame is rendered. A -=SceneProcessor= recieves its =ViewPort's= =FrameBuffer= and can do +=SceneProcessor= receives its =ViewPort's= =FrameBuffer= and can do whatever it wants to the data. Often this consists of invoking GPU specific operations on the rendered image. The =SceneProcessor= can also copy the GPU image data to RAM and process it with the CPU. @@ -51,11 +51,11 @@ Each eye in the simulated creature needs its own =ViewPort= so that it can see the world from its own perspective. To this =ViewPort=, I -add a =SceneProcessor= that feeds the visual data to any arbitray +add a =SceneProcessor= that feeds the visual data to any arbitrary continuation function for further processing. That continuation function may perform both CPU and GPU operations on the data. To make this easy for the continuation function, the =SceneProcessor= -maintains appropriatly sized buffers in RAM to hold the data. It does +maintains appropriately sized buffers in RAM to hold the data. It does not do any copying from the GPU to the CPU itself because it is a slow operation. @@ -65,7 +65,7 @@ "Create a SceneProcessor object which wraps a vision processing continuation function. The continuation is a function that takes [#^Renderer r #^FrameBuffer fb #^ByteBuffer b #^BufferedImage bi], - each of which has already been appropiately sized." + each of which has already been appropriately sized." [continuation] (let [byte-buffer (atom nil) renderer (atom nil) @@ -99,7 +99,7 @@ =FrameBuffer= references the GPU image data, but the pixel data can not be used directly on the CPU. The =ByteBuffer= and =BufferedImage= are initially "empty" but are sized to hold the data in the -=FrameBuffer=. I call transfering the GPU image data to the CPU +=FrameBuffer=. I call transferring the GPU image data to the CPU structures "mixing" the image data. I have provided three functions to do this mixing. @@ -129,7 +129,7 @@ entirely in terms of =BufferedImage= inputs. Just compose that =BufferedImage= algorithm with =BufferedImage!=. However, a vision processing algorithm that is entirely hosted on the GPU does not have -to pay for this convienence. +to pay for this convenience. * Optical sensor arrays are described with images and referenced with metadata The vision pipeline described above handles the flow of rendered @@ -139,7 +139,7 @@ An eye is described in blender in the same way as a joint. They are zero dimensional empty objects with no geometry whose local coordinate system determines the orientation of the resulting eye. All eyes are -childern of a parent node named "eyes" just as all joints have a +children of a parent node named "eyes" just as all joints have a parent named "joints". An eye binds to the nearest physical object with =bind-sense=. @@ -184,8 +184,8 @@ I want to be able to model any retinal configuration, so my eye-nodes in blender contain metadata pointing to images that describe the -percise position of the individual sensors using white pixels. The -meta-data also describes the percise sensitivity to light that the +precise position of the individual sensors using white pixels. The +meta-data also describes the precise sensitivity to light that the sensors described in the image have. An eye can contain any number of these images. For example, the metadata for an eye might look like this: @@ -215,11 +215,11 @@ relative sensitivity to the channels red, green, and blue. These sensitivity values are packed into an integer in the order =|_|R|G|B|= in 8-bit fields. The RGB values of a pixel in the image are added -together with these sensitivities as linear weights. Therfore, +together with these sensitivities as linear weights. Therefore, 0xFF0000 means sensitive to red only while 0xFFFFFF means sensitive to all colors equally (gray). -For convienence I've defined a few symbols for the more common +For convenience I've defined a few symbols for the more common sensitivity values. #+name: sensitivity @@ -383,10 +383,10 @@ only once the first time any of the functions from the list returned by =vision-kernel= is called. Each of the functions returned by =vision-kernel= also allows access to the =Viewport= through which -it recieves images. +it receives images. -The in-game display can be disrupted by all the viewports that the -functions greated by =vision-kernel= add. This doesn't affect the +The in-game display can be disrupted by all the ViewPorts that the +functions generated by =vision-kernel= add. This doesn't affect the simulation or the simulated senses, but can be annoying. =gen-fix-display= restores the in-simulation display. @@ -572,7 +572,7 @@ (light-up-everything world) (speed-up world) (.setTimer world timer) - (display-dialated-time world timer) + (display-dilated-time world timer) ;; add a view from the worm's perspective (if record? (Capture/captureVideo @@ -685,7 +685,7 @@ (ns cortex.vision "Simulate the sense of vision in jMonkeyEngine3. Enables multiple eyes from different positions to observe the same world, and pass - the observed data to any arbitray function. Automatically reads + the observed data to any arbitrary function. Automatically reads eye-nodes from specially prepared blender files and instantiates them in the world as actual eyes." {:author "Robert McIntyre"} diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/world.org --- a/org/world.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/world.org Sat Feb 18 10:59:41 2012 -0700 @@ -49,7 +49,7 @@ #+name: header #+begin_src clojure :results silent (ns cortex.world - "World Creation, abstracion over jme3's input system, and REPL + "World Creation, abstraction over jme3's input system, and REPL driven exception handling" {:author "Robert McIntyre"} @@ -85,7 +85,7 @@ (doto (AppSettings. true) (.setFullscreen false) (.setTitle "Aurellem.") - ;; The "Send" AudioRenderer supports sumulated hearing. + ;; The "Send" AudioRenderer supports simulated hearing. (.setAudioRenderer "Send")) "These settings control how the game is displayed on the screen for debugging purposes. Use binding forms to change this if desired. diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/youtube.org --- a/org/youtube.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/youtube.org Sat Feb 18 10:59:41 2012 -0700 @@ -1,21 +1,26 @@ -| local-file | youtube-url | -|-------------------------+-------------| -| basic-touch.ogg | | -| hand.ogg | | -| worm-hearing.ogg | | -| bind-sense.ogg | | -| java-hearing-test.ogg | | -| worm-muscles.ogg | | -| crumbly-hand.ogg | | -| spinning-cube.ogg | | -| worm-touch.ogg | | -| cube.ogg | | -| test-proprioception.ogg | | -| worm-vision.ogg | | -| full-hand.ogg | | -| touch-cube.ogg | | -| ghost-hand.ogg | | -| worm-1.ogg | | -|-------------------------+-------------| +| local-file | youtube-url | +|-------------------------+-----------------------------| +| basic-touch.ogg | http://youtu.be/8xNEtD-a8f0 | +| hand.ogg | http://youtu.be/DvoN2wWQ_6o | +| worm-hearing.ogg | | +| bind-sense.ogg | | +| java-hearing-test.ogg | | +| worm-muscles.ogg | | +| crumbly-hand.ogg | | +| spinning-cube.ogg | | +| worm-touch.ogg | | +| cube.ogg | | +| test-proprioception.ogg | | +| worm-vision.ogg | | +| full-hand.ogg | | +| touch-cube.ogg | | +| ghost-hand.ogg | | +| worm-1.ogg | | +|-------------------------+-----------------------------| + + + +Simulated senses in jMonkeyEngine3. +See http://aurellem.org