# HG changeset patch
# User Robert McIntyre The hand model directly loaded from blender. It has no physical
- presense in the simulation.
The worm can now hear the sound pulses produced from the hymn. Notice the strikingly different pattern that human speech - makes compared to the insturments. Once the worm is pushed off the + makes compared to the instruments. Once the worm is pushed off the floor, the sound it hears is attenuated, and the display of the - sound it hears beomes fainter. This shows the 3D localization of + sound it hears becomes fainter. This shows the 3D localization of sound in this world.
#+end_html diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/ideas.org --- a/org/ideas.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/ideas.org Sat Feb 18 10:59:41 2012 -0700 @@ -9,7 +9,7 @@ | Vision | Variable Coloration | | Hearing | Speech | | Proprioception | Movement | -| Smell/Taste (Chemoreception) | Pheremones | +| Smell/Taste (Chemoreception) | Pheromones | | Touch | Movement / Controllable Texture | | Acceleration | Movement | | Balance (sense gravity) | Movement | @@ -17,7 +17,7 @@ - New Senses/Effectors - Levitation -- Telekenesis +- Telekinesis - control of gravity within a certain radius - speed up/slow time - object creation/destruction @@ -42,8 +42,8 @@ "extend all your fingers" or "move your hand into the area with blue light" or "decrease the angle of this joint". It would be like Sussman's HACKER, except it would operate with much more data - in a more realistic world. Start off with "calestanthics" to - develop subrouitines over the motor control API. This would be the + in a more realistic world. Start off with "calisthenics" to + develop subroutines over the motor control API. This would be the "spinal chord" of a more intelligent creature. The low level programming code might be a turning machine that could develop programs to iterate over a "tape" where each entry in the tape @@ -52,7 +52,7 @@ creature interacts using its fingers to press keys on a virtual keyboard. The creature can access the internet, watch videos, take over the world, anything it wants. - - Make virtual insturments like pianos, drumbs, etc that it learns to + - Make virtual instruments like pianos, drums, etc that it learns to play. - make a joint that figures out what type of joint it is (range of motion) @@ -63,17 +63,17 @@ * goals -** have to get done before winston - - [X] write an explination for why greyscale bitmaps for senses is - appropiate -- 1/2 day +** have to get done before Winston + - [X] write an explanation for why greyscale bitmaps for senses is + appropriate -- 1/2 day - [X] muscle control -- day - [X] proprioception sensor map in the style of the other senses -- day - [X] refactor integration code to distribute to each of the senses -- day - [X] create video showing all the senses for Winston -- 2 days - - [ ] spellchecking !!!!! + - [ ] spell checking !!!!! - [ ] send package to friends for critiques -- 2 days - - [X] fix videos that were encoded wrong, test on InterNet Explorer. + - [X] fix videos that were encoded wrong, test on Internet Explorer. - [X] redo videos vision with new collapse code - [X] find a topology that looks good. (maybe nil topology?) - [X] fix red part of touch cube in video and image @@ -82,14 +82,14 @@ - [ ] additional senses to be implemented for Winston | -- 2 days - [ ] send Winston package / -** would be cool to get done before winston +** would be cool to get done before Winston - [X] enable greyscale bitmaps for touch -- 2 hours - [X] use sawfish to auto-tile sense windows -- 6 hours - [X] sawfish keybinding to automatically delete all sense windows - [ ] proof of concept C sense manipulation -- 2 days - [ ] proof of concept GPU sense manipulation -- week - - [ ] fourier view of sound -- 2 or 3 days - - [ ] dancing music listener -- 1 day, depends on fourier + - [ ] Fourier view of sound -- 2 or 3 days + - [ ] dancing music listener -- 1 day, depends on Fourier - [ ] uberjar cortex demo ** don't have to get done before winston @@ -105,7 +105,7 @@ * jMonkeyEngine ideas - [ ] video showing bullet joints problem - [ ] add mult for Matrix to Ray - - [ ] add iteraterator constructs to Vector3f, Vector2f, Triangle, + - [ ] add iterator constructs to Vector3f, Vector2f, Triangle, Matrix3f, Matrix4f, etc ;;In the elder days of Art, diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/integration.org --- a/org/integration.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/integration.org Sat Feb 18 10:59:41 2012 -0700 @@ -1,4 +1,4 @@ -#+title: +#+title: Integration #+author: Robert McIntyre #+email: rlm@mit.edu #+description: @@ -103,7 +103,7 @@ (def full 9001) -;; Coreography: +;; Choreography: ;; Let the hand fall palm-up @@ -117,7 +117,7 @@ ;; hand FORCEFULLY catapults the block so that it hits the camera. -;; the systax here is [keyframe body-part force] +;; the syntax here is [keyframe body-part force] (def move-fingers [[300 :pinky-3-f 50] [320 :pinky-2-f 80] @@ -867,6 +867,7 @@ #+end_src + * COMMENT generate source #+begin_src clojure :tangle ../src/cortex/test/integration.clj <The worm is now able to move. The bar in the lower right displays the power output of the muscle . Each jump causes 20 more motor neurons to be recruited. Notice that the power output increases non-linearly - with motror neuron recruitement, similiar to a human muscle.
+ with motor neuron recruitment, similar to a human muscle. #+end_html diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/proprioception.org --- a/org/proprioception.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/proprioception.org Sat Feb 18 10:59:41 2012 -0700 @@ -25,10 +25,10 @@ Proprioception in humans is mediated by [[http://en.wikipedia.org/wiki/Articular_capsule][joint capsules]], [[http://en.wikipedia.org/wiki/Muscle_spindle][muscle spindles]], and the [[http://en.wikipedia.org/wiki/Golgi_tendon_organ][Golgi tendon organs]]. These measure the relative -positions of each pody part by monitoring muscle strain and length. +positions of each body part by monitoring muscle strain and length. -It's clear that this is a vital sense for fulid, graceful -movement. It's also particurally easy to implement in jMonkeyEngine. +It's clear that this is a vital sense for fluid, graceful +movement. It's also particularly easy to implement in jMonkeyEngine. My simulated proprioception calculates the relative angles of each joint from the rest position defined in the blender file. This @@ -83,7 +83,7 @@ * Proprioception Kernel Given a joint, =proprioception-kernel= produces a function that -calculates the euler angles between the the objects the joint +calculates the Euler angles between the the objects the joint connects. #+name: proprioception @@ -142,7 +142,7 @@ Proprioception has the lowest bandwidth of all the senses so far, and it doesn't lend itself as readily to visual representation like -vision, hearing, or touch. This visualization code creates a "guage" +vision, hearing, or touch. This visualization code creates a "gauge" to view each of the three relative angles along a circle. #+name: visualize @@ -217,7 +217,7 @@ (defn test-proprioception "Testing proprioception: - You should see two foating bars, and a printout of pitch, yaw, and + You should see two floating bars, and a printout of pitch, yaw, and roll. Pressing key-r/key-t should move the blue bar up and down and change only the value of pitch. key-f/key-g moves it side to side and changes yaw. key-v/key-b will spin the blue segment clockwise @@ -338,7 +338,7 @@ #+begin_src clojure (ns cortex.proprioception "Simulate the sense of proprioception (ability to detect the - relative positions of body parts with repsect to other body parts) + relative positions of body parts with respect to other body parts) in jMonkeyEngine3. Reads specially prepared blender files to automatically generate proprioceptive senses." (:use (cortex world util sense body)) diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/sense.org --- a/org/sense.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/sense.org Sat Feb 18 10:59:41 2012 -0700 @@ -7,9 +7,9 @@ #+INCLUDE: ../../aurellem/org/level-0.org * Blender Utilities -In blender, any object can be assigned an arbitray number of key-value -pairs which are called "Custom Properties". These are accessable in -jMonkyeEngine when blender files are imported with the +In blender, any object can be assigned an arbitrary number of key-value +pairs which are called "Custom Properties". These are accessible in +jMonkeyEngine when blender files are imported with the =BlenderLoader=. =meta-data= extracts these properties. #+name: blender-1 @@ -55,7 +55,7 @@ circular two-dimensional image which is the cross section of the spinal cord. Points on this image that are close together in this circle represent touch sensors that are /probably/ close together on -the skin, although there is of course some cutting and rerangement +the skin, although there is of course some cutting and rearrangement that has to be done to transfer the complicated surface of the skin onto a two dimensional image. @@ -84,13 +84,13 @@ of that senses sensors. To get different types of sensors, you can either use a different color for each type of sensor, or use multiple UV-maps, each labeled with that sensor type. I generally use a white -pixel to mean the presense of a sensor and a black pixel to mean the -absense of a sensor, and use one UV-map for each sensor-type within a +pixel to mean the presence of a sensor and a black pixel to mean the +absence of a sensor, and use one UV-map for each sensor-type within a given sense. The paths to the images are not stored as the actual UV-map of the blender object but are instead referenced in the meta-data of the node. -#+CAPTION: The UV-map for an enlongated icososphere. The white dots each represent a touch sensor. They are dense in the regions that describe the tip of the finger, and less dense along the dorsal side of the finger opposite the tip. +#+CAPTION: The UV-map for an elongated icososphere. The white dots each represent a touch sensor. They are dense in the regions that describe the tip of the finger, and less dense along the dorsal side of the finger opposite the tip. #+ATTR_HTML: width="300" [[../images/finger-UV.png]] @@ -156,11 +156,11 @@ Information from the senses is transmitted to the brain via bundles of axons, whether it be the optic nerve or the spinal cord. While these -bundles more or less perserve the overall topology of a sense's -two-dimensional surface, they do not perserve the percise euclidean +bundles more or less preserve the overall topology of a sense's +two-dimensional surface, they do not preserve the precise euclidean distances between every sensor. =collapse= is here to smoosh the -sensors described by a UV-map into a contigous region that still -perserves the topology of the original sense. +sensors described by a UV-map into a contiguous region that still +preserves the topology of the original sense. #+name: topology-2 #+begin_src clojure @@ -180,7 +180,7 @@ (defn collapse "Take a sequence of pairs of integers and collapse them into a - contigous bitmap with no \"holes\" or negative entries, as close to + contiguous bitmap with no \"holes\" or negative entries, as close to the origin [0 0] as the shape permits. The order of the points is preserved. @@ -224,11 +224,11 @@ [(- x min-x) (- y min-y)]) squeezed)) - point-correspondance + point-correspondence (zipmap (sort points) (sort relocated)) original-order - (vec (map point-correspondance points))] + (vec (map point-correspondence points))] original-order))) #+end_src * Viewing Sense Data @@ -245,7 +245,7 @@ (in-ns 'cortex.sense) (defn view-image - "Initailizes a JPanel on which you may draw a BufferedImage. + "Initializes a JPanel on which you may draw a BufferedImage. Returns a function that accepts a BufferedImage and draws it to the JPanel. If given a directory it will save the images as png files starting at 0000000.png and incrementing from there." @@ -309,7 +309,7 @@ (defn points->image - "Take a collection of points and visuliaze it as a BufferedImage." + "Take a collection of points and visualize it as a BufferedImage." [points] (if (empty? points) (BufferedImage. 1 1 BufferedImage/TYPE_BYTE_BINARY) @@ -348,7 +348,7 @@ - Create a single top-level empty node whose name is the name of the sense - Add empty nodes which each contain meta-data relevant to the sense, including a UV-map describing the number/distribution - of sensors if applicipable. + of sensors if applicable. - Make each empty-node the child of the top-level node. =sense-nodes= below generates functions to find these children. @@ -538,7 +538,7 @@ (ns cortex.sense "Here are functions useful in the construction of two or more sensors/effectors." - {:author "Robert McInytre"} + {:author "Robert McIntyre"} (:use (cortex world util)) (:import ij.process.ImageProcessor) (:import jme3tools.converters.ImageToAwt) diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/test.org --- a/org/test.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/test.org Sat Feb 18 10:59:41 2012 -0700 @@ -1,7 +1,7 @@ #+title: Test Suite #+author: Robert McIntyre #+email: rlm@mit.edu -#+description: Simulating a body (movement, touch, propioception) in jMonkeyEngine3. +#+description: Simulating a body (movement, touch, proprioception) in jMonkeyEngine3. #+SETUPFILE: ../../aurellem/org/setup.org #+INCLUDE: ../../aurellem/org/level-0.org @@ -17,7 +17,7 @@ com.jme3.system.AppSettings)) (defn run-world - "run the simulation and wait until it closes proprely." + "run the simulation and wait until it closes properly." [world] (let [lock (promise)] (.enqueue diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/touch.org --- a/org/touch.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/touch.org Sat Feb 18 10:59:41 2012 -0700 @@ -11,7 +11,7 @@ Touch is critical to navigation and spatial reasoning and as such I need a simulated version of it to give to my AI creatures. -Human skin has a wide array of touch sensors, each of which speciliaze +Human skin has a wide array of touch sensors, each of which specialize in detecting different vibrational modes and pressures. These sensors can integrate a vast expanse of skin (i.e. your entire palm), or a tiny patch of skin at the tip of your finger. The hairs of the skin @@ -32,7 +32,7 @@ sense of touch which is a hybrid between a human's sense of deformation and sense from hairs. -Implementing touch in jMonkeyEngine follows a different techinal route +Implementing touch in jMonkeyEngine follows a different technical route than vision and hearing. Those two senses piggybacked off jMonkeyEngine's 3D audio and video rendering subsystems. To simulate touch, I use jMonkeyEngine's physics system to execute many small @@ -77,7 +77,7 @@ To simulate touch there are three conceptual steps. For each solid object in the creature, you first have to get UV image and scale -paramater which define the position and length of the feelers. Then, +parameter which define the position and length of the feelers. Then, you use the triangles which compose the mesh and the UV data stored in the mesh to determine the world-space position and orientation of each feeler. Then once every frame, update these positions and orientations @@ -100,7 +100,7 @@ normalized coordinates of the tips of the feelers. =feeler-tips=. * Triangle Math -** Schrapnel Conversion Functions +** Shrapnel Conversion Functions #+name: triangles-1 #+begin_src clojure @@ -121,11 +121,11 @@ (apply #(Triangle. %1 %2 %3) (map ->vector3f points))) #+end_src -It is convienent to treat a =Triangle= as a vector of vectors, and a +It is convenient to treat a =Triangle= as a vector of vectors, and a =Vector2f= or =Vector3f= as vectors of floats. (->vector3f) and (->triangle) undo the operations of =vector3f-seq= and =triangle-seq=. If these classes implemented =Iterable= then =seq= -would work on them automitacally. +would work on them automatically. ** Decomposing a 3D shape into Triangles @@ -134,7 +134,7 @@ data involved with displaying the object. A =Mesh= is composed of =Triangles=, and each =Triangle= has three -verticies which have coordinates in world space and UV space. +vertices which have coordinates in world space and UV space. Here, =triangles= gets all the world-space triangles which compose a mesh, while =pixel-triangles= gets those same triangles expressed in @@ -216,7 +216,7 @@ 1 & 1 & 1 & 1 \end{bmatrix} -Here, the first three columns of the matrix are the verticies of the +Here, the first three columns of the matrix are the vertices of the triangle. The last column is the right-handed unit normal of the triangle. @@ -225,7 +225,7 @@ $T_{2}T_{1}^{-1}$ -The clojure code below recaptiulates the formulas above, using +The clojure code below recapitulates the formulas above, using jMonkeyEngine's =Matrix4f= objects, which can describe any affine transformation. @@ -380,7 +380,7 @@ (defn set-ray [#^Ray ray #^Matrix4f transform #^Vector3f origin #^Vector3f tip] - ;; Doing everything locally recduces garbage collection by enough to + ;; Doing everything locally reduces garbage collection by enough to ;; be worth it. (.mult transform origin (.getOrigin ray)) (.mult transform tip (.getDirection ray)) @@ -443,7 +443,7 @@ (defn touch! "Endow the creature with the sense of touch. Returns a sequence of - functions, one for each body part with a tactile-sensor-proile, + functions, one for each body part with a tactile-sensor-profile, each of which when called returns sensory data for that body part." [#^Node creature] (filter @@ -459,7 +459,7 @@ * Visualizing Touch Each feeler is represented in the image as a single pixel. The -grayscale value of each pixel represents how deep the feeler +greyscale value of each pixel represents how deep the feeler represented by that pixel is inside another object. Black means that nothing is touching the feeler, while white means that the feeler is completely inside another object, which is presumably flush with the @@ -470,7 +470,7 @@ (in-ns 'cortex.touch) (defn touch->gray - "Convert a pair of [distance, max-distance] into a grayscale pixel." + "Convert a pair of [distance, max-distance] into a gray-scale pixel." [distance max-distance] (gray (- 255 (rem (int (* 255 (/ distance max-distance))) 256)))) @@ -736,7 +736,7 @@ * Next So far I've implemented simulated Vision, Hearing, and Touch, the most -obvious and promiment senses that humans have. Smell and Taste shall +obvious and prominent senses that humans have. Smell and Taste shall remain unimplemented for now. This accounts for the "five senses" that feature so prominently in our lives. But humans have far more than the five main senses. There are internal chemical senses, pain (which is diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/util.org --- a/org/util.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/util.org Sat Feb 18 10:59:41 2012 -0700 @@ -28,7 +28,7 @@ (defn jme-class? [classname] (and (.startsWith classname "com.jme3.") - ;; Don't import the Lwjgl stuff since it can throw exceptions + ;; Don't import the LWJGL stuff since it can throw exceptions ;; upon being loaded. (not (re-matches #".*Lwjgl.*" classname)))) @@ -52,10 +52,10 @@ happy with the general structure of a namespace I can deal with importing only the classes it actually needs. -The =mega-import-jme3= is quite usefull for debugging purposes since +The =mega-import-jme3= is quite useful for debugging purposes since it allows completion for almost all of JME's classes from the REPL. -Out of curiousity, let's see just how many classes =mega-import-jme3= +Out of curiosity, let's see just how many classes =mega-import-jme3= imports: #+begin_src clojure :exports both :results output @@ -226,7 +226,7 @@ (in-ns 'cortex.util) (defn load-bullet - "Runnig this function unpacks the native bullet libraries and makes + "Running this function unpacks the native bullet libraries and makes them available." [] (let [sim (world (Node.) {} no-op no-op)] @@ -316,8 +316,8 @@ ([] (sphere 0.5))) (defn x-ray - "A usefull material for debuging -- it can be seen no matter what - object occuldes it." + "A useful material for debugging -- it can be seen no matter what + object occludes it." [#^ColorRGBA color] (doto (Material. (asset-manager) "Common/MatDefs/Misc/Unshaded.j3md") @@ -359,7 +359,7 @@ (in-ns 'cortex.util) (defn basic-light-setup - "returns a sequence of lights appropiate for fully lighting a scene" + "returns a sequence of lights appropriate for fully lighting a scene" [] (conj (doall @@ -379,7 +379,7 @@ (.setColor ColorRGBA/White)))) (defn light-up-everything - "Add lights to a world appropiate for quickly seeing everything + "Add lights to a world appropriate for quickly seeing everything in the scene. Adds six DirectionalLights facing in orthogonal directions, and one AmbientLight to provide overall lighting coverage." @@ -433,7 +433,7 @@ forces [root-node keymap - intilization + initialization world-loop]] (let [add-keypress (fn [state keymap key] @@ -472,14 +472,14 @@ (splice-loop))] [root-node keymap* - intilization + initialization world-loop*])) (import com.jme3.font.BitmapText) (import com.jme3.scene.control.AbstractControl) (import com.aurellem.capture.IsoTimer) -(defn display-dialated-time +(defn display-dilated-time "Shows the time as it is flowing in the simulation on a HUD display. Useful for making videos." [world timer] diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/vision.org --- a/org/vision.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/vision.org Sat Feb 18 10:59:41 2012 -0700 @@ -11,11 +11,11 @@ Vision is one of the most important senses for humans, so I need to build a simulated sense of vision for my AI. I will do this with -simulated eyes. Each eye can be independely moved and should see its +simulated eyes. Each eye can be independently moved and should see its own version of the world depending on where it is. -Making these simulated eyes a reality is simple bacause jMonkeyEngine -already conatains extensive support for multiple views of the same 3D +Making these simulated eyes a reality is simple because jMonkeyEngine +already contains extensive support for multiple views of the same 3D simulated world. The reason jMonkeyEngine has this support is because the support is necessary to create games with split-screen views. Multiple views are also used to create efficient @@ -26,7 +26,7 @@ [[../images/goldeneye-4-player.png]] ** =ViewPorts=, =SceneProcessors=, and the =RenderManager=. -# =Viewports= are cameras; =RenderManger= takes snapshots each frame. +# =ViewPorts= are cameras; =RenderManger= takes snapshots each frame. #* A Brief Description of jMonkeyEngine's Rendering Pipeline jMonkeyEngine allows you to create a =ViewPort=, which represents a @@ -35,13 +35,13 @@ =ViewPort=, rendering the scene in the GPU. For each =ViewPort= there is a =FrameBuffer= which represents the rendered image in the GPU. -#+caption: =ViewPorts= are cameras in the world. During each frame, the =Rendermanager= records a snapshot of what each view is currently seeing; these snapshots are =Framebuffer= objects. +#+caption: =ViewPorts= are cameras in the world. During each frame, the =RenderManager= records a snapshot of what each view is currently seeing; these snapshots are =FrameBuffer= objects. #+ATTR_HTML: width="400" [[../images/diagram_rendermanager2.png]] Each =ViewPort= can have any number of attached =SceneProcessor= objects, which are called every time a new frame is rendered. A -=SceneProcessor= recieves its =ViewPort's= =FrameBuffer= and can do +=SceneProcessor= receives its =ViewPort's= =FrameBuffer= and can do whatever it wants to the data. Often this consists of invoking GPU specific operations on the rendered image. The =SceneProcessor= can also copy the GPU image data to RAM and process it with the CPU. @@ -51,11 +51,11 @@ Each eye in the simulated creature needs its own =ViewPort= so that it can see the world from its own perspective. To this =ViewPort=, I -add a =SceneProcessor= that feeds the visual data to any arbitray +add a =SceneProcessor= that feeds the visual data to any arbitrary continuation function for further processing. That continuation function may perform both CPU and GPU operations on the data. To make this easy for the continuation function, the =SceneProcessor= -maintains appropriatly sized buffers in RAM to hold the data. It does +maintains appropriately sized buffers in RAM to hold the data. It does not do any copying from the GPU to the CPU itself because it is a slow operation. @@ -65,7 +65,7 @@ "Create a SceneProcessor object which wraps a vision processing continuation function. The continuation is a function that takes [#^Renderer r #^FrameBuffer fb #^ByteBuffer b #^BufferedImage bi], - each of which has already been appropiately sized." + each of which has already been appropriately sized." [continuation] (let [byte-buffer (atom nil) renderer (atom nil) @@ -99,7 +99,7 @@ =FrameBuffer= references the GPU image data, but the pixel data can not be used directly on the CPU. The =ByteBuffer= and =BufferedImage= are initially "empty" but are sized to hold the data in the -=FrameBuffer=. I call transfering the GPU image data to the CPU +=FrameBuffer=. I call transferring the GPU image data to the CPU structures "mixing" the image data. I have provided three functions to do this mixing. @@ -129,7 +129,7 @@ entirely in terms of =BufferedImage= inputs. Just compose that =BufferedImage= algorithm with =BufferedImage!=. However, a vision processing algorithm that is entirely hosted on the GPU does not have -to pay for this convienence. +to pay for this convenience. * Optical sensor arrays are described with images and referenced with metadata The vision pipeline described above handles the flow of rendered @@ -139,7 +139,7 @@ An eye is described in blender in the same way as a joint. They are zero dimensional empty objects with no geometry whose local coordinate system determines the orientation of the resulting eye. All eyes are -childern of a parent node named "eyes" just as all joints have a +children of a parent node named "eyes" just as all joints have a parent named "joints". An eye binds to the nearest physical object with =bind-sense=. @@ -184,8 +184,8 @@ I want to be able to model any retinal configuration, so my eye-nodes in blender contain metadata pointing to images that describe the -percise position of the individual sensors using white pixels. The -meta-data also describes the percise sensitivity to light that the +precise position of the individual sensors using white pixels. The +meta-data also describes the precise sensitivity to light that the sensors described in the image have. An eye can contain any number of these images. For example, the metadata for an eye might look like this: @@ -215,11 +215,11 @@ relative sensitivity to the channels red, green, and blue. These sensitivity values are packed into an integer in the order =|_|R|G|B|= in 8-bit fields. The RGB values of a pixel in the image are added -together with these sensitivities as linear weights. Therfore, +together with these sensitivities as linear weights. Therefore, 0xFF0000 means sensitive to red only while 0xFFFFFF means sensitive to all colors equally (gray). -For convienence I've defined a few symbols for the more common +For convenience I've defined a few symbols for the more common sensitivity values. #+name: sensitivity @@ -383,10 +383,10 @@ only once the first time any of the functions from the list returned by =vision-kernel= is called. Each of the functions returned by =vision-kernel= also allows access to the =Viewport= through which -it recieves images. +it receives images. -The in-game display can be disrupted by all the viewports that the -functions greated by =vision-kernel= add. This doesn't affect the +The in-game display can be disrupted by all the ViewPorts that the +functions generated by =vision-kernel= add. This doesn't affect the simulation or the simulated senses, but can be annoying. =gen-fix-display= restores the in-simulation display. @@ -572,7 +572,7 @@ (light-up-everything world) (speed-up world) (.setTimer world timer) - (display-dialated-time world timer) + (display-dilated-time world timer) ;; add a view from the worm's perspective (if record? (Capture/captureVideo @@ -685,7 +685,7 @@ (ns cortex.vision "Simulate the sense of vision in jMonkeyEngine3. Enables multiple eyes from different positions to observe the same world, and pass - the observed data to any arbitray function. Automatically reads + the observed data to any arbitrary function. Automatically reads eye-nodes from specially prepared blender files and instantiates them in the world as actual eyes." {:author "Robert McIntyre"} diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/world.org --- a/org/world.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/world.org Sat Feb 18 10:59:41 2012 -0700 @@ -49,7 +49,7 @@ #+name: header #+begin_src clojure :results silent (ns cortex.world - "World Creation, abstracion over jme3's input system, and REPL + "World Creation, abstraction over jme3's input system, and REPL driven exception handling" {:author "Robert McIntyre"} @@ -85,7 +85,7 @@ (doto (AppSettings. true) (.setFullscreen false) (.setTitle "Aurellem.") - ;; The "Send" AudioRenderer supports sumulated hearing. + ;; The "Send" AudioRenderer supports simulated hearing. (.setAudioRenderer "Send")) "These settings control how the game is displayed on the screen for debugging purposes. Use binding forms to change this if desired. diff -r 19c43ec6958d -r 7e7f8d6d9ec5 org/youtube.org --- a/org/youtube.org Sat Feb 18 10:28:14 2012 -0700 +++ b/org/youtube.org Sat Feb 18 10:59:41 2012 -0700 @@ -1,21 +1,26 @@ -| local-file | youtube-url | -|-------------------------+-------------| -| basic-touch.ogg | | -| hand.ogg | | -| worm-hearing.ogg | | -| bind-sense.ogg | | -| java-hearing-test.ogg | | -| worm-muscles.ogg | | -| crumbly-hand.ogg | | -| spinning-cube.ogg | | -| worm-touch.ogg | | -| cube.ogg | | -| test-proprioception.ogg | | -| worm-vision.ogg | | -| full-hand.ogg | | -| touch-cube.ogg | | -| ghost-hand.ogg | | -| worm-1.ogg | | -|-------------------------+-------------| +| local-file | youtube-url | +|-------------------------+-----------------------------| +| basic-touch.ogg | http://youtu.be/8xNEtD-a8f0 | +| hand.ogg | http://youtu.be/DvoN2wWQ_6o | +| worm-hearing.ogg | | +| bind-sense.ogg | | +| java-hearing-test.ogg | | +| worm-muscles.ogg | | +| crumbly-hand.ogg | | +| spinning-cube.ogg | | +| worm-touch.ogg | | +| cube.ogg | | +| test-proprioception.ogg | | +| worm-vision.ogg | | +| full-hand.ogg | | +| touch-cube.ogg | | +| ghost-hand.ogg | | +| worm-1.ogg | | +|-------------------------+-----------------------------| + + + +Simulated senses in jMonkeyEngine3. +See http://aurellem.org