# HG changeset patch # User Robert McIntyre # Date 1399903456 14400 # Node ID 6299450f96182f35a66c1c219114be6bf83f9227 # Parent 6a61b637a4c5ecfaf31da1470242a878954269a2 corrections for jessica noss. diff -r 6a61b637a4c5 -r 6299450f9618 thesis/cortex.org --- a/thesis/cortex.org Sun May 04 15:38:11 2014 -0400 +++ b/thesis/cortex.org Mon May 12 10:04:16 2014 -0400 @@ -261,7 +261,7 @@ it gathers experiences that satisfy the embodied action definitions. - - Posture imitation :: When trying to interpret a video or image, + - Posture Imitation :: When trying to interpret a video or image, the creature takes a model of itself and aligns it with whatever it sees. This alignment might even cross species, as when humans try to align themselves with things like ponies, @@ -285,7 +285,7 @@ retrieved, and if it is analogous enough to the scene, then the creature will correctly identify the action in the scene. - My program, =EMPATH= uses this empathic problem solving technique + My program =EMPATH= uses this empathic problem solving technique to interpret the actions of a simple, worm-like creature. #+caption: The worm performs many actions during free play such as @@ -324,7 +324,7 @@ structure of =EMPATH= and =CORTEX= will make future work to enable video analysis much easier than it would otherwise be. -** =EMPATH= is built on =CORTEX=, a creature builder. +** COMMENT =EMPATH= is built on =CORTEX=, a creature builder. I built =CORTEX= to be a general AI research platform for doing experiments involving multiple rich senses and a wide variety and @@ -347,7 +347,7 @@ - You can design new creatures using Blender (\cite{blender}), a popular, free 3D modeling program. Each sense can be specified - using special blender nodes with biologically inspired + using special Blender nodes with biologically inspired parameters. You need not write any code to create a creature, and can use a wide library of pre-existing blender models as a base for your own creatures. @@ -412,7 +412,8 @@ I modeled my own right hand in Blender and rigged it with all the senses that {\tt CORTEX} supports. My simulated hand has a biologically inspired distribution of touch sensors. The senses are - displayed on the right, and the simulation is displayed on the + displayed on the right (the red/black squares are raw sensory output), + and the simulation is displayed on the left. Notice that my hand is curling its fingers, that it can see its own finger from the eye in its palm, and that it can feel its own thumb touching its palm.} @@ -764,8 +765,8 @@ *** Solidifying/Connecting a body =CORTEX= creates a creature in two steps: first, it traverses the - nodes in the blender file and creates physical representations for - any of them that have mass defined in their blender meta-data. + nodes in the Blender file and creates physical representations for + any of them that have mass defined in their Blender meta-data. #+caption: Program for iterating through the nodes in a blender file #+caption: and generating physical jMonkeyEngine3 objects with mass @@ -830,7 +831,7 @@ #+begin_listing clojure #+begin_src clojure (defn sense-nodes - "For some senses there is a special empty blender node whose + "For some senses there is a special empty Blender node whose children are considered markers for an instance of that sense. This function generates functions to find those children, given the name of the special parent node." @@ -891,13 +892,13 @@ Once =CORTEX= finds all joints and targets, it creates them using a dispatch on the metadata of each joint node. - #+caption: Program to dispatch on blender metadata and create joints + #+caption: Program to dispatch on Blender metadata and create joints #+caption: suitable for physical simulation. #+name: joint-dispatch #+begin_listing clojure #+begin_src clojure (defmulti joint-dispatch - "Translate blender pseudo-joints into real JME joints." + "Translate Blender pseudo-joints into real JME joints." (fn [constraints & _] (:type constraints))) @@ -930,10 +931,10 @@ All that is left for joints is to combine the above pieces into something that can operate on the collection of nodes that a - blender file represents. + Blender file represents. #+caption: Program to completely create a joint given information - #+caption: from a blender file. + #+caption: from a Blender file. #+name: connect #+begin_listing clojure #+begin_src clojure @@ -948,7 +949,7 @@ {:type :cone :limit-xz 0] :limit-xy 0] - :twist 0]} (use XZY rotation mode in blender!)" + :twist 0]} (use XZY rotation mode in Blender!)" [#^Node obj-a #^Node obj-b #^Node joint] (let [control-a (.getControl obj-a RigidBodyControl) control-b (.getControl obj-b RigidBodyControl) @@ -993,7 +994,7 @@ (defn body! "Endow the creature with a physical body connected with joints. The particulars of the joints and the masses of each body part are - determined in blender." + determined in Blender." [#^Node creature] (physical! creature) (joints! creature)) @@ -1008,7 +1009,7 @@ my own right hand, can now be given joints and simulated as a creature. - #+caption: With the ability to create physical creatures from blender, + #+caption: With the ability to create physical creatures from Blender, #+caption: =CORTEX= gets one step closer to becoming a full creature #+caption: simulation environment. #+name: physical-hand @@ -1125,7 +1126,7 @@ images. Now, =CORTEX= needs simulated eyes to serve as the source of these images. - An eye is described in blender in the same way as a joint. They + An eye is described in Blender in the same way as a joint. They are zero dimensional empty objects with no geometry whose local coordinate system determines the orientation of the resulting eye. All eyes are children of a parent node named "eyes" just as all @@ -1142,7 +1143,7 @@ "Create a Camera centered on the current position of 'eye which follows the closest physical node in 'creature. The camera will point in the X direction and use the Z vector as up as determined - by the rotation of these vectors in blender coordinate space. Use + by the rotation of these vectors in Blender coordinate space. Use XZY rotation for the node in blender." [#^Node creature #^Spatial eye] (let [target (closest-node creature eye) @@ -1156,7 +1157,7 @@ (.lookAtDirection cam ; this part is not a mistake and (.mult rot Vector3f/UNIT_X) ; is consistent with using Z in - (.mult rot Vector3f/UNIT_Y)) ; blender as the UP vector. + (.mult rot Vector3f/UNIT_Y)) ; Blender as the UP vector. (.setFrustumPerspective cam (float 45) (float (/ (.getWidth cam) (.getHeight cam))) @@ -1179,7 +1180,7 @@ distance from the fovea. I want to be able to model any retinal configuration, so my - eye-nodes in blender contain metadata pointing to images that + eye-nodes in Blender contain metadata pointing to images that describe the precise position of the individual sensors using white pixels. The meta-data also describes the precise sensitivity to light that the sensors described in the image have. An eye can @@ -1322,7 +1323,7 @@ At the end of this chapter I will have simulated ears that work the same way as the simulated eyes in the last chapter. I will be able to - place any number of ear-nodes in a blender file, and they will bind to + place any number of ear-nodes in a Blender file, and they will bind to the closest physical object and follow it as it moves around. Each ear will provide access to the sound data it picks up between every frame. @@ -2146,7 +2147,7 @@ It's also particularly easy to implement in jMonkeyEngine. My simulated proprioception calculates the relative angles of each - joint from the rest position defined in the blender file. This + joint from the rest position defined in the Blender file. This simulates the muscle-spindles and joint capsules. I will deal with Golgi tendon organs, which calculate muscle strain, in the next chapter. @@ -2309,7 +2310,7 @@ *** Muscle meta-data - #+caption: Program to deal with loading muscle data from a blender + #+caption: Program to deal with loading muscle data from a Blender #+caption: file's metadata. #+name: motor-pool #+begin_listing clojure @@ -2390,7 +2391,7 @@ =movement-kernel= creates a function that controls the movement of the nearest physical node to the muscle node. The muscle exerts a rotational force dependent on it's orientation to the object in - the blender file. The function returned by =movement-kernel= is + the Blender file. The function returned by =movement-kernel= is also a sense function: it returns the percent of the total muscle strength that is currently being employed. This is analogous to muscle tension in humans and completes the sense of proprioception @@ -2534,10 +2535,10 @@ #+ATTR_LaTeX: :width 10cm [[./images/basic-worm-view.png]] - #+caption: Program for reading a worm from a blender file and + #+caption: Program for reading a worm from a Blender file and #+caption: outfitting it with the senses of proprioception, #+caption: touch, and the ability to move, as specified in the - #+caption: blender file. + #+caption: Blender file. #+name: get-worm #+begin_listing clojure #+begin_src clojure @@ -2829,19 +2830,19 @@ that defines the worm's life. It is a time-stamp for each set of sensations the worm has experienced. - First, group the experience-indices into bins according to the + First, I group the experience-indices into bins according to the similarity of their proprioceptive data. I organize my bins into a 3 level hierarchy. The smallest bins have an approximate size of 0.001 radians in all proprioceptive dimensions. Each higher level is 10x bigger than the level below it. The bins serve as a hashing function for proprioceptive data. Given - a single piece of proprioceptive experience, the bins allow us to + a single piece of proprioceptive experience, the bins allow me to rapidly find all other similar experience-indices of past experience that had a very similar proprioceptive configuration. When looking up a proprioceptive experience, if the smallest bin - does not match any previous experience, then successively larger - bins are used until a match is found or we reach the largest bin. + does not match any previous experience, then I use successively + larger bins until a match is found or I reach the largest bin. Given a sequence of proprioceptive input, I use the bins to generate a set of similar experiences for each input using the @@ -3529,7 +3530,7 @@ the =CORTEX= system, a complete environment for creating simulated creatures. You have seen how to implement five senses: touch, proprioception, hearing, vision, and muscle tension. You have seen - how to create new creatures using blender, a 3D modeling tool. + how to create new creatures using Blender, a 3D modeling tool. As a minor digression, you also saw how I used =CORTEX= to enable a tiny worm to discover the topology of its skin simply by rolling on @@ -3628,7 +3629,7 @@ {:type :cone :limit-xz :limit-xy - :twist } ;(use XZY rotation mode in blender!) + :twist } ;(use XZY rotation mode in Blender!) #+END_SRC *** Eyes @@ -3719,7 +3720,7 @@ *** Proprioception Proprioception is tied to each joint node -- nothing special must - be done in a blender model to enable proprioception other than + be done in a Blender model to enable proprioception other than creating joint nodes. *** Muscles @@ -3820,7 +3821,7 @@ Options are the same as in =box=. - =(load-blender-model file-name)= :: create a node structure - representing the model described in a blender file. + representing the model described in a Blender file. - =(light-up-everything world)= :: distribute a standard compliment of lights throughout the simulation. Should be adequate for most @@ -3855,7 +3856,7 @@ during a simulation, return the vision data for the channel of one of the eyes. The functions are ordered depending on the alphabetical order of the names of the eye nodes in the - blender file. The data returned by the functions is a vector + Blender file. The data returned by the functions is a vector containing the eye's /topology/, a vector of coordinates, and the eye's /data/, a vector of RGB values filtered by the eye's sensitivity. @@ -3864,7 +3865,7 @@ Returns a list of functions, one for each ear, that when called will return a frame's worth of hearing data for that ear. The functions are ordered depending on the alphabetical - order of the names of the ear nodes in the blender file. The + order of the names of the ear nodes in the Blender file. The data returned by the functions is an array of PCM (pulse code modulated) wav data. diff -r 6a61b637a4c5 -r 6299450f9618 thesis/org/roadmap.org --- a/thesis/org/roadmap.org Sun May 04 15:38:11 2014 -0400 +++ b/thesis/org/roadmap.org Mon May 12 10:04:16 2014 -0400 @@ -21,7 +21,7 @@ - complete final formatting and submit * TODO Turn in CORTEX - DEADLINE: <2014-05-09 Fri> + DEADLINE: <2014-05-12 Mon> SHIT THAT'S IN 67 DAYS!!! ** program simple feature matching code for the worm's segments