Mercurial > cortex
comparison thesis/cortex.org @ 545:b2c66ea58c39
changes from athena.
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Mon, 28 Apr 2014 12:59:08 -0400 |
parents | 97d45f796ad6 |
children | 5d89879fc894 |
comparison
equal
deleted
inserted
replaced
544:431e6aedf67d | 545:b2c66ea58c39 |
---|---|
347 library of pre-existing blender models as a base for your own | 347 library of pre-existing blender models as a base for your own |
348 creatures. | 348 creatures. |
349 | 349 |
350 - =CORTEX= implements a wide variety of senses: touch, | 350 - =CORTEX= implements a wide variety of senses: touch, |
351 proprioception, vision, hearing, and muscle tension. Complicated | 351 proprioception, vision, hearing, and muscle tension. Complicated |
352 senses like touch, and vision involve multiple sensory elements | 352 senses like touch and vision involve multiple sensory elements |
353 embedded in a 2D surface. You have complete control over the | 353 embedded in a 2D surface. You have complete control over the |
354 distribution of these sensor elements through the use of simple | 354 distribution of these sensor elements through the use of simple |
355 png image files. In particular, =CORTEX= implements more | 355 png image files. In particular, =CORTEX= implements more |
356 comprehensive hearing than any other creature simulation system | 356 comprehensive hearing than any other creature simulation system |
357 available. | 357 available. |
1130 #+caption: Here, the camera is created based on metadata on the | 1130 #+caption: Here, the camera is created based on metadata on the |
1131 #+caption: eye-node and attached to the nearest physical object | 1131 #+caption: eye-node and attached to the nearest physical object |
1132 #+caption: with =bind-sense= | 1132 #+caption: with =bind-sense= |
1133 #+name: add-eye | 1133 #+name: add-eye |
1134 #+begin_listing clojure | 1134 #+begin_listing clojure |
1135 #+begin_src clojure | |
1135 (defn add-eye! | 1136 (defn add-eye! |
1136 "Create a Camera centered on the current position of 'eye which | 1137 "Create a Camera centered on the current position of 'eye which |
1137 follows the closest physical node in 'creature. The camera will | 1138 follows the closest physical node in 'creature. The camera will |
1138 point in the X direction and use the Z vector as up as determined | 1139 point in the X direction and use the Z vector as up as determined |
1139 by the rotation of these vectors in blender coordinate space. Use | 1140 by the rotation of these vectors in blender coordinate space. Use |
1155 cam (float 45) | 1156 cam (float 45) |
1156 (float (/ (.getWidth cam) (.getHeight cam))) | 1157 (float (/ (.getWidth cam) (.getHeight cam))) |
1157 (float 1) | 1158 (float 1) |
1158 (float 1000)) | 1159 (float 1000)) |
1159 (bind-sense target cam) cam)) | 1160 (bind-sense target cam) cam)) |
1161 #+end_src | |
1160 #+end_listing | 1162 #+end_listing |
1161 | 1163 |
1162 *** Simulated Retina | 1164 *** Simulated Retina |
1163 | 1165 |
1164 An eye is a surface (the retina) which contains many discrete | 1166 An eye is a surface (the retina) which contains many discrete |
1189 #+caption: edges and is inspired by the human retina. | 1191 #+caption: edges and is inspired by the human retina. |
1190 #+name: retina | 1192 #+name: retina |
1191 #+ATTR_LaTeX: :width 7cm | 1193 #+ATTR_LaTeX: :width 7cm |
1192 [[./images/retina-small.png]] | 1194 [[./images/retina-small.png]] |
1193 | 1195 |
1194 Together, the number 0xFF0000 and the image image above describe | 1196 Together, the number 0xFF0000 and the image above describe the |
1195 the placement of red-sensitive sensory elements. | 1197 placement of red-sensitive sensory elements. |
1196 | 1198 |
1197 Meta-data to very crudely approximate a human eye might be | 1199 Meta-data to very crudely approximate a human eye might be |
1198 something like this: | 1200 something like this: |
1199 | 1201 |
1200 #+begin_src clojure | 1202 #+begin_src clojure |
2177 #+end_listing | 2179 #+end_listing |
2178 | 2180 |
2179 *** Proprioception Kernel | 2181 *** Proprioception Kernel |
2180 | 2182 |
2181 Given a joint, =proprioception-kernel= produces a function that | 2183 Given a joint, =proprioception-kernel= produces a function that |
2182 calculates the Euler angles between the the objects the joint | 2184 calculates the Euler angles between the objects the joint |
2183 connects. The only tricky part here is making the angles relative | 2185 connects. The only tricky part here is making the angles relative |
2184 to the joint's initial ``straightness''. | 2186 to the joint's initial ``straightness''. |
2185 | 2187 |
2186 #+caption: Program to return biologically reasonable proprioceptive | 2188 #+caption: Program to return biologically reasonable proprioceptive |
2187 #+caption: data for each joint. | 2189 #+caption: data for each joint. |
2557 the senses to directly describe any action. | 2559 the senses to directly describe any action. |
2558 | 2560 |
2559 ** Action recognition is easy with a full gamut of senses | 2561 ** Action recognition is easy with a full gamut of senses |
2560 | 2562 |
2561 Embodied representations using multiple senses such as touch, | 2563 Embodied representations using multiple senses such as touch, |
2562 proprioception, and muscle tension turns out be be exceedingly | 2564 proprioception, and muscle tension turns out be exceedingly |
2563 efficient at describing body-centered actions. It is the right | 2565 efficient at describing body-centered actions. It is the right |
2564 language for the job. For example, it takes only around 5 lines of | 2566 language for the job. For example, it takes only around 5 lines of |
2565 LISP code to describe the action of curling using embodied | 2567 LISP code to describe the action of curling using embodied |
2566 primitives. It takes about 10 lines to describe the seemingly | 2568 primitives. It takes about 10 lines to describe the seemingly |
2567 complicated action of wiggling. | 2569 complicated action of wiggling. |
3047 | 3049 |
3048 To use =EMPATH= with the worm, I first need to gather a set of | 3050 To use =EMPATH= with the worm, I first need to gather a set of |
3049 experiences from the worm that includes the actions I want to | 3051 experiences from the worm that includes the actions I want to |
3050 recognize. The =generate-phi-space= program (listing | 3052 recognize. The =generate-phi-space= program (listing |
3051 \ref{generate-phi-space} runs the worm through a series of | 3053 \ref{generate-phi-space} runs the worm through a series of |
3052 exercises and gatherers those experiences into a vector. The | 3054 exercises and gathers those experiences into a vector. The |
3053 =do-all-the-things= program is a routine expressed in a simple | 3055 =do-all-the-things= program is a routine expressed in a simple |
3054 muscle contraction script language for automated worm control. It | 3056 muscle contraction script language for automated worm control. It |
3055 causes the worm to rest, curl, and wiggle over about 700 frames | 3057 causes the worm to rest, curl, and wiggle over about 700 frames |
3056 (approx. 11 seconds). | 3058 (approx. 11 seconds). |
3057 | 3059 |