comparison thesis/cortex.org @ 562:6299450f9618

corrections for jessica noss.
author Robert McIntyre <rlm@mit.edu>
date Mon, 12 May 2014 10:04:16 -0400
parents 6a61b637a4c5
children 3c3dc4dbb973
comparison
equal deleted inserted replaced
561:6a61b637a4c5 562:6299450f9618
259 - Guided Play :: The creature moves around and experiences the 259 - Guided Play :: The creature moves around and experiences the
260 world through its unique perspective. As the creature moves, 260 world through its unique perspective. As the creature moves,
261 it gathers experiences that satisfy the embodied action 261 it gathers experiences that satisfy the embodied action
262 definitions. 262 definitions.
263 263
264 - Posture imitation :: When trying to interpret a video or image, 264 - Posture Imitation :: When trying to interpret a video or image,
265 the creature takes a model of itself and aligns it with 265 the creature takes a model of itself and aligns it with
266 whatever it sees. This alignment might even cross species, as 266 whatever it sees. This alignment might even cross species, as
267 when humans try to align themselves with things like ponies, 267 when humans try to align themselves with things like ponies,
268 dogs, or other humans with a different body type. 268 dogs, or other humans with a different body type.
269 269
283 just as it would if it were actually experiencing the scene 283 just as it would if it were actually experiencing the scene
284 first-hand. If previous experience has been accurately 284 first-hand. If previous experience has been accurately
285 retrieved, and if it is analogous enough to the scene, then 285 retrieved, and if it is analogous enough to the scene, then
286 the creature will correctly identify the action in the scene. 286 the creature will correctly identify the action in the scene.
287 287
288 My program, =EMPATH= uses this empathic problem solving technique 288 My program =EMPATH= uses this empathic problem solving technique
289 to interpret the actions of a simple, worm-like creature. 289 to interpret the actions of a simple, worm-like creature.
290 290
291 #+caption: The worm performs many actions during free play such as 291 #+caption: The worm performs many actions during free play such as
292 #+caption: curling, wiggling, and resting. 292 #+caption: curling, wiggling, and resting.
293 #+name: worm-intro 293 #+name: worm-intro
322 - For expediency's sake, I relied on direct knowledge of joint 322 - For expediency's sake, I relied on direct knowledge of joint
323 positions in this proof of concept. However, I believe that the 323 positions in this proof of concept. However, I believe that the
324 structure of =EMPATH= and =CORTEX= will make future work to 324 structure of =EMPATH= and =CORTEX= will make future work to
325 enable video analysis much easier than it would otherwise be. 325 enable video analysis much easier than it would otherwise be.
326 326
327 ** =EMPATH= is built on =CORTEX=, a creature builder. 327 ** COMMENT =EMPATH= is built on =CORTEX=, a creature builder.
328 328
329 I built =CORTEX= to be a general AI research platform for doing 329 I built =CORTEX= to be a general AI research platform for doing
330 experiments involving multiple rich senses and a wide variety and 330 experiments involving multiple rich senses and a wide variety and
331 number of creatures. I intend it to be useful as a library for many 331 number of creatures. I intend it to be useful as a library for many
332 more projects than just this thesis. =CORTEX= was necessary to meet 332 more projects than just this thesis. =CORTEX= was necessary to meet
345 =CORTEX= is well suited as an environment for embodied AI research 345 =CORTEX= is well suited as an environment for embodied AI research
346 for three reasons: 346 for three reasons:
347 347
348 - You can design new creatures using Blender (\cite{blender}), a 348 - You can design new creatures using Blender (\cite{blender}), a
349 popular, free 3D modeling program. Each sense can be specified 349 popular, free 3D modeling program. Each sense can be specified
350 using special blender nodes with biologically inspired 350 using special Blender nodes with biologically inspired
351 parameters. You need not write any code to create a creature, and 351 parameters. You need not write any code to create a creature, and
352 can use a wide library of pre-existing blender models as a base 352 can use a wide library of pre-existing blender models as a base
353 for your own creatures. 353 for your own creatures.
354 354
355 - =CORTEX= implements a wide variety of senses: touch, 355 - =CORTEX= implements a wide variety of senses: touch,
410 \includegraphics[width=9.5in]{images/full-hand.png} 410 \includegraphics[width=9.5in]{images/full-hand.png}
411 \caption{ 411 \caption{
412 I modeled my own right hand in Blender and rigged it with all the 412 I modeled my own right hand in Blender and rigged it with all the
413 senses that {\tt CORTEX} supports. My simulated hand has a 413 senses that {\tt CORTEX} supports. My simulated hand has a
414 biologically inspired distribution of touch sensors. The senses are 414 biologically inspired distribution of touch sensors. The senses are
415 displayed on the right, and the simulation is displayed on the 415 displayed on the right (the red/black squares are raw sensory output),
416 and the simulation is displayed on the
416 left. Notice that my hand is curling its fingers, that it can see 417 left. Notice that my hand is curling its fingers, that it can see
417 its own finger from the eye in its palm, and that it can feel its 418 its own finger from the eye in its palm, and that it can feel its
418 own thumb touching its palm.} 419 own thumb touching its palm.}
419 \end{sidewaysfigure} 420 \end{sidewaysfigure}
420 #+END_LaTeX 421 #+END_LaTeX
762 are able to coalesce into one object in the movie. 763 are able to coalesce into one object in the movie.
763 764
764 *** Solidifying/Connecting a body 765 *** Solidifying/Connecting a body
765 766
766 =CORTEX= creates a creature in two steps: first, it traverses the 767 =CORTEX= creates a creature in two steps: first, it traverses the
767 nodes in the blender file and creates physical representations for 768 nodes in the Blender file and creates physical representations for
768 any of them that have mass defined in their blender meta-data. 769 any of them that have mass defined in their Blender meta-data.
769 770
770 #+caption: Program for iterating through the nodes in a blender file 771 #+caption: Program for iterating through the nodes in a blender file
771 #+caption: and generating physical jMonkeyEngine3 objects with mass 772 #+caption: and generating physical jMonkeyEngine3 objects with mass
772 #+caption: and a matching physics shape. 773 #+caption: and a matching physics shape.
773 #+name: physical 774 #+name: physical
828 #+caption: will be omitted 829 #+caption: will be omitted
829 #+name: get-empty-nodes 830 #+name: get-empty-nodes
830 #+begin_listing clojure 831 #+begin_listing clojure
831 #+begin_src clojure 832 #+begin_src clojure
832 (defn sense-nodes 833 (defn sense-nodes
833 "For some senses there is a special empty blender node whose 834 "For some senses there is a special empty Blender node whose
834 children are considered markers for an instance of that sense. This 835 children are considered markers for an instance of that sense. This
835 function generates functions to find those children, given the name 836 function generates functions to find those children, given the name
836 of the special parent node." 837 of the special parent node."
837 [parent-name] 838 [parent-name]
838 (fn [#^Node creature] 839 (fn [#^Node creature]
889 #+end_listing 890 #+end_listing
890 891
891 Once =CORTEX= finds all joints and targets, it creates them using 892 Once =CORTEX= finds all joints and targets, it creates them using
892 a dispatch on the metadata of each joint node. 893 a dispatch on the metadata of each joint node.
893 894
894 #+caption: Program to dispatch on blender metadata and create joints 895 #+caption: Program to dispatch on Blender metadata and create joints
895 #+caption: suitable for physical simulation. 896 #+caption: suitable for physical simulation.
896 #+name: joint-dispatch 897 #+name: joint-dispatch
897 #+begin_listing clojure 898 #+begin_listing clojure
898 #+begin_src clojure 899 #+begin_src clojure
899 (defmulti joint-dispatch 900 (defmulti joint-dispatch
900 "Translate blender pseudo-joints into real JME joints." 901 "Translate Blender pseudo-joints into real JME joints."
901 (fn [constraints & _] 902 (fn [constraints & _]
902 (:type constraints))) 903 (:type constraints)))
903 904
904 (defmethod joint-dispatch :point 905 (defmethod joint-dispatch :point
905 [constraints control-a control-b pivot-a pivot-b rotation] 906 [constraints control-a control-b pivot-a pivot-b rotation]
928 #+end_src 929 #+end_src
929 #+end_listing 930 #+end_listing
930 931
931 All that is left for joints is to combine the above pieces into 932 All that is left for joints is to combine the above pieces into
932 something that can operate on the collection of nodes that a 933 something that can operate on the collection of nodes that a
933 blender file represents. 934 Blender file represents.
934 935
935 #+caption: Program to completely create a joint given information 936 #+caption: Program to completely create a joint given information
936 #+caption: from a blender file. 937 #+caption: from a Blender file.
937 #+name: connect 938 #+name: connect
938 #+begin_listing clojure 939 #+begin_listing clojure
939 #+begin_src clojure 940 #+begin_src clojure
940 (defn connect 941 (defn connect
941 "Create a joint between 'obj-a and 'obj-b at the location of 942 "Create a joint between 'obj-a and 'obj-b at the location of
946 {:type :hinge :limit [0 (/ Math/PI 2)] :axis (Vector3f. 0 1 0)} 947 {:type :hinge :limit [0 (/ Math/PI 2)] :axis (Vector3f. 0 1 0)}
947 (:axis defaults to (Vector3f. 1 0 0) if not provided for hinge joints) 948 (:axis defaults to (Vector3f. 1 0 0) if not provided for hinge joints)
948 949
949 {:type :cone :limit-xz 0] 950 {:type :cone :limit-xz 0]
950 :limit-xy 0] 951 :limit-xy 0]
951 :twist 0]} (use XZY rotation mode in blender!)" 952 :twist 0]} (use XZY rotation mode in Blender!)"
952 [#^Node obj-a #^Node obj-b #^Node joint] 953 [#^Node obj-a #^Node obj-b #^Node joint]
953 (let [control-a (.getControl obj-a RigidBodyControl) 954 (let [control-a (.getControl obj-a RigidBodyControl)
954 control-b (.getControl obj-b RigidBodyControl) 955 control-b (.getControl obj-b RigidBodyControl)
955 joint-center (.getWorldTranslation joint) 956 joint-center (.getWorldTranslation joint)
956 joint-rotation (.toRotationMatrix (.getWorldRotation joint)) 957 joint-rotation (.toRotationMatrix (.getWorldRotation joint))
991 (connect obj-a obj-b joint))) 992 (connect obj-a obj-b joint)))
992 (joints creature)))) 993 (joints creature))))
993 (defn body! 994 (defn body!
994 "Endow the creature with a physical body connected with joints. The 995 "Endow the creature with a physical body connected with joints. The
995 particulars of the joints and the masses of each body part are 996 particulars of the joints and the masses of each body part are
996 determined in blender." 997 determined in Blender."
997 [#^Node creature] 998 [#^Node creature]
998 (physical! creature) 999 (physical! creature)
999 (joints! creature)) 1000 (joints! creature))
1000 #+end_src 1001 #+end_src
1001 #+end_listing 1002 #+end_listing
1006 1007
1007 The hand from figure \ref{blender-hand}, which was modeled after 1008 The hand from figure \ref{blender-hand}, which was modeled after
1008 my own right hand, can now be given joints and simulated as a 1009 my own right hand, can now be given joints and simulated as a
1009 creature. 1010 creature.
1010 1011
1011 #+caption: With the ability to create physical creatures from blender, 1012 #+caption: With the ability to create physical creatures from Blender,
1012 #+caption: =CORTEX= gets one step closer to becoming a full creature 1013 #+caption: =CORTEX= gets one step closer to becoming a full creature
1013 #+caption: simulation environment. 1014 #+caption: simulation environment.
1014 #+name: physical-hand 1015 #+name: physical-hand
1015 #+ATTR_LaTeX: :width 15cm 1016 #+ATTR_LaTeX: :width 15cm
1016 [[./images/physical-hand.png]] 1017 [[./images/physical-hand.png]]
1123 1124
1124 The vision pipeline described above handles the flow of rendered 1125 The vision pipeline described above handles the flow of rendered
1125 images. Now, =CORTEX= needs simulated eyes to serve as the source 1126 images. Now, =CORTEX= needs simulated eyes to serve as the source
1126 of these images. 1127 of these images.
1127 1128
1128 An eye is described in blender in the same way as a joint. They 1129 An eye is described in Blender in the same way as a joint. They
1129 are zero dimensional empty objects with no geometry whose local 1130 are zero dimensional empty objects with no geometry whose local
1130 coordinate system determines the orientation of the resulting eye. 1131 coordinate system determines the orientation of the resulting eye.
1131 All eyes are children of a parent node named "eyes" just as all 1132 All eyes are children of a parent node named "eyes" just as all
1132 joints have a parent named "joints". An eye binds to the nearest 1133 joints have a parent named "joints". An eye binds to the nearest
1133 physical object with =bind-sense=. 1134 physical object with =bind-sense=.
1140 #+begin_src clojure 1141 #+begin_src clojure
1141 (defn add-eye! 1142 (defn add-eye!
1142 "Create a Camera centered on the current position of 'eye which 1143 "Create a Camera centered on the current position of 'eye which
1143 follows the closest physical node in 'creature. The camera will 1144 follows the closest physical node in 'creature. The camera will
1144 point in the X direction and use the Z vector as up as determined 1145 point in the X direction and use the Z vector as up as determined
1145 by the rotation of these vectors in blender coordinate space. Use 1146 by the rotation of these vectors in Blender coordinate space. Use
1146 XZY rotation for the node in blender." 1147 XZY rotation for the node in blender."
1147 [#^Node creature #^Spatial eye] 1148 [#^Node creature #^Spatial eye]
1148 (let [target (closest-node creature eye) 1149 (let [target (closest-node creature eye)
1149 [cam-width cam-height] 1150 [cam-width cam-height]
1150 ;;[640 480] ;; graphics card on laptop doesn't support 1151 ;;[640 480] ;; graphics card on laptop doesn't support
1154 rot (.getWorldRotation eye)] 1155 rot (.getWorldRotation eye)]
1155 (.setLocation cam (.getWorldTranslation eye)) 1156 (.setLocation cam (.getWorldTranslation eye))
1156 (.lookAtDirection 1157 (.lookAtDirection
1157 cam ; this part is not a mistake and 1158 cam ; this part is not a mistake and
1158 (.mult rot Vector3f/UNIT_X) ; is consistent with using Z in 1159 (.mult rot Vector3f/UNIT_X) ; is consistent with using Z in
1159 (.mult rot Vector3f/UNIT_Y)) ; blender as the UP vector. 1160 (.mult rot Vector3f/UNIT_Y)) ; Blender as the UP vector.
1160 (.setFrustumPerspective 1161 (.setFrustumPerspective
1161 cam (float 45) 1162 cam (float 45)
1162 (float (/ (.getWidth cam) (.getHeight cam))) 1163 (float (/ (.getWidth cam) (.getHeight cam)))
1163 (float 1) 1164 (float 1)
1164 (float 1000)) 1165 (float 1000))
1177 a very high density of color sensors, and a blind spot which has 1178 a very high density of color sensors, and a blind spot which has
1178 no sensors at all. Sensor density decreases in proportion to 1179 no sensors at all. Sensor density decreases in proportion to
1179 distance from the fovea. 1180 distance from the fovea.
1180 1181
1181 I want to be able to model any retinal configuration, so my 1182 I want to be able to model any retinal configuration, so my
1182 eye-nodes in blender contain metadata pointing to images that 1183 eye-nodes in Blender contain metadata pointing to images that
1183 describe the precise position of the individual sensors using 1184 describe the precise position of the individual sensors using
1184 white pixels. The meta-data also describes the precise sensitivity 1185 white pixels. The meta-data also describes the precise sensitivity
1185 to light that the sensors described in the image have. An eye can 1186 to light that the sensors described in the image have. An eye can
1186 contain any number of these images. For example, the metadata for 1187 contain any number of these images. For example, the metadata for
1187 an eye might look like this: 1188 an eye might look like this:
1320 1321
1321 ** ...but hearing must be built from scratch 1322 ** ...but hearing must be built from scratch
1322 1323
1323 At the end of this chapter I will have simulated ears that work the 1324 At the end of this chapter I will have simulated ears that work the
1324 same way as the simulated eyes in the last chapter. I will be able to 1325 same way as the simulated eyes in the last chapter. I will be able to
1325 place any number of ear-nodes in a blender file, and they will bind to 1326 place any number of ear-nodes in a Blender file, and they will bind to
1326 the closest physical object and follow it as it moves around. Each ear 1327 the closest physical object and follow it as it moves around. Each ear
1327 will provide access to the sound data it picks up between every frame. 1328 will provide access to the sound data it picks up between every frame.
1328 1329
1329 Hearing is one of the more difficult senses to simulate, because there 1330 Hearing is one of the more difficult senses to simulate, because there
1330 is less support for obtaining the actual sound data that is processed 1331 is less support for obtaining the actual sound data that is processed
2144 2145
2145 It's clear that this is a vital sense for fluid, graceful movement. 2146 It's clear that this is a vital sense for fluid, graceful movement.
2146 It's also particularly easy to implement in jMonkeyEngine. 2147 It's also particularly easy to implement in jMonkeyEngine.
2147 2148
2148 My simulated proprioception calculates the relative angles of each 2149 My simulated proprioception calculates the relative angles of each
2149 joint from the rest position defined in the blender file. This 2150 joint from the rest position defined in the Blender file. This
2150 simulates the muscle-spindles and joint capsules. I will deal with 2151 simulates the muscle-spindles and joint capsules. I will deal with
2151 Golgi tendon organs, which calculate muscle strain, in the next 2152 Golgi tendon organs, which calculate muscle strain, in the next
2152 chapter. 2153 chapter.
2153 2154
2154 *** Helper functions 2155 *** Helper functions
2307 #+ATTR_LaTeX: :width 7cm 2308 #+ATTR_LaTeX: :width 7cm
2308 [[./images/basic-muscle.png]] 2309 [[./images/basic-muscle.png]]
2309 2310
2310 *** Muscle meta-data 2311 *** Muscle meta-data
2311 2312
2312 #+caption: Program to deal with loading muscle data from a blender 2313 #+caption: Program to deal with loading muscle data from a Blender
2313 #+caption: file's metadata. 2314 #+caption: file's metadata.
2314 #+name: motor-pool 2315 #+name: motor-pool
2315 #+begin_listing clojure 2316 #+begin_listing clojure
2316 #+BEGIN_SRC clojure 2317 #+BEGIN_SRC clojure
2317 (defn muscle-profile-image 2318 (defn muscle-profile-image
2388 2389
2389 2390
2390 =movement-kernel= creates a function that controls the movement 2391 =movement-kernel= creates a function that controls the movement
2391 of the nearest physical node to the muscle node. The muscle exerts 2392 of the nearest physical node to the muscle node. The muscle exerts
2392 a rotational force dependent on it's orientation to the object in 2393 a rotational force dependent on it's orientation to the object in
2393 the blender file. The function returned by =movement-kernel= is 2394 the Blender file. The function returned by =movement-kernel= is
2394 also a sense function: it returns the percent of the total muscle 2395 also a sense function: it returns the percent of the total muscle
2395 strength that is currently being employed. This is analogous to 2396 strength that is currently being employed. This is analogous to
2396 muscle tension in humans and completes the sense of proprioception 2397 muscle tension in humans and completes the sense of proprioception
2397 begun in the last chapter. 2398 begun in the last chapter.
2398 2399
2532 #+caption: for easy manual motor control. 2533 #+caption: for easy manual motor control.
2533 #+name: basic-worm-view 2534 #+name: basic-worm-view
2534 #+ATTR_LaTeX: :width 10cm 2535 #+ATTR_LaTeX: :width 10cm
2535 [[./images/basic-worm-view.png]] 2536 [[./images/basic-worm-view.png]]
2536 2537
2537 #+caption: Program for reading a worm from a blender file and 2538 #+caption: Program for reading a worm from a Blender file and
2538 #+caption: outfitting it with the senses of proprioception, 2539 #+caption: outfitting it with the senses of proprioception,
2539 #+caption: touch, and the ability to move, as specified in the 2540 #+caption: touch, and the ability to move, as specified in the
2540 #+caption: blender file. 2541 #+caption: Blender file.
2541 #+name: get-worm 2542 #+name: get-worm
2542 #+begin_listing clojure 2543 #+begin_listing clojure
2543 #+begin_src clojure 2544 #+begin_src clojure
2544 (defn worm [] 2545 (defn worm []
2545 (let [model (load-blender-model "Models/worm/worm.blend")] 2546 (let [model (load-blender-model "Models/worm/worm.blend")]
2827 2828
2828 An /experience-index/ is an index into the grand experience vector 2829 An /experience-index/ is an index into the grand experience vector
2829 that defines the worm's life. It is a time-stamp for each set of 2830 that defines the worm's life. It is a time-stamp for each set of
2830 sensations the worm has experienced. 2831 sensations the worm has experienced.
2831 2832
2832 First, group the experience-indices into bins according to the 2833 First, I group the experience-indices into bins according to the
2833 similarity of their proprioceptive data. I organize my bins into a 2834 similarity of their proprioceptive data. I organize my bins into a
2834 3 level hierarchy. The smallest bins have an approximate size of 2835 3 level hierarchy. The smallest bins have an approximate size of
2835 0.001 radians in all proprioceptive dimensions. Each higher level 2836 0.001 radians in all proprioceptive dimensions. Each higher level
2836 is 10x bigger than the level below it. 2837 is 10x bigger than the level below it.
2837 2838
2838 The bins serve as a hashing function for proprioceptive data. Given 2839 The bins serve as a hashing function for proprioceptive data. Given
2839 a single piece of proprioceptive experience, the bins allow us to 2840 a single piece of proprioceptive experience, the bins allow me to
2840 rapidly find all other similar experience-indices of past 2841 rapidly find all other similar experience-indices of past
2841 experience that had a very similar proprioceptive configuration. 2842 experience that had a very similar proprioceptive configuration.
2842 When looking up a proprioceptive experience, if the smallest bin 2843 When looking up a proprioceptive experience, if the smallest bin
2843 does not match any previous experience, then successively larger 2844 does not match any previous experience, then I use successively
2844 bins are used until a match is found or we reach the largest bin. 2845 larger bins until a match is found or I reach the largest bin.
2845 2846
2846 Given a sequence of proprioceptive input, I use the bins to 2847 Given a sequence of proprioceptive input, I use the bins to
2847 generate a set of similar experiences for each input using the 2848 generate a set of similar experiences for each input using the
2848 tiered proprioceptive bins. 2849 tiered proprioceptive bins.
2849 2850
3527 3528
3528 Another important contribution of this thesis is the development of 3529 Another important contribution of this thesis is the development of
3529 the =CORTEX= system, a complete environment for creating simulated 3530 the =CORTEX= system, a complete environment for creating simulated
3530 creatures. You have seen how to implement five senses: touch, 3531 creatures. You have seen how to implement five senses: touch,
3531 proprioception, hearing, vision, and muscle tension. You have seen 3532 proprioception, hearing, vision, and muscle tension. You have seen
3532 how to create new creatures using blender, a 3D modeling tool. 3533 how to create new creatures using Blender, a 3D modeling tool.
3533 3534
3534 As a minor digression, you also saw how I used =CORTEX= to enable a 3535 As a minor digression, you also saw how I used =CORTEX= to enable a
3535 tiny worm to discover the topology of its skin simply by rolling on 3536 tiny worm to discover the topology of its skin simply by rolling on
3536 the ground. You also saw how to detect objects using only embodied 3537 the ground. You also saw how to detect objects using only embodied
3537 predicates. 3538 predicates.
3626 ;; OR 3627 ;; OR
3627 3628
3628 {:type :cone 3629 {:type :cone
3629 :limit-xz <lim-xz> 3630 :limit-xz <lim-xz>
3630 :limit-xy <lim-xy> 3631 :limit-xy <lim-xy>
3631 :twist <lim-twist>} ;(use XZY rotation mode in blender!) 3632 :twist <lim-twist>} ;(use XZY rotation mode in Blender!)
3632 #+END_SRC 3633 #+END_SRC
3633 3634
3634 *** Eyes 3635 *** Eyes
3635 3636
3636 Eyes are created by creating an empty node named =eyes= and then 3637 Eyes are created by creating an empty node named =eyes= and then
3717 [[./images/finger-1.png]] 3718 [[./images/finger-1.png]]
3718 3719
3719 *** Proprioception 3720 *** Proprioception
3720 3721
3721 Proprioception is tied to each joint node -- nothing special must 3722 Proprioception is tied to each joint node -- nothing special must
3722 be done in a blender model to enable proprioception other than 3723 be done in a Blender model to enable proprioception other than
3723 creating joint nodes. 3724 creating joint nodes.
3724 3725
3725 *** Muscles 3726 *** Muscles
3726 3727
3727 Muscles are created by creating an empty node named =muscles= and 3728 Muscles are created by creating an empty node named =muscles= and
3818 3819
3819 - =(sphere radius & {options})= :: create a sphere in the simulation. 3820 - =(sphere radius & {options})= :: create a sphere in the simulation.
3820 Options are the same as in =box=. 3821 Options are the same as in =box=.
3821 3822
3822 - =(load-blender-model file-name)= :: create a node structure 3823 - =(load-blender-model file-name)= :: create a node structure
3823 representing the model described in a blender file. 3824 representing the model described in a Blender file.
3824 3825
3825 - =(light-up-everything world)= :: distribute a standard compliment 3826 - =(light-up-everything world)= :: distribute a standard compliment
3826 of lights throughout the simulation. Should be adequate for most 3827 of lights throughout the simulation. Should be adequate for most
3827 purposes. 3828 purposes.
3828 3829
3853 - =(vision! creature)= :: give the creature a sense of vision. 3854 - =(vision! creature)= :: give the creature a sense of vision.
3854 Returns a list of functions which will each, when called 3855 Returns a list of functions which will each, when called
3855 during a simulation, return the vision data for the channel of 3856 during a simulation, return the vision data for the channel of
3856 one of the eyes. The functions are ordered depending on the 3857 one of the eyes. The functions are ordered depending on the
3857 alphabetical order of the names of the eye nodes in the 3858 alphabetical order of the names of the eye nodes in the
3858 blender file. The data returned by the functions is a vector 3859 Blender file. The data returned by the functions is a vector
3859 containing the eye's /topology/, a vector of coordinates, and 3860 containing the eye's /topology/, a vector of coordinates, and
3860 the eye's /data/, a vector of RGB values filtered by the eye's 3861 the eye's /data/, a vector of RGB values filtered by the eye's
3861 sensitivity. 3862 sensitivity.
3862 3863
3863 - =(hearing! creature)= :: give the creature a sense of hearing. 3864 - =(hearing! creature)= :: give the creature a sense of hearing.
3864 Returns a list of functions, one for each ear, that when 3865 Returns a list of functions, one for each ear, that when
3865 called will return a frame's worth of hearing data for that 3866 called will return a frame's worth of hearing data for that
3866 ear. The functions are ordered depending on the alphabetical 3867 ear. The functions are ordered depending on the alphabetical
3867 order of the names of the ear nodes in the blender file. The 3868 order of the names of the ear nodes in the Blender file. The
3868 data returned by the functions is an array of PCM (pulse code 3869 data returned by the functions is an array of PCM (pulse code
3869 modulated) wav data. 3870 modulated) wav data.
3870 3871
3871 - =(touch! creature)= :: give the creature a sense of touch. Returns 3872 - =(touch! creature)= :: give the creature a sense of touch. Returns
3872 a single function that must be called with the /root node/ of 3873 a single function that must be called with the /root node/ of