view thesis/cortex.org @ 516:ced955c3c84f

resurrect old cortex to fix flow issues.
author Robert McIntyre <rlm@mit.edu>
date Sun, 30 Mar 2014 22:48:19 -0400
parents 58fa1ffd481e
children 68665d2c32a7
line wrap: on
line source
1 #+title: =CORTEX=
2 #+author: Robert McIntyre
3 #+email: rlm@mit.edu
4 #+description: Using embodied AI to facilitate Artificial Imagination.
5 #+keywords: AI, clojure, embodiment
6 #+LaTeX_CLASS_OPTIONS: [nofloat]
8 * COMMENT templates
9 #+caption:
10 #+caption:
11 #+caption:
12 #+caption:
13 #+name: name
14 #+begin_listing clojure
15 #+BEGIN_SRC clojure
16 #+END_SRC
17 #+end_listing
19 #+caption:
20 #+caption:
21 #+caption:
22 #+name: name
23 #+ATTR_LaTeX: :width 10cm
24 [[./images/aurellem-gray.png]]
26 #+caption:
27 #+caption:
28 #+caption:
29 #+caption:
30 #+name: name
31 #+begin_listing clojure
32 #+BEGIN_SRC clojure
33 #+END_SRC
34 #+end_listing
36 #+caption:
37 #+caption:
38 #+caption:
39 #+name: name
40 #+ATTR_LaTeX: :width 10cm
41 [[./images/aurellem-gray.png]]
44 * Empathy \& Embodiment: problem solving strategies
46 By the end of this thesis, you will have seen a novel approach to
47 interpreting video using embodiment and empathy. You will have also
48 seen one way to efficiently implement empathy for embodied
49 creatures. Finally, you will become familiar with =CORTEX=, a system
50 for designing and simulating creatures with rich senses, which you
51 may choose to use in your own research.
53 This is the core vision of my thesis: That one of the important ways
54 in which we understand others is by imagining ourselves in their
55 position and emphatically feeling experiences relative to our own
56 bodies. By understanding events in terms of our own previous
57 corporeal experience, we greatly constrain the possibilities of what
58 would otherwise be an unwieldy exponential search. This extra
59 constraint can be the difference between easily understanding what
60 is happening in a video and being completely lost in a sea of
61 incomprehensible color and movement.
64 ** The problem: recognizing actions in video is hard!
66 Examine the following image. What is happening? As you, and indeed
67 very young children, can easily determine, this is an image of
68 drinking.
70 #+caption: A cat drinking some water. Identifying this action is
71 #+caption: beyond the capabilities of existing computer vision systems.
72 #+ATTR_LaTeX: :width 7cm
73 [[./images/cat-drinking.jpg]]
75 Nevertheless, it is beyond the state of the art for a computer
76 vision program to describe what's happening in this image. Part of
77 the problem is that many computer vision systems focus on
78 pixel-level details or comparisons to example images (such as
79 \cite{volume-action-recognition}), but the 3D world is so variable
80 that it is hard to descrive the world in terms of possible images.
82 In fact, the contents of scene may have much less to do with pixel
83 probabilities than with recognizing various affordances: things you
84 can move, objects you can grasp, spaces that can be filled . For
85 example, what processes might enable you to see the chair in figure
86 \ref{hidden-chair}?
88 #+caption: The chair in this image is quite obvious to humans, but I
89 #+caption: doubt that any modern computer vision program can find it.
90 #+name: hidden-chair
91 #+ATTR_LaTeX: :width 10cm
92 [[./images/fat-person-sitting-at-desk.jpg]]
94 Finally, how is it that you can easily tell the difference between
95 how the girls /muscles/ are working in figure \ref{girl}?
97 #+caption: The mysterious ``common sense'' appears here as you are able
98 #+caption: to discern the difference in how the girl's arm muscles
99 #+caption: are activated between the two images.
100 #+name: girl
101 #+ATTR_LaTeX: :width 7cm
102 [[./images/wall-push.png]]
104 Each of these examples tells us something about what might be going
105 on in our minds as we easily solve these recognition problems.
107 The hidden chair shows us that we are strongly triggered by cues
108 relating to the position of human bodies, and that we can determine
109 the overall physical configuration of a human body even if much of
110 that body is occluded.
112 The picture of the girl pushing against the wall tells us that we
113 have common sense knowledge about the kinetics of our own bodies.
114 We know well how our muscles would have to work to maintain us in
115 most positions, and we can easily project this self-knowledge to
116 imagined positions triggered by images of the human body.
118 ** A step forward: the sensorimotor-centered approach
120 In this thesis, I explore the idea that our knowledge of our own
121 bodies, combined with our own rich senses, enables us to recognize
122 the actions of others.
124 For example, I think humans are able to label the cat video as
125 ``drinking'' because they imagine /themselves/ as the cat, and
126 imagine putting their face up against a stream of water and
127 sticking out their tongue. In that imagined world, they can feel
128 the cool water hitting their tongue, and feel the water entering
129 their body, and are able to recognize that /feeling/ as drinking.
130 So, the label of the action is not really in the pixels of the
131 image, but is found clearly in a simulation inspired by those
132 pixels. An imaginative system, having been trained on drinking and
133 non-drinking examples and learning that the most important
134 component of drinking is the feeling of water sliding down one's
135 throat, would analyze a video of a cat drinking in the following
136 manner:
138 1. Create a physical model of the video by putting a ``fuzzy''
139 model of its own body in place of the cat. Possibly also create
140 a simulation of the stream of water.
142 2. Play out this simulated scene and generate imagined sensory
143 experience. This will include relevant muscle contractions, a
144 close up view of the stream from the cat's perspective, and most
145 importantly, the imagined feeling of water entering the
146 mouth. The imagined sensory experience can come from a
147 simulation of the event, but can also be pattern-matched from
148 previous, similar embodied experience.
150 3. The action is now easily identified as drinking by the sense of
151 taste alone. The other senses (such as the tongue moving in and
152 out) help to give plausibility to the simulated action. Note that
153 the sense of vision, while critical in creating the simulation,
154 is not critical for identifying the action from the simulation.
156 For the chair examples, the process is even easier:
158 1. Align a model of your body to the person in the image.
160 2. Generate proprioceptive sensory data from this alignment.
162 3. Use the imagined proprioceptive data as a key to lookup related
163 sensory experience associated with that particular proproceptive
164 feeling.
166 4. Retrieve the feeling of your bottom resting on a surface, your
167 knees bent, and your leg muscles relaxed.
169 5. This sensory information is consistent with your =sitting?=
170 sensory predicate, so you (and the entity in the image) must be
171 sitting.
173 6. There must be a chair-like object since you are sitting.
175 Empathy offers yet another alternative to the age-old AI
176 representation question: ``What is a chair?'' --- A chair is the
177 feeling of sitting!
179 One powerful advantage of empathic problem solving is that it
180 factors the action recognition problem into two easier problems. To
181 use empathy, you need an /aligner/, which takes the video and a
182 model of your body, and aligns the model with the video. Then, you
183 need a /recognizer/, which uses the aligned model to interpret the
184 action. The power in this method lies in the fact that you describe
185 all actions form a body-centered viewpoint. You are less tied to
186 the particulars of any visual representation of the actions. If you
187 teach the system what ``running'' is, and you have a good enough
188 aligner, the system will from then on be able to recognize running
189 from any point of view, even strange points of view like above or
190 underneath the runner. This is in contrast to action recognition
191 schemes that try to identify actions using a non-embodied approach.
192 If these systems learn about running as viewed from the side, they
193 will not automatically be able to recognize running from any other
194 viewpoint.
196 Another powerful advantage is that using the language of multiple
197 body-centered rich senses to describe body-centerd actions offers a
198 massive boost in descriptive capability. Consider how difficult it
199 would be to compose a set of HOG filters to describe the action of
200 a simple worm-creature ``curling'' so that its head touches its
201 tail, and then behold the simplicity of describing thus action in a
202 language designed for the task (listing \ref{grand-circle-intro}):
204 #+caption: Body-centerd actions are best expressed in a body-centered
205 #+caption: language. This code detects when the worm has curled into a
206 #+caption: full circle. Imagine how you would replicate this functionality
207 #+caption: using low-level pixel features such as HOG filters!
208 #+name: grand-circle-intro
209 #+begin_listing clojure
210 #+begin_src clojure
211 (defn grand-circle?
212 "Does the worm form a majestic circle (one end touching the other)?"
213 [experiences]
214 (and (curled? experiences)
215 (let [worm-touch (:touch (peek experiences))
216 tail-touch (worm-touch 0)
217 head-touch (worm-touch 4)]
218 (and (< 0.2 (contact worm-segment-bottom-tip tail-touch))
219 (< 0.2 (contact worm-segment-top-tip head-touch))))))
220 #+end_src
221 #+end_listing
223 ** =EMPATH= regognizes actions using empathy
225 First, I built a system for constructing virtual creatures with
226 physiologically plausible sensorimotor systems and detailed
227 environments. The result is =CORTEX=, which is described in section
228 \ref{sec-2}. (=CORTEX= was built to be flexible and useful to other
229 AI researchers; it is provided in full with detailed instructions
230 on the web [here].)
232 Next, I wrote routines which enabled a simple worm-like creature to
233 infer the actions of a second worm-like creature, using only its
234 own prior sensorimotor experiences and knowledge of the second
235 worm's joint positions. This program, =EMPATH=, is described in
236 section \ref{sec-3}, and the key results of this experiment are
237 summarized below.
239 I have built a system that can express the types of recognition
240 problems in a form amenable to computation. It is split into
241 four parts:
243 - Free/Guided Play :: The creature moves around and experiences the
244 world through its unique perspective. Many otherwise
245 complicated actions are easily described in the language of a
246 full suite of body-centered, rich senses. For example,
247 drinking is the feeling of water sliding down your throat, and
248 cooling your insides. It's often accompanied by bringing your
249 hand close to your face, or bringing your face close to water.
250 Sitting down is the feeling of bending your knees, activating
251 your quadriceps, then feeling a surface with your bottom and
252 relaxing your legs. These body-centered action descriptions
253 can be either learned or hard coded.
254 - Posture Imitation :: When trying to interpret a video or image,
255 the creature takes a model of itself and aligns it with
256 whatever it sees. This alignment can even cross species, as
257 when humans try to align themselves with things like ponies,
258 dogs, or other humans with a different body type.
259 - Empathy :: The alignment triggers associations with
260 sensory data from prior experiences. For example, the
261 alignment itself easily maps to proprioceptive data. Any
262 sounds or obvious skin contact in the video can to a lesser
263 extent trigger previous experience. Segments of previous
264 experiences are stitched together to form a coherent and
265 complete sensory portrait of the scene.
266 - Recognition :: With the scene described in terms of first
267 person sensory events, the creature can now run its
268 action-identification programs on this synthesized sensory
269 data, just as it would if it were actually experiencing the
270 scene first-hand. If previous experience has been accurately
271 retrieved, and if it is analogous enough to the scene, then
272 the creature will correctly identify the action in the scene.
275 My program, =EMPATH= uses this empathic problem solving technique
276 to interpret the actions of a simple, worm-like creature.
278 #+caption: The worm performs many actions during free play such as
279 #+caption: curling, wiggling, and resting.
280 #+name: worm-intro
281 #+ATTR_LaTeX: :width 15cm
282 [[./images/worm-intro-white.png]]
284 #+caption: =EMPATH= recognized and classified each of these
285 #+caption: poses by inferring the complete sensory experience
286 #+caption: from proprioceptive data.
287 #+name: worm-recognition-intro
288 #+ATTR_LaTeX: :width 15cm
289 [[./images/worm-poses.png]]
291 #+caption: From only \emph{proprioceptive} data, =EMPATH= was able to infer
292 #+caption: the complete sensory experience and classify these four poses.
293 #+caption: The last image is a composite, depicting the intermediate stages
294 #+caption: of \emph{wriggling}.
295 #+name: worm-recognition-intro-2
296 #+ATTR_LaTeX: :width 15cm
297 [[./images/empathy-1.png]]
299 Next, I developed an experiment to test the power of =CORTEX='s
300 sensorimotor-centered language for solving recognition problems. As
301 a proof of concept, I wrote routines which enabled a simple
302 worm-like creature to infer the actions of a second worm-like
303 creature, using only its own previous sensorimotor experiences and
304 knowledge of the second worm's joints (figure
305 \ref{worm-recognition-intro-2}). The result of this proof of
306 concept was the program =EMPATH=, described in section \ref{sec-3}.
308 ** =EMPATH= is built on =CORTEX=, en environment for making creatures.
310 # =CORTEX= provides a language for describing the sensorimotor
311 # experiences of various creatures.
313 I built =CORTEX= to be a general AI research platform for doing
314 experiments involving multiple rich senses and a wide variety and
315 number of creatures. I intend it to be useful as a library for many
316 more projects than just this thesis. =CORTEX= was necessary to meet
317 a need among AI researchers at CSAIL and beyond, which is that
318 people often will invent neat ideas that are best expressed in the
319 language of creatures and senses, but in order to explore those
320 ideas they must first build a platform in which they can create
321 simulated creatures with rich senses! There are many ideas that
322 would be simple to execute (such as =EMPATH=), but attached to them
323 is the multi-month effort to make a good creature simulator. Often,
324 that initial investment of time proves to be too much, and the
325 project must make do with a lesser environment.
327 =CORTEX= is well suited as an environment for embodied AI research
328 for three reasons:
330 - You can create new creatures using Blender, a popular 3D modeling
331 program. Each sense can be specified using special blender nodes
332 with biologically inspired paramaters. You need not write any
333 code to create a creature, and can use a wide library of
334 pre-existing blender models as a base for your own creatures.
336 - =CORTEX= implements a wide variety of senses: touch,
337 proprioception, vision, hearing, and muscle tension. Complicated
338 senses like touch, and vision involve multiple sensory elements
339 embedded in a 2D surface. You have complete control over the
340 distribution of these sensor elements through the use of simple
341 png image files. In particular, =CORTEX= implements more
342 comprehensive hearing than any other creature simulation system
343 available.
345 - =CORTEX= supports any number of creatures and any number of
346 senses. Time in =CORTEX= dialates so that the simulated creatures
347 always precieve a perfectly smooth flow of time, regardless of
348 the actual computational load.
350 =CORTEX= is built on top of =jMonkeyEngine3=, which is a video game
351 engine designed to create cross-platform 3D desktop games. =CORTEX=
352 is mainly written in clojure, a dialect of =LISP= that runs on the
353 java virtual machine (JVM). The API for creating and simulating
354 creatures and senses is entirely expressed in clojure, though many
355 senses are implemented at the layer of jMonkeyEngine or below. For
356 example, for the sense of hearing I use a layer of clojure code on
357 top of a layer of java JNI bindings that drive a layer of =C++=
358 code which implements a modified version of =OpenAL= to support
359 multiple listeners. =CORTEX= is the only simulation environment
360 that I know of that can support multiple entities that can each
361 hear the world from their own perspective. Other senses also
362 require a small layer of Java code. =CORTEX= also uses =bullet=, a
363 physics simulator written in =C=.
365 #+caption: Here is the worm from figure \ref{worm-intro} modeled
366 #+caption: in Blender, a free 3D-modeling program. Senses and
367 #+caption: joints are described using special nodes in Blender.
368 #+name: worm-recognition-intro
369 #+ATTR_LaTeX: :width 12cm
370 [[./images/blender-worm.png]]
372 Here are some thing I anticipate that =CORTEX= might be used for:
374 - exploring new ideas about sensory integration
375 - distributed communication among swarm creatures
376 - self-learning using free exploration,
377 - evolutionary algorithms involving creature construction
378 - exploration of exoitic senses and effectors that are not possible
379 in the real world (such as telekenisis or a semantic sense)
380 - imagination using subworlds
382 During one test with =CORTEX=, I created 3,000 creatures each with
383 their own independent senses and ran them all at only 1/80 real
384 time. In another test, I created a detailed model of my own hand,
385 equipped with a realistic distribution of touch (more sensitive at
386 the fingertips), as well as eyes and ears, and it ran at around 1/4
387 real time.
389 #+BEGIN_LaTeX
390 \begin{sidewaysfigure}
391 \includegraphics[width=9.5in]{images/full-hand.png}
392 \caption{
393 I modeled my own right hand in Blender and rigged it with all the
394 senses that {\tt CORTEX} supports. My simulated hand has a
395 biologically inspired distribution of touch sensors. The senses are
396 displayed on the right, and the simulation is displayed on the
397 left. Notice that my hand is curling its fingers, that it can see
398 its own finger from the eye in its palm, and that it can feel its
399 own thumb touching its palm.}
400 \end{sidewaysfigure}
401 #+END_LaTeX
403 ** Contributions
405 - I built =CORTEX=, a comprehensive platform for embodied AI
406 experiments. =CORTEX= supports many features lacking in other
407 systems, such proper simulation of hearing. It is easy to create
408 new =CORTEX= creatures using Blender, a free 3D modeling program.
410 - I built =EMPATH=, which uses =CORTEX= to identify the actions of
411 a worm-like creature using a computational model of empathy.
413 - After one-shot supervised training, =EMPATH= was able recognize a
414 wide variety of static poses and dynamic actions---ranging from
415 curling in a circle to wriggling with a particular frequency ---
416 with 95\% accuracy.
418 - These results were completely independent of viewing angle
419 because the underlying body-centered language fundamentally is
420 independent; once an action is learned, it can be recognized
421 equally well from any viewing angle.
423 - =EMPATH= is surprisingly short; the sensorimotor-centered
424 language provided by =CORTEX= resulted in extremely economical
425 recognition routines --- about 500 lines in all --- suggesting
426 that such representations are very powerful, and often
427 indispensible for the types of recognition tasks considered here.
429 - Although for expediency's sake, I relied on direct knowledge of
430 joint positions in this proof of concept, it would be
431 straightforward to extend =EMPATH= so that it (more
432 realistically) infers joint positions from its visual data.
434 * Designing =CORTEX=
436 In this section, I outline the design decisions that went into
437 making =CORTEX=, along with some details about its implementation.
438 (A practical guide to getting started with =CORTEX=, which skips
439 over the history and implementation details presented here, is
440 provided in an appendix at the end of this thesis.)
442 Throughout this project, I intended for =CORTEX= to be flexible and
443 extensible enough to be useful for other researchers who want to
444 test out ideas of their own. To this end, wherver I have had to make
445 archetictural choices about =CORTEX=, I have chosen to give as much
446 freedom to the user as possible, so that =CORTEX= may be used for
447 things I have not forseen.
449 ** Building in simulation versus reality
450 The most important archetictural decision of all is the choice to
451 use a computer-simulated environemnt in the first place! The world
452 is a vast and rich place, and for now simulations are a very poor
453 reflection of its complexity. It may be that there is a significant
454 qualatative difference between dealing with senses in the real
455 world and dealing with pale facilimilies of them in a simulation
456 \cite{brooks-representation}. What are the advantages and
457 disadvantages of a simulation vs. reality?
459 *** Simulation
461 The advantages of virtual reality are that when everything is a
462 simulation, experiments in that simulation are absolutely
463 reproducible. It's also easier to change the character and world
464 to explore new situations and different sensory combinations.
466 If the world is to be simulated on a computer, then not only do
467 you have to worry about whether the character's senses are rich
468 enough to learn from the world, but whether the world itself is
469 rendered with enough detail and realism to give enough working
470 material to the character's senses. To name just a few
471 difficulties facing modern physics simulators: destructibility of
472 the environment, simulation of water/other fluids, large areas,
473 nonrigid bodies, lots of objects, smoke. I don't know of any
474 computer simulation that would allow a character to take a rock
475 and grind it into fine dust, then use that dust to make a clay
476 sculpture, at least not without spending years calculating the
477 interactions of every single small grain of dust. Maybe a
478 simulated world with today's limitations doesn't provide enough
479 richness for real intelligence to evolve.
481 *** Reality
483 The other approach for playing with senses is to hook your
484 software up to real cameras, microphones, robots, etc., and let it
485 loose in the real world. This has the advantage of eliminating
486 concerns about simulating the world at the expense of increasing
487 the complexity of implementing the senses. Instead of just
488 grabbing the current rendered frame for processing, you have to
489 use an actual camera with real lenses and interact with photons to
490 get an image. It is much harder to change the character, which is
491 now partly a physical robot of some sort, since doing so involves
492 changing things around in the real world instead of modifying
493 lines of code. While the real world is very rich and definitely
494 provides enough stimulation for intelligence to develop as
495 evidenced by our own existence, it is also uncontrollable in the
496 sense that a particular situation cannot be recreated perfectly or
497 saved for later use. It is harder to conduct science because it is
498 harder to repeat an experiment. The worst thing about using the
499 real world instead of a simulation is the matter of time. Instead
500 of simulated time you get the constant and unstoppable flow of
501 real time. This severely limits the sorts of software you can use
502 to program the AI because all sense inputs must be handled in real
503 time. Complicated ideas may have to be implemented in hardware or
504 may simply be impossible given the current speed of our
505 processors. Contrast this with a simulation, in which the flow of
506 time in the simulated world can be slowed down to accommodate the
507 limitations of the character's programming. In terms of cost,
508 doing everything in software is far cheaper than building custom
509 real-time hardware. All you need is a laptop and some patience.
511 ** Simulated time enables rapid prototyping \& simple programs
513 I envision =CORTEX= being used to support rapid prototyping and
514 iteration of ideas. Even if I could put together a well constructed
515 kit for creating robots, it would still not be enough because of
516 the scourge of real-time processing. Anyone who wants to test their
517 ideas in the real world must always worry about getting their
518 algorithms to run fast enough to process information in real time.
519 The need for real time processing only increases if multiple senses
520 are involved. In the extreme case, even simple algorithms will have
521 to be accelerated by ASIC chips or FPGAs, turning what would
522 otherwise be a few lines of code and a 10x speed penality into a
523 multi-month ordeal. For this reason, =CORTEX= supports
524 /time-dialiation/, which scales back the framerate of the
525 simulation in proportion to the amount of processing each frame.
526 From the perspective of the creatures inside the simulation, time
527 always appears to flow at a constant rate, regardless of how
528 complicated the envorimnent becomes or how many creatures are in
529 the simulation. The cost is that =CORTEX= can sometimes run slower
530 than real time. This can also be an advantage, however ---
531 simulations of very simple creatures in =CORTEX= generally run at
532 40x on my machine!
534 ** All sense organs are two-dimensional surfaces
536 If =CORTEX= is to support a wide variety of senses, it would help
537 to have a better understanding of what a ``sense'' actually is!
538 While vision, touch, and hearing all seem like they are quite
539 different things, I was supprised to learn during the course of
540 this thesis that they (and all physical senses) can be expressed as
541 exactly the same mathematical object due to a dimensional argument!
543 Human beings are three-dimensional objects, and the nerves that
544 transmit data from our various sense organs to our brain are
545 essentially one-dimensional. This leaves up to two dimensions in
546 which our sensory information may flow. For example, imagine your
547 skin: it is a two-dimensional surface around a three-dimensional
548 object (your body). It has discrete touch sensors embedded at
549 various points, and the density of these sensors corresponds to the
550 sensitivity of that region of skin. Each touch sensor connects to a
551 nerve, all of which eventually are bundled together as they travel
552 up the spinal cord to the brain. Intersect the spinal nerves with a
553 guillotining plane and you will see all of the sensory data of the
554 skin revealed in a roughly circular two-dimensional image which is
555 the cross section of the spinal cord. Points on this image that are
556 close together in this circle represent touch sensors that are
557 /probably/ close together on the skin, although there is of course
558 some cutting and rearrangement that has to be done to transfer the
559 complicated surface of the skin onto a two dimensional image.
561 Most human senses consist of many discrete sensors of various
562 properties distributed along a surface at various densities. For
563 skin, it is Pacinian corpuscles, Meissner's corpuscles, Merkel's
564 disks, and Ruffini's endings, which detect pressure and vibration
565 of various intensities. For ears, it is the stereocilia distributed
566 along the basilar membrane inside the cochlea; each one is
567 sensitive to a slightly different frequency of sound. For eyes, it
568 is rods and cones distributed along the surface of the retina. In
569 each case, we can describe the sense with a surface and a
570 distribution of sensors along that surface.
572 The neat idea is that every human sense can be effectively
573 described in terms of a surface containing embedded sensors. If the
574 sense had any more dimensions, then there wouldn't be enough room
575 in the spinal chord to transmit the information!
577 Therefore, =CORTEX= must support the ability to create objects and
578 then be able to ``paint'' points along their surfaces to describe
579 each sense.
581 Fortunately this idea is already a well known computer graphics
582 technique called called /UV-mapping/. The three-dimensional surface
583 of a model is cut and smooshed until it fits on a two-dimensional
584 image. You paint whatever you want on that image, and when the
585 three-dimensional shape is rendered in a game the smooshing and
586 cutting is reversed and the image appears on the three-dimensional
587 object.
589 To make a sense, interpret the UV-image as describing the
590 distribution of that senses sensors. To get different types of
591 sensors, you can either use a different color for each type of
592 sensor, or use multiple UV-maps, each labeled with that sensor
593 type. I generally use a white pixel to mean the presence of a
594 sensor and a black pixel to mean the absence of a sensor, and use
595 one UV-map for each sensor-type within a given sense.
597 #+CAPTION: The UV-map for an elongated icososphere. The white
598 #+caption: dots each represent a touch sensor. They are dense
599 #+caption: in the regions that describe the tip of the finger,
600 #+caption: and less dense along the dorsal side of the finger
601 #+caption: opposite the tip.
602 #+name: finger-UV
603 #+ATTR_latex: :width 10cm
604 [[./images/finger-UV.png]]
606 #+caption: Ventral side of the UV-mapped finger. Notice the
607 #+caption: density of touch sensors at the tip.
608 #+name: finger-side-view
609 #+ATTR_LaTeX: :width 10cm
610 [[./images/finger-1.png]]
612 ** Video game engines provide ready-made physics and shading
614 I did not need to write my own physics simulation code or shader to
615 build =CORTEX=. Doing so would lead to a system that is impossible
616 for anyone but myself to use anyway. Instead, I use a video game
617 engine as a base and modify it to accomodate the additional needs
618 of =CORTEX=. Video game engines are an ideal starting point to
619 build =CORTEX=, because they are not far from being creature
620 building systems themselves.
622 First off, general purpose video game engines come with a physics
623 engine and lighting / sound system. The physics system provides
624 tools that can be co-opted to serve as touch, proprioception, and
625 muscles. Since some games support split screen views, a good video
626 game engine will allow you to efficiently create multiple cameras
627 in the simulated world that can be used as eyes. Video game systems
628 offer integrated asset management for things like textures and
629 creatures models, providing an avenue for defining creatures. They
630 also understand UV-mapping, since this technique is used to apply a
631 texture to a model. Finally, because video game engines support a
632 large number of users, as long as =CORTEX= doesn't stray too far
633 from the base system, other researchers can turn to this community
634 for help when doing their research.
636 ** =CORTEX= is based on jMonkeyEngine3
638 While preparing to build =CORTEX= I studied several video game
639 engines to see which would best serve as a base. The top contenders
640 were:
642 - [[http://www.idsoftware.com][Quake II]]/[[http://www.bytonic.de/html/jake2.html][Jake2]] :: The Quake II engine was designed by ID
643 software in 1997. All the source code was released by ID
644 software into the Public Domain several years ago, and as a
645 result it has been ported to many different languages. This
646 engine was famous for its advanced use of realistic shading
647 and had decent and fast physics simulation. The main advantage
648 of the Quake II engine is its simplicity, but I ultimately
649 rejected it because the engine is too tied to the concept of a
650 first-person shooter game. One of the problems I had was that
651 there does not seem to be any easy way to attach multiple
652 cameras to a single character. There are also several physics
653 clipping issues that are corrected in a way that only applies
654 to the main character and do not apply to arbitrary objects.
656 - [[http://source.valvesoftware.com/][Source Engine]] :: The Source Engine evolved from the Quake II
657 and Quake I engines and is used by Valve in the Half-Life
658 series of games. The physics simulation in the Source Engine
659 is quite accurate and probably the best out of all the engines
660 I investigated. There is also an extensive community actively
661 working with the engine. However, applications that use the
662 Source Engine must be written in C++, the code is not open, it
663 only runs on Windows, and the tools that come with the SDK to
664 handle models and textures are complicated and awkward to use.
666 - [[http://jmonkeyengine.com/][jMonkeyEngine3]] :: jMonkeyEngine3 is a new library for creating
667 games in Java. It uses OpenGL to render to the screen and uses
668 screengraphs to avoid drawing things that do not appear on the
669 screen. It has an active community and several games in the
670 pipeline. The engine was not built to serve any particular
671 game but is instead meant to be used for any 3D game.
673 I chose jMonkeyEngine3 because it because it had the most features
674 out of all the free projects I looked at, and because I could then
675 write my code in clojure, an implementation of =LISP= that runs on
676 the JVM.
678 ** =CORTEX= uses Blender to create creature models
680 For the simple worm-like creatures I will use later on in this
681 thesis, I could define a simple API in =CORTEX= that would allow
682 one to create boxes, spheres, etc., and leave that API as the sole
683 way to create creatures. However, for =CORTEX= to truly be useful
684 for other projects, it needs a way to construct complicated
685 creatures. If possible, it would be nice to leverage work that has
686 already been done by the community of 3D modelers, or at least
687 enable people who are talented at moedling but not programming to
688 design =CORTEX= creatures.
690 Therefore, I use Blender, a free 3D modeling program, as the main
691 way to create creatures in =CORTEX=. However, the creatures modeled
692 in Blender must also be simple to simulate in jMonkeyEngine3's game
693 engine, and must also be easy to rig with =CORTEX='s senses. I
694 accomplish this with extensive use of Blender's ``empty nodes.''
696 Empty nodes have no mass, physical presence, or appearance, but
697 they can hold metadata and have names. I use a tree structure of
698 empty nodes to specify senses in the following manner:
700 - Create a single top-level empty node whose name is the name of
701 the sense.
702 - Add empty nodes which each contain meta-data relevant to the
703 sense, including a UV-map describing the number/distribution of
704 sensors if applicable.
705 - Make each empty-node the child of the top-level node.
707 #+caption: An example of annoting a creature model with empty
708 #+caption: nodes to describe the layout of senses. There are
709 #+caption: multiple empty nodes which each describe the position
710 #+caption: of muscles, ears, eyes, or joints.
711 #+name: sense-nodes
712 #+ATTR_LaTeX: :width 10cm
713 [[./images/empty-sense-nodes.png]]
715 ** Bodies are composed of segments connected by joints
717 Blender is a general purpose animation tool, which has been used in
718 the past to create high quality movies such as Sintel
719 \cite{blender}. Though Blender can model and render even complicated
720 things like water, it is crucual to keep models that are meant to
721 be simulated as creatures simple. =Bullet=, which =CORTEX= uses
722 though jMonkeyEngine3, is a rigid-body physics system. This offers
723 a compromise between the expressiveness of a game level and the
724 speed at which it can be simulated, and it means that creatures
725 should be naturally expressed as rigid components held together by
726 joint constraints.
728 But humans are more like a squishy bag with wrapped around some
729 hard bones which define the overall shape. When we move, our skin
730 bends and stretches to accomodate the new positions of our bones.
732 One way to make bodies composed of rigid pieces connected by joints
733 /seem/ more human-like is to use an /armature/, (or /rigging/)
734 system, which defines a overall ``body mesh'' and defines how the
735 mesh deforms as a function of the position of each ``bone'' which
736 is a standard rigid body. This technique is used extensively to
737 model humans and create realistic animations. It is not a good
738 technique for physical simulation, however because it creates a lie
739 -- the skin is not a physical part of the simulation and does not
740 interact with any objects in the world or itself. Objects will pass
741 right though the skin until they come in contact with the
742 underlying bone, which is a physical object. Whithout simulating
743 the skin, the sense of touch has little meaning, and the creature's
744 own vision will lie to it about the true extent of its body.
745 Simulating the skin as a physical object requires some way to
746 continuously update the physical model of the skin along with the
747 movement of the bones, which is unacceptably slow compared to rigid
748 body simulation.
750 Therefore, instead of using the human-like ``deformable bag of
751 bones'' approach, I decided to base my body plans on multiple solid
752 objects that are connected by joints, inspired by the robot =EVE=
753 from the movie WALL-E.
755 #+caption: =EVE= from the movie WALL-E. This body plan turns
756 #+caption: out to be much better suited to my purposes than a more
757 #+caption: human-like one.
758 #+ATTR_LaTeX: :width 10cm
759 [[./images/Eve.jpg]]
761 =EVE='s body is composed of several rigid components that are held
762 together by invisible joint constraints. This is what I mean by
763 ``eve-like''. The main reason that I use eve-style bodies is for
764 efficiency, and so that there will be correspondence between the
765 AI's semses and the physical presence of its body. Each individual
766 section is simulated by a separate rigid body that corresponds
767 exactly with its visual representation and does not change.
768 Sections are connected by invisible joints that are well supported
769 in jMonkeyEngine3. Bullet, the physics backend for jMonkeyEngine3,
770 can efficiently simulate hundreds of rigid bodies connected by
771 joints. Just because sections are rigid does not mean they have to
772 stay as one piece forever; they can be dynamically replaced with
773 multiple sections to simulate splitting in two. This could be used
774 to simulate retractable claws or =EVE='s hands, which are able to
775 coalesce into one object in the movie.
777 *** Solidifying/Connecting a body
779 =CORTEX= creates a creature in two steps: first, it traverses the
780 nodes in the blender file and creates physical representations for
781 any of them that have mass defined in their blender meta-data.
783 #+caption: Program for iterating through the nodes in a blender file
784 #+caption: and generating physical jMonkeyEngine3 objects with mass
785 #+caption: and a matching physics shape.
786 #+name: name
787 #+begin_listing clojure
788 #+begin_src clojure
789 (defn physical!
790 "Iterate through the nodes in creature and make them real physical
791 objects in the simulation."
792 [#^Node creature]
793 (dorun
794 (map
795 (fn [geom]
796 (let [physics-control
797 (RigidBodyControl.
798 (HullCollisionShape.
799 (.getMesh geom))
800 (if-let [mass (meta-data geom "mass")]
801 (float mass) (float 1)))]
802 (.addControl geom physics-control)))
803 (filter #(isa? (class %) Geometry )
804 (node-seq creature)))))
805 #+end_src
806 #+end_listing
808 The next step to making a proper body is to connect those pieces
809 together with joints. jMonkeyEngine has a large array of joints
810 available via =bullet=, such as Point2Point, Cone, Hinge, and a
811 generic Six Degree of Freedom joint, with or without spring
812 restitution.
814 Joints are treated a lot like proper senses, in that there is a
815 top-level empty node named ``joints'' whose children each
816 represent a joint.
818 #+caption: View of the hand model in Blender showing the main ``joints''
819 #+caption: node (highlighted in yellow) and its children which each
820 #+caption: represent a joint in the hand. Each joint node has metadata
821 #+caption: specifying what sort of joint it is.
822 #+name: blender-hand
823 #+ATTR_LaTeX: :width 10cm
824 [[./images/hand-screenshot1.png]]
827 =CORTEX='s procedure for binding the creature together with joints
828 is as follows:
830 - Find the children of the ``joints'' node.
831 - Determine the two spatials the joint is meant to connect.
832 - Create the joint based on the meta-data of the empty node.
834 The higher order function =sense-nodes= from =cortex.sense=
835 simplifies finding the joints based on their parent ``joints''
836 node.
838 #+caption: Retrieving the children empty nodes from a single
839 #+caption: named empty node is a common pattern in =CORTEX=
840 #+caption: further instances of this technique for the senses
841 #+caption: will be omitted
842 #+name: get-empty-nodes
843 #+begin_listing clojure
844 #+begin_src clojure
845 (defn sense-nodes
846 "For some senses there is a special empty blender node whose
847 children are considered markers for an instance of that sense. This
848 function generates functions to find those children, given the name
849 of the special parent node."
850 [parent-name]
851 (fn [#^Node creature]
852 (if-let [sense-node (.getChild creature parent-name)]
853 (seq (.getChildren sense-node)) [])))
855 (def
856 ^{:doc "Return the children of the creature's \"joints\" node."
857 :arglists '([creature])}
858 joints
859 (sense-nodes "joints"))
860 #+end_src
861 #+end_listing
863 To find a joint's targets, =CORTEX= creates a small cube, centered
864 around the empty-node, and grows the cube exponentially until it
865 intersects two physical objects. The objects are ordered according
866 to the joint's rotation, with the first one being the object that
867 has more negative coordinates in the joint's reference frame.
868 Since the objects must be physical, the empty-node itself escapes
869 detection. Because the objects must be physical, =joint-targets=
870 must be called /after/ =physical!= is called.
872 #+caption: Program to find the targets of a joint node by
873 #+caption: exponentiallly growth of a search cube.
874 #+name: joint-targets
875 #+begin_listing clojure
876 #+begin_src clojure
877 (defn joint-targets
878 "Return the two closest two objects to the joint object, ordered
879 from bottom to top according to the joint's rotation."
880 [#^Node parts #^Node joint]
881 (loop [radius (float 0.01)]
882 (let [results (CollisionResults.)]
883 (.collideWith
884 parts
885 (BoundingBox. (.getWorldTranslation joint)
886 radius radius radius) results)
887 (let [targets
888 (distinct
889 (map #(.getGeometry %) results))]
890 (if (>= (count targets) 2)
891 (sort-by
892 #(let [joint-ref-frame-position
893 (jme-to-blender
894 (.mult
895 (.inverse (.getWorldRotation joint))
896 (.subtract (.getWorldTranslation %)
897 (.getWorldTranslation joint))))]
898 (.dot (Vector3f. 1 1 1) joint-ref-frame-position))
899 (take 2 targets))
900 (recur (float (* radius 2))))))))
901 #+end_src
902 #+end_listing
904 Once =CORTEX= finds all joints and targets, it creates them using
905 a dispatch on the metadata of each joint node.
907 #+caption: Program to dispatch on blender metadata and create joints
908 #+caption: sutiable for physical simulation.
909 #+name: joint-dispatch
910 #+begin_listing clojure
911 #+begin_src clojure
912 (defmulti joint-dispatch
913 "Translate blender pseudo-joints into real JME joints."
914 (fn [constraints & _]
915 (:type constraints)))
917 (defmethod joint-dispatch :point
918 [constraints control-a control-b pivot-a pivot-b rotation]
919 (doto (SixDofJoint. control-a control-b pivot-a pivot-b false)
920 (.setLinearLowerLimit Vector3f/ZERO)
921 (.setLinearUpperLimit Vector3f/ZERO)))
923 (defmethod joint-dispatch :hinge
924 [constraints control-a control-b pivot-a pivot-b rotation]
925 (let [axis (if-let [axis (:axis constraints)] axis Vector3f/UNIT_X)
926 [limit-1 limit-2] (:limit constraints)
927 hinge-axis (.mult rotation (blender-to-jme axis))]
928 (doto (HingeJoint. control-a control-b pivot-a pivot-b
929 hinge-axis hinge-axis)
930 (.setLimit limit-1 limit-2))))
932 (defmethod joint-dispatch :cone
933 [constraints control-a control-b pivot-a pivot-b rotation]
934 (let [limit-xz (:limit-xz constraints)
935 limit-xy (:limit-xy constraints)
936 twist (:twist constraints)]
937 (doto (ConeJoint. control-a control-b pivot-a pivot-b
938 rotation rotation)
939 (.setLimit (float limit-xz) (float limit-xy)
940 (float twist)))))
941 #+end_src
942 #+end_listing
944 All that is left for joints it to combine the above pieces into a
945 something that can operate on the collection of nodes that a
946 blender file represents.
948 #+caption: Program to completely create a joint given information
949 #+caption: from a blender file.
950 #+name: connect
951 #+begin_listing clojure
952 #+begin_src clojure
953 (defn connect
954 "Create a joint between 'obj-a and 'obj-b at the location of
955 'joint. The type of joint is determined by the metadata on 'joint.
957 Here are some examples:
958 {:type :point}
959 {:type :hinge :limit [0 (/ Math/PI 2)] :axis (Vector3f. 0 1 0)}
960 (:axis defaults to (Vector3f. 1 0 0) if not provided for hinge joints)
962 {:type :cone :limit-xz 0]
963 :limit-xy 0]
964 :twist 0]} (use XZY rotation mode in blender!)"
965 [#^Node obj-a #^Node obj-b #^Node joint]
966 (let [control-a (.getControl obj-a RigidBodyControl)
967 control-b (.getControl obj-b RigidBodyControl)
968 joint-center (.getWorldTranslation joint)
969 joint-rotation (.toRotationMatrix (.getWorldRotation joint))
970 pivot-a (world-to-local obj-a joint-center)
971 pivot-b (world-to-local obj-b joint-center)]
972 (if-let
973 [constraints (map-vals eval (read-string (meta-data joint "joint")))]
974 ;; A side-effect of creating a joint registers
975 ;; it with both physics objects which in turn
976 ;; will register the joint with the physics system
977 ;; when the simulation is started.
978 (joint-dispatch constraints
979 control-a control-b
980 pivot-a pivot-b
981 joint-rotation))))
982 #+end_src
983 #+end_listing
985 In general, whenever =CORTEX= exposes a sense (or in this case
986 physicality), it provides a function of the type =sense!=, which
987 takes in a collection of nodes and augments it to support that
988 sense. The function returns any controlls necessary to use that
989 sense. In this case =body!= cerates a physical body and returns no
990 control functions.
992 #+caption: Program to give joints to a creature.
993 #+name: name
994 #+begin_listing clojure
995 #+begin_src clojure
996 (defn joints!
997 "Connect the solid parts of the creature with physical joints. The
998 joints are taken from the \"joints\" node in the creature."
999 [#^Node creature]
1000 (dorun
1001 (map
1002 (fn [joint]
1003 (let [[obj-a obj-b] (joint-targets creature joint)]
1004 (connect obj-a obj-b joint)))
1005 (joints creature))))
1006 (defn body!
1007 "Endow the creature with a physical body connected with joints. The
1008 particulars of the joints and the masses of each body part are
1009 determined in blender."
1010 [#^Node creature]
1011 (physical! creature)
1012 (joints! creature))
1013 #+end_src
1014 #+end_listing
1016 All of the code you have just seen amounts to only 130 lines, yet
1017 because it builds on top of Blender and jMonkeyEngine3, those few
1018 lines pack quite a punch!
1020 The hand from figure \ref{blender-hand}, which was modeled after
1021 my own right hand, can now be given joints and simulated as a
1022 creature.
1024 #+caption: With the ability to create physical creatures from blender,
1025 #+caption: =CORTEX= gets one step closer to becomming a full creature
1026 #+caption: simulation environment.
1027 #+name: name
1028 #+ATTR_LaTeX: :width 15cm
1029 [[./images/physical-hand.png]]
1031 ** Sight reuses standard video game components...
1033 Vision is one of the most important senses for humans, so I need to
1034 build a simulated sense of vision for my AI. I will do this with
1035 simulated eyes. Each eye can be independently moved and should see
1036 its own version of the world depending on where it is.
1038 Making these simulated eyes a reality is simple because
1039 jMonkeyEngine already contains extensive support for multiple views
1040 of the same 3D simulated world. The reason jMonkeyEngine has this
1041 support is because the support is necessary to create games with
1042 split-screen views. Multiple views are also used to create
1043 efficient pseudo-reflections by rendering the scene from a certain
1044 perspective and then projecting it back onto a surface in the 3D
1045 world.
1047 #+caption: jMonkeyEngine supports multiple views to enable
1048 #+caption: split-screen games, like GoldenEye, which was one of
1049 #+caption: the first games to use split-screen views.
1050 #+name: name
1051 #+ATTR_LaTeX: :width 10cm
1052 [[./images/goldeneye-4-player.png]]
1054 *** A Brief Description of jMonkeyEngine's Rendering Pipeline
1056 jMonkeyEngine allows you to create a =ViewPort=, which represents a
1057 view of the simulated world. You can create as many of these as you
1058 want. Every frame, the =RenderManager= iterates through each
1059 =ViewPort=, rendering the scene in the GPU. For each =ViewPort= there
1060 is a =FrameBuffer= which represents the rendered image in the GPU.
1062 #+caption: =ViewPorts= are cameras in the world. During each frame,
1063 #+caption: the =RenderManager= records a snapshot of what each view
1064 #+caption: is currently seeing; these snapshots are =FrameBuffer= objects.
1065 #+name: rendermanagers
1066 #+ATTR_LaTeX: :width 10cm
1067 [[./images/diagram_rendermanager2.png]]
1069 Each =ViewPort= can have any number of attached =SceneProcessor=
1070 objects, which are called every time a new frame is rendered. A
1071 =SceneProcessor= receives its =ViewPort's= =FrameBuffer= and can do
1072 whatever it wants to the data. Often this consists of invoking GPU
1073 specific operations on the rendered image. The =SceneProcessor= can
1074 also copy the GPU image data to RAM and process it with the CPU.
1076 *** Appropriating Views for Vision
1078 Each eye in the simulated creature needs its own =ViewPort= so
1079 that it can see the world from its own perspective. To this
1080 =ViewPort=, I add a =SceneProcessor= that feeds the visual data to
1081 any arbitrary continuation function for further processing. That
1082 continuation function may perform both CPU and GPU operations on
1083 the data. To make this easy for the continuation function, the
1084 =SceneProcessor= maintains appropriately sized buffers in RAM to
1085 hold the data. It does not do any copying from the GPU to the CPU
1086 itself because it is a slow operation.
1088 #+caption: Function to make the rendered secne in jMonkeyEngine
1089 #+caption: available for further processing.
1090 #+name: pipeline-1
1091 #+begin_listing clojure
1092 #+begin_src clojure
1093 (defn vision-pipeline
1094 "Create a SceneProcessor object which wraps a vision processing
1095 continuation function. The continuation is a function that takes
1096 [#^Renderer r #^FrameBuffer fb #^ByteBuffer b #^BufferedImage bi],
1097 each of which has already been appropriately sized."
1098 [continuation]
1099 (let [byte-buffer (atom nil)
1100 renderer (atom nil)
1101 image (atom nil)]
1102 (proxy [SceneProcessor] []
1103 (initialize
1104 [renderManager viewPort]
1105 (let [cam (.getCamera viewPort)
1106 width (.getWidth cam)
1107 height (.getHeight cam)]
1108 (reset! renderer (.getRenderer renderManager))
1109 (reset! byte-buffer
1110 (BufferUtils/createByteBuffer
1111 (* width height 4)))
1112 (reset! image (BufferedImage.
1113 width height
1114 BufferedImage/TYPE_4BYTE_ABGR))))
1115 (isInitialized [] (not (nil? @byte-buffer)))
1116 (reshape [_ _ _])
1117 (preFrame [_])
1118 (postQueue [_])
1119 (postFrame
1120 [#^FrameBuffer fb]
1121 (.clear @byte-buffer)
1122 (continuation @renderer fb @byte-buffer @image))
1123 (cleanup []))))
1124 #+end_src
1125 #+end_listing
1127 The continuation function given to =vision-pipeline= above will be
1128 given a =Renderer= and three containers for image data. The
1129 =FrameBuffer= references the GPU image data, but the pixel data
1130 can not be used directly on the CPU. The =ByteBuffer= and
1131 =BufferedImage= are initially "empty" but are sized to hold the
1132 data in the =FrameBuffer=. I call transferring the GPU image data
1133 to the CPU structures "mixing" the image data.
1135 *** Optical sensor arrays are described with images and referenced with metadata
1137 The vision pipeline described above handles the flow of rendered
1138 images. Now, =CORTEX= needs simulated eyes to serve as the source
1139 of these images.
1141 An eye is described in blender in the same way as a joint. They
1142 are zero dimensional empty objects with no geometry whose local
1143 coordinate system determines the orientation of the resulting eye.
1144 All eyes are children of a parent node named "eyes" just as all
1145 joints have a parent named "joints". An eye binds to the nearest
1146 physical object with =bind-sense=.
1148 #+caption: Here, the camera is created based on metadata on the
1149 #+caption: eye-node and attached to the nearest physical object
1150 #+caption: with =bind-sense=
1151 #+name: add-eye
1152 #+begin_listing clojure
1153 (defn add-eye!
1154 "Create a Camera centered on the current position of 'eye which
1155 follows the closest physical node in 'creature. The camera will
1156 point in the X direction and use the Z vector as up as determined
1157 by the rotation of these vectors in blender coordinate space. Use
1158 XZY rotation for the node in blender."
1159 [#^Node creature #^Spatial eye]
1160 (let [target (closest-node creature eye)
1161 [cam-width cam-height]
1162 ;;[640 480] ;; graphics card on laptop doesn't support
1163 ;; arbitray dimensions.
1164 (eye-dimensions eye)
1165 cam (Camera. cam-width cam-height)
1166 rot (.getWorldRotation eye)]
1167 (.setLocation cam (.getWorldTranslation eye))
1168 (.lookAtDirection
1169 cam ; this part is not a mistake and
1170 (.mult rot Vector3f/UNIT_X) ; is consistent with using Z in
1171 (.mult rot Vector3f/UNIT_Y)) ; blender as the UP vector.
1172 (.setFrustumPerspective
1173 cam (float 45)
1174 (float (/ (.getWidth cam) (.getHeight cam)))
1175 (float 1)
1176 (float 1000))
1177 (bind-sense target cam) cam))
1178 #+end_listing
1180 *** Simulated Retina
1182 An eye is a surface (the retina) which contains many discrete
1183 sensors to detect light. These sensors can have different
1184 light-sensing properties. In humans, each discrete sensor is
1185 sensitive to red, blue, green, or gray. These different types of
1186 sensors can have different spatial distributions along the retina.
1187 In humans, there is a fovea in the center of the retina which has
1188 a very high density of color sensors, and a blind spot which has
1189 no sensors at all. Sensor density decreases in proportion to
1190 distance from the fovea.
1192 I want to be able to model any retinal configuration, so my
1193 eye-nodes in blender contain metadata pointing to images that
1194 describe the precise position of the individual sensors using
1195 white pixels. The meta-data also describes the precise sensitivity
1196 to light that the sensors described in the image have. An eye can
1197 contain any number of these images. For example, the metadata for
1198 an eye might look like this:
1200 #+begin_src clojure
1201 {0xFF0000 "Models/test-creature/retina-small.png"}
1202 #+end_src
1204 #+caption: An example retinal profile image. White pixels are
1205 #+caption: photo-sensitive elements. The distribution of white
1206 #+caption: pixels is denser in the middle and falls off at the
1207 #+caption: edges and is inspired by the human retina.
1208 #+name: retina
1209 #+ATTR_LaTeX: :width 7cm
1210 [[./images/retina-small.png]]
1212 Together, the number 0xFF0000 and the image image above describe
1213 the placement of red-sensitive sensory elements.
1215 Meta-data to very crudely approximate a human eye might be
1216 something like this:
1218 #+begin_src clojure
1219 (let [retinal-profile "Models/test-creature/retina-small.png"]
1220 {0xFF0000 retinal-profile
1221 0x00FF00 retinal-profile
1222 0x0000FF retinal-profile
1223 0xFFFFFF retinal-profile})
1224 #+end_src
1226 The numbers that serve as keys in the map determine a sensor's
1227 relative sensitivity to the channels red, green, and blue. These
1228 sensitivity values are packed into an integer in the order
1229 =|_|R|G|B|= in 8-bit fields. The RGB values of a pixel in the
1230 image are added together with these sensitivities as linear
1231 weights. Therefore, 0xFF0000 means sensitive to red only while
1232 0xFFFFFF means sensitive to all colors equally (gray).
1234 #+caption: This is the core of vision in =CORTEX=. A given eye node
1235 #+caption: is converted into a function that returns visual
1236 #+caption: information from the simulation.
1237 #+name: vision-kernel
1238 #+begin_listing clojure
1239 #+BEGIN_SRC clojure
1240 (defn vision-kernel
1241 "Returns a list of functions, each of which will return a color
1242 channel's worth of visual information when called inside a running
1243 simulation."
1244 [#^Node creature #^Spatial eye & {skip :skip :or {skip 0}}]
1245 (let [retinal-map (retina-sensor-profile eye)
1246 camera (add-eye! creature eye)
1247 vision-image
1248 (atom
1249 (BufferedImage. (.getWidth camera)
1250 (.getHeight camera)
1251 BufferedImage/TYPE_BYTE_BINARY))
1252 register-eye!
1253 (runonce
1254 (fn [world]
1255 (add-camera!
1256 world camera
1257 (let [counter (atom 0)]
1258 (fn [r fb bb bi]
1259 (if (zero? (rem (swap! counter inc) (inc skip)))
1260 (reset! vision-image
1261 (BufferedImage! r fb bb bi))))))))]
1262 (vec
1263 (map
1264 (fn [[key image]]
1265 (let [whites (white-coordinates image)
1266 topology (vec (collapse whites))
1267 sensitivity (sensitivity-presets key key)]
1268 (attached-viewport.
1269 (fn [world]
1270 (register-eye! world)
1271 (vector
1272 topology
1273 (vec
1274 (for [[x y] whites]
1275 (pixel-sense
1276 sensitivity
1277 (.getRGB @vision-image x y))))))
1278 register-eye!)))
1279 retinal-map))))
1280 #+END_SRC
1281 #+end_listing
1283 Note that since each of the functions generated by =vision-kernel=
1284 shares the same =register-eye!= function, the eye will be
1285 registered only once the first time any of the functions from the
1286 list returned by =vision-kernel= is called. Each of the functions
1287 returned by =vision-kernel= also allows access to the =Viewport=
1288 through which it receives images.
1290 All the hard work has been done; all that remains is to apply
1291 =vision-kernel= to each eye in the creature and gather the results
1292 into one list of functions.
1295 #+caption: With =vision!=, =CORTEX= is already a fine simulation
1296 #+caption: environment for experimenting with different types of
1297 #+caption: eyes.
1298 #+name: vision!
1299 #+begin_listing clojure
1300 #+BEGIN_SRC clojure
1301 (defn vision!
1302 "Returns a list of functions, each of which returns visual sensory
1303 data when called inside a running simulation."
1304 [#^Node creature & {skip :skip :or {skip 0}}]
1305 (reduce
1306 concat
1307 (for [eye (eyes creature)]
1308 (vision-kernel creature eye))))
1309 #+END_SRC
1310 #+end_listing
1312 #+caption: Simulated vision with a test creature and the
1313 #+caption: human-like eye approximation. Notice how each channel
1314 #+caption: of the eye responds differently to the differently
1315 #+caption: colored balls.
1316 #+name: worm-vision-test.
1317 #+ATTR_LaTeX: :width 13cm
1318 [[./images/worm-vision.png]]
1320 The vision code is not much more complicated than the body code,
1321 and enables multiple further paths for simulated vision. For
1322 example, it is quite easy to create bifocal vision -- you just
1323 make two eyes next to each other in blender! It is also possible
1324 to encode vision transforms in the retinal files. For example, the
1325 human like retina file in figure \ref{retina} approximates a
1326 log-polar transform.
1328 This vision code has already been absorbed by the jMonkeyEngine
1329 community and is now (in modified form) part of a system for
1330 capturing in-game video to a file.
1332 ** ...but hearing must be built from scratch
1334 At the end of this section I will have simulated ears that work the
1335 same way as the simulated eyes in the last section. I will be able to
1336 place any number of ear-nodes in a blender file, and they will bind to
1337 the closest physical object and follow it as it moves around. Each ear
1338 will provide access to the sound data it picks up between every frame.
1340 Hearing is one of the more difficult senses to simulate, because there
1341 is less support for obtaining the actual sound data that is processed
1342 by jMonkeyEngine3. There is no "split-screen" support for rendering
1343 sound from different points of view, and there is no way to directly
1344 access the rendered sound data.
1346 =CORTEX='s hearing is unique because it does not have any
1347 limitations compared to other simulation environments. As far as I
1348 know, there is no other system that supports multiple listerers,
1349 and the sound demo at the end of this section is the first time
1350 it's been done in a video game environment.
1352 *** Brief Description of jMonkeyEngine's Sound System
1354 jMonkeyEngine's sound system works as follows:
1356 - jMonkeyEngine uses the =AppSettings= for the particular
1357 application to determine what sort of =AudioRenderer= should be
1358 used.
1359 - Although some support is provided for multiple AudioRendering
1360 backends, jMonkeyEngine at the time of this writing will either
1361 pick no =AudioRenderer= at all, or the =LwjglAudioRenderer=.
1362 - jMonkeyEngine tries to figure out what sort of system you're
1363 running and extracts the appropriate native libraries.
1364 - The =LwjglAudioRenderer= uses the [[http://lwjgl.org/][=LWJGL=]] (LightWeight Java Game
1365 Library) bindings to interface with a C library called [[http://kcat.strangesoft.net/openal.html][=OpenAL=]]
1366 - =OpenAL= renders the 3D sound and feeds the rendered sound
1367 directly to any of various sound output devices with which it
1368 knows how to communicate.
1370 A consequence of this is that there's no way to access the actual
1371 sound data produced by =OpenAL=. Even worse, =OpenAL= only supports
1372 one /listener/ (it renders sound data from only one perspective),
1373 which normally isn't a problem for games, but becomes a problem
1374 when trying to make multiple AI creatures that can each hear the
1375 world from a different perspective.
1377 To make many AI creatures in jMonkeyEngine that can each hear the
1378 world from their own perspective, or to make a single creature with
1379 many ears, it is necessary to go all the way back to =OpenAL= and
1380 implement support for simulated hearing there.
1382 *** Extending =OpenAl=
1384 Extending =OpenAL= to support multiple listeners requires 500
1385 lines of =C= code and is too hairy to mention here. Instead, I
1386 will show a small amount of extension code and go over the high
1387 level stragety. Full source is of course available with the
1388 =CORTEX= distribution if you're interested.
1390 =OpenAL= goes to great lengths to support many different systems,
1391 all with different sound capabilities and interfaces. It
1392 accomplishes this difficult task by providing code for many
1393 different sound backends in pseudo-objects called /Devices/.
1394 There's a device for the Linux Open Sound System and the Advanced
1395 Linux Sound Architecture, there's one for Direct Sound on Windows,
1396 and there's even one for Solaris. =OpenAL= solves the problem of
1397 platform independence by providing all these Devices.
1399 Wrapper libraries such as LWJGL are free to examine the system on
1400 which they are running and then select an appropriate device for
1401 that system.
1403 There are also a few "special" devices that don't interface with
1404 any particular system. These include the Null Device, which
1405 doesn't do anything, and the Wave Device, which writes whatever
1406 sound it receives to a file, if everything has been set up
1407 correctly when configuring =OpenAL=.
1409 Actual mixing (doppler shift and distance.environment-based
1410 attenuation) of the sound data happens in the Devices, and they
1411 are the only point in the sound rendering process where this data
1412 is available.
1414 Therefore, in order to support multiple listeners, and get the
1415 sound data in a form that the AIs can use, it is necessary to
1416 create a new Device which supports this feature.
1418 Adding a device to OpenAL is rather tricky -- there are five
1419 separate files in the =OpenAL= source tree that must be modified
1420 to do so. I named my device the "Multiple Audio Send" Device, or
1421 =Send= Device for short, since it sends audio data back to the
1422 calling application like an Aux-Send cable on a mixing board.
1424 The main idea behind the Send device is to take advantage of the
1425 fact that LWJGL only manages one /context/ when using OpenAL. A
1426 /context/ is like a container that holds samples and keeps track
1427 of where the listener is. In order to support multiple listeners,
1428 the Send device identifies the LWJGL context as the master
1429 context, and creates any number of slave contexts to represent
1430 additional listeners. Every time the device renders sound, it
1431 synchronizes every source from the master LWJGL context to the
1432 slave contexts. Then, it renders each context separately, using a
1433 different listener for each one. The rendered sound is made
1434 available via JNI to jMonkeyEngine.
1436 Switching between contexts is not the normal operation of a
1437 Device, and one of the problems with doing so is that a Device
1438 normally keeps around a few pieces of state such as the
1439 =ClickRemoval= array above which will become corrupted if the
1440 contexts are not rendered in parallel. The solution is to create a
1441 copy of this normally global device state for each context, and
1442 copy it back and forth into and out of the actual device state
1443 whenever a context is rendered.
1445 The core of the =Send= device is the =syncSources= function, which
1446 does the job of copying all relevant data from one context to
1447 another.
1449 #+caption: Program for extending =OpenAL= to support multiple
1450 #+caption: listeners via context copying/switching.
1451 #+name: sync-openal-sources
1452 #+begin_listing c
1453 #+BEGIN_SRC c
1454 void syncSources(ALsource *masterSource, ALsource *slaveSource,
1455 ALCcontext *masterCtx, ALCcontext *slaveCtx){
1456 ALuint master = masterSource->source;
1457 ALuint slave = slaveSource->source;
1458 ALCcontext *current = alcGetCurrentContext();
1460 syncSourcef(master,slave,masterCtx,slaveCtx,AL_PITCH);
1461 syncSourcef(master,slave,masterCtx,slaveCtx,AL_GAIN);
1462 syncSourcef(master,slave,masterCtx,slaveCtx,AL_MAX_DISTANCE);
1463 syncSourcef(master,slave,masterCtx,slaveCtx,AL_ROLLOFF_FACTOR);
1464 syncSourcef(master,slave,masterCtx,slaveCtx,AL_REFERENCE_DISTANCE);
1465 syncSourcef(master,slave,masterCtx,slaveCtx,AL_MIN_GAIN);
1466 syncSourcef(master,slave,masterCtx,slaveCtx,AL_MAX_GAIN);
1467 syncSourcef(master,slave,masterCtx,slaveCtx,AL_CONE_OUTER_GAIN);
1468 syncSourcef(master,slave,masterCtx,slaveCtx,AL_CONE_INNER_ANGLE);
1469 syncSourcef(master,slave,masterCtx,slaveCtx,AL_CONE_OUTER_ANGLE);
1470 syncSourcef(master,slave,masterCtx,slaveCtx,AL_SEC_OFFSET);
1471 syncSourcef(master,slave,masterCtx,slaveCtx,AL_SAMPLE_OFFSET);
1472 syncSourcef(master,slave,masterCtx,slaveCtx,AL_BYTE_OFFSET);
1474 syncSource3f(master,slave,masterCtx,slaveCtx,AL_POSITION);
1475 syncSource3f(master,slave,masterCtx,slaveCtx,AL_VELOCITY);
1476 syncSource3f(master,slave,masterCtx,slaveCtx,AL_DIRECTION);
1478 syncSourcei(master,slave,masterCtx,slaveCtx,AL_SOURCE_RELATIVE);
1479 syncSourcei(master,slave,masterCtx,slaveCtx,AL_LOOPING);
1481 alcMakeContextCurrent(masterCtx);
1482 ALint source_type;
1483 alGetSourcei(master, AL_SOURCE_TYPE, &source_type);
1485 // Only static sources are currently synchronized!
1486 if (AL_STATIC == source_type){
1487 ALint master_buffer;
1488 ALint slave_buffer;
1489 alGetSourcei(master, AL_BUFFER, &master_buffer);
1490 alcMakeContextCurrent(slaveCtx);
1491 alGetSourcei(slave, AL_BUFFER, &slave_buffer);
1492 if (master_buffer != slave_buffer){
1493 alSourcei(slave, AL_BUFFER, master_buffer);
1497 // Synchronize the state of the two sources.
1498 alcMakeContextCurrent(masterCtx);
1499 ALint masterState;
1500 ALint slaveState;
1502 alGetSourcei(master, AL_SOURCE_STATE, &masterState);
1503 alcMakeContextCurrent(slaveCtx);
1504 alGetSourcei(slave, AL_SOURCE_STATE, &slaveState);
1506 if (masterState != slaveState){
1507 switch (masterState){
1508 case AL_INITIAL : alSourceRewind(slave); break;
1509 case AL_PLAYING : alSourcePlay(slave); break;
1510 case AL_PAUSED : alSourcePause(slave); break;
1511 case AL_STOPPED : alSourceStop(slave); break;
1514 // Restore whatever context was previously active.
1515 alcMakeContextCurrent(current);
1517 #+END_SRC
1518 #+end_listing
1520 With this special context-switching device, and some ugly JNI
1521 bindings that are not worth mentioning, =CORTEX= gains the ability
1522 to access multiple sound streams from =OpenAL=.
1524 #+caption: Program to create an ear from a blender empty node. The ear
1525 #+caption: follows around the nearest physical object and passes
1526 #+caption: all sensory data to a continuation function.
1527 #+name: add-ear
1528 #+begin_listing clojure
1529 #+BEGIN_SRC clojure
1530 (defn add-ear!
1531 "Create a Listener centered on the current position of 'ear
1532 which follows the closest physical node in 'creature and
1533 sends sound data to 'continuation."
1534 [#^Application world #^Node creature #^Spatial ear continuation]
1535 (let [target (closest-node creature ear)
1536 lis (Listener.)
1537 audio-renderer (.getAudioRenderer world)
1538 sp (hearing-pipeline continuation)]
1539 (.setLocation lis (.getWorldTranslation ear))
1540 (.setRotation lis (.getWorldRotation ear))
1541 (bind-sense target lis)
1542 (update-listener-velocity! target lis)
1543 (.addListener audio-renderer lis)
1544 (.registerSoundProcessor audio-renderer lis sp)))
1545 #+END_SRC
1546 #+end_listing
1548 The =Send= device, unlike most of the other devices in =OpenAL=,
1549 does not render sound unless asked. This enables the system to
1550 slow down or speed up depending on the needs of the AIs who are
1551 using it to listen. If the device tried to render samples in
1552 real-time, a complicated AI whose mind takes 100 seconds of
1553 computer time to simulate 1 second of AI-time would miss almost
1554 all of the sound in its environment!
1556 #+caption: Program to enable arbitrary hearing in =CORTEX=
1557 #+name: hearing
1558 #+begin_listing clojure
1559 #+BEGIN_SRC clojure
1560 (defn hearing-kernel
1561 "Returns a function which returns auditory sensory data when called
1562 inside a running simulation."
1563 [#^Node creature #^Spatial ear]
1564 (let [hearing-data (atom [])
1565 register-listener!
1566 (runonce
1567 (fn [#^Application world]
1568 (add-ear!
1569 world creature ear
1570 (comp #(reset! hearing-data %)
1571 byteBuffer->pulse-vector))))]
1572 (fn [#^Application world]
1573 (register-listener! world)
1574 (let [data @hearing-data
1575 topology
1576 (vec (map #(vector % 0) (range 0 (count data))))]
1577 [topology data]))))
1579 (defn hearing!
1580 "Endow the creature in a particular world with the sense of
1581 hearing. Will return a sequence of functions, one for each ear,
1582 which when called will return the auditory data from that ear."
1583 [#^Node creature]
1584 (for [ear (ears creature)]
1585 (hearing-kernel creature ear)))
1586 #+END_SRC
1587 #+end_listing
1589 Armed with these functions, =CORTEX= is able to test possibly the
1590 first ever instance of multiple listeners in a video game engine
1591 based simulation!
1593 #+caption: Here a simple creature responds to sound by changing
1594 #+caption: its color from gray to green when the total volume
1595 #+caption: goes over a threshold.
1596 #+name: sound-test
1597 #+begin_listing java
1598 #+BEGIN_SRC java
1599 /**
1600 * Respond to sound! This is the brain of an AI entity that
1601 * hears its surroundings and reacts to them.
1602 */
1603 public void process(ByteBuffer audioSamples,
1604 int numSamples, AudioFormat format) {
1605 audioSamples.clear();
1606 byte[] data = new byte[numSamples];
1607 float[] out = new float[numSamples];
1608 audioSamples.get(data);
1609 FloatSampleTools.
1610 byte2floatInterleaved
1611 (data, 0, out, 0, numSamples/format.getFrameSize(), format);
1613 float max = Float.NEGATIVE_INFINITY;
1614 for (float f : out){if (f > max) max = f;}
1615 audioSamples.clear();
1617 if (max > 0.1){
1618 entity.getMaterial().setColor("Color", ColorRGBA.Green);
1620 else {
1621 entity.getMaterial().setColor("Color", ColorRGBA.Gray);
1623 #+END_SRC
1624 #+end_listing
1626 #+caption: First ever simulation of multiple listerners in =CORTEX=.
1627 #+caption: Each cube is a creature which processes sound data with
1628 #+caption: the =process= function from listing \ref{sound-test}.
1629 #+caption: the ball is constantally emiting a pure tone of
1630 #+caption: constant volume. As it approaches the cubes, they each
1631 #+caption: change color in response to the sound.
1632 #+name: sound-cubes.
1633 #+ATTR_LaTeX: :width 10cm
1634 [[./images/java-hearing-test.png]]
1636 This system of hearing has also been co-opted by the
1637 jMonkeyEngine3 community and is used to record audio for demo
1638 videos.
1640 ** Hundreds of hair-like elements provide a sense of touch
1642 Touch is critical to navigation and spatial reasoning and as such I
1643 need a simulated version of it to give to my AI creatures.
1645 Human skin has a wide array of touch sensors, each of which
1646 specialize in detecting different vibrational modes and pressures.
1647 These sensors can integrate a vast expanse of skin (i.e. your
1648 entire palm), or a tiny patch of skin at the tip of your finger.
1649 The hairs of the skin help detect objects before they even come
1650 into contact with the skin proper.
1652 However, touch in my simulated world can not exactly correspond to
1653 human touch because my creatures are made out of completely rigid
1654 segments that don't deform like human skin.
1656 Instead of measuring deformation or vibration, I surround each
1657 rigid part with a plenitude of hair-like objects (/feelers/) which
1658 do not interact with the physical world. Physical objects can pass
1659 through them with no effect. The feelers are able to tell when
1660 other objects pass through them, and they constantly report how
1661 much of their extent is covered. So even though the creature's body
1662 parts do not deform, the feelers create a margin around those body
1663 parts which achieves a sense of touch which is a hybrid between a
1664 human's sense of deformation and sense from hairs.
1666 Implementing touch in jMonkeyEngine follows a different technical
1667 route than vision and hearing. Those two senses piggybacked off
1668 jMonkeyEngine's 3D audio and video rendering subsystems. To
1669 simulate touch, I use jMonkeyEngine's physics system to execute
1670 many small collision detections, one for each feeler. The placement
1671 of the feelers is determined by a UV-mapped image which shows where
1672 each feeler should be on the 3D surface of the body.
1674 *** Defining Touch Meta-Data in Blender
1676 Each geometry can have a single UV map which describes the
1677 position of the feelers which will constitute its sense of touch.
1678 This image path is stored under the ``touch'' key. The image itself
1679 is black and white, with black meaning a feeler length of 0 (no
1680 feeler is present) and white meaning a feeler length of =scale=,
1681 which is a float stored under the key "scale".
1683 #+caption: Touch does not use empty nodes, to store metadata,
1684 #+caption: because the metadata of each solid part of a
1685 #+caption: creature's body is sufficient.
1686 #+name: touch-meta-data
1687 #+begin_listing clojure
1688 #+BEGIN_SRC clojure
1689 (defn tactile-sensor-profile
1690 "Return the touch-sensor distribution image in BufferedImage format,
1691 or nil if it does not exist."
1692 [#^Geometry obj]
1693 (if-let [image-path (meta-data obj "touch")]
1694 (load-image image-path)))
1696 (defn tactile-scale
1697 "Return the length of each feeler. Default scale is 0.01
1698 jMonkeyEngine units."
1699 [#^Geometry obj]
1700 (if-let [scale (meta-data obj "scale")]
1701 scale 0.1))
1702 #+END_SRC
1703 #+end_listing
1705 Here is an example of a UV-map which specifies the position of
1706 touch sensors along the surface of the upper segment of a fingertip.
1708 #+caption: This is the tactile-sensor-profile for the upper segment
1709 #+caption: of a fingertip. It defines regions of high touch sensitivity
1710 #+caption: (where there are many white pixels) and regions of low
1711 #+caption: sensitivity (where white pixels are sparse).
1712 #+name: fingertip-UV
1713 #+ATTR_LaTeX: :width 13cm
1714 [[./images/finger-UV.png]]
1716 *** Implementation Summary
1718 To simulate touch there are three conceptual steps. For each solid
1719 object in the creature, you first have to get UV image and scale
1720 parameter which define the position and length of the feelers.
1721 Then, you use the triangles which comprise the mesh and the UV
1722 data stored in the mesh to determine the world-space position and
1723 orientation of each feeler. Then once every frame, update these
1724 positions and orientations to match the current position and
1725 orientation of the object, and use physics collision detection to
1726 gather tactile data.
1728 Extracting the meta-data has already been described. The third
1729 step, physics collision detection, is handled in =touch-kernel=.
1730 Translating the positions and orientations of the feelers from the
1731 UV-map to world-space is itself a three-step process.
1733 - Find the triangles which make up the mesh in pixel-space and in
1734 world-space. \\(=triangles=, =pixel-triangles=).
1736 - Find the coordinates of each feeler in world-space. These are
1737 the origins of the feelers. (=feeler-origins=).
1739 - Calculate the normals of the triangles in world space, and add
1740 them to each of the origins of the feelers. These are the
1741 normalized coordinates of the tips of the feelers.
1742 (=feeler-tips=).
1744 *** Triangle Math
1746 The rigid objects which make up a creature have an underlying
1747 =Geometry=, which is a =Mesh= plus a =Material= and other
1748 important data involved with displaying the object.
1750 A =Mesh= is composed of =Triangles=, and each =Triangle= has three
1751 vertices which have coordinates in world space and UV space.
1753 Here, =triangles= gets all the world-space triangles which
1754 comprise a mesh, while =pixel-triangles= gets those same triangles
1755 expressed in pixel coordinates (which are UV coordinates scaled to
1756 fit the height and width of the UV image).
1758 #+caption: Programs to extract triangles from a geometry and get
1759 #+caption: their verticies in both world and UV-coordinates.
1760 #+name: get-triangles
1761 #+begin_listing clojure
1762 #+BEGIN_SRC clojure
1763 (defn triangle
1764 "Get the triangle specified by triangle-index from the mesh."
1765 [#^Geometry geo triangle-index]
1766 (triangle-seq
1767 (let [scratch (Triangle.)]
1768 (.getTriangle (.getMesh geo) triangle-index scratch) scratch)))
1770 (defn triangles
1771 "Return a sequence of all the Triangles which comprise a given
1772 Geometry."
1773 [#^Geometry geo]
1774 (map (partial triangle geo) (range (.getTriangleCount (.getMesh geo)))))
1776 (defn triangle-vertex-indices
1777 "Get the triangle vertex indices of a given triangle from a given
1778 mesh."
1779 [#^Mesh mesh triangle-index]
1780 (let [indices (int-array 3)]
1781 (.getTriangle mesh triangle-index indices)
1782 (vec indices)))
1784 (defn vertex-UV-coord
1785 "Get the UV-coordinates of the vertex named by vertex-index"
1786 [#^Mesh mesh vertex-index]
1787 (let [UV-buffer
1788 (.getData
1789 (.getBuffer
1790 mesh
1791 VertexBuffer$Type/TexCoord))]
1792 [(.get UV-buffer (* vertex-index 2))
1793 (.get UV-buffer (+ 1 (* vertex-index 2)))]))
1795 (defn pixel-triangle [#^Geometry geo image index]
1796 (let [mesh (.getMesh geo)
1797 width (.getWidth image)
1798 height (.getHeight image)]
1799 (vec (map (fn [[u v]] (vector (* width u) (* height v)))
1800 (map (partial vertex-UV-coord mesh)
1801 (triangle-vertex-indices mesh index))))))
1803 (defn pixel-triangles
1804 "The pixel-space triangles of the Geometry, in the same order as
1805 (triangles geo)"
1806 [#^Geometry geo image]
1807 (let [height (.getHeight image)
1808 width (.getWidth image)]
1809 (map (partial pixel-triangle geo image)
1810 (range (.getTriangleCount (.getMesh geo))))))
1811 #+END_SRC
1812 #+end_listing
1814 *** The Affine Transform from one Triangle to Another
1816 =pixel-triangles= gives us the mesh triangles expressed in pixel
1817 coordinates and =triangles= gives us the mesh triangles expressed
1818 in world coordinates. The tactile-sensor-profile gives the
1819 position of each feeler in pixel-space. In order to convert
1820 pixel-space coordinates into world-space coordinates we need
1821 something that takes coordinates on the surface of one triangle
1822 and gives the corresponding coordinates on the surface of another
1823 triangle.
1825 Triangles are [[http://mathworld.wolfram.com/AffineTransformation.html ][affine]], which means any triangle can be transformed
1826 into any other by a combination of translation, scaling, and
1827 rotation. The affine transformation from one triangle to another
1828 is readily computable if the triangle is expressed in terms of a
1829 $4x4$ matrix.
1831 #+BEGIN_LaTeX
1832 $$
1833 \begin{bmatrix}
1834 x_1 & x_2 & x_3 & n_x \\
1835 y_1 & y_2 & y_3 & n_y \\
1836 z_1 & z_2 & z_3 & n_z \\
1837 1 & 1 & 1 & 1
1838 \end{bmatrix}
1839 $$
1840 #+END_LaTeX
1842 Here, the first three columns of the matrix are the vertices of
1843 the triangle. The last column is the right-handed unit normal of
1844 the triangle.
1846 With two triangles $T_{1}$ and $T_{2}$ each expressed as a
1847 matrix like above, the affine transform from $T_{1}$ to $T_{2}$
1848 is $T_{2}T_{1}^{-1}$.
1850 The clojure code below recapitulates the formulas above, using
1851 jMonkeyEngine's =Matrix4f= objects, which can describe any affine
1852 transformation.
1854 #+caption: Program to interpert triangles as affine transforms.
1855 #+name: triangle-affine
1856 #+begin_listing clojure
1857 #+BEGIN_SRC clojure
1858 (defn triangle->matrix4f
1859 "Converts the triangle into a 4x4 matrix: The first three columns
1860 contain the vertices of the triangle; the last contains the unit
1861 normal of the triangle. The bottom row is filled with 1s."
1862 [#^Triangle t]
1863 (let [mat (Matrix4f.)
1864 [vert-1 vert-2 vert-3]
1865 (mapv #(.get t %) (range 3))
1866 unit-normal (do (.calculateNormal t)(.getNormal t))
1867 vertices [vert-1 vert-2 vert-3 unit-normal]]
1868 (dorun
1869 (for [row (range 4) col (range 3)]
1870 (do
1871 (.set mat col row (.get (vertices row) col))
1872 (.set mat 3 row 1)))) mat))
1874 (defn triangles->affine-transform
1875 "Returns the affine transformation that converts each vertex in the
1876 first triangle into the corresponding vertex in the second
1877 triangle."
1878 [#^Triangle tri-1 #^Triangle tri-2]
1879 (.mult
1880 (triangle->matrix4f tri-2)
1881 (.invert (triangle->matrix4f tri-1))))
1882 #+END_SRC
1883 #+end_listing
1885 *** Triangle Boundaries
1887 For efficiency's sake I will divide the tactile-profile image into
1888 small squares which inscribe each pixel-triangle, then extract the
1889 points which lie inside the triangle and map them to 3D-space using
1890 =triangle-transform= above. To do this I need a function,
1891 =convex-bounds= which finds the smallest box which inscribes a 2D
1892 triangle.
1894 =inside-triangle?= determines whether a point is inside a triangle
1895 in 2D pixel-space.
1897 #+caption: Program to efficiently determine point includion
1898 #+caption: in a triangle.
1899 #+name: in-triangle
1900 #+begin_listing clojure
1901 #+BEGIN_SRC clojure
1902 (defn convex-bounds
1903 "Returns the smallest square containing the given vertices, as a
1904 vector of integers [left top width height]."
1905 [verts]
1906 (let [xs (map first verts)
1907 ys (map second verts)
1908 x0 (Math/floor (apply min xs))
1909 y0 (Math/floor (apply min ys))
1910 x1 (Math/ceil (apply max xs))
1911 y1 (Math/ceil (apply max ys))]
1912 [x0 y0 (- x1 x0) (- y1 y0)]))
1914 (defn same-side?
1915 "Given the points p1 and p2 and the reference point ref, is point p
1916 on the same side of the line that goes through p1 and p2 as ref is?"
1917 [p1 p2 ref p]
1918 (<=
1920 (.dot
1921 (.cross (.subtract p2 p1) (.subtract p p1))
1922 (.cross (.subtract p2 p1) (.subtract ref p1)))))
1924 (defn inside-triangle?
1925 "Is the point inside the triangle?"
1926 {:author "Dylan Holmes"}
1927 [#^Triangle tri #^Vector3f p]
1928 (let [[vert-1 vert-2 vert-3] [(.get1 tri) (.get2 tri) (.get3 tri)]]
1929 (and
1930 (same-side? vert-1 vert-2 vert-3 p)
1931 (same-side? vert-2 vert-3 vert-1 p)
1932 (same-side? vert-3 vert-1 vert-2 p))))
1933 #+END_SRC
1934 #+end_listing
1936 *** Feeler Coordinates
1938 The triangle-related functions above make short work of
1939 calculating the positions and orientations of each feeler in
1940 world-space.
1942 #+caption: Program to get the coordinates of ``feelers '' in
1943 #+caption: both world and UV-coordinates.
1944 #+name: feeler-coordinates
1945 #+begin_listing clojure
1946 #+BEGIN_SRC clojure
1947 (defn feeler-pixel-coords
1948 "Returns the coordinates of the feelers in pixel space in lists, one
1949 list for each triangle, ordered in the same way as (triangles) and
1950 (pixel-triangles)."
1951 [#^Geometry geo image]
1952 (map
1953 (fn [pixel-triangle]
1954 (filter
1955 (fn [coord]
1956 (inside-triangle? (->triangle pixel-triangle)
1957 (->vector3f coord)))
1958 (white-coordinates image (convex-bounds pixel-triangle))))
1959 (pixel-triangles geo image)))
1961 (defn feeler-world-coords
1962 "Returns the coordinates of the feelers in world space in lists, one
1963 list for each triangle, ordered in the same way as (triangles) and
1964 (pixel-triangles)."
1965 [#^Geometry geo image]
1966 (let [transforms
1967 (map #(triangles->affine-transform
1968 (->triangle %1) (->triangle %2))
1969 (pixel-triangles geo image)
1970 (triangles geo))]
1971 (map (fn [transform coords]
1972 (map #(.mult transform (->vector3f %)) coords))
1973 transforms (feeler-pixel-coords geo image))))
1974 #+END_SRC
1975 #+end_listing
1977 #+caption: Program to get the position of the base and tip of
1978 #+caption: each ``feeler''
1979 #+name: feeler-tips
1980 #+begin_listing clojure
1981 #+BEGIN_SRC clojure
1982 (defn feeler-origins
1983 "The world space coordinates of the root of each feeler."
1984 [#^Geometry geo image]
1985 (reduce concat (feeler-world-coords geo image)))
1987 (defn feeler-tips
1988 "The world space coordinates of the tip of each feeler."
1989 [#^Geometry geo image]
1990 (let [world-coords (feeler-world-coords geo image)
1991 normals
1992 (map
1993 (fn [triangle]
1994 (.calculateNormal triangle)
1995 (.clone (.getNormal triangle)))
1996 (map ->triangle (triangles geo)))]
1998 (mapcat (fn [origins normal]
1999 (map #(.add % normal) origins))
2000 world-coords normals)))
2002 (defn touch-topology
2003 [#^Geometry geo image]
2004 (collapse (reduce concat (feeler-pixel-coords geo image))))
2005 #+END_SRC
2006 #+end_listing
2008 *** Simulated Touch
2010 Now that the functions to construct feelers are complete,
2011 =touch-kernel= generates functions to be called from within a
2012 simulation that perform the necessary physics collisions to
2013 collect tactile data, and =touch!= recursively applies it to every
2014 node in the creature.
2016 #+caption: Efficient program to transform a ray from
2017 #+caption: one position to another.
2018 #+name: set-ray
2019 #+begin_listing clojure
2020 #+BEGIN_SRC clojure
2021 (defn set-ray [#^Ray ray #^Matrix4f transform
2022 #^Vector3f origin #^Vector3f tip]
2023 ;; Doing everything locally reduces garbage collection by enough to
2024 ;; be worth it.
2025 (.mult transform origin (.getOrigin ray))
2026 (.mult transform tip (.getDirection ray))
2027 (.subtractLocal (.getDirection ray) (.getOrigin ray))
2028 (.normalizeLocal (.getDirection ray)))
2029 #+END_SRC
2030 #+end_listing
2032 #+caption: This is the core of touch in =CORTEX= each feeler
2033 #+caption: follows the object it is bound to, reporting any
2034 #+caption: collisions that may happen.
2035 #+name: touch-kernel
2036 #+begin_listing clojure
2037 #+BEGIN_SRC clojure
2038 (defn touch-kernel
2039 "Constructs a function which will return tactile sensory data from
2040 'geo when called from inside a running simulation"
2041 [#^Geometry geo]
2042 (if-let
2043 [profile (tactile-sensor-profile geo)]
2044 (let [ray-reference-origins (feeler-origins geo profile)
2045 ray-reference-tips (feeler-tips geo profile)
2046 ray-length (tactile-scale geo)
2047 current-rays (map (fn [_] (Ray.)) ray-reference-origins)
2048 topology (touch-topology geo profile)
2049 correction (float (* ray-length -0.2))]
2050 ;; slight tolerance for very close collisions.
2051 (dorun
2052 (map (fn [origin tip]
2053 (.addLocal origin (.mult (.subtract tip origin)
2054 correction)))
2055 ray-reference-origins ray-reference-tips))
2056 (dorun (map #(.setLimit % ray-length) current-rays))
2057 (fn [node]
2058 (let [transform (.getWorldMatrix geo)]
2059 (dorun
2060 (map (fn [ray ref-origin ref-tip]
2061 (set-ray ray transform ref-origin ref-tip))
2062 current-rays ray-reference-origins
2063 ray-reference-tips))
2064 (vector
2065 topology
2066 (vec
2067 (for [ray current-rays]
2068 (do
2069 (let [results (CollisionResults.)]
2070 (.collideWith node ray results)
2071 (let [touch-objects
2072 (filter #(not (= geo (.getGeometry %)))
2073 results)
2074 limit (.getLimit ray)]
2075 [(if (empty? touch-objects)
2076 limit
2077 (let [response
2078 (apply min (map #(.getDistance %)
2079 touch-objects))]
2080 (FastMath/clamp
2081 (float
2082 (if (> response limit) (float 0.0)
2083 (+ response correction)))
2084 (float 0.0)
2085 limit)))
2086 limit])))))))))))
2087 #+END_SRC
2088 #+end_listing
2090 Armed with the =touch!= function, =CORTEX= becomes capable of
2091 giving creatures a sense of touch. A simple test is to create a
2092 cube that is outfitted with a uniform distrubition of touch
2093 sensors. It can feel the ground and any balls that it touches.
2095 #+caption: =CORTEX= interface for creating touch in a simulated
2096 #+caption: creature.
2097 #+name: touch
2098 #+begin_listing clojure
2099 #+BEGIN_SRC clojure
2100 (defn touch!
2101 "Endow the creature with the sense of touch. Returns a sequence of
2102 functions, one for each body part with a tactile-sensor-profile,
2103 each of which when called returns sensory data for that body part."
2104 [#^Node creature]
2105 (filter
2106 (comp not nil?)
2107 (map touch-kernel
2108 (filter #(isa? (class %) Geometry)
2109 (node-seq creature)))))
2110 #+END_SRC
2111 #+end_listing
2113 The tactile-sensor-profile image for the touch cube is a simple
2114 cross with a unifom distribution of touch sensors:
2116 #+caption: The touch profile for the touch-cube. Each pure white
2117 #+caption: pixel defines a touch sensitive feeler.
2118 #+name: touch-cube-uv-map
2119 #+ATTR_LaTeX: :width 7cm
2120 [[./images/touch-profile.png]]
2122 #+caption: The touch cube reacts to canonballs. The black, red,
2123 #+caption: and white cross on the right is a visual display of
2124 #+caption: the creature's touch. White means that it is feeling
2125 #+caption: something strongly, black is not feeling anything,
2126 #+caption: and gray is in-between. The cube can feel both the
2127 #+caption: floor and the ball. Notice that when the ball causes
2128 #+caption: the cube to tip, that the bottom face can still feel
2129 #+caption: part of the ground.
2130 #+name: touch-cube-uv-map
2131 #+ATTR_LaTeX: :width 15cm
2132 [[./images/touch-cube.png]]
2134 ** Proprioception provides knowledge of your own body's position
2136 Close your eyes, and touch your nose with your right index finger.
2137 How did you do it? You could not see your hand, and neither your
2138 hand nor your nose could use the sense of touch to guide the path
2139 of your hand. There are no sound cues, and Taste and Smell
2140 certainly don't provide any help. You know where your hand is
2141 without your other senses because of Proprioception.
2143 Humans can sometimes loose this sense through viral infections or
2144 damage to the spinal cord or brain, and when they do, they loose
2145 the ability to control their own bodies without looking directly at
2146 the parts they want to move. In [[http://en.wikipedia.org/wiki/The_Man_Who_Mistook_His_Wife_for_a_Hat][The Man Who Mistook His Wife for a
2147 Hat]], a woman named Christina looses this sense and has to learn how
2148 to move by carefully watching her arms and legs. She describes
2149 proprioception as the "eyes of the body, the way the body sees
2150 itself".
2152 Proprioception in humans is mediated by [[http://en.wikipedia.org/wiki/Articular_capsule][joint capsules]], [[http://en.wikipedia.org/wiki/Muscle_spindle][muscle
2153 spindles]], and the [[http://en.wikipedia.org/wiki/Golgi_tendon_organ][Golgi tendon organs]]. These measure the relative
2154 positions of each body part by monitoring muscle strain and length.
2156 It's clear that this is a vital sense for fluid, graceful movement.
2157 It's also particularly easy to implement in jMonkeyEngine.
2159 My simulated proprioception calculates the relative angles of each
2160 joint from the rest position defined in the blender file. This
2161 simulates the muscle-spindles and joint capsules. I will deal with
2162 Golgi tendon organs, which calculate muscle strain, in the next
2163 section.
2165 *** Helper functions
2167 =absolute-angle= calculates the angle between two vectors,
2168 relative to a third axis vector. This angle is the number of
2169 radians you have to move counterclockwise around the axis vector
2170 to get from the first to the second vector. It is not commutative
2171 like a normal dot-product angle is.
2173 The purpose of these functions is to build a system of angle
2174 measurement that is biologically plausable.
2176 #+caption: Program to measure angles along a vector
2177 #+name: helpers
2178 #+begin_listing clojure
2179 #+BEGIN_SRC clojure
2180 (defn right-handed?
2181 "true iff the three vectors form a right handed coordinate
2182 system. The three vectors do not have to be normalized or
2183 orthogonal."
2184 [vec1 vec2 vec3]
2185 (pos? (.dot (.cross vec1 vec2) vec3)))
2187 (defn absolute-angle
2188 "The angle between 'vec1 and 'vec2 around 'axis. In the range
2189 [0 (* 2 Math/PI)]."
2190 [vec1 vec2 axis]
2191 (let [angle (.angleBetween vec1 vec2)]
2192 (if (right-handed? vec1 vec2 axis)
2193 angle (- (* 2 Math/PI) angle))))
2194 #+END_SRC
2195 #+end_listing
2197 *** Proprioception Kernel
2199 Given a joint, =proprioception-kernel= produces a function that
2200 calculates the Euler angles between the the objects the joint
2201 connects. The only tricky part here is making the angles relative
2202 to the joint's initial ``straightness''.
2204 #+caption: Program to return biologially reasonable proprioceptive
2205 #+caption: data for each joint.
2206 #+name: proprioception
2207 #+begin_listing clojure
2208 #+BEGIN_SRC clojure
2209 (defn proprioception-kernel
2210 "Returns a function which returns proprioceptive sensory data when
2211 called inside a running simulation."
2212 [#^Node parts #^Node joint]
2213 (let [[obj-a obj-b] (joint-targets parts joint)
2214 joint-rot (.getWorldRotation joint)
2215 x0 (.mult joint-rot Vector3f/UNIT_X)
2216 y0 (.mult joint-rot Vector3f/UNIT_Y)
2217 z0 (.mult joint-rot Vector3f/UNIT_Z)]
2218 (fn []
2219 (let [rot-a (.clone (.getWorldRotation obj-a))
2220 rot-b (.clone (.getWorldRotation obj-b))
2221 x (.mult rot-a x0)
2222 y (.mult rot-a y0)
2223 z (.mult rot-a z0)
2225 X (.mult rot-b x0)
2226 Y (.mult rot-b y0)
2227 Z (.mult rot-b z0)
2228 heading (Math/atan2 (.dot X z) (.dot X x))
2229 pitch (Math/atan2 (.dot X y) (.dot X x))
2231 ;; rotate x-vector back to origin
2232 reverse
2233 (doto (Quaternion.)
2234 (.fromAngleAxis
2235 (.angleBetween X x)
2236 (let [cross (.normalize (.cross X x))]
2237 (if (= 0 (.length cross)) y cross))))
2238 roll (absolute-angle (.mult reverse Y) y x)]
2239 [heading pitch roll]))))
2241 (defn proprioception!
2242 "Endow the creature with the sense of proprioception. Returns a
2243 sequence of functions, one for each child of the \"joints\" node in
2244 the creature, which each report proprioceptive information about
2245 that joint."
2246 [#^Node creature]
2247 ;; extract the body's joints
2248 (let [senses (map (partial proprioception-kernel creature)
2249 (joints creature))]
2250 (fn []
2251 (map #(%) senses))))
2252 #+END_SRC
2253 #+end_listing
2255 =proprioception!= maps =proprioception-kernel= across all the
2256 joints of the creature. It uses the same list of joints that
2257 =joints= uses. Proprioception is the easiest sense to implement in
2258 =CORTEX=, and it will play a crucial role when efficiently
2259 implementing empathy.
2261 #+caption: In the upper right corner, the three proprioceptive
2262 #+caption: angle measurements are displayed. Red is yaw, Green is
2263 #+caption: pitch, and White is roll.
2264 #+name: proprio
2265 #+ATTR_LaTeX: :width 11cm
2266 [[./images/proprio.png]]
2268 ** Muscles contain both sensors and effectors
2270 Surprisingly enough, terrestrial creatures only move by using
2271 torque applied about their joints. There's not a single straight
2272 line of force in the human body at all! (A straight line of force
2273 would correspond to some sort of jet or rocket propulsion.)
2275 In humans, muscles are composed of muscle fibers which can contract
2276 to exert force. The muscle fibers which compose a muscle are
2277 partitioned into discrete groups which are each controlled by a
2278 single alpha motor neuron. A single alpha motor neuron might
2279 control as little as three or as many as one thousand muscle
2280 fibers. When the alpha motor neuron is engaged by the spinal cord,
2281 it activates all of the muscle fibers to which it is attached. The
2282 spinal cord generally engages the alpha motor neurons which control
2283 few muscle fibers before the motor neurons which control many
2284 muscle fibers. This recruitment strategy allows for precise
2285 movements at low strength. The collection of all motor neurons that
2286 control a muscle is called the motor pool. The brain essentially
2287 says "activate 30% of the motor pool" and the spinal cord recruits
2288 motor neurons until 30% are activated. Since the distribution of
2289 power among motor neurons is unequal and recruitment goes from
2290 weakest to strongest, the first 30% of the motor pool might be 5%
2291 of the strength of the muscle.
2293 My simulated muscles follow a similar design: Each muscle is
2294 defined by a 1-D array of numbers (the "motor pool"). Each entry in
2295 the array represents a motor neuron which controls a number of
2296 muscle fibers equal to the value of the entry. Each muscle has a
2297 scalar strength factor which determines the total force the muscle
2298 can exert when all motor neurons are activated. The effector
2299 function for a muscle takes a number to index into the motor pool,
2300 and then "activates" all the motor neurons whose index is lower or
2301 equal to the number. Each motor-neuron will apply force in
2302 proportion to its value in the array. Lower values cause less
2303 force. The lower values can be put at the "beginning" of the 1-D
2304 array to simulate the layout of actual human muscles, which are
2305 capable of more precise movements when exerting less force. Or, the
2306 motor pool can simulate more exotic recruitment strategies which do
2307 not correspond to human muscles.
2309 This 1D array is defined in an image file for ease of
2310 creation/visualization. Here is an example muscle profile image.
2312 #+caption: A muscle profile image that describes the strengths
2313 #+caption: of each motor neuron in a muscle. White is weakest
2314 #+caption: and dark red is strongest. This particular pattern
2315 #+caption: has weaker motor neurons at the beginning, just
2316 #+caption: like human muscle.
2317 #+name: muscle-recruit
2318 #+ATTR_LaTeX: :width 7cm
2319 [[./images/basic-muscle.png]]
2321 *** Muscle meta-data
2323 #+caption: Program to deal with loading muscle data from a blender
2324 #+caption: file's metadata.
2325 #+name: motor-pool
2326 #+begin_listing clojure
2327 #+BEGIN_SRC clojure
2328 (defn muscle-profile-image
2329 "Get the muscle-profile image from the node's blender meta-data."
2330 [#^Node muscle]
2331 (if-let [image (meta-data muscle "muscle")]
2332 (load-image image)))
2334 (defn muscle-strength
2335 "Return the strength of this muscle, or 1 if it is not defined."
2336 [#^Node muscle]
2337 (if-let [strength (meta-data muscle "strength")]
2338 strength 1))
2340 (defn motor-pool
2341 "Return a vector where each entry is the strength of the \"motor
2342 neuron\" at that part in the muscle."
2343 [#^Node muscle]
2344 (let [profile (muscle-profile-image muscle)]
2345 (vec
2346 (let [width (.getWidth profile)]
2347 (for [x (range width)]
2348 (- 255
2349 (bit-and
2350 0x0000FF
2351 (.getRGB profile x 0))))))))
2352 #+END_SRC
2353 #+end_listing
2355 Of note here is =motor-pool= which interprets the muscle-profile
2356 image in a way that allows me to use gradients between white and
2357 red, instead of shades of gray as I've been using for all the
2358 other senses. This is purely an aesthetic touch.
2360 *** Creating muscles
2362 #+caption: This is the core movement functoion in =CORTEX=, which
2363 #+caption: implements muscles that report on their activation.
2364 #+name: muscle-kernel
2365 #+begin_listing clojure
2366 #+BEGIN_SRC clojure
2367 (defn movement-kernel
2368 "Returns a function which when called with a integer value inside a
2369 running simulation will cause movement in the creature according
2370 to the muscle's position and strength profile. Each function
2371 returns the amount of force applied / max force."
2372 [#^Node creature #^Node muscle]
2373 (let [target (closest-node creature muscle)
2374 axis
2375 (.mult (.getWorldRotation muscle) Vector3f/UNIT_Y)
2376 strength (muscle-strength muscle)
2378 pool (motor-pool muscle)
2379 pool-integral (reductions + pool)
2380 forces
2381 (vec (map #(float (* strength (/ % (last pool-integral))))
2382 pool-integral))
2383 control (.getControl target RigidBodyControl)]
2384 ;;(println-repl (.getName target) axis)
2385 (fn [n]
2386 (let [pool-index (max 0 (min n (dec (count pool))))
2387 force (forces pool-index)]
2388 (.applyTorque control (.mult axis force))
2389 (float (/ force strength))))))
2391 (defn movement!
2392 "Endow the creature with the power of movement. Returns a sequence
2393 of functions, each of which accept an integer value and will
2394 activate their corresponding muscle."
2395 [#^Node creature]
2396 (for [muscle (muscles creature)]
2397 (movement-kernel creature muscle)))
2398 #+END_SRC
2399 #+end_listing
2402 =movement-kernel= creates a function that will move the nearest
2403 physical object to the muscle node. The muscle exerts a rotational
2404 force dependent on it's orientation to the object in the blender
2405 file. The function returned by =movement-kernel= is also a sense
2406 function: it returns the percent of the total muscle strength that
2407 is currently being employed. This is analogous to muscle tension
2408 in humans and completes the sense of proprioception begun in the
2409 last section.
2411 ** =CORTEX= brings complex creatures to life!
2413 The ultimate test of =CORTEX= is to create a creature with the full
2414 gamut of senses and put it though its paces.
2416 With all senses enabled, my right hand model looks like an
2417 intricate marionette hand with several strings for each finger:
2419 #+caption: View of the hand model with all sense nodes. You can see
2420 #+caption: the joint, muscle, ear, and eye nodess here.
2421 #+name: hand-nodes-1
2422 #+ATTR_LaTeX: :width 11cm
2423 [[./images/hand-with-all-senses2.png]]
2425 #+caption: An alternate view of the hand.
2426 #+name: hand-nodes-2
2427 #+ATTR_LaTeX: :width 15cm
2428 [[./images/hand-with-all-senses3.png]]
2430 With the hand fully rigged with senses, I can run it though a test
2431 that will test everything.
2433 #+caption: A full test of the hand with all senses. Note expecially
2434 #+caption: the interactions the hand has with itself: it feels
2435 #+caption: its own palm and fingers, and when it curls its fingers,
2436 #+caption: it sees them with its eye (which is located in the center
2437 #+caption: of the palm. The red block appears with a pure tone sound.
2438 #+caption: The hand then uses its muscles to launch the cube!
2439 #+name: integration
2440 #+ATTR_LaTeX: :width 16cm
2441 [[./images/integration.png]]
2443 ** =CORTEX= enables many possiblities for further research
2445 Often times, the hardest part of building a system involving
2446 creatures is dealing with physics and graphics. =CORTEX= removes
2447 much of this initial difficulty and leaves researchers free to
2448 directly pursue their ideas. I hope that even undergrads with a
2449 passing curiosity about simulated touch or creature evolution will
2450 be able to use cortex for experimentation. =CORTEX= is a completely
2451 simulated world, and far from being a disadvantage, its simulated
2452 nature enables you to create senses and creatures that would be
2453 impossible to make in the real world.
2455 While not by any means a complete list, here are some paths
2456 =CORTEX= is well suited to help you explore:
2458 - Empathy :: my empathy program leaves many areas for
2459 improvement, among which are using vision to infer
2460 proprioception and looking up sensory experience with imagined
2461 vision, touch, and sound.
2462 - Evolution :: Karl Sims created a rich environment for
2463 simulating the evolution of creatures on a connection
2464 machine. Today, this can be redone and expanded with =CORTEX=
2465 on an ordinary computer.
2466 - Exotic senses :: Cortex enables many fascinating senses that are
2467 not possible to build in the real world. For example,
2468 telekinesis is an interesting avenue to explore. You can also
2469 make a ``semantic'' sense which looks up metadata tags on
2470 objects in the environment the metadata tags might contain
2471 other sensory information.
2472 - Imagination via subworlds :: this would involve a creature with
2473 an effector which creates an entire new sub-simulation where
2474 the creature has direct control over placement/creation of
2475 objects via simulated telekinesis. The creature observes this
2476 sub-world through it's normal senses and uses its observations
2477 to make predictions about its top level world.
2478 - Simulated prescience :: step the simulation forward a few ticks,
2479 gather sensory data, then supply this data for the creature as
2480 one of its actual senses. The cost of prescience is slowing
2481 the simulation down by a factor proportional to however far
2482 you want the entities to see into the future. What happens
2483 when two evolved creatures that can each see into the future
2484 fight each other?
2485 - Swarm creatures :: Program a group of creatures that cooperate
2486 with each other. Because the creatures would be simulated, you
2487 could investigate computationally complex rules of behavior
2488 which still, from the group's point of view, would happen in
2489 ``real time''. Interactions could be as simple as cellular
2490 organisms communicating via flashing lights, or as complex as
2491 humanoids completing social tasks, etc.
2492 - =HACKER= for writing muscle-control programs :: Presented with
2493 low-level muscle control/ sense API, generate higher level
2494 programs for accomplishing various stated goals. Example goals
2495 might be "extend all your fingers" or "move your hand into the
2496 area with blue light" or "decrease the angle of this joint".
2497 It would be like Sussman's HACKER, except it would operate
2498 with much more data in a more realistic world. Start off with
2499 "calisthenics" to develop subroutines over the motor control
2500 API. This would be the "spinal chord" of a more intelligent
2501 creature. The low level programming code might be a turning
2502 machine that could develop programs to iterate over a "tape"
2503 where each entry in the tape could control recruitment of the
2504 fibers in a muscle.
2505 - Sense fusion :: There is much work to be done on sense
2506 integration -- building up a coherent picture of the world and
2507 the things in it with =CORTEX= as a base, you can explore
2508 concepts like self-organizing maps or cross modal clustering
2509 in ways that have never before been tried.
2510 - Inverse kinematics :: experiments in sense guided motor control
2511 are easy given =CORTEX='s support -- you can get right to the
2512 hard control problems without worrying about physics or
2513 senses.
2515 * =EMPATH=: action recognition in a simulated worm
2517 Here I develop a computational model of empathy, using =CORTEX= as a
2518 base. Empathy in this context is the ability to observe another
2519 creature and infer what sorts of sensations that creature is
2520 feeling. My empathy algorithm involves multiple phases. First is
2521 free-play, where the creature moves around and gains sensory
2522 experience. From this experience I construct a representation of the
2523 creature's sensory state space, which I call \Phi-space. Using
2524 \Phi-space, I construct an efficient function which takes the
2525 limited data that comes from observing another creature and enriches
2526 it full compliment of imagined sensory data. I can then use the
2527 imagined sensory data to recognize what the observed creature is
2528 doing and feeling, using straightforward embodied action predicates.
2529 This is all demonstrated with using a simple worm-like creature, and
2530 recognizing worm-actions based on limited data.
2532 #+caption: Here is the worm with which we will be working.
2533 #+caption: It is composed of 5 segments. Each segment has a
2534 #+caption: pair of extensor and flexor muscles. Each of the
2535 #+caption: worm's four joints is a hinge joint which allows
2536 #+caption: about 30 degrees of rotation to either side. Each segment
2537 #+caption: of the worm is touch-capable and has a uniform
2538 #+caption: distribution of touch sensors on each of its faces.
2539 #+caption: Each joint has a proprioceptive sense to detect
2540 #+caption: relative positions. The worm segments are all the
2541 #+caption: same except for the first one, which has a much
2542 #+caption: higher weight than the others to allow for easy
2543 #+caption: manual motor control.
2544 #+name: basic-worm-view
2545 #+ATTR_LaTeX: :width 10cm
2546 [[./images/basic-worm-view.png]]
2548 #+caption: Program for reading a worm from a blender file and
2549 #+caption: outfitting it with the senses of proprioception,
2550 #+caption: touch, and the ability to move, as specified in the
2551 #+caption: blender file.
2552 #+name: get-worm
2553 #+begin_listing clojure
2554 #+begin_src clojure
2555 (defn worm []
2556 (let [model (load-blender-model "Models/worm/worm.blend")]
2557 {:body (doto model (body!))
2558 :touch (touch! model)
2559 :proprioception (proprioception! model)
2560 :muscles (movement! model)}))
2561 #+end_src
2562 #+end_listing
2564 ** Embodiment factors action recognition into managable parts
2566 Using empathy, I divide the problem of action recognition into a
2567 recognition process expressed in the language of a full compliment
2568 of senses, and an imaganitive process that generates full sensory
2569 data from partial sensory data. Splitting the action recognition
2570 problem in this manner greatly reduces the total amount of work to
2571 recognize actions: The imaganitive process is mostly just matching
2572 previous experience, and the recognition process gets to use all
2573 the senses to directly describe any action.
2575 ** Action recognition is easy with a full gamut of senses
2577 Embodied representations using multiple senses such as touch,
2578 proprioception, and muscle tension turns out be be exceedingly
2579 efficient at describing body-centered actions. It is the ``right
2580 language for the job''. For example, it takes only around 5 lines
2581 of LISP code to describe the action of ``curling'' using embodied
2582 primitives. It takes about 10 lines to describe the seemingly
2583 complicated action of wiggling.
2585 The following action predicates each take a stream of sensory
2586 experience, observe however much of it they desire, and decide
2587 whether the worm is doing the action they describe. =curled?=
2588 relies on proprioception, =resting?= relies on touch, =wiggling?=
2589 relies on a fourier analysis of muscle contraction, and
2590 =grand-circle?= relies on touch and reuses =curled?= as a gaurd.
2592 #+caption: Program for detecting whether the worm is curled. This is the
2593 #+caption: simplest action predicate, because it only uses the last frame
2594 #+caption: of sensory experience, and only uses proprioceptive data. Even
2595 #+caption: this simple predicate, however, is automatically frame
2596 #+caption: independent and ignores vermopomorphic differences such as
2597 #+caption: worm textures and colors.
2598 #+name: curled
2599 #+begin_listing clojure
2600 #+begin_src clojure
2601 (defn curled?
2602 "Is the worm curled up?"
2603 [experiences]
2604 (every?
2605 (fn [[_ _ bend]]
2606 (> (Math/sin bend) 0.64))
2607 (:proprioception (peek experiences))))
2608 #+end_src
2609 #+end_listing
2611 #+caption: Program for summarizing the touch information in a patch
2612 #+caption: of skin.
2613 #+name: touch-summary
2614 #+begin_listing clojure
2615 #+begin_src clojure
2616 (defn contact
2617 "Determine how much contact a particular worm segment has with
2618 other objects. Returns a value between 0 and 1, where 1 is full
2619 contact and 0 is no contact."
2620 [touch-region [coords contact :as touch]]
2621 (-> (zipmap coords contact)
2622 (select-keys touch-region)
2623 (vals)
2624 (#(map first %))
2625 (average)
2626 (* 10)
2627 (- 1)
2628 (Math/abs)))
2629 #+end_src
2630 #+end_listing
2633 #+caption: Program for detecting whether the worm is at rest. This program
2634 #+caption: uses a summary of the tactile information from the underbelly
2635 #+caption: of the worm, and is only true if every segment is touching the
2636 #+caption: floor. Note that this function contains no references to
2637 #+caption: proprioction at all.
2638 #+name: resting
2639 #+begin_listing clojure
2640 #+begin_src clojure
2641 (def worm-segment-bottom (rect-region [8 15] [14 22]))
2643 (defn resting?
2644 "Is the worm resting on the ground?"
2645 [experiences]
2646 (every?
2647 (fn [touch-data]
2648 (< 0.9 (contact worm-segment-bottom touch-data)))
2649 (:touch (peek experiences))))
2650 #+end_src
2651 #+end_listing
2653 #+caption: Program for detecting whether the worm is curled up into a
2654 #+caption: full circle. Here the embodied approach begins to shine, as
2655 #+caption: I am able to both use a previous action predicate (=curled?=)
2656 #+caption: as well as the direct tactile experience of the head and tail.
2657 #+name: grand-circle
2658 #+begin_listing clojure
2659 #+begin_src clojure
2660 (def worm-segment-bottom-tip (rect-region [15 15] [22 22]))
2662 (def worm-segment-top-tip (rect-region [0 15] [7 22]))
2664 (defn grand-circle?
2665 "Does the worm form a majestic circle (one end touching the other)?"
2666 [experiences]
2667 (and (curled? experiences)
2668 (let [worm-touch (:touch (peek experiences))
2669 tail-touch (worm-touch 0)
2670 head-touch (worm-touch 4)]
2671 (and (< 0.55 (contact worm-segment-bottom-tip tail-touch))
2672 (< 0.55 (contact worm-segment-top-tip head-touch))))))
2673 #+end_src
2674 #+end_listing
2677 #+caption: Program for detecting whether the worm has been wiggling for
2678 #+caption: the last few frames. It uses a fourier analysis of the muscle
2679 #+caption: contractions of the worm's tail to determine wiggling. This is
2680 #+caption: signigicant because there is no particular frame that clearly
2681 #+caption: indicates that the worm is wiggling --- only when multiple frames
2682 #+caption: are analyzed together is the wiggling revealed. Defining
2683 #+caption: wiggling this way also gives the worm an opportunity to learn
2684 #+caption: and recognize ``frustrated wiggling'', where the worm tries to
2685 #+caption: wiggle but can't. Frustrated wiggling is very visually different
2686 #+caption: from actual wiggling, but this definition gives it to us for free.
2687 #+name: wiggling
2688 #+begin_listing clojure
2689 #+begin_src clojure
2690 (defn fft [nums]
2691 (map
2692 #(.getReal %)
2693 (.transform
2694 (FastFourierTransformer. DftNormalization/STANDARD)
2695 (double-array nums) TransformType/FORWARD)))
2697 (def indexed (partial map-indexed vector))
2699 (defn max-indexed [s]
2700 (first (sort-by (comp - second) (indexed s))))
2702 (defn wiggling?
2703 "Is the worm wiggling?"
2704 [experiences]
2705 (let [analysis-interval 0x40]
2706 (when (> (count experiences) analysis-interval)
2707 (let [a-flex 3
2708 a-ex 2
2709 muscle-activity
2710 (map :muscle (vector:last-n experiences analysis-interval))
2711 base-activity
2712 (map #(- (% a-flex) (% a-ex)) muscle-activity)]
2713 (= 2
2714 (first
2715 (max-indexed
2716 (map #(Math/abs %)
2717 (take 20 (fft base-activity))))))))))
2718 #+end_src
2719 #+end_listing
2721 With these action predicates, I can now recognize the actions of
2722 the worm while it is moving under my control and I have access to
2723 all the worm's senses.
2725 #+caption: Use the action predicates defined earlier to report on
2726 #+caption: what the worm is doing while in simulation.
2727 #+name: report-worm-activity
2728 #+begin_listing clojure
2729 #+begin_src clojure
2730 (defn debug-experience
2731 [experiences text]
2732 (cond
2733 (grand-circle? experiences) (.setText text "Grand Circle")
2734 (curled? experiences) (.setText text "Curled")
2735 (wiggling? experiences) (.setText text "Wiggling")
2736 (resting? experiences) (.setText text "Resting")))
2737 #+end_src
2738 #+end_listing
2740 #+caption: Using =debug-experience=, the body-centered predicates
2741 #+caption: work together to classify the behaviour of the worm.
2742 #+caption: the predicates are operating with access to the worm's
2743 #+caption: full sensory data.
2744 #+name: basic-worm-view
2745 #+ATTR_LaTeX: :width 10cm
2746 [[./images/worm-identify-init.png]]
2748 These action predicates satisfy the recognition requirement of an
2749 empathic recognition system. There is power in the simplicity of
2750 the action predicates. They describe their actions without getting
2751 confused in visual details of the worm. Each one is frame
2752 independent, but more than that, they are each indepent of
2753 irrelevant visual details of the worm and the environment. They
2754 will work regardless of whether the worm is a different color or
2755 hevaily textured, or if the environment has strange lighting.
2757 The trick now is to make the action predicates work even when the
2758 sensory data on which they depend is absent. If I can do that, then
2759 I will have gained much,
2761 ** \Phi-space describes the worm's experiences
2763 As a first step towards building empathy, I need to gather all of
2764 the worm's experiences during free play. I use a simple vector to
2765 store all the experiences.
2767 Each element of the experience vector exists in the vast space of
2768 all possible worm-experiences. Most of this vast space is actually
2769 unreachable due to physical constraints of the worm's body. For
2770 example, the worm's segments are connected by hinge joints that put
2771 a practical limit on the worm's range of motions without limiting
2772 its degrees of freedom. Some groupings of senses are impossible;
2773 the worm can not be bent into a circle so that its ends are
2774 touching and at the same time not also experience the sensation of
2775 touching itself.
2777 As the worm moves around during free play and its experience vector
2778 grows larger, the vector begins to define a subspace which is all
2779 the sensations the worm can practicaly experience during normal
2780 operation. I call this subspace \Phi-space, short for
2781 physical-space. The experience vector defines a path through
2782 \Phi-space. This path has interesting properties that all derive
2783 from physical embodiment. The proprioceptive components are
2784 completely smooth, because in order for the worm to move from one
2785 position to another, it must pass through the intermediate
2786 positions. The path invariably forms loops as actions are repeated.
2787 Finally and most importantly, proprioception actually gives very
2788 strong inference about the other senses. For example, when the worm
2789 is flat, you can infer that it is touching the ground and that its
2790 muscles are not active, because if the muscles were active, the
2791 worm would be moving and would not be perfectly flat. In order to
2792 stay flat, the worm has to be touching the ground, or it would
2793 again be moving out of the flat position due to gravity. If the
2794 worm is positioned in such a way that it interacts with itself,
2795 then it is very likely to be feeling the same tactile feelings as
2796 the last time it was in that position, because it has the same body
2797 as then. If you observe multiple frames of proprioceptive data,
2798 then you can become increasingly confident about the exact
2799 activations of the worm's muscles, because it generally takes a
2800 unique combination of muscle contractions to transform the worm's
2801 body along a specific path through \Phi-space.
2803 There is a simple way of taking \Phi-space and the total ordering
2804 provided by an experience vector and reliably infering the rest of
2805 the senses.
2807 ** Empathy is the process of tracing though \Phi-space
2809 Here is the core of a basic empathy algorithm, starting with an
2810 experience vector:
2812 First, group the experiences into tiered proprioceptive bins. I use
2813 powers of 10 and 3 bins, and the smallest bin has an approximate
2814 size of 0.001 radians in all proprioceptive dimensions.
2816 Then, given a sequence of proprioceptive input, generate a set of
2817 matching experience records for each input, using the tiered
2818 proprioceptive bins.
2820 Finally, to infer sensory data, select the longest consective chain
2821 of experiences. Conecutive experience means that the experiences
2822 appear next to each other in the experience vector.
2824 This algorithm has three advantages:
2826 1. It's simple
2828 3. It's very fast -- retrieving possible interpretations takes
2829 constant time. Tracing through chains of interpretations takes
2830 time proportional to the average number of experiences in a
2831 proprioceptive bin. Redundant experiences in \Phi-space can be
2832 merged to save computation.
2834 2. It protects from wrong interpretations of transient ambiguous
2835 proprioceptive data. For example, if the worm is flat for just
2836 an instant, this flattness will not be interpreted as implying
2837 that the worm has its muscles relaxed, since the flattness is
2838 part of a longer chain which includes a distinct pattern of
2839 muscle activation. Markov chains or other memoryless statistical
2840 models that operate on individual frames may very well make this
2841 mistake.
2843 #+caption: Program to convert an experience vector into a
2844 #+caption: proprioceptively binned lookup function.
2845 #+name: bin
2846 #+begin_listing clojure
2847 #+begin_src clojure
2848 (defn bin [digits]
2849 (fn [angles]
2850 (->> angles
2851 (flatten)
2852 (map (juxt #(Math/sin %) #(Math/cos %)))
2853 (flatten)
2854 (mapv #(Math/round (* % (Math/pow 10 (dec digits))))))))
2856 (defn gen-phi-scan
2857 "Nearest-neighbors with binning. Only returns a result if
2858 the propriceptive data is within 10% of a previously recorded
2859 result in all dimensions."
2860 [phi-space]
2861 (let [bin-keys (map bin [3 2 1])
2862 bin-maps
2863 (map (fn [bin-key]
2864 (group-by
2865 (comp bin-key :proprioception phi-space)
2866 (range (count phi-space)))) bin-keys)
2867 lookups (map (fn [bin-key bin-map]
2868 (fn [proprio] (bin-map (bin-key proprio))))
2869 bin-keys bin-maps)]
2870 (fn lookup [proprio-data]
2871 (set (some #(% proprio-data) lookups)))))
2872 #+end_src
2873 #+end_listing
2875 #+caption: =longest-thread= finds the longest path of consecutive
2876 #+caption: experiences to explain proprioceptive worm data.
2877 #+name: phi-space-history-scan
2878 #+ATTR_LaTeX: :width 10cm
2879 [[./images/aurellem-gray.png]]
2881 =longest-thread= infers sensory data by stitching together pieces
2882 from previous experience. It prefers longer chains of previous
2883 experience to shorter ones. For example, during training the worm
2884 might rest on the ground for one second before it performs its
2885 excercises. If during recognition the worm rests on the ground for
2886 five seconds, =longest-thread= will accomodate this five second
2887 rest period by looping the one second rest chain five times.
2889 =longest-thread= takes time proportinal to the average number of
2890 entries in a proprioceptive bin, because for each element in the
2891 starting bin it performes a series of set lookups in the preceeding
2892 bins. If the total history is limited, then this is only a constant
2893 multiple times the number of entries in the starting bin. This
2894 analysis also applies even if the action requires multiple longest
2895 chains -- it's still the average number of entries in a
2896 proprioceptive bin times the desired chain length. Because
2897 =longest-thread= is so efficient and simple, I can interpret
2898 worm-actions in real time.
2900 #+caption: Program to calculate empathy by tracing though \Phi-space
2901 #+caption: and finding the longest (ie. most coherent) interpretation
2902 #+caption: of the data.
2903 #+name: longest-thread
2904 #+begin_listing clojure
2905 #+begin_src clojure
2906 (defn longest-thread
2907 "Find the longest thread from phi-index-sets. The index sets should
2908 be ordered from most recent to least recent."
2909 [phi-index-sets]
2910 (loop [result '()
2911 [thread-bases & remaining :as phi-index-sets] phi-index-sets]
2912 (if (empty? phi-index-sets)
2913 (vec result)
2914 (let [threads
2915 (for [thread-base thread-bases]
2916 (loop [thread (list thread-base)
2917 remaining remaining]
2918 (let [next-index (dec (first thread))]
2919 (cond (empty? remaining) thread
2920 (contains? (first remaining) next-index)
2921 (recur
2922 (cons next-index thread) (rest remaining))
2923 :else thread))))
2924 longest-thread
2925 (reduce (fn [thread-a thread-b]
2926 (if (> (count thread-a) (count thread-b))
2927 thread-a thread-b))
2928 '(nil)
2929 threads)]
2930 (recur (concat longest-thread result)
2931 (drop (count longest-thread) phi-index-sets))))))
2932 #+end_src
2933 #+end_listing
2935 There is one final piece, which is to replace missing sensory data
2936 with a best-guess estimate. While I could fill in missing data by
2937 using a gradient over the closest known sensory data points,
2938 averages can be misleading. It is certainly possible to create an
2939 impossible sensory state by averaging two possible sensory states.
2940 Therefore, I simply replicate the most recent sensory experience to
2941 fill in the gaps.
2943 #+caption: Fill in blanks in sensory experience by replicating the most
2944 #+caption: recent experience.
2945 #+name: infer-nils
2946 #+begin_listing clojure
2947 #+begin_src clojure
2948 (defn infer-nils
2949 "Replace nils with the next available non-nil element in the
2950 sequence, or barring that, 0."
2951 [s]
2952 (loop [i (dec (count s))
2953 v (transient s)]
2954 (if (zero? i) (persistent! v)
2955 (if-let [cur (v i)]
2956 (if (get v (dec i) 0)
2957 (recur (dec i) v)
2958 (recur (dec i) (assoc! v (dec i) cur)))
2959 (recur i (assoc! v i 0))))))
2960 #+end_src
2961 #+end_listing
2963 ** =EMPATH= recognizes actions efficiently
2965 To use =EMPATH= with the worm, I first need to gather a set of
2966 experiences from the worm that includes the actions I want to
2967 recognize. The =generate-phi-space= program (listing
2968 \ref{generate-phi-space} runs the worm through a series of
2969 exercices and gatheres those experiences into a vector. The
2970 =do-all-the-things= program is a routine expressed in a simple
2971 muscle contraction script language for automated worm control. It
2972 causes the worm to rest, curl, and wiggle over about 700 frames
2973 (approx. 11 seconds).
2975 #+caption: Program to gather the worm's experiences into a vector for
2976 #+caption: further processing. The =motor-control-program= line uses
2977 #+caption: a motor control script that causes the worm to execute a series
2978 #+caption: of ``exercices'' that include all the action predicates.
2979 #+name: generate-phi-space
2980 #+begin_listing clojure
2981 #+begin_src clojure
2982 (def do-all-the-things
2983 (concat
2984 curl-script
2985 [[300 :d-ex 40]
2986 [320 :d-ex 0]]
2987 (shift-script 280 (take 16 wiggle-script))))
2989 (defn generate-phi-space []
2990 (let [experiences (atom [])]
2991 (run-world
2992 (apply-map
2993 worm-world
2994 (merge
2995 (worm-world-defaults)
2996 {:end-frame 700
2997 :motor-control
2998 (motor-control-program worm-muscle-labels do-all-the-things)
2999 :experiences experiences})))
3000 @experiences))
3001 #+end_src
3002 #+end_listing
3004 #+caption: Use longest thread and a phi-space generated from a short
3005 #+caption: exercise routine to interpret actions during free play.
3006 #+name: empathy-debug
3007 #+begin_listing clojure
3008 #+begin_src clojure
3009 (defn init []
3010 (def phi-space (generate-phi-space))
3011 (def phi-scan (gen-phi-scan phi-space)))
3013 (defn empathy-demonstration []
3014 (let [proprio (atom ())]
3015 (fn
3016 [experiences text]
3017 (let [phi-indices (phi-scan (:proprioception (peek experiences)))]
3018 (swap! proprio (partial cons phi-indices))
3019 (let [exp-thread (longest-thread (take 300 @proprio))
3020 empathy (mapv phi-space (infer-nils exp-thread))]
3021 (println-repl (vector:last-n exp-thread 22))
3022 (cond
3023 (grand-circle? empathy) (.setText text "Grand Circle")
3024 (curled? empathy) (.setText text "Curled")
3025 (wiggling? empathy) (.setText text "Wiggling")
3026 (resting? empathy) (.setText text "Resting")
3027 :else (.setText text "Unknown")))))))
3029 (defn empathy-experiment [record]
3030 (.start (worm-world :experience-watch (debug-experience-phi)
3031 :record record :worm worm*)))
3032 #+end_src
3033 #+end_listing
3035 The result of running =empathy-experiment= is that the system is
3036 generally able to interpret worm actions using the action-predicates
3037 on simulated sensory data just as well as with actual data. Figure
3038 \ref{empathy-debug-image} was generated using =empathy-experiment=:
3040 #+caption: From only proprioceptive data, =EMPATH= was able to infer
3041 #+caption: the complete sensory experience and classify four poses
3042 #+caption: (The last panel shows a composite image of \emph{wriggling},
3043 #+caption: a dynamic pose.)
3044 #+name: empathy-debug-image
3045 #+ATTR_LaTeX: :width 10cm :placement [H]
3046 [[./images/empathy-1.png]]
3048 One way to measure the performance of =EMPATH= is to compare the
3049 sutiability of the imagined sense experience to trigger the same
3050 action predicates as the real sensory experience.
3052 #+caption: Determine how closely empathy approximates actual
3053 #+caption: sensory data.
3054 #+name: test-empathy-accuracy
3055 #+begin_listing clojure
3056 #+begin_src clojure
3057 (def worm-action-label
3058 (juxt grand-circle? curled? wiggling?))
3060 (defn compare-empathy-with-baseline [matches]
3061 (let [proprio (atom ())]
3062 (fn
3063 [experiences text]
3064 (let [phi-indices (phi-scan (:proprioception (peek experiences)))]
3065 (swap! proprio (partial cons phi-indices))
3066 (let [exp-thread (longest-thread (take 300 @proprio))
3067 empathy (mapv phi-space (infer-nils exp-thread))
3068 experience-matches-empathy
3069 (= (worm-action-label experiences)
3070 (worm-action-label empathy))]
3071 (println-repl experience-matches-empathy)
3072 (swap! matches #(conj % experience-matches-empathy)))))))
3074 (defn accuracy [v]
3075 (float (/ (count (filter true? v)) (count v))))
3077 (defn test-empathy-accuracy []
3078 (let [res (atom [])]
3079 (run-world
3080 (worm-world :experience-watch
3081 (compare-empathy-with-baseline res)
3082 :worm worm*))
3083 (accuracy @res)))
3084 #+end_src
3085 #+end_listing
3087 Running =test-empathy-accuracy= using the very short exercise
3088 program defined in listing \ref{generate-phi-space}, and then doing
3089 a similar pattern of activity manually yeilds an accuracy of around
3090 73%. This is based on very limited worm experience. By training the
3091 worm for longer, the accuracy dramatically improves.
3093 #+caption: Program to generate \Phi-space using manual training.
3094 #+name: manual-phi-space
3095 #+begin_listing clojure
3096 #+begin_src clojure
3097 (defn init-interactive []
3098 (def phi-space
3099 (let [experiences (atom [])]
3100 (run-world
3101 (apply-map
3102 worm-world
3103 (merge
3104 (worm-world-defaults)
3105 {:experiences experiences})))
3106 @experiences))
3107 (def phi-scan (gen-phi-scan phi-space)))
3108 #+end_src
3109 #+end_listing
3111 After about 1 minute of manual training, I was able to achieve 95%
3112 accuracy on manual testing of the worm using =init-interactive= and
3113 =test-empathy-accuracy=. The majority of errors are near the
3114 boundaries of transitioning from one type of action to another.
3115 During these transitions the exact label for the action is more open
3116 to interpretation, and dissaggrement between empathy and experience
3117 is more excusable.
3119 ** Digression: Learn touch sensor layout through free play
3121 In the previous section I showed how to compute actions in terms of
3122 body-centered predicates which relied averate touch activation of
3123 pre-defined regions of the worm's skin. What if, instead of
3124 recieving touch pre-grouped into the six faces of each worm
3125 segment, the true topology of the worm's skin was unknown? This is
3126 more similiar to how a nerve fiber bundle might be arranged. While
3127 two fibers that are close in a nerve bundle /might/ correspond to
3128 two touch sensors that are close together on the skin, the process
3129 of taking a complicated surface and forcing it into essentially a
3130 circle requires some cuts and rerragenments.
3132 In this section I show how to automatically learn the skin-topology of
3133 a worm segment by free exploration. As the worm rolls around on the
3134 floor, large sections of its surface get activated. If the worm has
3135 stopped moving, then whatever region of skin that is touching the
3136 floor is probably an important region, and should be recorded.
3138 #+caption: Program to detect whether the worm is in a resting state
3139 #+caption: with one face touching the floor.
3140 #+name: pure-touch
3141 #+begin_listing clojure
3142 #+begin_src clojure
3143 (def full-contact [(float 0.0) (float 0.1)])
3145 (defn pure-touch?
3146 "This is worm specific code to determine if a large region of touch
3147 sensors is either all on or all off."
3148 [[coords touch :as touch-data]]
3149 (= (set (map first touch)) (set full-contact)))
3150 #+end_src
3151 #+end_listing
3153 After collecting these important regions, there will many nearly
3154 similiar touch regions. While for some purposes the subtle
3155 differences between these regions will be important, for my
3156 purposes I colapse them into mostly non-overlapping sets using
3157 =remove-similiar= in listing \ref{remove-similiar}
3159 #+caption: Program to take a lits of set of points and ``collapse them''
3160 #+caption: so that the remaining sets in the list are siginificantly
3161 #+caption: different from each other. Prefer smaller sets to larger ones.
3162 #+name: remove-similiar
3163 #+begin_listing clojure
3164 #+begin_src clojure
3165 (defn remove-similar
3166 [coll]
3167 (loop [result () coll (sort-by (comp - count) coll)]
3168 (if (empty? coll) result
3169 (let [[x & xs] coll
3170 c (count x)]
3171 (if (some
3172 (fn [other-set]
3173 (let [oc (count other-set)]
3174 (< (- (count (union other-set x)) c) (* oc 0.1))))
3175 xs)
3176 (recur result xs)
3177 (recur (cons x result) xs))))))
3178 #+end_src
3179 #+end_listing
3181 Actually running this simulation is easy given =CORTEX='s facilities.
3183 #+caption: Collect experiences while the worm moves around. Filter the touch
3184 #+caption: sensations by stable ones, collapse similiar ones together,
3185 #+caption: and report the regions learned.
3186 #+name: learn-touch
3187 #+begin_listing clojure
3188 #+begin_src clojure
3189 (defn learn-touch-regions []
3190 (let [experiences (atom [])
3191 world (apply-map
3192 worm-world
3193 (assoc (worm-segment-defaults)
3194 :experiences experiences))]
3195 (run-world world)
3196 (->>
3197 @experiences
3198 (drop 175)
3199 ;; access the single segment's touch data
3200 (map (comp first :touch))
3201 ;; only deal with "pure" touch data to determine surfaces
3202 (filter pure-touch?)
3203 ;; associate coordinates with touch values
3204 (map (partial apply zipmap))
3205 ;; select those regions where contact is being made
3206 (map (partial group-by second))
3207 (map #(get % full-contact))
3208 (map (partial map first))
3209 ;; remove redundant/subset regions
3210 (map set)
3211 remove-similar)))
3213 (defn learn-and-view-touch-regions []
3214 (map view-touch-region
3215 (learn-touch-regions)))
3216 #+end_src
3217 #+end_listing
3219 The only thing remining to define is the particular motion the worm
3220 must take. I accomplish this with a simple motor control program.
3222 #+caption: Motor control program for making the worm roll on the ground.
3223 #+caption: This could also be replaced with random motion.
3224 #+name: worm-roll
3225 #+begin_listing clojure
3226 #+begin_src clojure
3227 (defn touch-kinesthetics []
3228 [[170 :lift-1 40]
3229 [190 :lift-1 19]
3230 [206 :lift-1 0]
3232 [400 :lift-2 40]
3233 [410 :lift-2 0]
3235 [570 :lift-2 40]
3236 [590 :lift-2 21]
3237 [606 :lift-2 0]
3239 [800 :lift-1 30]
3240 [809 :lift-1 0]
3242 [900 :roll-2 40]
3243 [905 :roll-2 20]
3244 [910 :roll-2 0]
3246 [1000 :roll-2 40]
3247 [1005 :roll-2 20]
3248 [1010 :roll-2 0]
3250 [1100 :roll-2 40]
3251 [1105 :roll-2 20]
3252 [1110 :roll-2 0]
3253 ])
3254 #+end_src
3255 #+end_listing
3258 #+caption: The small worm rolls around on the floor, driven
3259 #+caption: by the motor control program in listing \ref{worm-roll}.
3260 #+name: worm-roll
3261 #+ATTR_LaTeX: :width 12cm
3262 [[./images/worm-roll.png]]
3265 #+caption: After completing its adventures, the worm now knows
3266 #+caption: how its touch sensors are arranged along its skin. These
3267 #+caption: are the regions that were deemed important by
3268 #+caption: =learn-touch-regions=. Note that the worm has discovered
3269 #+caption: that it has six sides.
3270 #+name: worm-touch-map
3271 #+ATTR_LaTeX: :width 12cm
3272 [[./images/touch-learn.png]]
3274 While simple, =learn-touch-regions= exploits regularities in both
3275 the worm's physiology and the worm's environment to correctly
3276 deduce that the worm has six sides. Note that =learn-touch-regions=
3277 would work just as well even if the worm's touch sense data were
3278 completely scrambled. The cross shape is just for convienence. This
3279 example justifies the use of pre-defined touch regions in =EMPATH=.
3281 * Contributions
3283 In this thesis you have seen the =CORTEX= system, a complete
3284 environment for creating simulated creatures. You have seen how to
3285 implement five senses: touch, proprioception, hearing, vision, and
3286 muscle tension. You have seen how to create new creatues using
3287 blender, a 3D modeling tool. I hope that =CORTEX= will be useful in
3288 further research projects. To this end I have included the full
3289 source to =CORTEX= along with a large suite of tests and examples. I
3290 have also created a user guide for =CORTEX= which is inculded in an
3291 appendix to this thesis \ref{}.
3292 # dxh: todo reference appendix
3294 You have also seen how I used =CORTEX= as a platform to attach the
3295 /action recognition/ problem, which is the problem of recognizing
3296 actions in video. You saw a simple system called =EMPATH= which
3297 ientifies actions by first describing actions in a body-centerd,
3298 rich sense language, then infering a full range of sensory
3299 experience from limited data using previous experience gained from
3300 free play.
3302 As a minor digression, you also saw how I used =CORTEX= to enable a
3303 tiny worm to discover the topology of its skin simply by rolling on
3304 the ground.
3306 In conclusion, the main contributions of this thesis are:
3308 - =CORTEX=, a system for creating simulated creatures with rich
3309 senses.
3310 - =EMPATH=, a program for recognizing actions by imagining sensory
3311 experience.
3313 # An anatomical joke:
3314 # - Training
3315 # - Skeletal imitation
3316 # - Sensory fleshing-out
3317 # - Classification
3318 #+BEGIN_LaTeX
3319 \appendix
3320 #+END_LaTeX
3321 * Appendix: =CORTEX= User Guide
3323 Those who write a thesis should endeavor to make their code not only
3324 accessable, but actually useable, as a way to pay back the community
3325 that made the thesis possible in the first place. This thesis would
3326 not be possible without Free Software such as jMonkeyEngine3,
3327 Blender, clojure, emacs, ffmpeg, and many other tools. That is why I
3328 have included this user guide, in the hope that someone else might
3329 find =CORTEX= useful.
3331 ** Obtaining =CORTEX=
3333 You can get cortex from its mercurial repository at
3334 http://hg.bortreb.com/cortex. You may also download =CORTEX=
3335 releases at http://aurellem.org/cortex/releases/. As a condition of
3336 making this thesis, I have also provided Professor Winston the
3337 =CORTEX= source, and he knows how to run the demos and get started.
3338 You may also email me at =cortex@aurellem.org= and I may help where
3339 I can.
3341 ** Running =CORTEX=
3343 =CORTEX= comes with README and INSTALL files that will guide you
3344 through installation and running the test suite. In particular you
3345 should look at test =cortex.test= which contains test suites that
3346 run through all senses and multiple creatures.
3348 ** Creating creatures
3350 Creatures are created using /Blender/, a free 3D modeling program.
3351 You will need Blender version 2.6 when using the =CORTEX= included
3352 in this thesis. You create a =CORTEX= creature in a similiar manner
3353 to modeling anything in Blender, except that you also create
3354 several trees of empty nodes which define the creature's senses.
3356 *** Mass
3358 To give an object mass in =CORTEX=, add a ``mass'' metadata label
3359 to the object with the mass in jMonkeyEngine units. Note that
3360 setting the mass to 0 causes the object to be immovable.
3362 *** Joints
3364 Joints are created by creating an empty node named =joints= and
3365 then creating any number of empty child nodes to represent your
3366 creature's joints. The joint will automatically connect the
3367 closest two physical objects. It will help to set the empty node's
3368 display mode to ``Arrows'' so that you can clearly see the
3369 direction of the axes.
3371 Joint nodes should have the following metadata under the ``joint''
3372 label:
3374 #+BEGIN_SRC clojure
3375 ;; ONE OF the following, under the label "joint":
3376 {:type :point}
3378 ;; OR
3380 {:type :hinge
3381 :limit [<limit-low> <limit-high>]
3382 :axis (Vector3f. <x> <y> <z>)}
3383 ;;(:axis defaults to (Vector3f. 1 0 0) if not provided for hinge joints)
3385 ;; OR
3387 {:type :cone
3388 :limit-xz <lim-xz>
3389 :limit-xy <lim-xy>
3390 :twist <lim-twist>} ;(use XZY rotation mode in blender!)
3391 #+END_SRC
3393 *** Eyes
3395 Eyes are created by creating an empty node named =eyes= and then
3396 creating any number of empty child nodes to represent your
3397 creature's eyes.
3399 Eye nodes should have the following metadata under the ``eye''
3400 label:
3402 #+BEGIN_SRC clojure
3403 {:red <red-retina-definition>
3404 :blue <blue-retina-definition>
3405 :green <green-retina-definition>
3406 :all <all-retina-definition>
3407 (<0xrrggbb> <custom-retina-image>)...
3409 #+END_SRC
3411 Any of the color channels may be omitted. You may also include
3412 your own color selectors, and in fact :red is equivalent to
3413 0xFF0000 and so forth. The eye will be placed at the same position
3414 as the empty node and will bind to the neatest physical object.
3415 The eye will point outward from the X-axis of the node, and ``up''
3416 will be in the direction of the X-axis of the node. It will help
3417 to set the empty node's display mode to ``Arrows'' so that you can
3418 clearly see the direction of the axes.
3420 Each retina file should contain white pixels whever you want to be
3421 sensitive to your chosen color. If you want the entire field of
3422 view, specify :all of 0xFFFFFF and a retinal map that is entirely
3423 white.
3425 Here is a sample retinal map:
3427 #+caption: An example retinal profile image. White pixels are
3428 #+caption: photo-sensitive elements. The distribution of white
3429 #+caption: pixels is denser in the middle and falls off at the
3430 #+caption: edges and is inspired by the human retina.
3431 #+name: retina
3432 #+ATTR_LaTeX: :width 7cm :placement [H]
3433 [[./images/retina-small.png]]
3435 *** Hearing
3437 Ears are created by creating an empty node named =ears= and then
3438 creating any number of empty child nodes to represent your
3439 creature's ears.
3441 Ear nodes do not require any metadata.
3443 The ear will bind to and follow the closest physical node.
3445 *** Touch
3447 Touch is handled similarly to mass. To make a particular object
3448 touch sensitive, add metadata of the following form under the
3449 object's ``touch'' metadata field:
3451 #+BEGIN_EXAMPLE
3452 <touch-UV-map-file-name>
3453 #+END_EXAMPLE
3455 You may also include an optional ``scale'' metadata number to
3456 specifiy the length of the touch feelers. The default is $0.1$,
3457 and this is generally sufficient.
3459 The touch UV should contain white pixels for each touch sensor.
3461 Here is an example touch-uv map that approximates a human finger,
3462 and its corresponding model.
3464 #+caption: This is the tactile-sensor-profile for the upper segment
3465 #+caption: of a fingertip. It defines regions of high touch sensitivity
3466 #+caption: (where there are many white pixels) and regions of low
3467 #+caption: sensitivity (where white pixels are sparse).
3468 #+name: guide-fingertip-UV
3469 #+ATTR_LaTeX: :width 9cm :placement [H]
3470 [[./images/finger-UV.png]]
3472 #+caption: The fingertip UV-image form above applied to a simple
3473 #+caption: model of a fingertip.
3474 #+name: guide-fingertip
3475 #+ATTR_LaTeX: :width 9cm :placement [H]
3476 [[./images/finger-2.png]]
3478 *** Propriocepotion
3480 Proprioception is tied to each joint node -- nothing special must
3481 be done in a blender model to enable proprioception other than
3482 creating joint nodes.
3484 *** Muscles
3486 Muscles are created by creating an empty node named =muscles= and
3487 then creating any number of empty child nodes to represent your
3488 creature's muscles.
3491 Muscle nodes should have the following metadata under the
3492 ``muscle'' label:
3494 #+BEGIN_EXAMPLE
3495 <muscle-profile-file-name>
3496 #+END_EXAMPLE
3498 Muscles should also have a ``strength'' metadata entry describing
3499 the muscle's total strength at full activation.
3501 Muscle profiles are simple images that contain the relative amount
3502 of muscle power in each simulated alpha motor neuron. The width of
3503 the image is the total size of the motor pool, and the redness of
3504 each neuron is the relative power of that motor pool.
3506 While the profile image can have any dimensions, only the first
3507 line of pixels is used to define the muscle. Here is a sample
3508 muscle profile image that defines a human-like muscle.
3510 #+caption: A muscle profile image that describes the strengths
3511 #+caption: of each motor neuron in a muscle. White is weakest
3512 #+caption: and dark red is strongest. This particular pattern
3513 #+caption: has weaker motor neurons at the beginning, just
3514 #+caption: like human muscle.
3515 #+name: muscle-recruit
3516 #+ATTR_LaTeX: :width 7cm :placement [H]
3517 [[./images/basic-muscle.png]]
3519 Muscles twist the nearest physical object about the muscle node's
3520 Z-axis. I recommend using the ``Single Arrow'' display mode for
3521 muscles and using the right hand rule to determine which way the
3522 muscle will twist. To make a segment that can twist in multiple
3523 directions, create multiple, differently aligned muscles.
3525 ** =CORTEX= API
3527 These are the some functions exposed by =CORTEX= for creating
3528 worlds and simulating creatures. These are in addition to
3529 jMonkeyEngine3's extensive library, which is documented elsewhere.
3531 *** Simulation
3532 - =(world root-node key-map setup-fn update-fn)= :: create
3533 a simulation.
3534 - /root-node/ :: a =com.jme3.scene.Node= object which
3535 contains all of the objects that should be in the
3536 simulation.
3538 - /key-map/ :: a map from strings describing keys to
3539 functions that should be executed whenever that key is
3540 pressed. the functions should take a SimpleApplication
3541 object and a boolean value. The SimpleApplication is the
3542 current simulation that is running, and the boolean is true
3543 if the key is being pressed, and false if it is being
3544 released. As an example,
3545 #+BEGIN_SRC clojure
3546 {"key-j" (fn [game value] (if value (println "key j pressed")))}
3547 #+END_SRC
3548 is a valid key-map which will cause the simulation to print
3549 a message whenever the 'j' key on the keyboard is pressed.
3551 - /setup-fn/ :: a function that takes a =SimpleApplication=
3552 object. It is called once when initializing the simulation.
3553 Use it to create things like lights, change the gravity,
3554 initialize debug nodes, etc.
3556 - /update-fn/ :: this function takes a =SimpleApplication=
3557 object and a float and is called every frame of the
3558 simulation. The float tells how many seconds is has been
3559 since the last frame was rendered, according to whatever
3560 clock jme is currently using. The default is to use IsoTimer
3561 which will result in this value always being the same.
3563 - =(position-camera world position rotation)= :: set the position
3564 of the simulation's main camera.
3566 - =(enable-debug world)= :: turn on debug wireframes for each
3567 simulated object.
3569 - =(set-gravity world gravity)= :: set the gravity of a running
3570 simulation.
3572 - =(box length width height & {options})= :: create a box in the
3573 simulation. Options is a hash map specifying texture, mass,
3574 etc. Possible options are =:name=, =:color=, =:mass=,
3575 =:friction=, =:texture=, =:material=, =:position=,
3576 =:rotation=, =:shape=, and =:physical?=.
3578 - =(sphere radius & {options})= :: create a sphere in the simulation.
3579 Options are the same as in =box=.
3581 - =(load-blender-model file-name)= :: create a node structure
3582 representing that described in a blender file.
3584 - =(light-up-everything world)= :: distribute a standard compliment
3585 of lights throught the simulation. Should be adequate for most
3586 purposes.
3588 - =(node-seq node)= :: return a recursuve list of the node's
3589 children.
3591 - =(nodify name children)= :: construct a node given a node-name and
3592 desired children.
3594 - =(add-element world element)= :: add an object to a running world
3595 simulation.
3597 - =(set-accuracy world accuracy)= :: change the accuracy of the
3598 world's physics simulator.
3600 - =(asset-manager)= :: get an /AssetManager/, a jMonkeyEngine
3601 construct that is useful for loading textures and is required
3602 for smooth interaction with jMonkeyEngine library functions.
3604 - =(load-bullet)= :: unpack native libraries and initialize
3605 blender. This function is required before other world building
3606 functions are called.
3608 *** Creature Manipulation / Import
3610 - =(body! creature)= :: give the creature a physical body.
3612 - =(vision! creature)= :: give the creature a sense of vision.
3613 Returns a list of functions which will each, when called
3614 during a simulation, return the vision data for the channel of
3615 one of the eyes. The functions are ordered depending on the
3616 alphabetical order of the names of the eye nodes in the
3617 blender file. The data returned by the functions is a vector
3618 containing the eye's /topology/, a vector of coordinates, and
3619 the eye's /data/, a vector of RGB values filtered by the eye's
3620 sensitivity.
3622 - =(hearing! creature)= :: give the creature a sense of hearing.
3623 Returns a list of functions, one for each ear, that when
3624 called will return a frame's worth of hearing data for that
3625 ear. The functions are ordered depending on the alphabetical
3626 order of the names of the ear nodes in the blender file. The
3627 data returned by the functions is an array PCM encoded wav
3628 data.
3630 - =(touch! creature)= :: give the creature a sense of touch. Returns
3631 a single function that must be called with the /root node/ of
3632 the world, and which will return a vector of /touch-data/
3633 one entry for each touch sensitive component, each entry of
3634 which contains a /topology/ that specifies the distribution of
3635 touch sensors, and the /data/, which is a vector of
3636 =[activation, length]= pairs for each touch hair.
3638 - =(proprioception! creature)= :: give the creature the sense of
3639 proprioception. Returns a list of functions, one for each
3640 joint, that when called during a running simulation will
3641 report the =[headnig, pitch, roll]= of the joint.
3643 - =(movement! creature)= :: give the creature the power of movement.
3644 Creates a list of functions, one for each muscle, that when
3645 called with an integer, will set the recruitment of that
3646 muscle to that integer, and will report the current power
3647 being exerted by the muscle. Order of muscles is determined by
3648 the alphabetical sort order of the names of the muscle nodes.
3650 *** Visualization/Debug
3652 - =(view-vision)= :: create a function that when called with a list
3653 of visual data returned from the functions made by =vision!=,
3654 will display that visual data on the screen.
3656 - =(view-hearing)= :: same as =view-vision= but for hearing.
3658 - =(view-touch)= :: same as =view-vision= but for touch.
3660 - =(view-proprioception)= :: same as =view-vision= but for
3661 proprioception.
3663 - =(view-movement)= :: same as =view-vision= but for
3664 proprioception.
3666 - =(view anything)= :: =view= is a polymorphic function that allows
3667 you to inspect almost anything you could reasonably expect to
3668 be able to ``see'' in =CORTEX=.
3670 - =(text anything)= :: =text= is a polymorphic function that allows
3671 you to convert practically anything into a text string.
3673 - =(println-repl anything)= :: print messages to clojure's repl
3674 instead of the simulation's terminal window.
3676 - =(mega-import-jme3)= :: for experimenting at the REPL. This
3677 function will import all jMonkeyEngine3 classes for immediate
3678 use.
3680 - =(display-dialated-time world timer)= :: Shows the time as it is
3681 flowing in the simulation on a HUD display.