annotate thesis/cortex.org @ 468:258078f78b33

insert argument about senses.
author Robert McIntyre <rlm@mit.edu>
date Fri, 28 Mar 2014 16:25:31 -0400
parents ade64947d2bf
children ae10f35022ba
rev   line source
rlm@425 1 #+title: =CORTEX=
rlm@425 2 #+author: Robert McIntyre
rlm@425 3 #+email: rlm@mit.edu
rlm@425 4 #+description: Using embodied AI to facilitate Artificial Imagination.
rlm@425 5 #+keywords: AI, clojure, embodiment
rlm@451 6 #+LaTeX_CLASS_OPTIONS: [nofloat]
rlm@422 7
rlm@465 8 * COMMENT templates
rlm@465 9 #+caption:
rlm@465 10 #+caption:
rlm@465 11 #+caption:
rlm@465 12 #+caption:
rlm@465 13 #+name: name
rlm@465 14 #+begin_listing clojure
rlm@465 15 #+begin_src clojure
rlm@465 16 #+end_src
rlm@465 17 #+end_listing
rlm@465 18
rlm@465 19 #+caption:
rlm@465 20 #+caption:
rlm@465 21 #+caption:
rlm@465 22 #+name: name
rlm@465 23 #+ATTR_LaTeX: :width 10cm
rlm@466 24 [[./images/aurellem-gray.png]]
rlm@465 25
rlm@465 26
rlm@465 27
rlm@465 28 * COMMENT Empathy and Embodiment as problem solving strategies
rlm@437 29
rlm@437 30 By the end of this thesis, you will have seen a novel approach to
rlm@437 31 interpreting video using embodiment and empathy. You will have also
rlm@437 32 seen one way to efficiently implement empathy for embodied
rlm@447 33 creatures. Finally, you will become familiar with =CORTEX=, a system
rlm@447 34 for designing and simulating creatures with rich senses, which you
rlm@447 35 may choose to use in your own research.
rlm@437 36
rlm@441 37 This is the core vision of my thesis: That one of the important ways
rlm@441 38 in which we understand others is by imagining ourselves in their
rlm@441 39 position and emphatically feeling experiences relative to our own
rlm@441 40 bodies. By understanding events in terms of our own previous
rlm@441 41 corporeal experience, we greatly constrain the possibilities of what
rlm@441 42 would otherwise be an unwieldy exponential search. This extra
rlm@441 43 constraint can be the difference between easily understanding what
rlm@441 44 is happening in a video and being completely lost in a sea of
rlm@441 45 incomprehensible color and movement.
rlm@435 46
rlm@436 47 ** Recognizing actions in video is extremely difficult
rlm@437 48
rlm@447 49 Consider for example the problem of determining what is happening
rlm@447 50 in a video of which this is one frame:
rlm@437 51
rlm@441 52 #+caption: A cat drinking some water. Identifying this action is
rlm@441 53 #+caption: beyond the state of the art for computers.
rlm@441 54 #+ATTR_LaTeX: :width 7cm
rlm@441 55 [[./images/cat-drinking.jpg]]
rlm@441 56
rlm@441 57 It is currently impossible for any computer program to reliably
rlm@447 58 label such a video as ``drinking''. And rightly so -- it is a very
rlm@441 59 hard problem! What features can you describe in terms of low level
rlm@441 60 functions of pixels that can even begin to describe at a high level
rlm@441 61 what is happening here?
rlm@437 62
rlm@447 63 Or suppose that you are building a program that recognizes chairs.
rlm@448 64 How could you ``see'' the chair in figure \ref{hidden-chair}?
rlm@441 65
rlm@441 66 #+caption: The chair in this image is quite obvious to humans, but I
rlm@448 67 #+caption: doubt that any modern computer vision program can find it.
rlm@441 68 #+name: hidden-chair
rlm@441 69 #+ATTR_LaTeX: :width 10cm
rlm@441 70 [[./images/fat-person-sitting-at-desk.jpg]]
rlm@441 71
rlm@441 72 Finally, how is it that you can easily tell the difference between
rlm@441 73 how the girls /muscles/ are working in figure \ref{girl}?
rlm@441 74
rlm@441 75 #+caption: The mysterious ``common sense'' appears here as you are able
rlm@441 76 #+caption: to discern the difference in how the girl's arm muscles
rlm@441 77 #+caption: are activated between the two images.
rlm@441 78 #+name: girl
rlm@448 79 #+ATTR_LaTeX: :width 7cm
rlm@441 80 [[./images/wall-push.png]]
rlm@437 81
rlm@441 82 Each of these examples tells us something about what might be going
rlm@441 83 on in our minds as we easily solve these recognition problems.
rlm@441 84
rlm@441 85 The hidden chairs show us that we are strongly triggered by cues
rlm@447 86 relating to the position of human bodies, and that we can determine
rlm@447 87 the overall physical configuration of a human body even if much of
rlm@447 88 that body is occluded.
rlm@437 89
rlm@441 90 The picture of the girl pushing against the wall tells us that we
rlm@441 91 have common sense knowledge about the kinetics of our own bodies.
rlm@441 92 We know well how our muscles would have to work to maintain us in
rlm@441 93 most positions, and we can easily project this self-knowledge to
rlm@441 94 imagined positions triggered by images of the human body.
rlm@441 95
rlm@441 96 ** =EMPATH= neatly solves recognition problems
rlm@441 97
rlm@441 98 I propose a system that can express the types of recognition
rlm@441 99 problems above in a form amenable to computation. It is split into
rlm@441 100 four parts:
rlm@441 101
rlm@448 102 - Free/Guided Play :: The creature moves around and experiences the
rlm@448 103 world through its unique perspective. Many otherwise
rlm@448 104 complicated actions are easily described in the language of a
rlm@448 105 full suite of body-centered, rich senses. For example,
rlm@448 106 drinking is the feeling of water sliding down your throat, and
rlm@448 107 cooling your insides. It's often accompanied by bringing your
rlm@448 108 hand close to your face, or bringing your face close to water.
rlm@448 109 Sitting down is the feeling of bending your knees, activating
rlm@448 110 your quadriceps, then feeling a surface with your bottom and
rlm@448 111 relaxing your legs. These body-centered action descriptions
rlm@448 112 can be either learned or hard coded.
rlm@448 113 - Posture Imitation :: When trying to interpret a video or image,
rlm@448 114 the creature takes a model of itself and aligns it with
rlm@448 115 whatever it sees. This alignment can even cross species, as
rlm@448 116 when humans try to align themselves with things like ponies,
rlm@448 117 dogs, or other humans with a different body type.
rlm@448 118 - Empathy :: The alignment triggers associations with
rlm@448 119 sensory data from prior experiences. For example, the
rlm@448 120 alignment itself easily maps to proprioceptive data. Any
rlm@448 121 sounds or obvious skin contact in the video can to a lesser
rlm@448 122 extent trigger previous experience. Segments of previous
rlm@448 123 experiences are stitched together to form a coherent and
rlm@448 124 complete sensory portrait of the scene.
rlm@448 125 - Recognition :: With the scene described in terms of first
rlm@448 126 person sensory events, the creature can now run its
rlm@447 127 action-identification programs on this synthesized sensory
rlm@447 128 data, just as it would if it were actually experiencing the
rlm@447 129 scene first-hand. If previous experience has been accurately
rlm@447 130 retrieved, and if it is analogous enough to the scene, then
rlm@447 131 the creature will correctly identify the action in the scene.
rlm@447 132
rlm@441 133 For example, I think humans are able to label the cat video as
rlm@447 134 ``drinking'' because they imagine /themselves/ as the cat, and
rlm@441 135 imagine putting their face up against a stream of water and
rlm@441 136 sticking out their tongue. In that imagined world, they can feel
rlm@441 137 the cool water hitting their tongue, and feel the water entering
rlm@447 138 their body, and are able to recognize that /feeling/ as drinking.
rlm@447 139 So, the label of the action is not really in the pixels of the
rlm@447 140 image, but is found clearly in a simulation inspired by those
rlm@447 141 pixels. An imaginative system, having been trained on drinking and
rlm@447 142 non-drinking examples and learning that the most important
rlm@447 143 component of drinking is the feeling of water sliding down one's
rlm@447 144 throat, would analyze a video of a cat drinking in the following
rlm@447 145 manner:
rlm@441 146
rlm@447 147 1. Create a physical model of the video by putting a ``fuzzy''
rlm@447 148 model of its own body in place of the cat. Possibly also create
rlm@447 149 a simulation of the stream of water.
rlm@441 150
rlm@441 151 2. Play out this simulated scene and generate imagined sensory
rlm@441 152 experience. This will include relevant muscle contractions, a
rlm@441 153 close up view of the stream from the cat's perspective, and most
rlm@441 154 importantly, the imagined feeling of water entering the
rlm@443 155 mouth. The imagined sensory experience can come from a
rlm@441 156 simulation of the event, but can also be pattern-matched from
rlm@441 157 previous, similar embodied experience.
rlm@441 158
rlm@441 159 3. The action is now easily identified as drinking by the sense of
rlm@441 160 taste alone. The other senses (such as the tongue moving in and
rlm@441 161 out) help to give plausibility to the simulated action. Note that
rlm@441 162 the sense of vision, while critical in creating the simulation,
rlm@441 163 is not critical for identifying the action from the simulation.
rlm@441 164
rlm@441 165 For the chair examples, the process is even easier:
rlm@441 166
rlm@441 167 1. Align a model of your body to the person in the image.
rlm@441 168
rlm@441 169 2. Generate proprioceptive sensory data from this alignment.
rlm@437 170
rlm@441 171 3. Use the imagined proprioceptive data as a key to lookup related
rlm@441 172 sensory experience associated with that particular proproceptive
rlm@441 173 feeling.
rlm@437 174
rlm@443 175 4. Retrieve the feeling of your bottom resting on a surface, your
rlm@443 176 knees bent, and your leg muscles relaxed.
rlm@437 177
rlm@441 178 5. This sensory information is consistent with the =sitting?=
rlm@441 179 sensory predicate, so you (and the entity in the image) must be
rlm@441 180 sitting.
rlm@440 181
rlm@441 182 6. There must be a chair-like object since you are sitting.
rlm@440 183
rlm@441 184 Empathy offers yet another alternative to the age-old AI
rlm@441 185 representation question: ``What is a chair?'' --- A chair is the
rlm@441 186 feeling of sitting.
rlm@441 187
rlm@441 188 My program, =EMPATH= uses this empathic problem solving technique
rlm@441 189 to interpret the actions of a simple, worm-like creature.
rlm@437 190
rlm@441 191 #+caption: The worm performs many actions during free play such as
rlm@441 192 #+caption: curling, wiggling, and resting.
rlm@441 193 #+name: worm-intro
rlm@446 194 #+ATTR_LaTeX: :width 15cm
rlm@445 195 [[./images/worm-intro-white.png]]
rlm@437 196
rlm@462 197 #+caption: =EMPATH= recognized and classified each of these
rlm@462 198 #+caption: poses by inferring the complete sensory experience
rlm@462 199 #+caption: from proprioceptive data.
rlm@441 200 #+name: worm-recognition-intro
rlm@446 201 #+ATTR_LaTeX: :width 15cm
rlm@445 202 [[./images/worm-poses.png]]
rlm@441 203
rlm@441 204 One powerful advantage of empathic problem solving is that it
rlm@441 205 factors the action recognition problem into two easier problems. To
rlm@441 206 use empathy, you need an /aligner/, which takes the video and a
rlm@441 207 model of your body, and aligns the model with the video. Then, you
rlm@441 208 need a /recognizer/, which uses the aligned model to interpret the
rlm@441 209 action. The power in this method lies in the fact that you describe
rlm@448 210 all actions form a body-centered viewpoint. You are less tied to
rlm@447 211 the particulars of any visual representation of the actions. If you
rlm@441 212 teach the system what ``running'' is, and you have a good enough
rlm@441 213 aligner, the system will from then on be able to recognize running
rlm@441 214 from any point of view, even strange points of view like above or
rlm@441 215 underneath the runner. This is in contrast to action recognition
rlm@448 216 schemes that try to identify actions using a non-embodied approach.
rlm@448 217 If these systems learn about running as viewed from the side, they
rlm@448 218 will not automatically be able to recognize running from any other
rlm@448 219 viewpoint.
rlm@441 220
rlm@441 221 Another powerful advantage is that using the language of multiple
rlm@441 222 body-centered rich senses to describe body-centerd actions offers a
rlm@441 223 massive boost in descriptive capability. Consider how difficult it
rlm@441 224 would be to compose a set of HOG filters to describe the action of
rlm@447 225 a simple worm-creature ``curling'' so that its head touches its
rlm@447 226 tail, and then behold the simplicity of describing thus action in a
rlm@441 227 language designed for the task (listing \ref{grand-circle-intro}):
rlm@441 228
rlm@446 229 #+caption: Body-centerd actions are best expressed in a body-centered
rlm@446 230 #+caption: language. This code detects when the worm has curled into a
rlm@446 231 #+caption: full circle. Imagine how you would replicate this functionality
rlm@446 232 #+caption: using low-level pixel features such as HOG filters!
rlm@446 233 #+name: grand-circle-intro
rlm@452 234 #+attr_latex: [htpb]
rlm@452 235 #+begin_listing clojure
rlm@446 236 #+begin_src clojure
rlm@446 237 (defn grand-circle?
rlm@446 238 "Does the worm form a majestic circle (one end touching the other)?"
rlm@446 239 [experiences]
rlm@446 240 (and (curled? experiences)
rlm@446 241 (let [worm-touch (:touch (peek experiences))
rlm@446 242 tail-touch (worm-touch 0)
rlm@446 243 head-touch (worm-touch 4)]
rlm@462 244 (and (< 0.2 (contact worm-segment-bottom-tip tail-touch))
rlm@462 245 (< 0.2 (contact worm-segment-top-tip head-touch))))))
rlm@446 246 #+end_src
rlm@446 247 #+end_listing
rlm@446 248
rlm@435 249
rlm@449 250 ** =CORTEX= is a toolkit for building sensate creatures
rlm@435 251
rlm@448 252 I built =CORTEX= to be a general AI research platform for doing
rlm@448 253 experiments involving multiple rich senses and a wide variety and
rlm@448 254 number of creatures. I intend it to be useful as a library for many
rlm@462 255 more projects than just this thesis. =CORTEX= was necessary to meet
rlm@462 256 a need among AI researchers at CSAIL and beyond, which is that
rlm@462 257 people often will invent neat ideas that are best expressed in the
rlm@448 258 language of creatures and senses, but in order to explore those
rlm@448 259 ideas they must first build a platform in which they can create
rlm@448 260 simulated creatures with rich senses! There are many ideas that
rlm@448 261 would be simple to execute (such as =EMPATH=), but attached to them
rlm@448 262 is the multi-month effort to make a good creature simulator. Often,
rlm@448 263 that initial investment of time proves to be too much, and the
rlm@448 264 project must make do with a lesser environment.
rlm@435 265
rlm@448 266 =CORTEX= is well suited as an environment for embodied AI research
rlm@448 267 for three reasons:
rlm@448 268
rlm@448 269 - You can create new creatures using Blender, a popular 3D modeling
rlm@448 270 program. Each sense can be specified using special blender nodes
rlm@448 271 with biologically inspired paramaters. You need not write any
rlm@448 272 code to create a creature, and can use a wide library of
rlm@448 273 pre-existing blender models as a base for your own creatures.
rlm@448 274
rlm@448 275 - =CORTEX= implements a wide variety of senses, including touch,
rlm@448 276 proprioception, vision, hearing, and muscle tension. Complicated
rlm@448 277 senses like touch, and vision involve multiple sensory elements
rlm@448 278 embedded in a 2D surface. You have complete control over the
rlm@448 279 distribution of these sensor elements through the use of simple
rlm@448 280 png image files. In particular, =CORTEX= implements more
rlm@448 281 comprehensive hearing than any other creature simulation system
rlm@448 282 available.
rlm@448 283
rlm@448 284 - =CORTEX= supports any number of creatures and any number of
rlm@448 285 senses. Time in =CORTEX= dialates so that the simulated creatures
rlm@448 286 always precieve a perfectly smooth flow of time, regardless of
rlm@448 287 the actual computational load.
rlm@448 288
rlm@448 289 =CORTEX= is built on top of =jMonkeyEngine3=, which is a video game
rlm@448 290 engine designed to create cross-platform 3D desktop games. =CORTEX=
rlm@448 291 is mainly written in clojure, a dialect of =LISP= that runs on the
rlm@448 292 java virtual machine (JVM). The API for creating and simulating
rlm@449 293 creatures and senses is entirely expressed in clojure, though many
rlm@449 294 senses are implemented at the layer of jMonkeyEngine or below. For
rlm@449 295 example, for the sense of hearing I use a layer of clojure code on
rlm@449 296 top of a layer of java JNI bindings that drive a layer of =C++=
rlm@449 297 code which implements a modified version of =OpenAL= to support
rlm@449 298 multiple listeners. =CORTEX= is the only simulation environment
rlm@449 299 that I know of that can support multiple entities that can each
rlm@449 300 hear the world from their own perspective. Other senses also
rlm@449 301 require a small layer of Java code. =CORTEX= also uses =bullet=, a
rlm@449 302 physics simulator written in =C=.
rlm@448 303
rlm@448 304 #+caption: Here is the worm from above modeled in Blender, a free
rlm@448 305 #+caption: 3D-modeling program. Senses and joints are described
rlm@448 306 #+caption: using special nodes in Blender.
rlm@448 307 #+name: worm-recognition-intro
rlm@448 308 #+ATTR_LaTeX: :width 12cm
rlm@448 309 [[./images/blender-worm.png]]
rlm@448 310
rlm@449 311 Here are some thing I anticipate that =CORTEX= might be used for:
rlm@449 312
rlm@449 313 - exploring new ideas about sensory integration
rlm@449 314 - distributed communication among swarm creatures
rlm@449 315 - self-learning using free exploration,
rlm@449 316 - evolutionary algorithms involving creature construction
rlm@449 317 - exploration of exoitic senses and effectors that are not possible
rlm@449 318 in the real world (such as telekenisis or a semantic sense)
rlm@449 319 - imagination using subworlds
rlm@449 320
rlm@451 321 During one test with =CORTEX=, I created 3,000 creatures each with
rlm@448 322 their own independent senses and ran them all at only 1/80 real
rlm@448 323 time. In another test, I created a detailed model of my own hand,
rlm@448 324 equipped with a realistic distribution of touch (more sensitive at
rlm@448 325 the fingertips), as well as eyes and ears, and it ran at around 1/4
rlm@451 326 real time.
rlm@448 327
rlm@451 328 #+BEGIN_LaTeX
rlm@449 329 \begin{sidewaysfigure}
rlm@449 330 \includegraphics[width=9.5in]{images/full-hand.png}
rlm@451 331 \caption{
rlm@451 332 I modeled my own right hand in Blender and rigged it with all the
rlm@451 333 senses that {\tt CORTEX} supports. My simulated hand has a
rlm@451 334 biologically inspired distribution of touch sensors. The senses are
rlm@451 335 displayed on the right, and the simulation is displayed on the
rlm@451 336 left. Notice that my hand is curling its fingers, that it can see
rlm@451 337 its own finger from the eye in its palm, and that it can feel its
rlm@451 338 own thumb touching its palm.}
rlm@449 339 \end{sidewaysfigure}
rlm@451 340 #+END_LaTeX
rlm@448 341
rlm@437 342 ** Contributions
rlm@435 343
rlm@451 344 - I built =CORTEX=, a comprehensive platform for embodied AI
rlm@451 345 experiments. =CORTEX= supports many features lacking in other
rlm@451 346 systems, such proper simulation of hearing. It is easy to create
rlm@451 347 new =CORTEX= creatures using Blender, a free 3D modeling program.
rlm@449 348
rlm@451 349 - I built =EMPATH=, which uses =CORTEX= to identify the actions of
rlm@451 350 a worm-like creature using a computational model of empathy.
rlm@449 351
rlm@436 352 * Building =CORTEX=
rlm@435 353
rlm@462 354 I intend for =CORTEX= to be used as a general purpose library for
rlm@462 355 building creatures and outfitting them with senses, so that it will
rlm@462 356 be useful for other researchers who want to test out ideas of their
rlm@462 357 own. To this end, wherver I have had to make archetictural choices
rlm@462 358 about =CORTEX=, I have chosen to give as much freedom to the user as
rlm@462 359 possible, so that =CORTEX= may be used for things I have not
rlm@462 360 forseen.
rlm@462 361
rlm@465 362 ** COMMENT Simulation or Reality?
rlm@462 363
rlm@462 364 The most important archetictural decision of all is the choice to
rlm@462 365 use a computer-simulated environemnt in the first place! The world
rlm@462 366 is a vast and rich place, and for now simulations are a very poor
rlm@462 367 reflection of its complexity. It may be that there is a significant
rlm@462 368 qualatative difference between dealing with senses in the real
rlm@468 369 world and dealing with pale facilimilies of them in a simulation.
rlm@468 370 What are the advantages and disadvantages of a simulation vs.
rlm@468 371 reality?
rlm@462 372
rlm@462 373 *** Simulation
rlm@462 374
rlm@462 375 The advantages of virtual reality are that when everything is a
rlm@462 376 simulation, experiments in that simulation are absolutely
rlm@462 377 reproducible. It's also easier to change the character and world
rlm@462 378 to explore new situations and different sensory combinations.
rlm@462 379
rlm@462 380 If the world is to be simulated on a computer, then not only do
rlm@462 381 you have to worry about whether the character's senses are rich
rlm@462 382 enough to learn from the world, but whether the world itself is
rlm@462 383 rendered with enough detail and realism to give enough working
rlm@462 384 material to the character's senses. To name just a few
rlm@462 385 difficulties facing modern physics simulators: destructibility of
rlm@462 386 the environment, simulation of water/other fluids, large areas,
rlm@462 387 nonrigid bodies, lots of objects, smoke. I don't know of any
rlm@462 388 computer simulation that would allow a character to take a rock
rlm@462 389 and grind it into fine dust, then use that dust to make a clay
rlm@462 390 sculpture, at least not without spending years calculating the
rlm@462 391 interactions of every single small grain of dust. Maybe a
rlm@462 392 simulated world with today's limitations doesn't provide enough
rlm@462 393 richness for real intelligence to evolve.
rlm@462 394
rlm@462 395 *** Reality
rlm@462 396
rlm@462 397 The other approach for playing with senses is to hook your
rlm@462 398 software up to real cameras, microphones, robots, etc., and let it
rlm@462 399 loose in the real world. This has the advantage of eliminating
rlm@462 400 concerns about simulating the world at the expense of increasing
rlm@462 401 the complexity of implementing the senses. Instead of just
rlm@462 402 grabbing the current rendered frame for processing, you have to
rlm@462 403 use an actual camera with real lenses and interact with photons to
rlm@462 404 get an image. It is much harder to change the character, which is
rlm@462 405 now partly a physical robot of some sort, since doing so involves
rlm@462 406 changing things around in the real world instead of modifying
rlm@462 407 lines of code. While the real world is very rich and definitely
rlm@462 408 provides enough stimulation for intelligence to develop as
rlm@462 409 evidenced by our own existence, it is also uncontrollable in the
rlm@462 410 sense that a particular situation cannot be recreated perfectly or
rlm@462 411 saved for later use. It is harder to conduct science because it is
rlm@462 412 harder to repeat an experiment. The worst thing about using the
rlm@462 413 real world instead of a simulation is the matter of time. Instead
rlm@462 414 of simulated time you get the constant and unstoppable flow of
rlm@462 415 real time. This severely limits the sorts of software you can use
rlm@462 416 to program the AI because all sense inputs must be handled in real
rlm@462 417 time. Complicated ideas may have to be implemented in hardware or
rlm@462 418 may simply be impossible given the current speed of our
rlm@462 419 processors. Contrast this with a simulation, in which the flow of
rlm@462 420 time in the simulated world can be slowed down to accommodate the
rlm@462 421 limitations of the character's programming. In terms of cost,
rlm@462 422 doing everything in software is far cheaper than building custom
rlm@462 423 real-time hardware. All you need is a laptop and some patience.
rlm@435 424
rlm@465 425 ** COMMENT Because of Time, simulation is perferable to reality
rlm@435 426
rlm@462 427 I envision =CORTEX= being used to support rapid prototyping and
rlm@462 428 iteration of ideas. Even if I could put together a well constructed
rlm@462 429 kit for creating robots, it would still not be enough because of
rlm@462 430 the scourge of real-time processing. Anyone who wants to test their
rlm@462 431 ideas in the real world must always worry about getting their
rlm@465 432 algorithms to run fast enough to process information in real time.
rlm@465 433 The need for real time processing only increases if multiple senses
rlm@465 434 are involved. In the extreme case, even simple algorithms will have
rlm@465 435 to be accelerated by ASIC chips or FPGAs, turning what would
rlm@465 436 otherwise be a few lines of code and a 10x speed penality into a
rlm@465 437 multi-month ordeal. For this reason, =CORTEX= supports
rlm@462 438 /time-dialiation/, which scales back the framerate of the
rlm@465 439 simulation in proportion to the amount of processing each frame.
rlm@465 440 From the perspective of the creatures inside the simulation, time
rlm@465 441 always appears to flow at a constant rate, regardless of how
rlm@462 442 complicated the envorimnent becomes or how many creatures are in
rlm@462 443 the simulation. The cost is that =CORTEX= can sometimes run slower
rlm@462 444 than real time. This can also be an advantage, however ---
rlm@462 445 simulations of very simple creatures in =CORTEX= generally run at
rlm@462 446 40x on my machine!
rlm@462 447
rlm@468 448 ** What is a sense?
rlm@468 449
rlm@468 450 If =CORTEX= is to support a wide variety of senses, it would help
rlm@468 451 to have a better understanding of what a ``sense'' actually is!
rlm@468 452 While vision, touch, and hearing all seem like they are quite
rlm@468 453 different things, I was supprised to learn during the course of
rlm@468 454 this thesis that they (and all physical senses) can be expressed as
rlm@468 455 exactly the same mathematical object due to a dimensional argument!
rlm@468 456
rlm@468 457 Human beings are three-dimensional objects, and the nerves that
rlm@468 458 transmit data from our various sense organs to our brain are
rlm@468 459 essentially one-dimensional. This leaves up to two dimensions in
rlm@468 460 which our sensory information may flow. For example, imagine your
rlm@468 461 skin: it is a two-dimensional surface around a three-dimensional
rlm@468 462 object (your body). It has discrete touch sensors embedded at
rlm@468 463 various points, and the density of these sensors corresponds to the
rlm@468 464 sensitivity of that region of skin. Each touch sensor connects to a
rlm@468 465 nerve, all of which eventually are bundled together as they travel
rlm@468 466 up the spinal cord to the brain. Intersect the spinal nerves with a
rlm@468 467 guillotining plane and you will see all of the sensory data of the
rlm@468 468 skin revealed in a roughly circular two-dimensional image which is
rlm@468 469 the cross section of the spinal cord. Points on this image that are
rlm@468 470 close together in this circle represent touch sensors that are
rlm@468 471 /probably/ close together on the skin, although there is of course
rlm@468 472 some cutting and rearrangement that has to be done to transfer the
rlm@468 473 complicated surface of the skin onto a two dimensional image.
rlm@468 474
rlm@468 475 Most human senses consist of many discrete sensors of various
rlm@468 476 properties distributed along a surface at various densities. For
rlm@468 477 skin, it is Pacinian corpuscles, Meissner's corpuscles, Merkel's
rlm@468 478 disks, and Ruffini's endings, which detect pressure and vibration
rlm@468 479 of various intensities. For ears, it is the stereocilia distributed
rlm@468 480 along the basilar membrane inside the cochlea; each one is
rlm@468 481 sensitive to a slightly different frequency of sound. For eyes, it
rlm@468 482 is rods and cones distributed along the surface of the retina. In
rlm@468 483 each case, we can describe the sense with a surface and a
rlm@468 484 distribution of sensors along that surface.
rlm@468 485
rlm@468 486 The neat idea is that every human sense can be effectively
rlm@468 487 described in terms of a surface containing embedded sensors. If the
rlm@468 488 sense had any more dimensions, then there wouldn't be enough room
rlm@468 489 in the spinal chord to transmit the information!
rlm@468 490
rlm@468 491 Therefore, =CORTEX= must support the ability to create objects and
rlm@468 492 then be able to ``paint'' points along their surfaces to describe
rlm@468 493 each sense.
rlm@468 494
rlm@468 495 Fortunately this idea is already a well known computer graphics
rlm@468 496 technique called called /UV-mapping/. The three-dimensional surface
rlm@468 497 of a model is cut and smooshed until it fits on a two-dimensional
rlm@468 498 image. You paint whatever you want on that image, and when the
rlm@468 499 three-dimensional shape is rendered in a game the smooshing and
rlm@468 500 cutting is reversed and the image appears on the three-dimensional
rlm@468 501 object.
rlm@468 502
rlm@468 503 To make a sense, interpret the UV-image as describing the
rlm@468 504 distribution of that senses sensors. To get different types of
rlm@468 505 sensors, you can either use a different color for each type of
rlm@468 506 sensor, or use multiple UV-maps, each labeled with that sensor
rlm@468 507 type. I generally use a white pixel to mean the presence of a
rlm@468 508 sensor and a black pixel to mean the absence of a sensor, and use
rlm@468 509 one UV-map for each sensor-type within a given sense.
rlm@468 510
rlm@468 511 #+CAPTION: The UV-map for an elongated icososphere. The white
rlm@468 512 #+caption: dots each represent a touch sensor. They are dense
rlm@468 513 #+caption: in the regions that describe the tip of the finger,
rlm@468 514 #+caption: and less dense along the dorsal side of the finger
rlm@468 515 #+caption: opposite the tip.
rlm@468 516 #+name: finger-UV
rlm@468 517 #+ATTR_latex: :width 10cm
rlm@468 518 [[./images/finger-UV.png]]
rlm@468 519
rlm@468 520 #+caption: Ventral side of the UV-mapped finger. Notice the
rlm@468 521 #+caption: density of touch sensors at the tip.
rlm@468 522 #+name: finger-side-view
rlm@468 523 #+ATTR_LaTeX: :width 10cm
rlm@468 524 [[./images/finger-1.png]]
rlm@468 525
rlm@468 526
rlm@465 527 ** COMMENT Video game engines are a great starting point
rlm@462 528
rlm@462 529 I did not need to write my own physics simulation code or shader to
rlm@462 530 build =CORTEX=. Doing so would lead to a system that is impossible
rlm@462 531 for anyone but myself to use anyway. Instead, I use a video game
rlm@462 532 engine as a base and modify it to accomodate the additional needs
rlm@462 533 of =CORTEX=. Video game engines are an ideal starting point to
rlm@462 534 build =CORTEX=, because they are not far from being creature
rlm@463 535 building systems themselves.
rlm@462 536
rlm@462 537 First off, general purpose video game engines come with a physics
rlm@462 538 engine and lighting / sound system. The physics system provides
rlm@462 539 tools that can be co-opted to serve as touch, proprioception, and
rlm@462 540 muscles. Since some games support split screen views, a good video
rlm@462 541 game engine will allow you to efficiently create multiple cameras
rlm@463 542 in the simulated world that can be used as eyes. Video game systems
rlm@463 543 offer integrated asset management for things like textures and
rlm@468 544 creatures models, providing an avenue for defining creatures. They
rlm@468 545 also understand UV-mapping, since this technique is used to apply a
rlm@468 546 texture to a model. Finally, because video game engines support a
rlm@468 547 large number of users, as long as =CORTEX= doesn't stray too far
rlm@468 548 from the base system, other researchers can turn to this community
rlm@468 549 for help when doing their research.
rlm@463 550
rlm@465 551 ** COMMENT =CORTEX= is based on jMonkeyEngine3
rlm@463 552
rlm@463 553 While preparing to build =CORTEX= I studied several video game
rlm@463 554 engines to see which would best serve as a base. The top contenders
rlm@463 555 were:
rlm@463 556
rlm@463 557 - [[http://www.idsoftware.com][Quake II]]/[[http://www.bytonic.de/html/jake2.html][Jake2]] :: The Quake II engine was designed by ID
rlm@463 558 software in 1997. All the source code was released by ID
rlm@463 559 software into the Public Domain several years ago, and as a
rlm@463 560 result it has been ported to many different languages. This
rlm@463 561 engine was famous for its advanced use of realistic shading
rlm@463 562 and had decent and fast physics simulation. The main advantage
rlm@463 563 of the Quake II engine is its simplicity, but I ultimately
rlm@463 564 rejected it because the engine is too tied to the concept of a
rlm@463 565 first-person shooter game. One of the problems I had was that
rlm@463 566 there does not seem to be any easy way to attach multiple
rlm@463 567 cameras to a single character. There are also several physics
rlm@463 568 clipping issues that are corrected in a way that only applies
rlm@463 569 to the main character and do not apply to arbitrary objects.
rlm@463 570
rlm@463 571 - [[http://source.valvesoftware.com/][Source Engine]] :: The Source Engine evolved from the Quake II
rlm@463 572 and Quake I engines and is used by Valve in the Half-Life
rlm@463 573 series of games. The physics simulation in the Source Engine
rlm@463 574 is quite accurate and probably the best out of all the engines
rlm@463 575 I investigated. There is also an extensive community actively
rlm@463 576 working with the engine. However, applications that use the
rlm@463 577 Source Engine must be written in C++, the code is not open, it
rlm@463 578 only runs on Windows, and the tools that come with the SDK to
rlm@463 579 handle models and textures are complicated and awkward to use.
rlm@463 580
rlm@463 581 - [[http://jmonkeyengine.com/][jMonkeyEngine3]] :: jMonkeyEngine3 is a new library for creating
rlm@463 582 games in Java. It uses OpenGL to render to the screen and uses
rlm@463 583 screengraphs to avoid drawing things that do not appear on the
rlm@463 584 screen. It has an active community and several games in the
rlm@463 585 pipeline. The engine was not built to serve any particular
rlm@463 586 game but is instead meant to be used for any 3D game.
rlm@463 587
rlm@463 588 I chose jMonkeyEngine3 because it because it had the most features
rlm@464 589 out of all the free projects I looked at, and because I could then
rlm@463 590 write my code in clojure, an implementation of =LISP= that runs on
rlm@463 591 the JVM.
rlm@435 592
rlm@468 593 ** =CORTEX= uses Blender to create creature models
rlm@435 594
rlm@464 595 For the simple worm-like creatures I will use later on in this
rlm@464 596 thesis, I could define a simple API in =CORTEX= that would allow
rlm@464 597 one to create boxes, spheres, etc., and leave that API as the sole
rlm@464 598 way to create creatures. However, for =CORTEX= to truly be useful
rlm@468 599 for other projects, it needs a way to construct complicated
rlm@464 600 creatures. If possible, it would be nice to leverage work that has
rlm@464 601 already been done by the community of 3D modelers, or at least
rlm@464 602 enable people who are talented at moedling but not programming to
rlm@468 603 design =CORTEX= creatures.
rlm@464 604
rlm@464 605 Therefore, I use Blender, a free 3D modeling program, as the main
rlm@464 606 way to create creatures in =CORTEX=. However, the creatures modeled
rlm@464 607 in Blender must also be simple to simulate in jMonkeyEngine3's game
rlm@468 608 engine, and must also be easy to rig with =CORTEX='s senses. I
rlm@468 609 accomplish this with extensive use of Blender's ``empty nodes.''
rlm@464 610
rlm@468 611 Empty nodes have no mass, physical presence, or appearance, but
rlm@468 612 they can hold metadata and have names. I use a tree structure of
rlm@468 613 empty nodes to specify senses in the following manner:
rlm@468 614
rlm@468 615 - Create a single top-level empty node whose name is the name of
rlm@468 616 the sense.
rlm@468 617 - Add empty nodes which each contain meta-data relevant to the
rlm@468 618 sense, including a UV-map describing the number/distribution of
rlm@468 619 sensors if applicable.
rlm@468 620 - Make each empty-node the child of the top-level node.
rlm@468 621
rlm@468 622 #+caption: An example of annoting a creature model with empty
rlm@468 623 #+caption: nodes to describe the layout of senses. There are
rlm@468 624 #+caption: multiple empty nodes which each describe the position
rlm@468 625 #+caption: of muscles, ears, eyes, or joints.
rlm@468 626 #+name: sense-nodes
rlm@468 627 #+ATTR_LaTeX: :width 10cm
rlm@468 628 [[./images/empty-sense-nodes.png]]
rlm@468 629
rlm@468 630
rlm@468 631 ** Bodies are composed of segments connected by joints
rlm@468 632
rlm@468 633 Blender is a general purpose animation tool, which has been used in
rlm@468 634 the past to create high quality movies such as Sintel
rlm@468 635 \cite{sintel}. Though Blender can model and render even complicated
rlm@468 636 things like water, it is crucual to keep models that are meant to
rlm@468 637 be simulated as creatures simple. =Bullet=, which =CORTEX= uses
rlm@468 638 though jMonkeyEngine3, is a rigid-body physics system. This offers
rlm@468 639 a compromise between the expressiveness of a game level and the
rlm@468 640 speed at which it can be simulated, and it means that creatures
rlm@468 641 should be naturally expressed as rigid components held together by
rlm@468 642 joint constraints.
rlm@468 643
rlm@468 644 But humans are more like a squishy bag with wrapped around some
rlm@468 645 hard bones which define the overall shape. When we move, our skin
rlm@468 646 bends and stretches to accomodate the new positions of our bones.
rlm@468 647
rlm@468 648 One way to make bodies composed of rigid pieces connected by joints
rlm@468 649 /seem/ more human-like is to use an /armature/, (or /rigging/)
rlm@468 650 system, which defines a overall ``body mesh'' and defines how the
rlm@468 651 mesh deforms as a function of the position of each ``bone'' which
rlm@468 652 is a standard rigid body. This technique is used extensively to
rlm@468 653 model humans and create realistic animations. It is not a good
rlm@468 654 technique for physical simulation, however because it creates a lie
rlm@468 655 -- the skin is not a physical part of the simulation and does not
rlm@468 656 interact with any objects in the world or itself. Objects will pass
rlm@468 657 right though the skin until they come in contact with the
rlm@468 658 underlying bone, which is a physical object. Whithout simulating
rlm@468 659 the skin, the sense of touch has little meaning, and the creature's
rlm@468 660 own vision will lie to it about the true extent of its body.
rlm@468 661 Simulating the skin as a physical object requires some way to
rlm@468 662 continuously update the physical model of the skin along with the
rlm@468 663 movement of the bones, which is unacceptably slow compared to rigid
rlm@468 664 body simulation.
rlm@468 665
rlm@468 666 Therefore, instead of using the human-like ``deformable bag of
rlm@468 667 bones'' approach, I decided to base my body plans on multiple solid
rlm@468 668 objects that are connected by joints, inspired by the robot =EVE=
rlm@468 669 from the movie WALL-E.
rlm@464 670
rlm@464 671 #+caption: =EVE= from the movie WALL-E. This body plan turns
rlm@464 672 #+caption: out to be much better suited to my purposes than a more
rlm@464 673 #+caption: human-like one.
rlm@465 674 #+ATTR_LaTeX: :width 10cm
rlm@464 675 [[./images/Eve.jpg]]
rlm@464 676
rlm@464 677 =EVE='s body is composed of several rigid components that are held
rlm@464 678 together by invisible joint constraints. This is what I mean by
rlm@464 679 ``eve-like''. The main reason that I use eve-style bodies is for
rlm@464 680 efficiency, and so that there will be correspondence between the
rlm@468 681 AI's semses and the physical presence of its body. Each individual
rlm@464 682 section is simulated by a separate rigid body that corresponds
rlm@464 683 exactly with its visual representation and does not change.
rlm@464 684 Sections are connected by invisible joints that are well supported
rlm@464 685 in jMonkeyEngine3. Bullet, the physics backend for jMonkeyEngine3,
rlm@464 686 can efficiently simulate hundreds of rigid bodies connected by
rlm@468 687 joints. Just because sections are rigid does not mean they have to
rlm@468 688 stay as one piece forever; they can be dynamically replaced with
rlm@468 689 multiple sections to simulate splitting in two. This could be used
rlm@468 690 to simulate retractable claws or =EVE='s hands, which are able to
rlm@468 691 coalesce into one object in the movie.
rlm@465 692
rlm@465 693 *** Solidifying/Connecting the body
rlm@465 694
rlm@465 695 #+caption: View of the hand model in Blender showing the main ``joints''
rlm@465 696 #+caption: node (highlighted in yellow) and its children which each
rlm@465 697 #+caption: represent a joint in the hand. Each joint node has metadata
rlm@465 698 #+caption: specifying what sort of joint it is.
rlm@466 699 #+name: blender-hand
rlm@465 700 #+ATTR_LaTeX: :width 10cm
rlm@465 701 [[./images/hand-screenshot1.png]]
rlm@465 702
rlm@466 703 =CORTEX= creates a creature in two steps: first, it traverses the
rlm@466 704 nodes in the blender file and creates physical representations for
rlm@466 705 any of them that have mass defined.
rlm@466 706
rlm@466 707 #+caption: Program for iterating through the nodes in a blender file
rlm@466 708 #+caption: and generating physical jMonkeyEngine3 objects with mass
rlm@466 709 #+caption: and a matching physics shape.
rlm@466 710 #+name: name
rlm@466 711 #+begin_listing clojure
rlm@466 712 #+begin_src clojure
rlm@466 713 (defn physical!
rlm@466 714 "Iterate through the nodes in creature and make them real physical
rlm@466 715 objects in the simulation."
rlm@466 716 [#^Node creature]
rlm@466 717 (dorun
rlm@466 718 (map
rlm@466 719 (fn [geom]
rlm@466 720 (let [physics-control
rlm@466 721 (RigidBodyControl.
rlm@466 722 (HullCollisionShape.
rlm@466 723 (.getMesh geom))
rlm@466 724 (if-let [mass (meta-data geom "mass")]
rlm@466 725 (float mass) (float 1)))]
rlm@466 726 (.addControl geom physics-control)))
rlm@466 727 (filter #(isa? (class %) Geometry )
rlm@466 728 (node-seq creature)))))
rlm@466 729 #+end_src
rlm@466 730 #+end_listing
rlm@465 731
rlm@466 732 The next step to making a proper body is to connect those pieces
rlm@466 733 together with joints. jMonkeyEngine has a large array of joints
rlm@466 734 available via =bullet=, such as Point2Point, Cone, Hinge, and a
rlm@466 735 generic Six Degree of Freedom joint, with or without spring
rlm@466 736 restitution. =CORTEX='s procedure for binding the creature together
rlm@466 737 with joints is as follows:
rlm@465 738
rlm@466 739 - Find the children of the "joints" node.
rlm@466 740 - Determine the two spatials the joint is meant to connect.
rlm@466 741 - Create the joint based on the meta-data of the empty node.
rlm@466 742
rlm@466 743 The higher order function =sense-nodes= from =cortex.sense=
rlm@466 744 simplifies finding the joints based on their parent ``joints''
rlm@466 745 node.
rlm@466 746
rlm@466 747 #+caption: Retrieving the children empty nodes from a single
rlm@466 748 #+caption: named empty node is a common pattern in =CORTEX=
rlm@466 749 #+caption: further instances of this technique for the senses
rlm@466 750 #+caption: will be omitted
rlm@466 751 #+name: get-empty-nodes
rlm@466 752 #+begin_listing clojure
rlm@466 753 #+begin_src clojure
rlm@466 754 (defn sense-nodes
rlm@466 755 "For some senses there is a special empty blender node whose
rlm@466 756 children are considered markers for an instance of that sense. This
rlm@466 757 function generates functions to find those children, given the name
rlm@466 758 of the special parent node."
rlm@466 759 [parent-name]
rlm@466 760 (fn [#^Node creature]
rlm@466 761 (if-let [sense-node (.getChild creature parent-name)]
rlm@466 762 (seq (.getChildren sense-node)) [])))
rlm@466 763
rlm@466 764 (def
rlm@466 765 ^{:doc "Return the children of the creature's \"joints\" node."
rlm@466 766 :arglists '([creature])}
rlm@466 767 joints
rlm@466 768 (sense-nodes "joints"))
rlm@466 769 #+end_src
rlm@466 770 #+end_listing
rlm@466 771
rlm@466 772 To find a joint's targets targets, =CORTEX= creates a small cube,
rlm@466 773 centered around the empty-node, and grows the cube exponentially
rlm@466 774 until it intersects two /physical/ objects. The objects are ordered
rlm@466 775 according to the joint's rotation, with the first one being the
rlm@466 776 object that has more negative coordinates in the joint's reference
rlm@466 777 frame. Since the objects must be physical, the empty-node itself
rlm@466 778 escapes detection. Because the objects must be physical,
rlm@466 779 =joint-targets= must be called /after/ =physical!= is called.
rlm@464 780
rlm@466 781 #+caption: Program to find the targets of a joint node by
rlm@466 782 #+caption: exponentiallly growth of a search cube.
rlm@466 783 #+name: joint-targets
rlm@466 784 #+begin_listing clojure
rlm@466 785 #+begin_src clojure
rlm@466 786 (defn joint-targets
rlm@466 787 "Return the two closest two objects to the joint object, ordered
rlm@466 788 from bottom to top according to the joint's rotation."
rlm@466 789 [#^Node parts #^Node joint]
rlm@466 790 (loop [radius (float 0.01)]
rlm@466 791 (let [results (CollisionResults.)]
rlm@466 792 (.collideWith
rlm@466 793 parts
rlm@466 794 (BoundingBox. (.getWorldTranslation joint)
rlm@466 795 radius radius radius) results)
rlm@466 796 (let [targets
rlm@466 797 (distinct
rlm@466 798 (map #(.getGeometry %) results))]
rlm@466 799 (if (>= (count targets) 2)
rlm@466 800 (sort-by
rlm@466 801 #(let [joint-ref-frame-position
rlm@466 802 (jme-to-blender
rlm@466 803 (.mult
rlm@466 804 (.inverse (.getWorldRotation joint))
rlm@466 805 (.subtract (.getWorldTranslation %)
rlm@466 806 (.getWorldTranslation joint))))]
rlm@466 807 (.dot (Vector3f. 1 1 1) joint-ref-frame-position))
rlm@466 808 (take 2 targets))
rlm@466 809 (recur (float (* radius 2))))))))
rlm@466 810 #+end_src
rlm@466 811 #+end_listing
rlm@464 812
rlm@466 813 Once =CORTEX= finds all joints and targets, it creates them using a
rlm@466 814 simple dispatch on the metadata of the joint node.
rlm@466 815
rlm@466 816 #+caption: Program to dispatch on blender metadata and create joints
rlm@466 817 #+caption: sutiable for physical simulation.
rlm@466 818 #+name: joint-dispatch
rlm@466 819 #+begin_listing clojure
rlm@466 820 #+begin_src clojure
rlm@466 821 (defmulti joint-dispatch
rlm@466 822 "Translate blender pseudo-joints into real JME joints."
rlm@466 823 (fn [constraints & _]
rlm@466 824 (:type constraints)))
rlm@466 825
rlm@466 826 (defmethod joint-dispatch :point
rlm@466 827 [constraints control-a control-b pivot-a pivot-b rotation]
rlm@466 828 (doto (SixDofJoint. control-a control-b pivot-a pivot-b false)
rlm@466 829 (.setLinearLowerLimit Vector3f/ZERO)
rlm@466 830 (.setLinearUpperLimit Vector3f/ZERO)))
rlm@466 831
rlm@466 832 (defmethod joint-dispatch :hinge
rlm@466 833 [constraints control-a control-b pivot-a pivot-b rotation]
rlm@466 834 (let [axis (if-let [axis (:axis constraints)] axis Vector3f/UNIT_X)
rlm@466 835 [limit-1 limit-2] (:limit constraints)
rlm@466 836 hinge-axis (.mult rotation (blender-to-jme axis))]
rlm@466 837 (doto (HingeJoint. control-a control-b pivot-a pivot-b
rlm@466 838 hinge-axis hinge-axis)
rlm@466 839 (.setLimit limit-1 limit-2))))
rlm@466 840
rlm@466 841 (defmethod joint-dispatch :cone
rlm@466 842 [constraints control-a control-b pivot-a pivot-b rotation]
rlm@466 843 (let [limit-xz (:limit-xz constraints)
rlm@466 844 limit-xy (:limit-xy constraints)
rlm@466 845 twist (:twist constraints)]
rlm@466 846 (doto (ConeJoint. control-a control-b pivot-a pivot-b
rlm@466 847 rotation rotation)
rlm@466 848 (.setLimit (float limit-xz) (float limit-xy)
rlm@466 849 (float twist)))))
rlm@466 850 #+end_src
rlm@466 851 #+end_listing
rlm@466 852
rlm@466 853 All that is left for joints it to combine the above pieces into a
rlm@466 854 something that can operate on the collection of nodes that a
rlm@466 855 blender file represents.
rlm@466 856
rlm@466 857 #+caption: Program to completely create a joint given information
rlm@466 858 #+caption: from a blender file.
rlm@466 859 #+name: connect
rlm@466 860 #+begin_listing clojure
rlm@466 861 #+begin_src clojure
rlm@466 862 (defn connect
rlm@466 863 "Create a joint between 'obj-a and 'obj-b at the location of
rlm@466 864 'joint. The type of joint is determined by the metadata on 'joint.
rlm@466 865
rlm@466 866 Here are some examples:
rlm@466 867 {:type :point}
rlm@466 868 {:type :hinge :limit [0 (/ Math/PI 2)] :axis (Vector3f. 0 1 0)}
rlm@466 869 (:axis defaults to (Vector3f. 1 0 0) if not provided for hinge joints)
rlm@466 870
rlm@466 871 {:type :cone :limit-xz 0]
rlm@466 872 :limit-xy 0]
rlm@466 873 :twist 0]} (use XZY rotation mode in blender!)"
rlm@466 874 [#^Node obj-a #^Node obj-b #^Node joint]
rlm@466 875 (let [control-a (.getControl obj-a RigidBodyControl)
rlm@466 876 control-b (.getControl obj-b RigidBodyControl)
rlm@466 877 joint-center (.getWorldTranslation joint)
rlm@466 878 joint-rotation (.toRotationMatrix (.getWorldRotation joint))
rlm@466 879 pivot-a (world-to-local obj-a joint-center)
rlm@466 880 pivot-b (world-to-local obj-b joint-center)]
rlm@466 881 (if-let
rlm@466 882 [constraints (map-vals eval (read-string (meta-data joint "joint")))]
rlm@466 883 ;; A side-effect of creating a joint registers
rlm@466 884 ;; it with both physics objects which in turn
rlm@466 885 ;; will register the joint with the physics system
rlm@466 886 ;; when the simulation is started.
rlm@466 887 (joint-dispatch constraints
rlm@466 888 control-a control-b
rlm@466 889 pivot-a pivot-b
rlm@466 890 joint-rotation))))
rlm@466 891 #+end_src
rlm@466 892 #+end_listing
rlm@466 893
rlm@466 894 In general, whenever =CORTEX= exposes a sense (or in this case
rlm@466 895 physicality), it provides a function of the type =sense!=, which
rlm@466 896 takes in a collection of nodes and augments it to support that
rlm@466 897 sense. The function returns any controlls necessary to use that
rlm@466 898 sense. In this case =body!= cerates a physical body and returns no
rlm@466 899 control functions.
rlm@466 900
rlm@466 901 #+caption: Program to give joints to a creature.
rlm@466 902 #+name: name
rlm@466 903 #+begin_listing clojure
rlm@466 904 #+begin_src clojure
rlm@466 905 (defn joints!
rlm@466 906 "Connect the solid parts of the creature with physical joints. The
rlm@466 907 joints are taken from the \"joints\" node in the creature."
rlm@466 908 [#^Node creature]
rlm@466 909 (dorun
rlm@466 910 (map
rlm@466 911 (fn [joint]
rlm@466 912 (let [[obj-a obj-b] (joint-targets creature joint)]
rlm@466 913 (connect obj-a obj-b joint)))
rlm@466 914 (joints creature))))
rlm@466 915 (defn body!
rlm@466 916 "Endow the creature with a physical body connected with joints. The
rlm@466 917 particulars of the joints and the masses of each body part are
rlm@466 918 determined in blender."
rlm@466 919 [#^Node creature]
rlm@466 920 (physical! creature)
rlm@466 921 (joints! creature))
rlm@466 922 #+end_src
rlm@466 923 #+end_listing
rlm@466 924
rlm@466 925 All of the code you have just seen amounts to only 130 lines, yet
rlm@466 926 because it builds on top of Blender and jMonkeyEngine3, those few
rlm@466 927 lines pack quite a punch!
rlm@466 928
rlm@466 929 The hand from figure \ref{blender-hand}, which was modeled after my
rlm@466 930 own right hand, can now be given joints and simulated as a
rlm@466 931 creature.
rlm@466 932
rlm@466 933 #+caption: With the ability to create physical creatures from blender,
rlm@466 934 #+caption: =CORTEX= gets one step closer to a full creature simulation
rlm@466 935 #+caption: environment.
rlm@466 936 #+name: name
rlm@466 937 #+ATTR_LaTeX: :width 15cm
rlm@466 938 [[./images/physical-hand.png]]
rlm@466 939
rlm@464 940
rlm@468 941
rlm@436 942 ** Eyes reuse standard video game components
rlm@436 943
rlm@436 944 ** Hearing is hard; =CORTEX= does it right
rlm@436 945
rlm@436 946 ** Touch uses hundreds of hair-like elements
rlm@436 947
rlm@440 948 ** Proprioception is the sense that makes everything ``real''
rlm@436 949
rlm@436 950 ** Muscles are both effectors and sensors
rlm@436 951
rlm@436 952 ** =CORTEX= brings complex creatures to life!
rlm@436 953
rlm@436 954 ** =CORTEX= enables many possiblities for further research
rlm@435 955
rlm@465 956 * COMMENT Empathy in a simulated worm
rlm@435 957
rlm@449 958 Here I develop a computational model of empathy, using =CORTEX= as a
rlm@449 959 base. Empathy in this context is the ability to observe another
rlm@449 960 creature and infer what sorts of sensations that creature is
rlm@449 961 feeling. My empathy algorithm involves multiple phases. First is
rlm@449 962 free-play, where the creature moves around and gains sensory
rlm@449 963 experience. From this experience I construct a representation of the
rlm@449 964 creature's sensory state space, which I call \Phi-space. Using
rlm@449 965 \Phi-space, I construct an efficient function which takes the
rlm@449 966 limited data that comes from observing another creature and enriches
rlm@449 967 it full compliment of imagined sensory data. I can then use the
rlm@449 968 imagined sensory data to recognize what the observed creature is
rlm@449 969 doing and feeling, using straightforward embodied action predicates.
rlm@449 970 This is all demonstrated with using a simple worm-like creature, and
rlm@449 971 recognizing worm-actions based on limited data.
rlm@449 972
rlm@449 973 #+caption: Here is the worm with which we will be working.
rlm@449 974 #+caption: It is composed of 5 segments. Each segment has a
rlm@449 975 #+caption: pair of extensor and flexor muscles. Each of the
rlm@449 976 #+caption: worm's four joints is a hinge joint which allows
rlm@451 977 #+caption: about 30 degrees of rotation to either side. Each segment
rlm@449 978 #+caption: of the worm is touch-capable and has a uniform
rlm@449 979 #+caption: distribution of touch sensors on each of its faces.
rlm@449 980 #+caption: Each joint has a proprioceptive sense to detect
rlm@449 981 #+caption: relative positions. The worm segments are all the
rlm@449 982 #+caption: same except for the first one, which has a much
rlm@449 983 #+caption: higher weight than the others to allow for easy
rlm@449 984 #+caption: manual motor control.
rlm@449 985 #+name: basic-worm-view
rlm@449 986 #+ATTR_LaTeX: :width 10cm
rlm@449 987 [[./images/basic-worm-view.png]]
rlm@449 988
rlm@449 989 #+caption: Program for reading a worm from a blender file and
rlm@449 990 #+caption: outfitting it with the senses of proprioception,
rlm@449 991 #+caption: touch, and the ability to move, as specified in the
rlm@449 992 #+caption: blender file.
rlm@449 993 #+name: get-worm
rlm@449 994 #+begin_listing clojure
rlm@449 995 #+begin_src clojure
rlm@449 996 (defn worm []
rlm@449 997 (let [model (load-blender-model "Models/worm/worm.blend")]
rlm@449 998 {:body (doto model (body!))
rlm@449 999 :touch (touch! model)
rlm@449 1000 :proprioception (proprioception! model)
rlm@449 1001 :muscles (movement! model)}))
rlm@449 1002 #+end_src
rlm@449 1003 #+end_listing
rlm@452 1004
rlm@436 1005 ** Embodiment factors action recognition into managable parts
rlm@435 1006
rlm@449 1007 Using empathy, I divide the problem of action recognition into a
rlm@449 1008 recognition process expressed in the language of a full compliment
rlm@449 1009 of senses, and an imaganitive process that generates full sensory
rlm@449 1010 data from partial sensory data. Splitting the action recognition
rlm@449 1011 problem in this manner greatly reduces the total amount of work to
rlm@449 1012 recognize actions: The imaganitive process is mostly just matching
rlm@449 1013 previous experience, and the recognition process gets to use all
rlm@449 1014 the senses to directly describe any action.
rlm@449 1015
rlm@436 1016 ** Action recognition is easy with a full gamut of senses
rlm@435 1017
rlm@449 1018 Embodied representations using multiple senses such as touch,
rlm@449 1019 proprioception, and muscle tension turns out be be exceedingly
rlm@449 1020 efficient at describing body-centered actions. It is the ``right
rlm@449 1021 language for the job''. For example, it takes only around 5 lines
rlm@449 1022 of LISP code to describe the action of ``curling'' using embodied
rlm@451 1023 primitives. It takes about 10 lines to describe the seemingly
rlm@449 1024 complicated action of wiggling.
rlm@449 1025
rlm@449 1026 The following action predicates each take a stream of sensory
rlm@449 1027 experience, observe however much of it they desire, and decide
rlm@449 1028 whether the worm is doing the action they describe. =curled?=
rlm@449 1029 relies on proprioception, =resting?= relies on touch, =wiggling?=
rlm@449 1030 relies on a fourier analysis of muscle contraction, and
rlm@449 1031 =grand-circle?= relies on touch and reuses =curled?= as a gaurd.
rlm@449 1032
rlm@449 1033 #+caption: Program for detecting whether the worm is curled. This is the
rlm@449 1034 #+caption: simplest action predicate, because it only uses the last frame
rlm@449 1035 #+caption: of sensory experience, and only uses proprioceptive data. Even
rlm@449 1036 #+caption: this simple predicate, however, is automatically frame
rlm@449 1037 #+caption: independent and ignores vermopomorphic differences such as
rlm@449 1038 #+caption: worm textures and colors.
rlm@449 1039 #+name: curled
rlm@452 1040 #+attr_latex: [htpb]
rlm@452 1041 #+begin_listing clojure
rlm@449 1042 #+begin_src clojure
rlm@449 1043 (defn curled?
rlm@449 1044 "Is the worm curled up?"
rlm@449 1045 [experiences]
rlm@449 1046 (every?
rlm@449 1047 (fn [[_ _ bend]]
rlm@449 1048 (> (Math/sin bend) 0.64))
rlm@449 1049 (:proprioception (peek experiences))))
rlm@449 1050 #+end_src
rlm@449 1051 #+end_listing
rlm@449 1052
rlm@449 1053 #+caption: Program for summarizing the touch information in a patch
rlm@449 1054 #+caption: of skin.
rlm@449 1055 #+name: touch-summary
rlm@452 1056 #+attr_latex: [htpb]
rlm@452 1057
rlm@452 1058 #+begin_listing clojure
rlm@449 1059 #+begin_src clojure
rlm@449 1060 (defn contact
rlm@449 1061 "Determine how much contact a particular worm segment has with
rlm@449 1062 other objects. Returns a value between 0 and 1, where 1 is full
rlm@449 1063 contact and 0 is no contact."
rlm@449 1064 [touch-region [coords contact :as touch]]
rlm@449 1065 (-> (zipmap coords contact)
rlm@449 1066 (select-keys touch-region)
rlm@449 1067 (vals)
rlm@449 1068 (#(map first %))
rlm@449 1069 (average)
rlm@449 1070 (* 10)
rlm@449 1071 (- 1)
rlm@449 1072 (Math/abs)))
rlm@449 1073 #+end_src
rlm@449 1074 #+end_listing
rlm@449 1075
rlm@449 1076
rlm@449 1077 #+caption: Program for detecting whether the worm is at rest. This program
rlm@449 1078 #+caption: uses a summary of the tactile information from the underbelly
rlm@449 1079 #+caption: of the worm, and is only true if every segment is touching the
rlm@449 1080 #+caption: floor. Note that this function contains no references to
rlm@449 1081 #+caption: proprioction at all.
rlm@449 1082 #+name: resting
rlm@452 1083 #+attr_latex: [htpb]
rlm@452 1084 #+begin_listing clojure
rlm@449 1085 #+begin_src clojure
rlm@449 1086 (def worm-segment-bottom (rect-region [8 15] [14 22]))
rlm@449 1087
rlm@449 1088 (defn resting?
rlm@449 1089 "Is the worm resting on the ground?"
rlm@449 1090 [experiences]
rlm@449 1091 (every?
rlm@449 1092 (fn [touch-data]
rlm@449 1093 (< 0.9 (contact worm-segment-bottom touch-data)))
rlm@449 1094 (:touch (peek experiences))))
rlm@449 1095 #+end_src
rlm@449 1096 #+end_listing
rlm@449 1097
rlm@449 1098 #+caption: Program for detecting whether the worm is curled up into a
rlm@449 1099 #+caption: full circle. Here the embodied approach begins to shine, as
rlm@449 1100 #+caption: I am able to both use a previous action predicate (=curled?=)
rlm@449 1101 #+caption: as well as the direct tactile experience of the head and tail.
rlm@449 1102 #+name: grand-circle
rlm@452 1103 #+attr_latex: [htpb]
rlm@452 1104 #+begin_listing clojure
rlm@449 1105 #+begin_src clojure
rlm@449 1106 (def worm-segment-bottom-tip (rect-region [15 15] [22 22]))
rlm@449 1107
rlm@449 1108 (def worm-segment-top-tip (rect-region [0 15] [7 22]))
rlm@449 1109
rlm@449 1110 (defn grand-circle?
rlm@449 1111 "Does the worm form a majestic circle (one end touching the other)?"
rlm@449 1112 [experiences]
rlm@449 1113 (and (curled? experiences)
rlm@449 1114 (let [worm-touch (:touch (peek experiences))
rlm@449 1115 tail-touch (worm-touch 0)
rlm@449 1116 head-touch (worm-touch 4)]
rlm@449 1117 (and (< 0.55 (contact worm-segment-bottom-tip tail-touch))
rlm@449 1118 (< 0.55 (contact worm-segment-top-tip head-touch))))))
rlm@449 1119 #+end_src
rlm@449 1120 #+end_listing
rlm@449 1121
rlm@449 1122
rlm@449 1123 #+caption: Program for detecting whether the worm has been wiggling for
rlm@449 1124 #+caption: the last few frames. It uses a fourier analysis of the muscle
rlm@449 1125 #+caption: contractions of the worm's tail to determine wiggling. This is
rlm@449 1126 #+caption: signigicant because there is no particular frame that clearly
rlm@449 1127 #+caption: indicates that the worm is wiggling --- only when multiple frames
rlm@449 1128 #+caption: are analyzed together is the wiggling revealed. Defining
rlm@449 1129 #+caption: wiggling this way also gives the worm an opportunity to learn
rlm@449 1130 #+caption: and recognize ``frustrated wiggling'', where the worm tries to
rlm@449 1131 #+caption: wiggle but can't. Frustrated wiggling is very visually different
rlm@449 1132 #+caption: from actual wiggling, but this definition gives it to us for free.
rlm@449 1133 #+name: wiggling
rlm@452 1134 #+attr_latex: [htpb]
rlm@452 1135 #+begin_listing clojure
rlm@449 1136 #+begin_src clojure
rlm@449 1137 (defn fft [nums]
rlm@449 1138 (map
rlm@449 1139 #(.getReal %)
rlm@449 1140 (.transform
rlm@449 1141 (FastFourierTransformer. DftNormalization/STANDARD)
rlm@449 1142 (double-array nums) TransformType/FORWARD)))
rlm@449 1143
rlm@449 1144 (def indexed (partial map-indexed vector))
rlm@449 1145
rlm@449 1146 (defn max-indexed [s]
rlm@449 1147 (first (sort-by (comp - second) (indexed s))))
rlm@449 1148
rlm@449 1149 (defn wiggling?
rlm@449 1150 "Is the worm wiggling?"
rlm@449 1151 [experiences]
rlm@449 1152 (let [analysis-interval 0x40]
rlm@449 1153 (when (> (count experiences) analysis-interval)
rlm@449 1154 (let [a-flex 3
rlm@449 1155 a-ex 2
rlm@449 1156 muscle-activity
rlm@449 1157 (map :muscle (vector:last-n experiences analysis-interval))
rlm@449 1158 base-activity
rlm@449 1159 (map #(- (% a-flex) (% a-ex)) muscle-activity)]
rlm@449 1160 (= 2
rlm@449 1161 (first
rlm@449 1162 (max-indexed
rlm@449 1163 (map #(Math/abs %)
rlm@449 1164 (take 20 (fft base-activity))))))))))
rlm@449 1165 #+end_src
rlm@449 1166 #+end_listing
rlm@449 1167
rlm@449 1168 With these action predicates, I can now recognize the actions of
rlm@449 1169 the worm while it is moving under my control and I have access to
rlm@449 1170 all the worm's senses.
rlm@449 1171
rlm@449 1172 #+caption: Use the action predicates defined earlier to report on
rlm@449 1173 #+caption: what the worm is doing while in simulation.
rlm@449 1174 #+name: report-worm-activity
rlm@452 1175 #+attr_latex: [htpb]
rlm@452 1176 #+begin_listing clojure
rlm@449 1177 #+begin_src clojure
rlm@449 1178 (defn debug-experience
rlm@449 1179 [experiences text]
rlm@449 1180 (cond
rlm@449 1181 (grand-circle? experiences) (.setText text "Grand Circle")
rlm@449 1182 (curled? experiences) (.setText text "Curled")
rlm@449 1183 (wiggling? experiences) (.setText text "Wiggling")
rlm@449 1184 (resting? experiences) (.setText text "Resting")))
rlm@449 1185 #+end_src
rlm@449 1186 #+end_listing
rlm@449 1187
rlm@449 1188 #+caption: Using =debug-experience=, the body-centered predicates
rlm@449 1189 #+caption: work together to classify the behaviour of the worm.
rlm@451 1190 #+caption: the predicates are operating with access to the worm's
rlm@451 1191 #+caption: full sensory data.
rlm@449 1192 #+name: basic-worm-view
rlm@449 1193 #+ATTR_LaTeX: :width 10cm
rlm@449 1194 [[./images/worm-identify-init.png]]
rlm@449 1195
rlm@449 1196 These action predicates satisfy the recognition requirement of an
rlm@451 1197 empathic recognition system. There is power in the simplicity of
rlm@451 1198 the action predicates. They describe their actions without getting
rlm@451 1199 confused in visual details of the worm. Each one is frame
rlm@451 1200 independent, but more than that, they are each indepent of
rlm@449 1201 irrelevant visual details of the worm and the environment. They
rlm@449 1202 will work regardless of whether the worm is a different color or
rlm@451 1203 hevaily textured, or if the environment has strange lighting.
rlm@449 1204
rlm@449 1205 The trick now is to make the action predicates work even when the
rlm@449 1206 sensory data on which they depend is absent. If I can do that, then
rlm@449 1207 I will have gained much,
rlm@435 1208
rlm@436 1209 ** \Phi-space describes the worm's experiences
rlm@449 1210
rlm@449 1211 As a first step towards building empathy, I need to gather all of
rlm@449 1212 the worm's experiences during free play. I use a simple vector to
rlm@449 1213 store all the experiences.
rlm@449 1214
rlm@449 1215 Each element of the experience vector exists in the vast space of
rlm@449 1216 all possible worm-experiences. Most of this vast space is actually
rlm@449 1217 unreachable due to physical constraints of the worm's body. For
rlm@449 1218 example, the worm's segments are connected by hinge joints that put
rlm@451 1219 a practical limit on the worm's range of motions without limiting
rlm@451 1220 its degrees of freedom. Some groupings of senses are impossible;
rlm@451 1221 the worm can not be bent into a circle so that its ends are
rlm@451 1222 touching and at the same time not also experience the sensation of
rlm@451 1223 touching itself.
rlm@449 1224
rlm@451 1225 As the worm moves around during free play and its experience vector
rlm@451 1226 grows larger, the vector begins to define a subspace which is all
rlm@451 1227 the sensations the worm can practicaly experience during normal
rlm@451 1228 operation. I call this subspace \Phi-space, short for
rlm@451 1229 physical-space. The experience vector defines a path through
rlm@451 1230 \Phi-space. This path has interesting properties that all derive
rlm@451 1231 from physical embodiment. The proprioceptive components are
rlm@451 1232 completely smooth, because in order for the worm to move from one
rlm@451 1233 position to another, it must pass through the intermediate
rlm@451 1234 positions. The path invariably forms loops as actions are repeated.
rlm@451 1235 Finally and most importantly, proprioception actually gives very
rlm@451 1236 strong inference about the other senses. For example, when the worm
rlm@451 1237 is flat, you can infer that it is touching the ground and that its
rlm@451 1238 muscles are not active, because if the muscles were active, the
rlm@451 1239 worm would be moving and would not be perfectly flat. In order to
rlm@451 1240 stay flat, the worm has to be touching the ground, or it would
rlm@451 1241 again be moving out of the flat position due to gravity. If the
rlm@451 1242 worm is positioned in such a way that it interacts with itself,
rlm@451 1243 then it is very likely to be feeling the same tactile feelings as
rlm@451 1244 the last time it was in that position, because it has the same body
rlm@451 1245 as then. If you observe multiple frames of proprioceptive data,
rlm@451 1246 then you can become increasingly confident about the exact
rlm@451 1247 activations of the worm's muscles, because it generally takes a
rlm@451 1248 unique combination of muscle contractions to transform the worm's
rlm@451 1249 body along a specific path through \Phi-space.
rlm@449 1250
rlm@449 1251 There is a simple way of taking \Phi-space and the total ordering
rlm@449 1252 provided by an experience vector and reliably infering the rest of
rlm@449 1253 the senses.
rlm@435 1254
rlm@436 1255 ** Empathy is the process of tracing though \Phi-space
rlm@449 1256
rlm@450 1257 Here is the core of a basic empathy algorithm, starting with an
rlm@451 1258 experience vector:
rlm@451 1259
rlm@451 1260 First, group the experiences into tiered proprioceptive bins. I use
rlm@451 1261 powers of 10 and 3 bins, and the smallest bin has an approximate
rlm@451 1262 size of 0.001 radians in all proprioceptive dimensions.
rlm@450 1263
rlm@450 1264 Then, given a sequence of proprioceptive input, generate a set of
rlm@451 1265 matching experience records for each input, using the tiered
rlm@451 1266 proprioceptive bins.
rlm@449 1267
rlm@450 1268 Finally, to infer sensory data, select the longest consective chain
rlm@451 1269 of experiences. Conecutive experience means that the experiences
rlm@451 1270 appear next to each other in the experience vector.
rlm@449 1271
rlm@450 1272 This algorithm has three advantages:
rlm@450 1273
rlm@450 1274 1. It's simple
rlm@450 1275
rlm@451 1276 3. It's very fast -- retrieving possible interpretations takes
rlm@451 1277 constant time. Tracing through chains of interpretations takes
rlm@451 1278 time proportional to the average number of experiences in a
rlm@451 1279 proprioceptive bin. Redundant experiences in \Phi-space can be
rlm@451 1280 merged to save computation.
rlm@450 1281
rlm@450 1282 2. It protects from wrong interpretations of transient ambiguous
rlm@451 1283 proprioceptive data. For example, if the worm is flat for just
rlm@450 1284 an instant, this flattness will not be interpreted as implying
rlm@450 1285 that the worm has its muscles relaxed, since the flattness is
rlm@450 1286 part of a longer chain which includes a distinct pattern of
rlm@451 1287 muscle activation. Markov chains or other memoryless statistical
rlm@451 1288 models that operate on individual frames may very well make this
rlm@451 1289 mistake.
rlm@450 1290
rlm@450 1291 #+caption: Program to convert an experience vector into a
rlm@450 1292 #+caption: proprioceptively binned lookup function.
rlm@450 1293 #+name: bin
rlm@452 1294 #+attr_latex: [htpb]
rlm@452 1295 #+begin_listing clojure
rlm@450 1296 #+begin_src clojure
rlm@449 1297 (defn bin [digits]
rlm@449 1298 (fn [angles]
rlm@449 1299 (->> angles
rlm@449 1300 (flatten)
rlm@449 1301 (map (juxt #(Math/sin %) #(Math/cos %)))
rlm@449 1302 (flatten)
rlm@449 1303 (mapv #(Math/round (* % (Math/pow 10 (dec digits))))))))
rlm@449 1304
rlm@449 1305 (defn gen-phi-scan
rlm@450 1306 "Nearest-neighbors with binning. Only returns a result if
rlm@450 1307 the propriceptive data is within 10% of a previously recorded
rlm@450 1308 result in all dimensions."
rlm@450 1309 [phi-space]
rlm@449 1310 (let [bin-keys (map bin [3 2 1])
rlm@449 1311 bin-maps
rlm@449 1312 (map (fn [bin-key]
rlm@449 1313 (group-by
rlm@449 1314 (comp bin-key :proprioception phi-space)
rlm@449 1315 (range (count phi-space)))) bin-keys)
rlm@449 1316 lookups (map (fn [bin-key bin-map]
rlm@450 1317 (fn [proprio] (bin-map (bin-key proprio))))
rlm@450 1318 bin-keys bin-maps)]
rlm@449 1319 (fn lookup [proprio-data]
rlm@449 1320 (set (some #(% proprio-data) lookups)))))
rlm@450 1321 #+end_src
rlm@450 1322 #+end_listing
rlm@449 1323
rlm@451 1324 #+caption: =longest-thread= finds the longest path of consecutive
rlm@451 1325 #+caption: experiences to explain proprioceptive worm data.
rlm@451 1326 #+name: phi-space-history-scan
rlm@451 1327 #+ATTR_LaTeX: :width 10cm
rlm@451 1328 [[./images/aurellem-gray.png]]
rlm@451 1329
rlm@451 1330 =longest-thread= infers sensory data by stitching together pieces
rlm@451 1331 from previous experience. It prefers longer chains of previous
rlm@451 1332 experience to shorter ones. For example, during training the worm
rlm@451 1333 might rest on the ground for one second before it performs its
rlm@451 1334 excercises. If during recognition the worm rests on the ground for
rlm@451 1335 five seconds, =longest-thread= will accomodate this five second
rlm@451 1336 rest period by looping the one second rest chain five times.
rlm@451 1337
rlm@451 1338 =longest-thread= takes time proportinal to the average number of
rlm@451 1339 entries in a proprioceptive bin, because for each element in the
rlm@451 1340 starting bin it performes a series of set lookups in the preceeding
rlm@451 1341 bins. If the total history is limited, then this is only a constant
rlm@451 1342 multiple times the number of entries in the starting bin. This
rlm@451 1343 analysis also applies even if the action requires multiple longest
rlm@451 1344 chains -- it's still the average number of entries in a
rlm@451 1345 proprioceptive bin times the desired chain length. Because
rlm@451 1346 =longest-thread= is so efficient and simple, I can interpret
rlm@451 1347 worm-actions in real time.
rlm@449 1348
rlm@450 1349 #+caption: Program to calculate empathy by tracing though \Phi-space
rlm@450 1350 #+caption: and finding the longest (ie. most coherent) interpretation
rlm@450 1351 #+caption: of the data.
rlm@450 1352 #+name: longest-thread
rlm@452 1353 #+attr_latex: [htpb]
rlm@452 1354 #+begin_listing clojure
rlm@450 1355 #+begin_src clojure
rlm@449 1356 (defn longest-thread
rlm@449 1357 "Find the longest thread from phi-index-sets. The index sets should
rlm@449 1358 be ordered from most recent to least recent."
rlm@449 1359 [phi-index-sets]
rlm@449 1360 (loop [result '()
rlm@449 1361 [thread-bases & remaining :as phi-index-sets] phi-index-sets]
rlm@449 1362 (if (empty? phi-index-sets)
rlm@449 1363 (vec result)
rlm@449 1364 (let [threads
rlm@449 1365 (for [thread-base thread-bases]
rlm@449 1366 (loop [thread (list thread-base)
rlm@449 1367 remaining remaining]
rlm@449 1368 (let [next-index (dec (first thread))]
rlm@449 1369 (cond (empty? remaining) thread
rlm@449 1370 (contains? (first remaining) next-index)
rlm@449 1371 (recur
rlm@449 1372 (cons next-index thread) (rest remaining))
rlm@449 1373 :else thread))))
rlm@449 1374 longest-thread
rlm@449 1375 (reduce (fn [thread-a thread-b]
rlm@449 1376 (if (> (count thread-a) (count thread-b))
rlm@449 1377 thread-a thread-b))
rlm@449 1378 '(nil)
rlm@449 1379 threads)]
rlm@449 1380 (recur (concat longest-thread result)
rlm@449 1381 (drop (count longest-thread) phi-index-sets))))))
rlm@450 1382 #+end_src
rlm@450 1383 #+end_listing
rlm@450 1384
rlm@451 1385 There is one final piece, which is to replace missing sensory data
rlm@451 1386 with a best-guess estimate. While I could fill in missing data by
rlm@451 1387 using a gradient over the closest known sensory data points,
rlm@451 1388 averages can be misleading. It is certainly possible to create an
rlm@451 1389 impossible sensory state by averaging two possible sensory states.
rlm@451 1390 Therefore, I simply replicate the most recent sensory experience to
rlm@451 1391 fill in the gaps.
rlm@449 1392
rlm@449 1393 #+caption: Fill in blanks in sensory experience by replicating the most
rlm@449 1394 #+caption: recent experience.
rlm@449 1395 #+name: infer-nils
rlm@452 1396 #+attr_latex: [htpb]
rlm@452 1397 #+begin_listing clojure
rlm@449 1398 #+begin_src clojure
rlm@449 1399 (defn infer-nils
rlm@449 1400 "Replace nils with the next available non-nil element in the
rlm@449 1401 sequence, or barring that, 0."
rlm@449 1402 [s]
rlm@449 1403 (loop [i (dec (count s))
rlm@449 1404 v (transient s)]
rlm@449 1405 (if (zero? i) (persistent! v)
rlm@449 1406 (if-let [cur (v i)]
rlm@449 1407 (if (get v (dec i) 0)
rlm@449 1408 (recur (dec i) v)
rlm@449 1409 (recur (dec i) (assoc! v (dec i) cur)))
rlm@449 1410 (recur i (assoc! v i 0))))))
rlm@449 1411 #+end_src
rlm@449 1412 #+end_listing
rlm@435 1413
rlm@441 1414 ** Efficient action recognition with =EMPATH=
rlm@451 1415
rlm@451 1416 To use =EMPATH= with the worm, I first need to gather a set of
rlm@451 1417 experiences from the worm that includes the actions I want to
rlm@452 1418 recognize. The =generate-phi-space= program (listing
rlm@451 1419 \ref{generate-phi-space} runs the worm through a series of
rlm@451 1420 exercices and gatheres those experiences into a vector. The
rlm@451 1421 =do-all-the-things= program is a routine expressed in a simple
rlm@452 1422 muscle contraction script language for automated worm control. It
rlm@452 1423 causes the worm to rest, curl, and wiggle over about 700 frames
rlm@452 1424 (approx. 11 seconds).
rlm@425 1425
rlm@451 1426 #+caption: Program to gather the worm's experiences into a vector for
rlm@451 1427 #+caption: further processing. The =motor-control-program= line uses
rlm@451 1428 #+caption: a motor control script that causes the worm to execute a series
rlm@451 1429 #+caption: of ``exercices'' that include all the action predicates.
rlm@451 1430 #+name: generate-phi-space
rlm@452 1431 #+attr_latex: [htpb]
rlm@452 1432 #+begin_listing clojure
rlm@451 1433 #+begin_src clojure
rlm@451 1434 (def do-all-the-things
rlm@451 1435 (concat
rlm@451 1436 curl-script
rlm@451 1437 [[300 :d-ex 40]
rlm@451 1438 [320 :d-ex 0]]
rlm@451 1439 (shift-script 280 (take 16 wiggle-script))))
rlm@451 1440
rlm@451 1441 (defn generate-phi-space []
rlm@451 1442 (let [experiences (atom [])]
rlm@451 1443 (run-world
rlm@451 1444 (apply-map
rlm@451 1445 worm-world
rlm@451 1446 (merge
rlm@451 1447 (worm-world-defaults)
rlm@451 1448 {:end-frame 700
rlm@451 1449 :motor-control
rlm@451 1450 (motor-control-program worm-muscle-labels do-all-the-things)
rlm@451 1451 :experiences experiences})))
rlm@451 1452 @experiences))
rlm@451 1453 #+end_src
rlm@451 1454 #+end_listing
rlm@451 1455
rlm@451 1456 #+caption: Use longest thread and a phi-space generated from a short
rlm@451 1457 #+caption: exercise routine to interpret actions during free play.
rlm@451 1458 #+name: empathy-debug
rlm@452 1459 #+attr_latex: [htpb]
rlm@452 1460 #+begin_listing clojure
rlm@451 1461 #+begin_src clojure
rlm@451 1462 (defn init []
rlm@451 1463 (def phi-space (generate-phi-space))
rlm@451 1464 (def phi-scan (gen-phi-scan phi-space)))
rlm@451 1465
rlm@451 1466 (defn empathy-demonstration []
rlm@451 1467 (let [proprio (atom ())]
rlm@451 1468 (fn
rlm@451 1469 [experiences text]
rlm@451 1470 (let [phi-indices (phi-scan (:proprioception (peek experiences)))]
rlm@451 1471 (swap! proprio (partial cons phi-indices))
rlm@451 1472 (let [exp-thread (longest-thread (take 300 @proprio))
rlm@451 1473 empathy (mapv phi-space (infer-nils exp-thread))]
rlm@451 1474 (println-repl (vector:last-n exp-thread 22))
rlm@451 1475 (cond
rlm@451 1476 (grand-circle? empathy) (.setText text "Grand Circle")
rlm@451 1477 (curled? empathy) (.setText text "Curled")
rlm@451 1478 (wiggling? empathy) (.setText text "Wiggling")
rlm@451 1479 (resting? empathy) (.setText text "Resting")
rlm@451 1480 :else (.setText text "Unknown")))))))
rlm@451 1481
rlm@451 1482 (defn empathy-experiment [record]
rlm@451 1483 (.start (worm-world :experience-watch (debug-experience-phi)
rlm@451 1484 :record record :worm worm*)))
rlm@451 1485 #+end_src
rlm@451 1486 #+end_listing
rlm@451 1487
rlm@451 1488 The result of running =empathy-experiment= is that the system is
rlm@451 1489 generally able to interpret worm actions using the action-predicates
rlm@451 1490 on simulated sensory data just as well as with actual data. Figure
rlm@451 1491 \ref{empathy-debug-image} was generated using =empathy-experiment=:
rlm@451 1492
rlm@451 1493 #+caption: From only proprioceptive data, =EMPATH= was able to infer
rlm@451 1494 #+caption: the complete sensory experience and classify four poses
rlm@451 1495 #+caption: (The last panel shows a composite image of \emph{wriggling},
rlm@451 1496 #+caption: a dynamic pose.)
rlm@451 1497 #+name: empathy-debug-image
rlm@451 1498 #+ATTR_LaTeX: :width 10cm :placement [H]
rlm@451 1499 [[./images/empathy-1.png]]
rlm@451 1500
rlm@451 1501 One way to measure the performance of =EMPATH= is to compare the
rlm@451 1502 sutiability of the imagined sense experience to trigger the same
rlm@451 1503 action predicates as the real sensory experience.
rlm@451 1504
rlm@451 1505 #+caption: Determine how closely empathy approximates actual
rlm@451 1506 #+caption: sensory data.
rlm@451 1507 #+name: test-empathy-accuracy
rlm@452 1508 #+attr_latex: [htpb]
rlm@452 1509 #+begin_listing clojure
rlm@451 1510 #+begin_src clojure
rlm@451 1511 (def worm-action-label
rlm@451 1512 (juxt grand-circle? curled? wiggling?))
rlm@451 1513
rlm@451 1514 (defn compare-empathy-with-baseline [matches]
rlm@451 1515 (let [proprio (atom ())]
rlm@451 1516 (fn
rlm@451 1517 [experiences text]
rlm@451 1518 (let [phi-indices (phi-scan (:proprioception (peek experiences)))]
rlm@451 1519 (swap! proprio (partial cons phi-indices))
rlm@451 1520 (let [exp-thread (longest-thread (take 300 @proprio))
rlm@451 1521 empathy (mapv phi-space (infer-nils exp-thread))
rlm@451 1522 experience-matches-empathy
rlm@451 1523 (= (worm-action-label experiences)
rlm@451 1524 (worm-action-label empathy))]
rlm@451 1525 (println-repl experience-matches-empathy)
rlm@451 1526 (swap! matches #(conj % experience-matches-empathy)))))))
rlm@451 1527
rlm@451 1528 (defn accuracy [v]
rlm@451 1529 (float (/ (count (filter true? v)) (count v))))
rlm@451 1530
rlm@451 1531 (defn test-empathy-accuracy []
rlm@451 1532 (let [res (atom [])]
rlm@451 1533 (run-world
rlm@451 1534 (worm-world :experience-watch
rlm@451 1535 (compare-empathy-with-baseline res)
rlm@451 1536 :worm worm*))
rlm@451 1537 (accuracy @res)))
rlm@451 1538 #+end_src
rlm@451 1539 #+end_listing
rlm@451 1540
rlm@451 1541 Running =test-empathy-accuracy= using the very short exercise
rlm@451 1542 program defined in listing \ref{generate-phi-space}, and then doing
rlm@451 1543 a similar pattern of activity manually yeilds an accuracy of around
rlm@451 1544 73%. This is based on very limited worm experience. By training the
rlm@451 1545 worm for longer, the accuracy dramatically improves.
rlm@451 1546
rlm@451 1547 #+caption: Program to generate \Phi-space using manual training.
rlm@451 1548 #+name: manual-phi-space
rlm@452 1549 #+attr_latex: [htpb]
rlm@451 1550 #+begin_listing clojure
rlm@451 1551 #+begin_src clojure
rlm@451 1552 (defn init-interactive []
rlm@451 1553 (def phi-space
rlm@451 1554 (let [experiences (atom [])]
rlm@451 1555 (run-world
rlm@451 1556 (apply-map
rlm@451 1557 worm-world
rlm@451 1558 (merge
rlm@451 1559 (worm-world-defaults)
rlm@451 1560 {:experiences experiences})))
rlm@451 1561 @experiences))
rlm@451 1562 (def phi-scan (gen-phi-scan phi-space)))
rlm@451 1563 #+end_src
rlm@451 1564 #+end_listing
rlm@451 1565
rlm@451 1566 After about 1 minute of manual training, I was able to achieve 95%
rlm@451 1567 accuracy on manual testing of the worm using =init-interactive= and
rlm@452 1568 =test-empathy-accuracy=. The majority of errors are near the
rlm@452 1569 boundaries of transitioning from one type of action to another.
rlm@452 1570 During these transitions the exact label for the action is more open
rlm@452 1571 to interpretation, and dissaggrement between empathy and experience
rlm@452 1572 is more excusable.
rlm@450 1573
rlm@449 1574 ** Digression: bootstrapping touch using free exploration
rlm@449 1575
rlm@452 1576 In the previous section I showed how to compute actions in terms of
rlm@452 1577 body-centered predicates which relied averate touch activation of
rlm@452 1578 pre-defined regions of the worm's skin. What if, instead of recieving
rlm@452 1579 touch pre-grouped into the six faces of each worm segment, the true
rlm@452 1580 topology of the worm's skin was unknown? This is more similiar to how
rlm@452 1581 a nerve fiber bundle might be arranged. While two fibers that are
rlm@452 1582 close in a nerve bundle /might/ correspond to two touch sensors that
rlm@452 1583 are close together on the skin, the process of taking a complicated
rlm@452 1584 surface and forcing it into essentially a circle requires some cuts
rlm@452 1585 and rerragenments.
rlm@452 1586
rlm@452 1587 In this section I show how to automatically learn the skin-topology of
rlm@452 1588 a worm segment by free exploration. As the worm rolls around on the
rlm@452 1589 floor, large sections of its surface get activated. If the worm has
rlm@452 1590 stopped moving, then whatever region of skin that is touching the
rlm@452 1591 floor is probably an important region, and should be recorded.
rlm@452 1592
rlm@452 1593 #+caption: Program to detect whether the worm is in a resting state
rlm@452 1594 #+caption: with one face touching the floor.
rlm@452 1595 #+name: pure-touch
rlm@452 1596 #+begin_listing clojure
rlm@452 1597 #+begin_src clojure
rlm@452 1598 (def full-contact [(float 0.0) (float 0.1)])
rlm@452 1599
rlm@452 1600 (defn pure-touch?
rlm@452 1601 "This is worm specific code to determine if a large region of touch
rlm@452 1602 sensors is either all on or all off."
rlm@452 1603 [[coords touch :as touch-data]]
rlm@452 1604 (= (set (map first touch)) (set full-contact)))
rlm@452 1605 #+end_src
rlm@452 1606 #+end_listing
rlm@452 1607
rlm@452 1608 After collecting these important regions, there will many nearly
rlm@452 1609 similiar touch regions. While for some purposes the subtle
rlm@452 1610 differences between these regions will be important, for my
rlm@452 1611 purposes I colapse them into mostly non-overlapping sets using
rlm@452 1612 =remove-similiar= in listing \ref{remove-similiar}
rlm@452 1613
rlm@452 1614 #+caption: Program to take a lits of set of points and ``collapse them''
rlm@452 1615 #+caption: so that the remaining sets in the list are siginificantly
rlm@452 1616 #+caption: different from each other. Prefer smaller sets to larger ones.
rlm@452 1617 #+name: remove-similiar
rlm@452 1618 #+begin_listing clojure
rlm@452 1619 #+begin_src clojure
rlm@452 1620 (defn remove-similar
rlm@452 1621 [coll]
rlm@452 1622 (loop [result () coll (sort-by (comp - count) coll)]
rlm@452 1623 (if (empty? coll) result
rlm@452 1624 (let [[x & xs] coll
rlm@452 1625 c (count x)]
rlm@452 1626 (if (some
rlm@452 1627 (fn [other-set]
rlm@452 1628 (let [oc (count other-set)]
rlm@452 1629 (< (- (count (union other-set x)) c) (* oc 0.1))))
rlm@452 1630 xs)
rlm@452 1631 (recur result xs)
rlm@452 1632 (recur (cons x result) xs))))))
rlm@452 1633 #+end_src
rlm@452 1634 #+end_listing
rlm@452 1635
rlm@452 1636 Actually running this simulation is easy given =CORTEX='s facilities.
rlm@452 1637
rlm@452 1638 #+caption: Collect experiences while the worm moves around. Filter the touch
rlm@452 1639 #+caption: sensations by stable ones, collapse similiar ones together,
rlm@452 1640 #+caption: and report the regions learned.
rlm@452 1641 #+name: learn-touch
rlm@452 1642 #+begin_listing clojure
rlm@452 1643 #+begin_src clojure
rlm@452 1644 (defn learn-touch-regions []
rlm@452 1645 (let [experiences (atom [])
rlm@452 1646 world (apply-map
rlm@452 1647 worm-world
rlm@452 1648 (assoc (worm-segment-defaults)
rlm@452 1649 :experiences experiences))]
rlm@452 1650 (run-world world)
rlm@452 1651 (->>
rlm@452 1652 @experiences
rlm@452 1653 (drop 175)
rlm@452 1654 ;; access the single segment's touch data
rlm@452 1655 (map (comp first :touch))
rlm@452 1656 ;; only deal with "pure" touch data to determine surfaces
rlm@452 1657 (filter pure-touch?)
rlm@452 1658 ;; associate coordinates with touch values
rlm@452 1659 (map (partial apply zipmap))
rlm@452 1660 ;; select those regions where contact is being made
rlm@452 1661 (map (partial group-by second))
rlm@452 1662 (map #(get % full-contact))
rlm@452 1663 (map (partial map first))
rlm@452 1664 ;; remove redundant/subset regions
rlm@452 1665 (map set)
rlm@452 1666 remove-similar)))
rlm@452 1667
rlm@452 1668 (defn learn-and-view-touch-regions []
rlm@452 1669 (map view-touch-region
rlm@452 1670 (learn-touch-regions)))
rlm@452 1671 #+end_src
rlm@452 1672 #+end_listing
rlm@452 1673
rlm@452 1674 The only thing remining to define is the particular motion the worm
rlm@452 1675 must take. I accomplish this with a simple motor control program.
rlm@452 1676
rlm@452 1677 #+caption: Motor control program for making the worm roll on the ground.
rlm@452 1678 #+caption: This could also be replaced with random motion.
rlm@452 1679 #+name: worm-roll
rlm@452 1680 #+begin_listing clojure
rlm@452 1681 #+begin_src clojure
rlm@452 1682 (defn touch-kinesthetics []
rlm@452 1683 [[170 :lift-1 40]
rlm@452 1684 [190 :lift-1 19]
rlm@452 1685 [206 :lift-1 0]
rlm@452 1686
rlm@452 1687 [400 :lift-2 40]
rlm@452 1688 [410 :lift-2 0]
rlm@452 1689
rlm@452 1690 [570 :lift-2 40]
rlm@452 1691 [590 :lift-2 21]
rlm@452 1692 [606 :lift-2 0]
rlm@452 1693
rlm@452 1694 [800 :lift-1 30]
rlm@452 1695 [809 :lift-1 0]
rlm@452 1696
rlm@452 1697 [900 :roll-2 40]
rlm@452 1698 [905 :roll-2 20]
rlm@452 1699 [910 :roll-2 0]
rlm@452 1700
rlm@452 1701 [1000 :roll-2 40]
rlm@452 1702 [1005 :roll-2 20]
rlm@452 1703 [1010 :roll-2 0]
rlm@452 1704
rlm@452 1705 [1100 :roll-2 40]
rlm@452 1706 [1105 :roll-2 20]
rlm@452 1707 [1110 :roll-2 0]
rlm@452 1708 ])
rlm@452 1709 #+end_src
rlm@452 1710 #+end_listing
rlm@452 1711
rlm@452 1712
rlm@452 1713 #+caption: The small worm rolls around on the floor, driven
rlm@452 1714 #+caption: by the motor control program in listing \ref{worm-roll}.
rlm@452 1715 #+name: worm-roll
rlm@452 1716 #+ATTR_LaTeX: :width 12cm
rlm@452 1717 [[./images/worm-roll.png]]
rlm@452 1718
rlm@452 1719
rlm@452 1720 #+caption: After completing its adventures, the worm now knows
rlm@452 1721 #+caption: how its touch sensors are arranged along its skin. These
rlm@452 1722 #+caption: are the regions that were deemed important by
rlm@452 1723 #+caption: =learn-touch-regions=. Note that the worm has discovered
rlm@452 1724 #+caption: that it has six sides.
rlm@452 1725 #+name: worm-touch-map
rlm@452 1726 #+ATTR_LaTeX: :width 12cm
rlm@452 1727 [[./images/touch-learn.png]]
rlm@452 1728
rlm@452 1729 While simple, =learn-touch-regions= exploits regularities in both
rlm@452 1730 the worm's physiology and the worm's environment to correctly
rlm@452 1731 deduce that the worm has six sides. Note that =learn-touch-regions=
rlm@452 1732 would work just as well even if the worm's touch sense data were
rlm@452 1733 completely scrambled. The cross shape is just for convienence. This
rlm@452 1734 example justifies the use of pre-defined touch regions in =EMPATH=.
rlm@452 1735
rlm@465 1736 * COMMENT Contributions
rlm@454 1737
rlm@461 1738 In this thesis you have seen the =CORTEX= system, a complete
rlm@461 1739 environment for creating simulated creatures. You have seen how to
rlm@461 1740 implement five senses including touch, proprioception, hearing,
rlm@461 1741 vision, and muscle tension. You have seen how to create new creatues
rlm@461 1742 using blender, a 3D modeling tool. I hope that =CORTEX= will be
rlm@461 1743 useful in further research projects. To this end I have included the
rlm@461 1744 full source to =CORTEX= along with a large suite of tests and
rlm@461 1745 examples. I have also created a user guide for =CORTEX= which is
rlm@461 1746 inculded in an appendix to this thesis.
rlm@447 1747
rlm@461 1748 You have also seen how I used =CORTEX= as a platform to attach the
rlm@461 1749 /action recognition/ problem, which is the problem of recognizing
rlm@461 1750 actions in video. You saw a simple system called =EMPATH= which
rlm@461 1751 ientifies actions by first describing actions in a body-centerd,
rlm@461 1752 rich sense language, then infering a full range of sensory
rlm@461 1753 experience from limited data using previous experience gained from
rlm@461 1754 free play.
rlm@447 1755
rlm@461 1756 As a minor digression, you also saw how I used =CORTEX= to enable a
rlm@461 1757 tiny worm to discover the topology of its skin simply by rolling on
rlm@461 1758 the ground.
rlm@461 1759
rlm@461 1760 In conclusion, the main contributions of this thesis are:
rlm@461 1761
rlm@461 1762 - =CORTEX=, a system for creating simulated creatures with rich
rlm@461 1763 senses.
rlm@461 1764 - =EMPATH=, a program for recognizing actions by imagining sensory
rlm@461 1765 experience.
rlm@447 1766
rlm@447 1767 # An anatomical joke:
rlm@447 1768 # - Training
rlm@447 1769 # - Skeletal imitation
rlm@447 1770 # - Sensory fleshing-out
rlm@447 1771 # - Classification