Mercurial > cortex
view thesis/cortex.org @ 469:ae10f35022ba
completed first draft of body section.
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Fri, 28 Mar 2014 16:34:35 -0400 |
parents | 258078f78b33 |
children | 3401053124b0 |
line wrap: on
line source
1 #+title: =CORTEX=2 #+author: Robert McIntyre3 #+email: rlm@mit.edu4 #+description: Using embodied AI to facilitate Artificial Imagination.5 #+keywords: AI, clojure, embodiment6 #+LaTeX_CLASS_OPTIONS: [nofloat]8 * COMMENT templates9 #+caption:10 #+caption:11 #+caption:12 #+caption:13 #+name: name14 #+begin_listing clojure15 #+begin_src clojure16 #+end_src17 #+end_listing19 #+caption:20 #+caption:21 #+caption:22 #+name: name23 #+ATTR_LaTeX: :width 10cm24 [[./images/aurellem-gray.png]]26 * COMMENT Empathy and Embodiment as problem solving strategies28 By the end of this thesis, you will have seen a novel approach to29 interpreting video using embodiment and empathy. You will have also30 seen one way to efficiently implement empathy for embodied31 creatures. Finally, you will become familiar with =CORTEX=, a system32 for designing and simulating creatures with rich senses, which you33 may choose to use in your own research.35 This is the core vision of my thesis: That one of the important ways36 in which we understand others is by imagining ourselves in their37 position and emphatically feeling experiences relative to our own38 bodies. By understanding events in terms of our own previous39 corporeal experience, we greatly constrain the possibilities of what40 would otherwise be an unwieldy exponential search. This extra41 constraint can be the difference between easily understanding what42 is happening in a video and being completely lost in a sea of43 incomprehensible color and movement.45 ** Recognizing actions in video is extremely difficult47 Consider for example the problem of determining what is happening48 in a video of which this is one frame:50 #+caption: A cat drinking some water. Identifying this action is51 #+caption: beyond the state of the art for computers.52 #+ATTR_LaTeX: :width 7cm53 [[./images/cat-drinking.jpg]]55 It is currently impossible for any computer program to reliably56 label such a video as ``drinking''. And rightly so -- it is a very57 hard problem! What features can you describe in terms of low level58 functions of pixels that can even begin to describe at a high level59 what is happening here?61 Or suppose that you are building a program that recognizes chairs.62 How could you ``see'' the chair in figure \ref{hidden-chair}?64 #+caption: The chair in this image is quite obvious to humans, but I65 #+caption: doubt that any modern computer vision program can find it.66 #+name: hidden-chair67 #+ATTR_LaTeX: :width 10cm68 [[./images/fat-person-sitting-at-desk.jpg]]70 Finally, how is it that you can easily tell the difference between71 how the girls /muscles/ are working in figure \ref{girl}?73 #+caption: The mysterious ``common sense'' appears here as you are able74 #+caption: to discern the difference in how the girl's arm muscles75 #+caption: are activated between the two images.76 #+name: girl77 #+ATTR_LaTeX: :width 7cm78 [[./images/wall-push.png]]80 Each of these examples tells us something about what might be going81 on in our minds as we easily solve these recognition problems.83 The hidden chairs show us that we are strongly triggered by cues84 relating to the position of human bodies, and that we can determine85 the overall physical configuration of a human body even if much of86 that body is occluded.88 The picture of the girl pushing against the wall tells us that we89 have common sense knowledge about the kinetics of our own bodies.90 We know well how our muscles would have to work to maintain us in91 most positions, and we can easily project this self-knowledge to92 imagined positions triggered by images of the human body.94 ** =EMPATH= neatly solves recognition problems96 I propose a system that can express the types of recognition97 problems above in a form amenable to computation. It is split into98 four parts:100 - Free/Guided Play :: The creature moves around and experiences the101 world through its unique perspective. Many otherwise102 complicated actions are easily described in the language of a103 full suite of body-centered, rich senses. For example,104 drinking is the feeling of water sliding down your throat, and105 cooling your insides. It's often accompanied by bringing your106 hand close to your face, or bringing your face close to water.107 Sitting down is the feeling of bending your knees, activating108 your quadriceps, then feeling a surface with your bottom and109 relaxing your legs. These body-centered action descriptions110 can be either learned or hard coded.111 - Posture Imitation :: When trying to interpret a video or image,112 the creature takes a model of itself and aligns it with113 whatever it sees. This alignment can even cross species, as114 when humans try to align themselves with things like ponies,115 dogs, or other humans with a different body type.116 - Empathy :: The alignment triggers associations with117 sensory data from prior experiences. For example, the118 alignment itself easily maps to proprioceptive data. Any119 sounds or obvious skin contact in the video can to a lesser120 extent trigger previous experience. Segments of previous121 experiences are stitched together to form a coherent and122 complete sensory portrait of the scene.123 - Recognition :: With the scene described in terms of first124 person sensory events, the creature can now run its125 action-identification programs on this synthesized sensory126 data, just as it would if it were actually experiencing the127 scene first-hand. If previous experience has been accurately128 retrieved, and if it is analogous enough to the scene, then129 the creature will correctly identify the action in the scene.131 For example, I think humans are able to label the cat video as132 ``drinking'' because they imagine /themselves/ as the cat, and133 imagine putting their face up against a stream of water and134 sticking out their tongue. In that imagined world, they can feel135 the cool water hitting their tongue, and feel the water entering136 their body, and are able to recognize that /feeling/ as drinking.137 So, the label of the action is not really in the pixels of the138 image, but is found clearly in a simulation inspired by those139 pixels. An imaginative system, having been trained on drinking and140 non-drinking examples and learning that the most important141 component of drinking is the feeling of water sliding down one's142 throat, would analyze a video of a cat drinking in the following143 manner:145 1. Create a physical model of the video by putting a ``fuzzy''146 model of its own body in place of the cat. Possibly also create147 a simulation of the stream of water.149 2. Play out this simulated scene and generate imagined sensory150 experience. This will include relevant muscle contractions, a151 close up view of the stream from the cat's perspective, and most152 importantly, the imagined feeling of water entering the153 mouth. The imagined sensory experience can come from a154 simulation of the event, but can also be pattern-matched from155 previous, similar embodied experience.157 3. The action is now easily identified as drinking by the sense of158 taste alone. The other senses (such as the tongue moving in and159 out) help to give plausibility to the simulated action. Note that160 the sense of vision, while critical in creating the simulation,161 is not critical for identifying the action from the simulation.163 For the chair examples, the process is even easier:165 1. Align a model of your body to the person in the image.167 2. Generate proprioceptive sensory data from this alignment.169 3. Use the imagined proprioceptive data as a key to lookup related170 sensory experience associated with that particular proproceptive171 feeling.173 4. Retrieve the feeling of your bottom resting on a surface, your174 knees bent, and your leg muscles relaxed.176 5. This sensory information is consistent with the =sitting?=177 sensory predicate, so you (and the entity in the image) must be178 sitting.180 6. There must be a chair-like object since you are sitting.182 Empathy offers yet another alternative to the age-old AI183 representation question: ``What is a chair?'' --- A chair is the184 feeling of sitting.186 My program, =EMPATH= uses this empathic problem solving technique187 to interpret the actions of a simple, worm-like creature.189 #+caption: The worm performs many actions during free play such as190 #+caption: curling, wiggling, and resting.191 #+name: worm-intro192 #+ATTR_LaTeX: :width 15cm193 [[./images/worm-intro-white.png]]195 #+caption: =EMPATH= recognized and classified each of these196 #+caption: poses by inferring the complete sensory experience197 #+caption: from proprioceptive data.198 #+name: worm-recognition-intro199 #+ATTR_LaTeX: :width 15cm200 [[./images/worm-poses.png]]202 One powerful advantage of empathic problem solving is that it203 factors the action recognition problem into two easier problems. To204 use empathy, you need an /aligner/, which takes the video and a205 model of your body, and aligns the model with the video. Then, you206 need a /recognizer/, which uses the aligned model to interpret the207 action. The power in this method lies in the fact that you describe208 all actions form a body-centered viewpoint. You are less tied to209 the particulars of any visual representation of the actions. If you210 teach the system what ``running'' is, and you have a good enough211 aligner, the system will from then on be able to recognize running212 from any point of view, even strange points of view like above or213 underneath the runner. This is in contrast to action recognition214 schemes that try to identify actions using a non-embodied approach.215 If these systems learn about running as viewed from the side, they216 will not automatically be able to recognize running from any other217 viewpoint.219 Another powerful advantage is that using the language of multiple220 body-centered rich senses to describe body-centerd actions offers a221 massive boost in descriptive capability. Consider how difficult it222 would be to compose a set of HOG filters to describe the action of223 a simple worm-creature ``curling'' so that its head touches its224 tail, and then behold the simplicity of describing thus action in a225 language designed for the task (listing \ref{grand-circle-intro}):227 #+caption: Body-centerd actions are best expressed in a body-centered228 #+caption: language. This code detects when the worm has curled into a229 #+caption: full circle. Imagine how you would replicate this functionality230 #+caption: using low-level pixel features such as HOG filters!231 #+name: grand-circle-intro232 #+attr_latex: [htpb]233 #+begin_listing clojure234 #+begin_src clojure235 (defn grand-circle?236 "Does the worm form a majestic circle (one end touching the other)?"237 [experiences]238 (and (curled? experiences)239 (let [worm-touch (:touch (peek experiences))240 tail-touch (worm-touch 0)241 head-touch (worm-touch 4)]242 (and (< 0.2 (contact worm-segment-bottom-tip tail-touch))243 (< 0.2 (contact worm-segment-top-tip head-touch))))))244 #+end_src245 #+end_listing248 ** =CORTEX= is a toolkit for building sensate creatures250 I built =CORTEX= to be a general AI research platform for doing251 experiments involving multiple rich senses and a wide variety and252 number of creatures. I intend it to be useful as a library for many253 more projects than just this thesis. =CORTEX= was necessary to meet254 a need among AI researchers at CSAIL and beyond, which is that255 people often will invent neat ideas that are best expressed in the256 language of creatures and senses, but in order to explore those257 ideas they must first build a platform in which they can create258 simulated creatures with rich senses! There are many ideas that259 would be simple to execute (such as =EMPATH=), but attached to them260 is the multi-month effort to make a good creature simulator. Often,261 that initial investment of time proves to be too much, and the262 project must make do with a lesser environment.264 =CORTEX= is well suited as an environment for embodied AI research265 for three reasons:267 - You can create new creatures using Blender, a popular 3D modeling268 program. Each sense can be specified using special blender nodes269 with biologically inspired paramaters. You need not write any270 code to create a creature, and can use a wide library of271 pre-existing blender models as a base for your own creatures.273 - =CORTEX= implements a wide variety of senses, including touch,274 proprioception, vision, hearing, and muscle tension. Complicated275 senses like touch, and vision involve multiple sensory elements276 embedded in a 2D surface. You have complete control over the277 distribution of these sensor elements through the use of simple278 png image files. In particular, =CORTEX= implements more279 comprehensive hearing than any other creature simulation system280 available.282 - =CORTEX= supports any number of creatures and any number of283 senses. Time in =CORTEX= dialates so that the simulated creatures284 always precieve a perfectly smooth flow of time, regardless of285 the actual computational load.287 =CORTEX= is built on top of =jMonkeyEngine3=, which is a video game288 engine designed to create cross-platform 3D desktop games. =CORTEX=289 is mainly written in clojure, a dialect of =LISP= that runs on the290 java virtual machine (JVM). The API for creating and simulating291 creatures and senses is entirely expressed in clojure, though many292 senses are implemented at the layer of jMonkeyEngine or below. For293 example, for the sense of hearing I use a layer of clojure code on294 top of a layer of java JNI bindings that drive a layer of =C++=295 code which implements a modified version of =OpenAL= to support296 multiple listeners. =CORTEX= is the only simulation environment297 that I know of that can support multiple entities that can each298 hear the world from their own perspective. Other senses also299 require a small layer of Java code. =CORTEX= also uses =bullet=, a300 physics simulator written in =C=.302 #+caption: Here is the worm from above modeled in Blender, a free303 #+caption: 3D-modeling program. Senses and joints are described304 #+caption: using special nodes in Blender.305 #+name: worm-recognition-intro306 #+ATTR_LaTeX: :width 12cm307 [[./images/blender-worm.png]]309 Here are some thing I anticipate that =CORTEX= might be used for:311 - exploring new ideas about sensory integration312 - distributed communication among swarm creatures313 - self-learning using free exploration,314 - evolutionary algorithms involving creature construction315 - exploration of exoitic senses and effectors that are not possible316 in the real world (such as telekenisis or a semantic sense)317 - imagination using subworlds319 During one test with =CORTEX=, I created 3,000 creatures each with320 their own independent senses and ran them all at only 1/80 real321 time. In another test, I created a detailed model of my own hand,322 equipped with a realistic distribution of touch (more sensitive at323 the fingertips), as well as eyes and ears, and it ran at around 1/4324 real time.326 #+BEGIN_LaTeX327 \begin{sidewaysfigure}328 \includegraphics[width=9.5in]{images/full-hand.png}329 \caption{330 I modeled my own right hand in Blender and rigged it with all the331 senses that {\tt CORTEX} supports. My simulated hand has a332 biologically inspired distribution of touch sensors. The senses are333 displayed on the right, and the simulation is displayed on the334 left. Notice that my hand is curling its fingers, that it can see335 its own finger from the eye in its palm, and that it can feel its336 own thumb touching its palm.}337 \end{sidewaysfigure}338 #+END_LaTeX340 ** Contributions342 - I built =CORTEX=, a comprehensive platform for embodied AI343 experiments. =CORTEX= supports many features lacking in other344 systems, such proper simulation of hearing. It is easy to create345 new =CORTEX= creatures using Blender, a free 3D modeling program.347 - I built =EMPATH=, which uses =CORTEX= to identify the actions of348 a worm-like creature using a computational model of empathy.350 * Building =CORTEX=352 I intend for =CORTEX= to be used as a general purpose library for353 building creatures and outfitting them with senses, so that it will354 be useful for other researchers who want to test out ideas of their355 own. To this end, wherver I have had to make archetictural choices356 about =CORTEX=, I have chosen to give as much freedom to the user as357 possible, so that =CORTEX= may be used for things I have not358 forseen.360 ** COMMENT Simulation or Reality?362 The most important archetictural decision of all is the choice to363 use a computer-simulated environemnt in the first place! The world364 is a vast and rich place, and for now simulations are a very poor365 reflection of its complexity. It may be that there is a significant366 qualatative difference between dealing with senses in the real367 world and dealing with pale facilimilies of them in a simulation.368 What are the advantages and disadvantages of a simulation vs.369 reality?371 *** Simulation373 The advantages of virtual reality are that when everything is a374 simulation, experiments in that simulation are absolutely375 reproducible. It's also easier to change the character and world376 to explore new situations and different sensory combinations.378 If the world is to be simulated on a computer, then not only do379 you have to worry about whether the character's senses are rich380 enough to learn from the world, but whether the world itself is381 rendered with enough detail and realism to give enough working382 material to the character's senses. To name just a few383 difficulties facing modern physics simulators: destructibility of384 the environment, simulation of water/other fluids, large areas,385 nonrigid bodies, lots of objects, smoke. I don't know of any386 computer simulation that would allow a character to take a rock387 and grind it into fine dust, then use that dust to make a clay388 sculpture, at least not without spending years calculating the389 interactions of every single small grain of dust. Maybe a390 simulated world with today's limitations doesn't provide enough391 richness for real intelligence to evolve.393 *** Reality395 The other approach for playing with senses is to hook your396 software up to real cameras, microphones, robots, etc., and let it397 loose in the real world. This has the advantage of eliminating398 concerns about simulating the world at the expense of increasing399 the complexity of implementing the senses. Instead of just400 grabbing the current rendered frame for processing, you have to401 use an actual camera with real lenses and interact with photons to402 get an image. It is much harder to change the character, which is403 now partly a physical robot of some sort, since doing so involves404 changing things around in the real world instead of modifying405 lines of code. While the real world is very rich and definitely406 provides enough stimulation for intelligence to develop as407 evidenced by our own existence, it is also uncontrollable in the408 sense that a particular situation cannot be recreated perfectly or409 saved for later use. It is harder to conduct science because it is410 harder to repeat an experiment. The worst thing about using the411 real world instead of a simulation is the matter of time. Instead412 of simulated time you get the constant and unstoppable flow of413 real time. This severely limits the sorts of software you can use414 to program the AI because all sense inputs must be handled in real415 time. Complicated ideas may have to be implemented in hardware or416 may simply be impossible given the current speed of our417 processors. Contrast this with a simulation, in which the flow of418 time in the simulated world can be slowed down to accommodate the419 limitations of the character's programming. In terms of cost,420 doing everything in software is far cheaper than building custom421 real-time hardware. All you need is a laptop and some patience.423 ** COMMENT Because of Time, simulation is perferable to reality425 I envision =CORTEX= being used to support rapid prototyping and426 iteration of ideas. Even if I could put together a well constructed427 kit for creating robots, it would still not be enough because of428 the scourge of real-time processing. Anyone who wants to test their429 ideas in the real world must always worry about getting their430 algorithms to run fast enough to process information in real time.431 The need for real time processing only increases if multiple senses432 are involved. In the extreme case, even simple algorithms will have433 to be accelerated by ASIC chips or FPGAs, turning what would434 otherwise be a few lines of code and a 10x speed penality into a435 multi-month ordeal. For this reason, =CORTEX= supports436 /time-dialiation/, which scales back the framerate of the437 simulation in proportion to the amount of processing each frame.438 From the perspective of the creatures inside the simulation, time439 always appears to flow at a constant rate, regardless of how440 complicated the envorimnent becomes or how many creatures are in441 the simulation. The cost is that =CORTEX= can sometimes run slower442 than real time. This can also be an advantage, however ---443 simulations of very simple creatures in =CORTEX= generally run at444 40x on my machine!446 ** COMMENT What is a sense?448 If =CORTEX= is to support a wide variety of senses, it would help449 to have a better understanding of what a ``sense'' actually is!450 While vision, touch, and hearing all seem like they are quite451 different things, I was supprised to learn during the course of452 this thesis that they (and all physical senses) can be expressed as453 exactly the same mathematical object due to a dimensional argument!455 Human beings are three-dimensional objects, and the nerves that456 transmit data from our various sense organs to our brain are457 essentially one-dimensional. This leaves up to two dimensions in458 which our sensory information may flow. For example, imagine your459 skin: it is a two-dimensional surface around a three-dimensional460 object (your body). It has discrete touch sensors embedded at461 various points, and the density of these sensors corresponds to the462 sensitivity of that region of skin. Each touch sensor connects to a463 nerve, all of which eventually are bundled together as they travel464 up the spinal cord to the brain. Intersect the spinal nerves with a465 guillotining plane and you will see all of the sensory data of the466 skin revealed in a roughly circular two-dimensional image which is467 the cross section of the spinal cord. Points on this image that are468 close together in this circle represent touch sensors that are469 /probably/ close together on the skin, although there is of course470 some cutting and rearrangement that has to be done to transfer the471 complicated surface of the skin onto a two dimensional image.473 Most human senses consist of many discrete sensors of various474 properties distributed along a surface at various densities. For475 skin, it is Pacinian corpuscles, Meissner's corpuscles, Merkel's476 disks, and Ruffini's endings, which detect pressure and vibration477 of various intensities. For ears, it is the stereocilia distributed478 along the basilar membrane inside the cochlea; each one is479 sensitive to a slightly different frequency of sound. For eyes, it480 is rods and cones distributed along the surface of the retina. In481 each case, we can describe the sense with a surface and a482 distribution of sensors along that surface.484 The neat idea is that every human sense can be effectively485 described in terms of a surface containing embedded sensors. If the486 sense had any more dimensions, then there wouldn't be enough room487 in the spinal chord to transmit the information!489 Therefore, =CORTEX= must support the ability to create objects and490 then be able to ``paint'' points along their surfaces to describe491 each sense.493 Fortunately this idea is already a well known computer graphics494 technique called called /UV-mapping/. The three-dimensional surface495 of a model is cut and smooshed until it fits on a two-dimensional496 image. You paint whatever you want on that image, and when the497 three-dimensional shape is rendered in a game the smooshing and498 cutting is reversed and the image appears on the three-dimensional499 object.501 To make a sense, interpret the UV-image as describing the502 distribution of that senses sensors. To get different types of503 sensors, you can either use a different color for each type of504 sensor, or use multiple UV-maps, each labeled with that sensor505 type. I generally use a white pixel to mean the presence of a506 sensor and a black pixel to mean the absence of a sensor, and use507 one UV-map for each sensor-type within a given sense.509 #+CAPTION: The UV-map for an elongated icososphere. The white510 #+caption: dots each represent a touch sensor. They are dense511 #+caption: in the regions that describe the tip of the finger,512 #+caption: and less dense along the dorsal side of the finger513 #+caption: opposite the tip.514 #+name: finger-UV515 #+ATTR_latex: :width 10cm516 [[./images/finger-UV.png]]518 #+caption: Ventral side of the UV-mapped finger. Notice the519 #+caption: density of touch sensors at the tip.520 #+name: finger-side-view521 #+ATTR_LaTeX: :width 10cm522 [[./images/finger-1.png]]524 ** COMMENT Video game engines are a great starting point526 I did not need to write my own physics simulation code or shader to527 build =CORTEX=. Doing so would lead to a system that is impossible528 for anyone but myself to use anyway. Instead, I use a video game529 engine as a base and modify it to accomodate the additional needs530 of =CORTEX=. Video game engines are an ideal starting point to531 build =CORTEX=, because they are not far from being creature532 building systems themselves.534 First off, general purpose video game engines come with a physics535 engine and lighting / sound system. The physics system provides536 tools that can be co-opted to serve as touch, proprioception, and537 muscles. Since some games support split screen views, a good video538 game engine will allow you to efficiently create multiple cameras539 in the simulated world that can be used as eyes. Video game systems540 offer integrated asset management for things like textures and541 creatures models, providing an avenue for defining creatures. They542 also understand UV-mapping, since this technique is used to apply a543 texture to a model. Finally, because video game engines support a544 large number of users, as long as =CORTEX= doesn't stray too far545 from the base system, other researchers can turn to this community546 for help when doing their research.548 ** COMMENT =CORTEX= is based on jMonkeyEngine3550 While preparing to build =CORTEX= I studied several video game551 engines to see which would best serve as a base. The top contenders552 were:554 - [[http://www.idsoftware.com][Quake II]]/[[http://www.bytonic.de/html/jake2.html][Jake2]] :: The Quake II engine was designed by ID555 software in 1997. All the source code was released by ID556 software into the Public Domain several years ago, and as a557 result it has been ported to many different languages. This558 engine was famous for its advanced use of realistic shading559 and had decent and fast physics simulation. The main advantage560 of the Quake II engine is its simplicity, but I ultimately561 rejected it because the engine is too tied to the concept of a562 first-person shooter game. One of the problems I had was that563 there does not seem to be any easy way to attach multiple564 cameras to a single character. There are also several physics565 clipping issues that are corrected in a way that only applies566 to the main character and do not apply to arbitrary objects.568 - [[http://source.valvesoftware.com/][Source Engine]] :: The Source Engine evolved from the Quake II569 and Quake I engines and is used by Valve in the Half-Life570 series of games. The physics simulation in the Source Engine571 is quite accurate and probably the best out of all the engines572 I investigated. There is also an extensive community actively573 working with the engine. However, applications that use the574 Source Engine must be written in C++, the code is not open, it575 only runs on Windows, and the tools that come with the SDK to576 handle models and textures are complicated and awkward to use.578 - [[http://jmonkeyengine.com/][jMonkeyEngine3]] :: jMonkeyEngine3 is a new library for creating579 games in Java. It uses OpenGL to render to the screen and uses580 screengraphs to avoid drawing things that do not appear on the581 screen. It has an active community and several games in the582 pipeline. The engine was not built to serve any particular583 game but is instead meant to be used for any 3D game.585 I chose jMonkeyEngine3 because it because it had the most features586 out of all the free projects I looked at, and because I could then587 write my code in clojure, an implementation of =LISP= that runs on588 the JVM.590 ** COMMENT =CORTEX= uses Blender to create creature models592 For the simple worm-like creatures I will use later on in this593 thesis, I could define a simple API in =CORTEX= that would allow594 one to create boxes, spheres, etc., and leave that API as the sole595 way to create creatures. However, for =CORTEX= to truly be useful596 for other projects, it needs a way to construct complicated597 creatures. If possible, it would be nice to leverage work that has598 already been done by the community of 3D modelers, or at least599 enable people who are talented at moedling but not programming to600 design =CORTEX= creatures.602 Therefore, I use Blender, a free 3D modeling program, as the main603 way to create creatures in =CORTEX=. However, the creatures modeled604 in Blender must also be simple to simulate in jMonkeyEngine3's game605 engine, and must also be easy to rig with =CORTEX='s senses. I606 accomplish this with extensive use of Blender's ``empty nodes.''608 Empty nodes have no mass, physical presence, or appearance, but609 they can hold metadata and have names. I use a tree structure of610 empty nodes to specify senses in the following manner:612 - Create a single top-level empty node whose name is the name of613 the sense.614 - Add empty nodes which each contain meta-data relevant to the615 sense, including a UV-map describing the number/distribution of616 sensors if applicable.617 - Make each empty-node the child of the top-level node.619 #+caption: An example of annoting a creature model with empty620 #+caption: nodes to describe the layout of senses. There are621 #+caption: multiple empty nodes which each describe the position622 #+caption: of muscles, ears, eyes, or joints.623 #+name: sense-nodes624 #+ATTR_LaTeX: :width 10cm625 [[./images/empty-sense-nodes.png]]627 ** COMMENT Bodies are composed of segments connected by joints629 Blender is a general purpose animation tool, which has been used in630 the past to create high quality movies such as Sintel631 \cite{sintel}. Though Blender can model and render even complicated632 things like water, it is crucual to keep models that are meant to633 be simulated as creatures simple. =Bullet=, which =CORTEX= uses634 though jMonkeyEngine3, is a rigid-body physics system. This offers635 a compromise between the expressiveness of a game level and the636 speed at which it can be simulated, and it means that creatures637 should be naturally expressed as rigid components held together by638 joint constraints.640 But humans are more like a squishy bag with wrapped around some641 hard bones which define the overall shape. When we move, our skin642 bends and stretches to accomodate the new positions of our bones.644 One way to make bodies composed of rigid pieces connected by joints645 /seem/ more human-like is to use an /armature/, (or /rigging/)646 system, which defines a overall ``body mesh'' and defines how the647 mesh deforms as a function of the position of each ``bone'' which648 is a standard rigid body. This technique is used extensively to649 model humans and create realistic animations. It is not a good650 technique for physical simulation, however because it creates a lie651 -- the skin is not a physical part of the simulation and does not652 interact with any objects in the world or itself. Objects will pass653 right though the skin until they come in contact with the654 underlying bone, which is a physical object. Whithout simulating655 the skin, the sense of touch has little meaning, and the creature's656 own vision will lie to it about the true extent of its body.657 Simulating the skin as a physical object requires some way to658 continuously update the physical model of the skin along with the659 movement of the bones, which is unacceptably slow compared to rigid660 body simulation.662 Therefore, instead of using the human-like ``deformable bag of663 bones'' approach, I decided to base my body plans on multiple solid664 objects that are connected by joints, inspired by the robot =EVE=665 from the movie WALL-E.667 #+caption: =EVE= from the movie WALL-E. This body plan turns668 #+caption: out to be much better suited to my purposes than a more669 #+caption: human-like one.670 #+ATTR_LaTeX: :width 10cm671 [[./images/Eve.jpg]]673 =EVE='s body is composed of several rigid components that are held674 together by invisible joint constraints. This is what I mean by675 ``eve-like''. The main reason that I use eve-style bodies is for676 efficiency, and so that there will be correspondence between the677 AI's semses and the physical presence of its body. Each individual678 section is simulated by a separate rigid body that corresponds679 exactly with its visual representation and does not change.680 Sections are connected by invisible joints that are well supported681 in jMonkeyEngine3. Bullet, the physics backend for jMonkeyEngine3,682 can efficiently simulate hundreds of rigid bodies connected by683 joints. Just because sections are rigid does not mean they have to684 stay as one piece forever; they can be dynamically replaced with685 multiple sections to simulate splitting in two. This could be used686 to simulate retractable claws or =EVE='s hands, which are able to687 coalesce into one object in the movie.689 *** Solidifying/Connecting a body691 =CORTEX= creates a creature in two steps: first, it traverses the692 nodes in the blender file and creates physical representations for693 any of them that have mass defined in their blender meta-data.695 #+caption: Program for iterating through the nodes in a blender file696 #+caption: and generating physical jMonkeyEngine3 objects with mass697 #+caption: and a matching physics shape.698 #+name: name699 #+begin_listing clojure700 #+begin_src clojure701 (defn physical!702 "Iterate through the nodes in creature and make them real physical703 objects in the simulation."704 [#^Node creature]705 (dorun706 (map707 (fn [geom]708 (let [physics-control709 (RigidBodyControl.710 (HullCollisionShape.711 (.getMesh geom))712 (if-let [mass (meta-data geom "mass")]713 (float mass) (float 1)))]714 (.addControl geom physics-control)))715 (filter #(isa? (class %) Geometry )716 (node-seq creature)))))717 #+end_src718 #+end_listing720 The next step to making a proper body is to connect those pieces721 together with joints. jMonkeyEngine has a large array of joints722 available via =bullet=, such as Point2Point, Cone, Hinge, and a723 generic Six Degree of Freedom joint, with or without spring724 restitution.726 Joints are treated a lot like proper senses, in that there is a727 top-level empty node named ``joints'' whose children each728 represent a joint.730 #+caption: View of the hand model in Blender showing the main ``joints''731 #+caption: node (highlighted in yellow) and its children which each732 #+caption: represent a joint in the hand. Each joint node has metadata733 #+caption: specifying what sort of joint it is.734 #+name: blender-hand735 #+ATTR_LaTeX: :width 10cm736 [[./images/hand-screenshot1.png]]739 =CORTEX='s procedure for binding the creature together with joints740 is as follows:742 - Find the children of the ``joints'' node.743 - Determine the two spatials the joint is meant to connect.744 - Create the joint based on the meta-data of the empty node.746 The higher order function =sense-nodes= from =cortex.sense=747 simplifies finding the joints based on their parent ``joints''748 node.750 #+caption: Retrieving the children empty nodes from a single751 #+caption: named empty node is a common pattern in =CORTEX=752 #+caption: further instances of this technique for the senses753 #+caption: will be omitted754 #+name: get-empty-nodes755 #+begin_listing clojure756 #+begin_src clojure757 (defn sense-nodes758 "For some senses there is a special empty blender node whose759 children are considered markers for an instance of that sense. This760 function generates functions to find those children, given the name761 of the special parent node."762 [parent-name]763 (fn [#^Node creature]764 (if-let [sense-node (.getChild creature parent-name)]765 (seq (.getChildren sense-node)) [])))767 (def768 ^{:doc "Return the children of the creature's \"joints\" node."769 :arglists '([creature])}770 joints771 (sense-nodes "joints"))772 #+end_src773 #+end_listing775 To find a joint's targets, =CORTEX= creates a small cube, centered776 around the empty-node, and grows the cube exponentially until it777 intersects two physical objects. The objects are ordered according778 to the joint's rotation, with the first one being the object that779 has more negative coordinates in the joint's reference frame.780 Since the objects must be physical, the empty-node itself escapes781 detection. Because the objects must be physical, =joint-targets=782 must be called /after/ =physical!= is called.784 #+caption: Program to find the targets of a joint node by785 #+caption: exponentiallly growth of a search cube.786 #+name: joint-targets787 #+begin_listing clojure788 #+begin_src clojure789 (defn joint-targets790 "Return the two closest two objects to the joint object, ordered791 from bottom to top according to the joint's rotation."792 [#^Node parts #^Node joint]793 (loop [radius (float 0.01)]794 (let [results (CollisionResults.)]795 (.collideWith796 parts797 (BoundingBox. (.getWorldTranslation joint)798 radius radius radius) results)799 (let [targets800 (distinct801 (map #(.getGeometry %) results))]802 (if (>= (count targets) 2)803 (sort-by804 #(let [joint-ref-frame-position805 (jme-to-blender806 (.mult807 (.inverse (.getWorldRotation joint))808 (.subtract (.getWorldTranslation %)809 (.getWorldTranslation joint))))]810 (.dot (Vector3f. 1 1 1) joint-ref-frame-position))811 (take 2 targets))812 (recur (float (* radius 2))))))))813 #+end_src814 #+end_listing816 Once =CORTEX= finds all joints and targets, it creates them using817 a dispatch on the metadata of each joint node.819 #+caption: Program to dispatch on blender metadata and create joints820 #+caption: sutiable for physical simulation.821 #+name: joint-dispatch822 #+begin_listing clojure823 #+begin_src clojure824 (defmulti joint-dispatch825 "Translate blender pseudo-joints into real JME joints."826 (fn [constraints & _]827 (:type constraints)))829 (defmethod joint-dispatch :point830 [constraints control-a control-b pivot-a pivot-b rotation]831 (doto (SixDofJoint. control-a control-b pivot-a pivot-b false)832 (.setLinearLowerLimit Vector3f/ZERO)833 (.setLinearUpperLimit Vector3f/ZERO)))835 (defmethod joint-dispatch :hinge836 [constraints control-a control-b pivot-a pivot-b rotation]837 (let [axis (if-let [axis (:axis constraints)] axis Vector3f/UNIT_X)838 [limit-1 limit-2] (:limit constraints)839 hinge-axis (.mult rotation (blender-to-jme axis))]840 (doto (HingeJoint. control-a control-b pivot-a pivot-b841 hinge-axis hinge-axis)842 (.setLimit limit-1 limit-2))))844 (defmethod joint-dispatch :cone845 [constraints control-a control-b pivot-a pivot-b rotation]846 (let [limit-xz (:limit-xz constraints)847 limit-xy (:limit-xy constraints)848 twist (:twist constraints)]849 (doto (ConeJoint. control-a control-b pivot-a pivot-b850 rotation rotation)851 (.setLimit (float limit-xz) (float limit-xy)852 (float twist)))))853 #+end_src854 #+end_listing856 All that is left for joints it to combine the above pieces into a857 something that can operate on the collection of nodes that a858 blender file represents.860 #+caption: Program to completely create a joint given information861 #+caption: from a blender file.862 #+name: connect863 #+begin_listing clojure864 #+begin_src clojure865 (defn connect866 "Create a joint between 'obj-a and 'obj-b at the location of867 'joint. The type of joint is determined by the metadata on 'joint.869 Here are some examples:870 {:type :point}871 {:type :hinge :limit [0 (/ Math/PI 2)] :axis (Vector3f. 0 1 0)}872 (:axis defaults to (Vector3f. 1 0 0) if not provided for hinge joints)874 {:type :cone :limit-xz 0]875 :limit-xy 0]876 :twist 0]} (use XZY rotation mode in blender!)"877 [#^Node obj-a #^Node obj-b #^Node joint]878 (let [control-a (.getControl obj-a RigidBodyControl)879 control-b (.getControl obj-b RigidBodyControl)880 joint-center (.getWorldTranslation joint)881 joint-rotation (.toRotationMatrix (.getWorldRotation joint))882 pivot-a (world-to-local obj-a joint-center)883 pivot-b (world-to-local obj-b joint-center)]884 (if-let885 [constraints (map-vals eval (read-string (meta-data joint "joint")))]886 ;; A side-effect of creating a joint registers887 ;; it with both physics objects which in turn888 ;; will register the joint with the physics system889 ;; when the simulation is started.890 (joint-dispatch constraints891 control-a control-b892 pivot-a pivot-b893 joint-rotation))))894 #+end_src895 #+end_listing897 In general, whenever =CORTEX= exposes a sense (or in this case898 physicality), it provides a function of the type =sense!=, which899 takes in a collection of nodes and augments it to support that900 sense. The function returns any controlls necessary to use that901 sense. In this case =body!= cerates a physical body and returns no902 control functions.904 #+caption: Program to give joints to a creature.905 #+name: name906 #+begin_listing clojure907 #+begin_src clojure908 (defn joints!909 "Connect the solid parts of the creature with physical joints. The910 joints are taken from the \"joints\" node in the creature."911 [#^Node creature]912 (dorun913 (map914 (fn [joint]915 (let [[obj-a obj-b] (joint-targets creature joint)]916 (connect obj-a obj-b joint)))917 (joints creature))))918 (defn body!919 "Endow the creature with a physical body connected with joints. The920 particulars of the joints and the masses of each body part are921 determined in blender."922 [#^Node creature]923 (physical! creature)924 (joints! creature))925 #+end_src926 #+end_listing928 All of the code you have just seen amounts to only 130 lines, yet929 because it builds on top of Blender and jMonkeyEngine3, those few930 lines pack quite a punch!932 The hand from figure \ref{blender-hand}, which was modeled after933 my own right hand, can now be given joints and simulated as a934 creature.936 #+caption: With the ability to create physical creatures from blender,937 #+caption: =CORTEX= gets one step closer to becomming a full creature938 #+caption: simulation environment.939 #+name: name940 #+ATTR_LaTeX: :width 15cm941 [[./images/physical-hand.png]]943 ** Eyes reuse standard video game components945 ** Hearing is hard; =CORTEX= does it right947 ** Touch uses hundreds of hair-like elements949 ** Proprioception is the sense that makes everything ``real''951 ** Muscles are both effectors and sensors953 ** =CORTEX= brings complex creatures to life!955 ** =CORTEX= enables many possiblities for further research957 * COMMENT Empathy in a simulated worm959 Here I develop a computational model of empathy, using =CORTEX= as a960 base. Empathy in this context is the ability to observe another961 creature and infer what sorts of sensations that creature is962 feeling. My empathy algorithm involves multiple phases. First is963 free-play, where the creature moves around and gains sensory964 experience. From this experience I construct a representation of the965 creature's sensory state space, which I call \Phi-space. Using966 \Phi-space, I construct an efficient function which takes the967 limited data that comes from observing another creature and enriches968 it full compliment of imagined sensory data. I can then use the969 imagined sensory data to recognize what the observed creature is970 doing and feeling, using straightforward embodied action predicates.971 This is all demonstrated with using a simple worm-like creature, and972 recognizing worm-actions based on limited data.974 #+caption: Here is the worm with which we will be working.975 #+caption: It is composed of 5 segments. Each segment has a976 #+caption: pair of extensor and flexor muscles. Each of the977 #+caption: worm's four joints is a hinge joint which allows978 #+caption: about 30 degrees of rotation to either side. Each segment979 #+caption: of the worm is touch-capable and has a uniform980 #+caption: distribution of touch sensors on each of its faces.981 #+caption: Each joint has a proprioceptive sense to detect982 #+caption: relative positions. The worm segments are all the983 #+caption: same except for the first one, which has a much984 #+caption: higher weight than the others to allow for easy985 #+caption: manual motor control.986 #+name: basic-worm-view987 #+ATTR_LaTeX: :width 10cm988 [[./images/basic-worm-view.png]]990 #+caption: Program for reading a worm from a blender file and991 #+caption: outfitting it with the senses of proprioception,992 #+caption: touch, and the ability to move, as specified in the993 #+caption: blender file.994 #+name: get-worm995 #+begin_listing clojure996 #+begin_src clojure997 (defn worm []998 (let [model (load-blender-model "Models/worm/worm.blend")]999 {:body (doto model (body!))1000 :touch (touch! model)1001 :proprioception (proprioception! model)1002 :muscles (movement! model)}))1003 #+end_src1004 #+end_listing1006 ** Embodiment factors action recognition into managable parts1008 Using empathy, I divide the problem of action recognition into a1009 recognition process expressed in the language of a full compliment1010 of senses, and an imaganitive process that generates full sensory1011 data from partial sensory data. Splitting the action recognition1012 problem in this manner greatly reduces the total amount of work to1013 recognize actions: The imaganitive process is mostly just matching1014 previous experience, and the recognition process gets to use all1015 the senses to directly describe any action.1017 ** Action recognition is easy with a full gamut of senses1019 Embodied representations using multiple senses such as touch,1020 proprioception, and muscle tension turns out be be exceedingly1021 efficient at describing body-centered actions. It is the ``right1022 language for the job''. For example, it takes only around 5 lines1023 of LISP code to describe the action of ``curling'' using embodied1024 primitives. It takes about 10 lines to describe the seemingly1025 complicated action of wiggling.1027 The following action predicates each take a stream of sensory1028 experience, observe however much of it they desire, and decide1029 whether the worm is doing the action they describe. =curled?=1030 relies on proprioception, =resting?= relies on touch, =wiggling?=1031 relies on a fourier analysis of muscle contraction, and1032 =grand-circle?= relies on touch and reuses =curled?= as a gaurd.1034 #+caption: Program for detecting whether the worm is curled. This is the1035 #+caption: simplest action predicate, because it only uses the last frame1036 #+caption: of sensory experience, and only uses proprioceptive data. Even1037 #+caption: this simple predicate, however, is automatically frame1038 #+caption: independent and ignores vermopomorphic differences such as1039 #+caption: worm textures and colors.1040 #+name: curled1041 #+attr_latex: [htpb]1042 #+begin_listing clojure1043 #+begin_src clojure1044 (defn curled?1045 "Is the worm curled up?"1046 [experiences]1047 (every?1048 (fn [[_ _ bend]]1049 (> (Math/sin bend) 0.64))1050 (:proprioception (peek experiences))))1051 #+end_src1052 #+end_listing1054 #+caption: Program for summarizing the touch information in a patch1055 #+caption: of skin.1056 #+name: touch-summary1057 #+attr_latex: [htpb]1059 #+begin_listing clojure1060 #+begin_src clojure1061 (defn contact1062 "Determine how much contact a particular worm segment has with1063 other objects. Returns a value between 0 and 1, where 1 is full1064 contact and 0 is no contact."1065 [touch-region [coords contact :as touch]]1066 (-> (zipmap coords contact)1067 (select-keys touch-region)1068 (vals)1069 (#(map first %))1070 (average)1071 (* 10)1072 (- 1)1073 (Math/abs)))1074 #+end_src1075 #+end_listing1078 #+caption: Program for detecting whether the worm is at rest. This program1079 #+caption: uses a summary of the tactile information from the underbelly1080 #+caption: of the worm, and is only true if every segment is touching the1081 #+caption: floor. Note that this function contains no references to1082 #+caption: proprioction at all.1083 #+name: resting1084 #+attr_latex: [htpb]1085 #+begin_listing clojure1086 #+begin_src clojure1087 (def worm-segment-bottom (rect-region [8 15] [14 22]))1089 (defn resting?1090 "Is the worm resting on the ground?"1091 [experiences]1092 (every?1093 (fn [touch-data]1094 (< 0.9 (contact worm-segment-bottom touch-data)))1095 (:touch (peek experiences))))1096 #+end_src1097 #+end_listing1099 #+caption: Program for detecting whether the worm is curled up into a1100 #+caption: full circle. Here the embodied approach begins to shine, as1101 #+caption: I am able to both use a previous action predicate (=curled?=)1102 #+caption: as well as the direct tactile experience of the head and tail.1103 #+name: grand-circle1104 #+attr_latex: [htpb]1105 #+begin_listing clojure1106 #+begin_src clojure1107 (def worm-segment-bottom-tip (rect-region [15 15] [22 22]))1109 (def worm-segment-top-tip (rect-region [0 15] [7 22]))1111 (defn grand-circle?1112 "Does the worm form a majestic circle (one end touching the other)?"1113 [experiences]1114 (and (curled? experiences)1115 (let [worm-touch (:touch (peek experiences))1116 tail-touch (worm-touch 0)1117 head-touch (worm-touch 4)]1118 (and (< 0.55 (contact worm-segment-bottom-tip tail-touch))1119 (< 0.55 (contact worm-segment-top-tip head-touch))))))1120 #+end_src1121 #+end_listing1124 #+caption: Program for detecting whether the worm has been wiggling for1125 #+caption: the last few frames. It uses a fourier analysis of the muscle1126 #+caption: contractions of the worm's tail to determine wiggling. This is1127 #+caption: signigicant because there is no particular frame that clearly1128 #+caption: indicates that the worm is wiggling --- only when multiple frames1129 #+caption: are analyzed together is the wiggling revealed. Defining1130 #+caption: wiggling this way also gives the worm an opportunity to learn1131 #+caption: and recognize ``frustrated wiggling'', where the worm tries to1132 #+caption: wiggle but can't. Frustrated wiggling is very visually different1133 #+caption: from actual wiggling, but this definition gives it to us for free.1134 #+name: wiggling1135 #+attr_latex: [htpb]1136 #+begin_listing clojure1137 #+begin_src clojure1138 (defn fft [nums]1139 (map1140 #(.getReal %)1141 (.transform1142 (FastFourierTransformer. DftNormalization/STANDARD)1143 (double-array nums) TransformType/FORWARD)))1145 (def indexed (partial map-indexed vector))1147 (defn max-indexed [s]1148 (first (sort-by (comp - second) (indexed s))))1150 (defn wiggling?1151 "Is the worm wiggling?"1152 [experiences]1153 (let [analysis-interval 0x40]1154 (when (> (count experiences) analysis-interval)1155 (let [a-flex 31156 a-ex 21157 muscle-activity1158 (map :muscle (vector:last-n experiences analysis-interval))1159 base-activity1160 (map #(- (% a-flex) (% a-ex)) muscle-activity)]1161 (= 21162 (first1163 (max-indexed1164 (map #(Math/abs %)1165 (take 20 (fft base-activity))))))))))1166 #+end_src1167 #+end_listing1169 With these action predicates, I can now recognize the actions of1170 the worm while it is moving under my control and I have access to1171 all the worm's senses.1173 #+caption: Use the action predicates defined earlier to report on1174 #+caption: what the worm is doing while in simulation.1175 #+name: report-worm-activity1176 #+attr_latex: [htpb]1177 #+begin_listing clojure1178 #+begin_src clojure1179 (defn debug-experience1180 [experiences text]1181 (cond1182 (grand-circle? experiences) (.setText text "Grand Circle")1183 (curled? experiences) (.setText text "Curled")1184 (wiggling? experiences) (.setText text "Wiggling")1185 (resting? experiences) (.setText text "Resting")))1186 #+end_src1187 #+end_listing1189 #+caption: Using =debug-experience=, the body-centered predicates1190 #+caption: work together to classify the behaviour of the worm.1191 #+caption: the predicates are operating with access to the worm's1192 #+caption: full sensory data.1193 #+name: basic-worm-view1194 #+ATTR_LaTeX: :width 10cm1195 [[./images/worm-identify-init.png]]1197 These action predicates satisfy the recognition requirement of an1198 empathic recognition system. There is power in the simplicity of1199 the action predicates. They describe their actions without getting1200 confused in visual details of the worm. Each one is frame1201 independent, but more than that, they are each indepent of1202 irrelevant visual details of the worm and the environment. They1203 will work regardless of whether the worm is a different color or1204 hevaily textured, or if the environment has strange lighting.1206 The trick now is to make the action predicates work even when the1207 sensory data on which they depend is absent. If I can do that, then1208 I will have gained much,1210 ** \Phi-space describes the worm's experiences1212 As a first step towards building empathy, I need to gather all of1213 the worm's experiences during free play. I use a simple vector to1214 store all the experiences.1216 Each element of the experience vector exists in the vast space of1217 all possible worm-experiences. Most of this vast space is actually1218 unreachable due to physical constraints of the worm's body. For1219 example, the worm's segments are connected by hinge joints that put1220 a practical limit on the worm's range of motions without limiting1221 its degrees of freedom. Some groupings of senses are impossible;1222 the worm can not be bent into a circle so that its ends are1223 touching and at the same time not also experience the sensation of1224 touching itself.1226 As the worm moves around during free play and its experience vector1227 grows larger, the vector begins to define a subspace which is all1228 the sensations the worm can practicaly experience during normal1229 operation. I call this subspace \Phi-space, short for1230 physical-space. The experience vector defines a path through1231 \Phi-space. This path has interesting properties that all derive1232 from physical embodiment. The proprioceptive components are1233 completely smooth, because in order for the worm to move from one1234 position to another, it must pass through the intermediate1235 positions. The path invariably forms loops as actions are repeated.1236 Finally and most importantly, proprioception actually gives very1237 strong inference about the other senses. For example, when the worm1238 is flat, you can infer that it is touching the ground and that its1239 muscles are not active, because if the muscles were active, the1240 worm would be moving and would not be perfectly flat. In order to1241 stay flat, the worm has to be touching the ground, or it would1242 again be moving out of the flat position due to gravity. If the1243 worm is positioned in such a way that it interacts with itself,1244 then it is very likely to be feeling the same tactile feelings as1245 the last time it was in that position, because it has the same body1246 as then. If you observe multiple frames of proprioceptive data,1247 then you can become increasingly confident about the exact1248 activations of the worm's muscles, because it generally takes a1249 unique combination of muscle contractions to transform the worm's1250 body along a specific path through \Phi-space.1252 There is a simple way of taking \Phi-space and the total ordering1253 provided by an experience vector and reliably infering the rest of1254 the senses.1256 ** Empathy is the process of tracing though \Phi-space1258 Here is the core of a basic empathy algorithm, starting with an1259 experience vector:1261 First, group the experiences into tiered proprioceptive bins. I use1262 powers of 10 and 3 bins, and the smallest bin has an approximate1263 size of 0.001 radians in all proprioceptive dimensions.1265 Then, given a sequence of proprioceptive input, generate a set of1266 matching experience records for each input, using the tiered1267 proprioceptive bins.1269 Finally, to infer sensory data, select the longest consective chain1270 of experiences. Conecutive experience means that the experiences1271 appear next to each other in the experience vector.1273 This algorithm has three advantages:1275 1. It's simple1277 3. It's very fast -- retrieving possible interpretations takes1278 constant time. Tracing through chains of interpretations takes1279 time proportional to the average number of experiences in a1280 proprioceptive bin. Redundant experiences in \Phi-space can be1281 merged to save computation.1283 2. It protects from wrong interpretations of transient ambiguous1284 proprioceptive data. For example, if the worm is flat for just1285 an instant, this flattness will not be interpreted as implying1286 that the worm has its muscles relaxed, since the flattness is1287 part of a longer chain which includes a distinct pattern of1288 muscle activation. Markov chains or other memoryless statistical1289 models that operate on individual frames may very well make this1290 mistake.1292 #+caption: Program to convert an experience vector into a1293 #+caption: proprioceptively binned lookup function.1294 #+name: bin1295 #+attr_latex: [htpb]1296 #+begin_listing clojure1297 #+begin_src clojure1298 (defn bin [digits]1299 (fn [angles]1300 (->> angles1301 (flatten)1302 (map (juxt #(Math/sin %) #(Math/cos %)))1303 (flatten)1304 (mapv #(Math/round (* % (Math/pow 10 (dec digits))))))))1306 (defn gen-phi-scan1307 "Nearest-neighbors with binning. Only returns a result if1308 the propriceptive data is within 10% of a previously recorded1309 result in all dimensions."1310 [phi-space]1311 (let [bin-keys (map bin [3 2 1])1312 bin-maps1313 (map (fn [bin-key]1314 (group-by1315 (comp bin-key :proprioception phi-space)1316 (range (count phi-space)))) bin-keys)1317 lookups (map (fn [bin-key bin-map]1318 (fn [proprio] (bin-map (bin-key proprio))))1319 bin-keys bin-maps)]1320 (fn lookup [proprio-data]1321 (set (some #(% proprio-data) lookups)))))1322 #+end_src1323 #+end_listing1325 #+caption: =longest-thread= finds the longest path of consecutive1326 #+caption: experiences to explain proprioceptive worm data.1327 #+name: phi-space-history-scan1328 #+ATTR_LaTeX: :width 10cm1329 [[./images/aurellem-gray.png]]1331 =longest-thread= infers sensory data by stitching together pieces1332 from previous experience. It prefers longer chains of previous1333 experience to shorter ones. For example, during training the worm1334 might rest on the ground for one second before it performs its1335 excercises. If during recognition the worm rests on the ground for1336 five seconds, =longest-thread= will accomodate this five second1337 rest period by looping the one second rest chain five times.1339 =longest-thread= takes time proportinal to the average number of1340 entries in a proprioceptive bin, because for each element in the1341 starting bin it performes a series of set lookups in the preceeding1342 bins. If the total history is limited, then this is only a constant1343 multiple times the number of entries in the starting bin. This1344 analysis also applies even if the action requires multiple longest1345 chains -- it's still the average number of entries in a1346 proprioceptive bin times the desired chain length. Because1347 =longest-thread= is so efficient and simple, I can interpret1348 worm-actions in real time.1350 #+caption: Program to calculate empathy by tracing though \Phi-space1351 #+caption: and finding the longest (ie. most coherent) interpretation1352 #+caption: of the data.1353 #+name: longest-thread1354 #+attr_latex: [htpb]1355 #+begin_listing clojure1356 #+begin_src clojure1357 (defn longest-thread1358 "Find the longest thread from phi-index-sets. The index sets should1359 be ordered from most recent to least recent."1360 [phi-index-sets]1361 (loop [result '()1362 [thread-bases & remaining :as phi-index-sets] phi-index-sets]1363 (if (empty? phi-index-sets)1364 (vec result)1365 (let [threads1366 (for [thread-base thread-bases]1367 (loop [thread (list thread-base)1368 remaining remaining]1369 (let [next-index (dec (first thread))]1370 (cond (empty? remaining) thread1371 (contains? (first remaining) next-index)1372 (recur1373 (cons next-index thread) (rest remaining))1374 :else thread))))1375 longest-thread1376 (reduce (fn [thread-a thread-b]1377 (if (> (count thread-a) (count thread-b))1378 thread-a thread-b))1379 '(nil)1380 threads)]1381 (recur (concat longest-thread result)1382 (drop (count longest-thread) phi-index-sets))))))1383 #+end_src1384 #+end_listing1386 There is one final piece, which is to replace missing sensory data1387 with a best-guess estimate. While I could fill in missing data by1388 using a gradient over the closest known sensory data points,1389 averages can be misleading. It is certainly possible to create an1390 impossible sensory state by averaging two possible sensory states.1391 Therefore, I simply replicate the most recent sensory experience to1392 fill in the gaps.1394 #+caption: Fill in blanks in sensory experience by replicating the most1395 #+caption: recent experience.1396 #+name: infer-nils1397 #+attr_latex: [htpb]1398 #+begin_listing clojure1399 #+begin_src clojure1400 (defn infer-nils1401 "Replace nils with the next available non-nil element in the1402 sequence, or barring that, 0."1403 [s]1404 (loop [i (dec (count s))1405 v (transient s)]1406 (if (zero? i) (persistent! v)1407 (if-let [cur (v i)]1408 (if (get v (dec i) 0)1409 (recur (dec i) v)1410 (recur (dec i) (assoc! v (dec i) cur)))1411 (recur i (assoc! v i 0))))))1412 #+end_src1413 #+end_listing1415 ** Efficient action recognition with =EMPATH=1417 To use =EMPATH= with the worm, I first need to gather a set of1418 experiences from the worm that includes the actions I want to1419 recognize. The =generate-phi-space= program (listing1420 \ref{generate-phi-space} runs the worm through a series of1421 exercices and gatheres those experiences into a vector. The1422 =do-all-the-things= program is a routine expressed in a simple1423 muscle contraction script language for automated worm control. It1424 causes the worm to rest, curl, and wiggle over about 700 frames1425 (approx. 11 seconds).1427 #+caption: Program to gather the worm's experiences into a vector for1428 #+caption: further processing. The =motor-control-program= line uses1429 #+caption: a motor control script that causes the worm to execute a series1430 #+caption: of ``exercices'' that include all the action predicates.1431 #+name: generate-phi-space1432 #+attr_latex: [htpb]1433 #+begin_listing clojure1434 #+begin_src clojure1435 (def do-all-the-things1436 (concat1437 curl-script1438 [[300 :d-ex 40]1439 [320 :d-ex 0]]1440 (shift-script 280 (take 16 wiggle-script))))1442 (defn generate-phi-space []1443 (let [experiences (atom [])]1444 (run-world1445 (apply-map1446 worm-world1447 (merge1448 (worm-world-defaults)1449 {:end-frame 7001450 :motor-control1451 (motor-control-program worm-muscle-labels do-all-the-things)1452 :experiences experiences})))1453 @experiences))1454 #+end_src1455 #+end_listing1457 #+caption: Use longest thread and a phi-space generated from a short1458 #+caption: exercise routine to interpret actions during free play.1459 #+name: empathy-debug1460 #+attr_latex: [htpb]1461 #+begin_listing clojure1462 #+begin_src clojure1463 (defn init []1464 (def phi-space (generate-phi-space))1465 (def phi-scan (gen-phi-scan phi-space)))1467 (defn empathy-demonstration []1468 (let [proprio (atom ())]1469 (fn1470 [experiences text]1471 (let [phi-indices (phi-scan (:proprioception (peek experiences)))]1472 (swap! proprio (partial cons phi-indices))1473 (let [exp-thread (longest-thread (take 300 @proprio))1474 empathy (mapv phi-space (infer-nils exp-thread))]1475 (println-repl (vector:last-n exp-thread 22))1476 (cond1477 (grand-circle? empathy) (.setText text "Grand Circle")1478 (curled? empathy) (.setText text "Curled")1479 (wiggling? empathy) (.setText text "Wiggling")1480 (resting? empathy) (.setText text "Resting")1481 :else (.setText text "Unknown")))))))1483 (defn empathy-experiment [record]1484 (.start (worm-world :experience-watch (debug-experience-phi)1485 :record record :worm worm*)))1486 #+end_src1487 #+end_listing1489 The result of running =empathy-experiment= is that the system is1490 generally able to interpret worm actions using the action-predicates1491 on simulated sensory data just as well as with actual data. Figure1492 \ref{empathy-debug-image} was generated using =empathy-experiment=:1494 #+caption: From only proprioceptive data, =EMPATH= was able to infer1495 #+caption: the complete sensory experience and classify four poses1496 #+caption: (The last panel shows a composite image of \emph{wriggling},1497 #+caption: a dynamic pose.)1498 #+name: empathy-debug-image1499 #+ATTR_LaTeX: :width 10cm :placement [H]1500 [[./images/empathy-1.png]]1502 One way to measure the performance of =EMPATH= is to compare the1503 sutiability of the imagined sense experience to trigger the same1504 action predicates as the real sensory experience.1506 #+caption: Determine how closely empathy approximates actual1507 #+caption: sensory data.1508 #+name: test-empathy-accuracy1509 #+attr_latex: [htpb]1510 #+begin_listing clojure1511 #+begin_src clojure1512 (def worm-action-label1513 (juxt grand-circle? curled? wiggling?))1515 (defn compare-empathy-with-baseline [matches]1516 (let [proprio (atom ())]1517 (fn1518 [experiences text]1519 (let [phi-indices (phi-scan (:proprioception (peek experiences)))]1520 (swap! proprio (partial cons phi-indices))1521 (let [exp-thread (longest-thread (take 300 @proprio))1522 empathy (mapv phi-space (infer-nils exp-thread))1523 experience-matches-empathy1524 (= (worm-action-label experiences)1525 (worm-action-label empathy))]1526 (println-repl experience-matches-empathy)1527 (swap! matches #(conj % experience-matches-empathy)))))))1529 (defn accuracy [v]1530 (float (/ (count (filter true? v)) (count v))))1532 (defn test-empathy-accuracy []1533 (let [res (atom [])]1534 (run-world1535 (worm-world :experience-watch1536 (compare-empathy-with-baseline res)1537 :worm worm*))1538 (accuracy @res)))1539 #+end_src1540 #+end_listing1542 Running =test-empathy-accuracy= using the very short exercise1543 program defined in listing \ref{generate-phi-space}, and then doing1544 a similar pattern of activity manually yeilds an accuracy of around1545 73%. This is based on very limited worm experience. By training the1546 worm for longer, the accuracy dramatically improves.1548 #+caption: Program to generate \Phi-space using manual training.1549 #+name: manual-phi-space1550 #+attr_latex: [htpb]1551 #+begin_listing clojure1552 #+begin_src clojure1553 (defn init-interactive []1554 (def phi-space1555 (let [experiences (atom [])]1556 (run-world1557 (apply-map1558 worm-world1559 (merge1560 (worm-world-defaults)1561 {:experiences experiences})))1562 @experiences))1563 (def phi-scan (gen-phi-scan phi-space)))1564 #+end_src1565 #+end_listing1567 After about 1 minute of manual training, I was able to achieve 95%1568 accuracy on manual testing of the worm using =init-interactive= and1569 =test-empathy-accuracy=. The majority of errors are near the1570 boundaries of transitioning from one type of action to another.1571 During these transitions the exact label for the action is more open1572 to interpretation, and dissaggrement between empathy and experience1573 is more excusable.1575 ** Digression: bootstrapping touch using free exploration1577 In the previous section I showed how to compute actions in terms of1578 body-centered predicates which relied averate touch activation of1579 pre-defined regions of the worm's skin. What if, instead of recieving1580 touch pre-grouped into the six faces of each worm segment, the true1581 topology of the worm's skin was unknown? This is more similiar to how1582 a nerve fiber bundle might be arranged. While two fibers that are1583 close in a nerve bundle /might/ correspond to two touch sensors that1584 are close together on the skin, the process of taking a complicated1585 surface and forcing it into essentially a circle requires some cuts1586 and rerragenments.1588 In this section I show how to automatically learn the skin-topology of1589 a worm segment by free exploration. As the worm rolls around on the1590 floor, large sections of its surface get activated. If the worm has1591 stopped moving, then whatever region of skin that is touching the1592 floor is probably an important region, and should be recorded.1594 #+caption: Program to detect whether the worm is in a resting state1595 #+caption: with one face touching the floor.1596 #+name: pure-touch1597 #+begin_listing clojure1598 #+begin_src clojure1599 (def full-contact [(float 0.0) (float 0.1)])1601 (defn pure-touch?1602 "This is worm specific code to determine if a large region of touch1603 sensors is either all on or all off."1604 [[coords touch :as touch-data]]1605 (= (set (map first touch)) (set full-contact)))1606 #+end_src1607 #+end_listing1609 After collecting these important regions, there will many nearly1610 similiar touch regions. While for some purposes the subtle1611 differences between these regions will be important, for my1612 purposes I colapse them into mostly non-overlapping sets using1613 =remove-similiar= in listing \ref{remove-similiar}1615 #+caption: Program to take a lits of set of points and ``collapse them''1616 #+caption: so that the remaining sets in the list are siginificantly1617 #+caption: different from each other. Prefer smaller sets to larger ones.1618 #+name: remove-similiar1619 #+begin_listing clojure1620 #+begin_src clojure1621 (defn remove-similar1622 [coll]1623 (loop [result () coll (sort-by (comp - count) coll)]1624 (if (empty? coll) result1625 (let [[x & xs] coll1626 c (count x)]1627 (if (some1628 (fn [other-set]1629 (let [oc (count other-set)]1630 (< (- (count (union other-set x)) c) (* oc 0.1))))1631 xs)1632 (recur result xs)1633 (recur (cons x result) xs))))))1634 #+end_src1635 #+end_listing1637 Actually running this simulation is easy given =CORTEX='s facilities.1639 #+caption: Collect experiences while the worm moves around. Filter the touch1640 #+caption: sensations by stable ones, collapse similiar ones together,1641 #+caption: and report the regions learned.1642 #+name: learn-touch1643 #+begin_listing clojure1644 #+begin_src clojure1645 (defn learn-touch-regions []1646 (let [experiences (atom [])1647 world (apply-map1648 worm-world1649 (assoc (worm-segment-defaults)1650 :experiences experiences))]1651 (run-world world)1652 (->>1653 @experiences1654 (drop 175)1655 ;; access the single segment's touch data1656 (map (comp first :touch))1657 ;; only deal with "pure" touch data to determine surfaces1658 (filter pure-touch?)1659 ;; associate coordinates with touch values1660 (map (partial apply zipmap))1661 ;; select those regions where contact is being made1662 (map (partial group-by second))1663 (map #(get % full-contact))1664 (map (partial map first))1665 ;; remove redundant/subset regions1666 (map set)1667 remove-similar)))1669 (defn learn-and-view-touch-regions []1670 (map view-touch-region1671 (learn-touch-regions)))1672 #+end_src1673 #+end_listing1675 The only thing remining to define is the particular motion the worm1676 must take. I accomplish this with a simple motor control program.1678 #+caption: Motor control program for making the worm roll on the ground.1679 #+caption: This could also be replaced with random motion.1680 #+name: worm-roll1681 #+begin_listing clojure1682 #+begin_src clojure1683 (defn touch-kinesthetics []1684 [[170 :lift-1 40]1685 [190 :lift-1 19]1686 [206 :lift-1 0]1688 [400 :lift-2 40]1689 [410 :lift-2 0]1691 [570 :lift-2 40]1692 [590 :lift-2 21]1693 [606 :lift-2 0]1695 [800 :lift-1 30]1696 [809 :lift-1 0]1698 [900 :roll-2 40]1699 [905 :roll-2 20]1700 [910 :roll-2 0]1702 [1000 :roll-2 40]1703 [1005 :roll-2 20]1704 [1010 :roll-2 0]1706 [1100 :roll-2 40]1707 [1105 :roll-2 20]1708 [1110 :roll-2 0]1709 ])1710 #+end_src1711 #+end_listing1714 #+caption: The small worm rolls around on the floor, driven1715 #+caption: by the motor control program in listing \ref{worm-roll}.1716 #+name: worm-roll1717 #+ATTR_LaTeX: :width 12cm1718 [[./images/worm-roll.png]]1721 #+caption: After completing its adventures, the worm now knows1722 #+caption: how its touch sensors are arranged along its skin. These1723 #+caption: are the regions that were deemed important by1724 #+caption: =learn-touch-regions=. Note that the worm has discovered1725 #+caption: that it has six sides.1726 #+name: worm-touch-map1727 #+ATTR_LaTeX: :width 12cm1728 [[./images/touch-learn.png]]1730 While simple, =learn-touch-regions= exploits regularities in both1731 the worm's physiology and the worm's environment to correctly1732 deduce that the worm has six sides. Note that =learn-touch-regions=1733 would work just as well even if the worm's touch sense data were1734 completely scrambled. The cross shape is just for convienence. This1735 example justifies the use of pre-defined touch regions in =EMPATH=.1737 * COMMENT Contributions1739 In this thesis you have seen the =CORTEX= system, a complete1740 environment for creating simulated creatures. You have seen how to1741 implement five senses including touch, proprioception, hearing,1742 vision, and muscle tension. You have seen how to create new creatues1743 using blender, a 3D modeling tool. I hope that =CORTEX= will be1744 useful in further research projects. To this end I have included the1745 full source to =CORTEX= along with a large suite of tests and1746 examples. I have also created a user guide for =CORTEX= which is1747 inculded in an appendix to this thesis.1749 You have also seen how I used =CORTEX= as a platform to attach the1750 /action recognition/ problem, which is the problem of recognizing1751 actions in video. You saw a simple system called =EMPATH= which1752 ientifies actions by first describing actions in a body-centerd,1753 rich sense language, then infering a full range of sensory1754 experience from limited data using previous experience gained from1755 free play.1757 As a minor digression, you also saw how I used =CORTEX= to enable a1758 tiny worm to discover the topology of its skin simply by rolling on1759 the ground.1761 In conclusion, the main contributions of this thesis are:1763 - =CORTEX=, a system for creating simulated creatures with rich1764 senses.1765 - =EMPATH=, a program for recognizing actions by imagining sensory1766 experience.1768 # An anatomical joke:1769 # - Training1770 # - Skeletal imitation1771 # - Sensory fleshing-out1772 # - Classification