changeset 462:bb81cef09ad7

stuff about simulation vs reality.
author Robert McIntyre <rlm@mit.edu>
date Thu, 27 Mar 2014 20:18:51 -0400
parents b345650a0baa
children 6d55ac73bc6f
files org/intro.org thesis/cortex.org
diffstat 2 files changed, 114 insertions(+), 42 deletions(-) [+]
line wrap: on
line diff
     1.1 --- a/org/intro.org	Thu Mar 27 18:17:23 2014 -0400
     1.2 +++ b/org/intro.org	Thu Mar 27 20:18:51 2014 -0400
     1.3 @@ -195,36 +195,3 @@
     1.4  out of all the open projects I looked at, and because I
     1.5  could then write my code in Clojure, an implementation of
     1.6  LISP that runs on the JVM.
     1.7 -
     1.8 -
     1.9 -
    1.10 -
    1.11 -
    1.12 -
    1.13 -
    1.14 -
    1.15 -
    1.16 -
    1.17 -
    1.18 -
    1.19 -
    1.20 -
    1.21 -
    1.22 -
    1.23 -
    1.24 -
    1.25 -
    1.26 -
    1.27 -
    1.28 -
    1.29 -
    1.30 -
    1.31 -
    1.32 -
    1.33 -
    1.34 -
    1.35 -
    1.36 -
    1.37 -
    1.38 -
    1.39 -
     2.1 --- a/thesis/cortex.org	Thu Mar 27 18:17:23 2014 -0400
     2.2 +++ b/thesis/cortex.org	Thu Mar 27 20:18:51 2014 -0400
     2.3 @@ -174,9 +174,9 @@
     2.4     #+ATTR_LaTeX: :width 15cm
     2.5     [[./images/worm-intro-white.png]]
     2.6  
     2.7 -   #+caption: =EMPATH= recognized and classified each of these poses by
     2.8 -   #+caption: inferring the complete sensory experience from 
     2.9 -   #+caption: proprioceptive data.
    2.10 +   #+caption: =EMPATH= recognized and classified each of these 
    2.11 +   #+caption: poses by inferring the complete sensory experience 
    2.12 +   #+caption: from proprioceptive data.
    2.13     #+name: worm-recognition-intro
    2.14     #+ATTR_LaTeX: :width 15cm
    2.15     [[./images/worm-poses.png]]
    2.16 @@ -221,8 +221,8 @@
    2.17         (let [worm-touch (:touch (peek experiences))
    2.18               tail-touch (worm-touch 0)
    2.19               head-touch (worm-touch 4)]
    2.20 -         (and (< 0.55 (contact worm-segment-bottom-tip tail-touch))
    2.21 -              (< 0.55 (contact worm-segment-top-tip    head-touch))))))
    2.22 +         (and (< 0.2 (contact worm-segment-bottom-tip tail-touch))
    2.23 +              (< 0.2 (contact worm-segment-top-tip    head-touch))))))
    2.24     #+end_src
    2.25     #+end_listing
    2.26  
    2.27 @@ -232,9 +232,9 @@
    2.28     I built =CORTEX= to be a general AI research platform for doing
    2.29     experiments involving multiple rich senses and a wide variety and
    2.30     number of creatures. I intend it to be useful as a library for many
    2.31 -   more projects than just this one. =CORTEX= was necessary to meet a
    2.32 -   need among AI researchers at CSAIL and beyond, which is that people
    2.33 -   often will invent neat ideas that are best expressed in the
    2.34 +   more projects than just this thesis. =CORTEX= was necessary to meet
    2.35 +   a need among AI researchers at CSAIL and beyond, which is that
    2.36 +   people often will invent neat ideas that are best expressed in the
    2.37     language of creatures and senses, but in order to explore those
    2.38     ideas they must first build a platform in which they can create
    2.39     simulated creatures with rich senses! There are many ideas that
    2.40 @@ -331,11 +331,116 @@
    2.41     
    2.42  * Building =CORTEX=
    2.43  
    2.44 -** To explore embodiment, we need a world, body, and senses
    2.45 +  I intend for =CORTEX= to be used as a general purpose library for
    2.46 +  building creatures and outfitting them with senses, so that it will
    2.47 +  be useful for other researchers who want to test out ideas of their
    2.48 +  own. To this end, wherver I have had to make archetictural choices
    2.49 +  about =CORTEX=, I have chosen to give as much freedom to the user as
    2.50 +  possible, so that =CORTEX= may be used for things I have not
    2.51 +  forseen.
    2.52 +
    2.53 +** Simulation or Reality?
    2.54 +   
    2.55 +   The most important archetictural decision of all is the choice to
    2.56 +   use a computer-simulated environemnt in the first place! The world
    2.57 +   is a vast and rich place, and for now simulations are a very poor
    2.58 +   reflection of its complexity. It may be that there is a significant
    2.59 +   qualatative difference between dealing with senses in the real
    2.60 +   world and dealing with pale facilimilies of them in a
    2.61 +   simulation. What are the advantages and disadvantages of a
    2.62 +   simulation vs. reality?
    2.63 +
    2.64 +*** Simulation
    2.65 +
    2.66 +    The advantages of virtual reality are that when everything is a
    2.67 +    simulation, experiments in that simulation are absolutely
    2.68 +    reproducible. It's also easier to change the character and world
    2.69 +    to explore new situations and different sensory combinations.
    2.70 +
    2.71 +    If the world is to be simulated on a computer, then not only do
    2.72 +    you have to worry about whether the character's senses are rich
    2.73 +    enough to learn from the world, but whether the world itself is
    2.74 +    rendered with enough detail and realism to give enough working
    2.75 +    material to the character's senses. To name just a few
    2.76 +    difficulties facing modern physics simulators: destructibility of
    2.77 +    the environment, simulation of water/other fluids, large areas,
    2.78 +    nonrigid bodies, lots of objects, smoke. I don't know of any
    2.79 +    computer simulation that would allow a character to take a rock
    2.80 +    and grind it into fine dust, then use that dust to make a clay
    2.81 +    sculpture, at least not without spending years calculating the
    2.82 +    interactions of every single small grain of dust. Maybe a
    2.83 +    simulated world with today's limitations doesn't provide enough
    2.84 +    richness for real intelligence to evolve.
    2.85 +
    2.86 +*** Reality
    2.87 +
    2.88 +    The other approach for playing with senses is to hook your
    2.89 +    software up to real cameras, microphones, robots, etc., and let it
    2.90 +    loose in the real world. This has the advantage of eliminating
    2.91 +    concerns about simulating the world at the expense of increasing
    2.92 +    the complexity of implementing the senses. Instead of just
    2.93 +    grabbing the current rendered frame for processing, you have to
    2.94 +    use an actual camera with real lenses and interact with photons to
    2.95 +    get an image. It is much harder to change the character, which is
    2.96 +    now partly a physical robot of some sort, since doing so involves
    2.97 +    changing things around in the real world instead of modifying
    2.98 +    lines of code. While the real world is very rich and definitely
    2.99 +    provides enough stimulation for intelligence to develop as
   2.100 +    evidenced by our own existence, it is also uncontrollable in the
   2.101 +    sense that a particular situation cannot be recreated perfectly or
   2.102 +    saved for later use. It is harder to conduct science because it is
   2.103 +    harder to repeat an experiment. The worst thing about using the
   2.104 +    real world instead of a simulation is the matter of time. Instead
   2.105 +    of simulated time you get the constant and unstoppable flow of
   2.106 +    real time. This severely limits the sorts of software you can use
   2.107 +    to program the AI because all sense inputs must be handled in real
   2.108 +    time. Complicated ideas may have to be implemented in hardware or
   2.109 +    may simply be impossible given the current speed of our
   2.110 +    processors. Contrast this with a simulation, in which the flow of
   2.111 +    time in the simulated world can be slowed down to accommodate the
   2.112 +    limitations of the character's programming. In terms of cost,
   2.113 +    doing everything in software is far cheaper than building custom
   2.114 +    real-time hardware. All you need is a laptop and some patience.
   2.115  
   2.116  ** Because of Time, simulation is perferable to reality
   2.117  
   2.118 +   I envision =CORTEX= being used to support rapid prototyping and
   2.119 +   iteration of ideas. Even if I could put together a well constructed
   2.120 +   kit for creating robots, it would still not be enough because of
   2.121 +   the scourge of real-time processing. Anyone who wants to test their
   2.122 +   ideas in the real world must always worry about getting their
   2.123 +   algorithms to run fast enough to process information in real
   2.124 +   time. The need for real time processing only increases if multiple
   2.125 +   senses are involved. In the extreme case, even simple algorithms
   2.126 +   will have to be accelerated by ASIC chips or FPGAs, turning what
   2.127 +   would otherwise be a few lines of code and a 10x speed penality
   2.128 +   into a multi-month ordeal. For this reason, =CORTEX= supports
   2.129 +   /time-dialiation/, which scales back the framerate of the
   2.130 +   simulation in proportion to the amount of processing each
   2.131 +   frame. From the perspective of the creatures inside the simulation,
   2.132 +   time always appears to flow at a constant rate, regardless of how
   2.133 +   complicated the envorimnent becomes or how many creatures are in
   2.134 +   the simulation. The cost is that =CORTEX= can sometimes run slower
   2.135 +   than real time. This can also be an advantage, however ---
   2.136 +   simulations of very simple creatures in =CORTEX= generally run at
   2.137 +   40x on my machine!
   2.138 +
   2.139  ** Video game engines are a great starting point
   2.140 +   
   2.141 +   I did not need to write my own physics simulation code or shader to
   2.142 +   build =CORTEX=. Doing so would lead to a system that is impossible
   2.143 +   for anyone but myself to use anyway. Instead, I use a video game
   2.144 +   engine as a base and modify it to accomodate the additional needs
   2.145 +   of =CORTEX=. Video game engines are an ideal starting point to
   2.146 +   build =CORTEX=, because they are not far from being creature
   2.147 +   building systems themselves. 
   2.148 +   
   2.149 +   First off, general purpose video game engines come with a physics
   2.150 +   engine and lighting / sound system. The physics system provides
   2.151 +   tools that can be co-opted to serve as touch, proprioception, and
   2.152 +   muscles. Since some games support split screen views, a good video
   2.153 +   game engine will allow you to efficiently create multiple cameras
   2.154 +   in the simulated world that can be used as eyes. 
   2.155  
   2.156  ** Bodies are composed of segments connected by joints
   2.157