changeset 22:157b416152ea

continuing splitting
author Robert McIntyre <rlm@mit.edu>
date Sun, 23 Oct 2011 23:35:04 -0700
parents 01e1427126af
children cab2da252494
files org/intro.org org/setup.org
diffstat 2 files changed, 313 insertions(+), 0 deletions(-) [+]
line wrap: on
line diff
     1.1 --- /dev/null	Thu Jan 01 00:00:00 1970 +0000
     1.2 +++ b/org/intro.org	Sun Oct 23 23:35:04 2011 -0700
     1.3 @@ -0,0 +1,177 @@
     1.4 +#+title: Simulated Senses
     1.5 +#+author: Robert McIntyre
     1.6 +#+email: rlm@mit.edu
     1.7 +#+description: Simulating senses for AI research using JMonkeyEngine3
     1.8 +#+SETUPFILE: ../../aurellem/org/setup.org
     1.9 +#+INCLUDE: ../../aurellem/org/level-0.org
    1.10 +#+babel: :mkdirp yes :noweb yes
    1.11 +
    1.12 +* Background
    1.13 +Artificial Intelligence has tried and failed for more than half a
    1.14 +century to produce programs as flexible, creative, and "intelligent"
    1.15 +as the human mind itself. Clearly, we are still missing some important
    1.16 +ideas concerning intelligent programs or we would have strong AI
    1.17 +already. What idea could be missing?
    1.18 +
    1.19 +When Turing first proposed his famous "Turing Test" in the
    1.20 +groundbreaking paper [[./sources/turing.pdf][/Computing Machines and Intelligence/]], he gave
    1.21 +little importance to how a computer program might interact with the
    1.22 +world:
    1.23 +
    1.24 +#+BEGIN_QUOTE
    1.25 +\ldquo{}We need not be too concerned about the legs, eyes, etc. The example of
    1.26 +Miss Helen Keller shows that education can take place provided that
    1.27 +communication in both directions between teacher and pupil can take
    1.28 +place by some means or other.\rdquo{}
    1.29 +#+END_QUOTE
    1.30 +
    1.31 +And from the example of Hellen Keller he went on to assume that the
    1.32 +only thing a fledgling AI program could need by way of communication
    1.33 +is a teletypewriter. But Hellen Keller did possess vision and hearing
    1.34 +for the first few months of her life, and her tactile sense was far
    1.35 +more rich than any text-stream could hope to achieve. She possessed a
    1.36 +body she could move freely, and had continual access to the real world
    1.37 +to learn from her actions.
    1.38 +
    1.39 +I believe that our programs are suffering from too little sensory
    1.40 +input to become really intelligent. Imagine for a moment that you
    1.41 +lived in a world completely cut off form all sensory stimulation. You
    1.42 +have no eyes to see, no ears to hear, no mouth to speak. No body, no
    1.43 +taste, no feeling whatsoever. The only sense you get at all is a
    1.44 +single point of light, flickering on and off in the void. If this was
    1.45 +your life from birth, you would never learn anything, and could never
    1.46 +become intelligent. Actual humans placed in sensory deprivation
    1.47 +chambers experience hallucinations and can begin to loose their sense
    1.48 +of reality in as little as 15 minutes[sensory-deprivation]. Most of
    1.49 +the time, the programs we write are in exactly this situation. They do
    1.50 +not interface with cameras and microphones, and they do not control a
    1.51 +real or simulated body or interact with any sort of world.
    1.52 +
    1.53 +
    1.54 +* Simulation vs. Reality
    1.55 +I want demonstrate that multiple senses are what enable
    1.56 +intelligence. There are two ways of playing around with senses and
    1.57 +computer programs:
    1.58 +
    1.59 +The first is to go entirely with simulation: virtual world, virtual
    1.60 +character, virtual senses. The advantages are that when everything is
    1.61 +a simulation, experiments in that simulation are absolutely
    1.62 +reproducible. It's also easier to change the character and world to
    1.63 +explore new situations and different sensory combinations.
    1.64 +
    1.65 +
    1.66 +** Issues with Simulation
    1.67 +
    1.68 +If the world is to be simulated on a computer, then not only do you
    1.69 +have to worry about whether the character's senses are rich enough to
    1.70 +learn from the world, but whether the world itself is rendered with
    1.71 +enough detail and realism to give enough working material to the
    1.72 +character's senses. To name just a few difficulties facing modern
    1.73 +physics simulators: destructibility of the environment, simulation of
    1.74 +water/other fluids, large areas, nonrigid bodies, lots of objects,
    1.75 +smoke. I don't know of any computer simulation that would allow a
    1.76 +character to take a rock and grind it into fine dust, then use that
    1.77 +dust to make a clay sculpture, at least not without spending years
    1.78 +calculating the interactions of every single small grain of
    1.79 +dust. Maybe a simulated world with today's limitations doesn't provide
    1.80 +enough richness for real intelligence to evolve.
    1.81 +
    1.82 +** Issues with Reality
    1.83 +
    1.84 +The other approach for playing with senses is to hook your software up
    1.85 +to real cameras, microphones, robots, etc., and let it loose in the
    1.86 +real world. This has the advantage of eliminating concerns about
    1.87 +simulating the world at the expense of increasing the complexity of
    1.88 +implementing the senses. Instead of just grabbing the current rendered
    1.89 +frame for processing, you have to use an actual camera with real
    1.90 +lenses and interact with photons to get an image. It is much harder to
    1.91 +change the character, which is now partly a physical robot of some
    1.92 +sort, since doing so involves changing things around in the real world
    1.93 +instead of modifying lines of code. While the real world is very rich
    1.94 +and definitely provides enough stimulation for intelligence to develop
    1.95 +as evidenced by our own existence, it is also uncontrollable in the
    1.96 +sense that a particular situation cannot be recreated perfectly or
    1.97 +saved for later use. It is harder to conduct science because it is
    1.98 +harder to repeat an experiment. The worst thing about using the real
    1.99 +world instead of a simulation is the matter of time. Instead of
   1.100 +simulated time you get the constant and unstoppable flow of real
   1.101 +time. This severely limits the sorts of software you can use to
   1.102 +program the AI because all sense inputs must be handled in real
   1.103 +time. Complicated ideas may have to be implemented in hardware or may
   1.104 +simply be impossible given the current speed of our
   1.105 +processors. Contrast this with a simulation, in which the flow of time
   1.106 +in the simulated world can be slowed down to accommodate the
   1.107 +limitations of the character's programming. In terms of cost, doing
   1.108 +everything in software is far cheaper than building custom real-time
   1.109 +hardware. All you need is a laptop and some patience.
   1.110 +
   1.111 +* Choose a Simulation Engine
   1.112 +
   1.113 +Mainly because of issues with controlling the flow of time, I chose to
   1.114 +simulate both the world and the character. I set out to make a minimal
   1.115 +world in which I could embed a character with multiple senses. My main
   1.116 +goal is to make an environment where I can perform further experiments
   1.117 +in simulated senses.
   1.118 +
   1.119 +As Carl Sagan once said, "If you wish to make an apple pie from
   1.120 +scratch, you must first invent the universe." I examined many
   1.121 +different 3D environments to try and find something I would use as the
   1.122 +base for my simulation; eventually the choice came down to three
   1.123 +engines: the Quake II engine, the Source Engine, and jMonkeyEngine.
   1.124 +
   1.125 +** Quake II/Jake2
   1.126 +
   1.127 +I spent a bit more than a month working with the Quake II Engine from
   1.128 +ID software to see if I could use it for my purposes. All the source
   1.129 +code was released by ID software into the Public Domain several years
   1.130 +ago, and as a result it has been ported and modified for many
   1.131 +different reasons. This engine was famous for its advanced use of
   1.132 +realistic shading and had decent and fast physics
   1.133 +simulation. Researchers at Princeton [[http://www.nature.com/nature/journal/v461/n7266/pdf/nature08499.pdf][used this code]] to study spatial
   1.134 +information encoding in the hippocampal cells of rats. Those
   1.135 +researchers created a special Quake II level that simulated a maze,
   1.136 +and added an interface where a mouse could run around inside a ball in
   1.137 +various directions to move the character in the simulated maze. They
   1.138 +measured hippocampal activity during this exercise to try and tease
   1.139 +out the method in which spatial data was stored in that area of the
   1.140 +brain. I find this promising because if a real living rat can interact
   1.141 +with a computer simulation of a maze in the same way as it interacts
   1.142 +with a real-world maze, then maybe that simulation is close enough to
   1.143 +reality that a simulated sense of vision and motor control interacting
   1.144 +with that simulation could reveal useful information about the real
   1.145 +thing. It happens that there is a Java port of the original C source
   1.146 +code called Jake2. The port demonstrates Java's OpenGL bindings and
   1.147 +runs anywhere from 90% to 105% as fast as the C version. After
   1.148 +reviewing much of the source of Jake2, I eventually rejected it
   1.149 +because the engine is too tied to the concept of a first-person
   1.150 +shooter game. One of the problems I had was that there does not seem
   1.151 +to be any easy way to attach multiple cameras to a single
   1.152 +character. There are also several physics clipping issues that are
   1.153 +corrected in a way that only applies to the main character and does
   1.154 +not apply to arbitrary objects. While there is a large community of
   1.155 +level modders, I couldn't find a community to support using the engine
   1.156 +to make new things.
   1.157 +
   1.158 +** Source Engine
   1.159 +
   1.160 +The Source Engine evolved from the Quake II and Quake I engines and is
   1.161 +used by Valve in the Half-Life series of games. The physics simulation
   1.162 +in the Source Engine is quite accurate and probably the best out of
   1.163 +all the engines I investigated. There is also an extensive community
   1.164 +actively working with the engine. However, applications that use the
   1.165 +Source Engine must be written in C++, the code is not open, it only
   1.166 +runs on Windows, and the tools that come with the SDK to handle models
   1.167 +and textures are complicated and awkward to use.
   1.168 +
   1.169 +** jMonkeyEngine
   1.170 +
   1.171 +jMonkeyEngine is a new library for creating games in Java. It uses
   1.172 +OpenGL to render to the screen and uses screengraphs to avoid drawing
   1.173 +things that do not appear on the screen. It has an active community
   1.174 +and several games in the pipeline. The engine was not built to serve
   1.175 +any particular game but is instead meant to be used for any 3D
   1.176 +game. After experimenting with each of these three engines and a few
   1.177 +others for about 2 months I settled on jMonkeyEngine. I chose it
   1.178 +because it had the most features out of all the open projects I looked
   1.179 +at, and because I could then write my code in Clojure, an
   1.180 +implementation of LISP that runs on the JVM...
     2.1 --- /dev/null	Thu Jan 01 00:00:00 1970 +0000
     2.2 +++ b/org/setup.org	Sun Oct 23 23:35:04 2011 -0700
     2.3 @@ -0,0 +1,136 @@
     2.4 +#+title: Setup jMonkeyEngine3
     2.5 +#+author: Robert McIntyre
     2.6 +#+email: rlm@mit.edu
     2.7 +#+description: Simulating senses for AI research using JMonkeyEngine3
     2.8 +#+SETUPFILE: ../../aurellem/org/setup.org
     2.9 +#+INCLUDE: ../../aurellem/org/level-0.org
    2.10 +#+babel: :mkdirp yes :noweb yes :exports both
    2.11 +
    2.12 +* Setup
    2.13 +
    2.14 +First, I checked out the source to jMonkeyEngine:
    2.15 +
    2.16 +#+srcname: checkout 
    2.17 +#+begin_src sh :results verbatim
    2.18 +svn checkout http://jmonkeyengine.googlecode.com/svn/trunk/engine jme3
    2.19 +#+end_src
    2.20 +
    2.21 +#+results: checkout
    2.22 +: Checked out revision 7975.
    2.23 +
    2.24 +
    2.25 +Building jMonkeyEngine is easy enough:
    2.26 +
    2.27 +#+srcname: build
    2.28 +#+begin_src sh :results verbatim
    2.29 +cd jme3
    2.30 +ant jar | tail -n 2
    2.31 +#+end_src
    2.32 +
    2.33 +#+results: build
    2.34 +: BUILD SUCCESSFUL
    2.35 +: Total time: 15 seconds
    2.36 +
    2.37 +
    2.38 +Also build the javadoc:
    2.39 +
    2.40 +#+srcname: javadoc
    2.41 +#+begin_src sh :results verbatim
    2.42 +cd jme3
    2.43 +ant javadoc | tail -n 2
    2.44 +#+end_src
    2.45 +
    2.46 +#+results: javadoc
    2.47 +: BUILD SUCCESSFUL
    2.48 +: Total time: 12 seconds
    2.49 +
    2.50 +Now, move the jars from the compilation into the project's lib folder.
    2.51 +
    2.52 +#+srcname: move-jars
    2.53 +#+begin_src sh :results verbatim
    2.54 +mkdir -p lib 
    2.55 +mkdir -p src
    2.56 +cp jme3/dist/jMonkeyEngine3.jar lib/
    2.57 +cp jme3/dist/lib/* lib/
    2.58 +ls lib
    2.59 +#+end_src
    2.60 +
    2.61 +#+results: move-jars
    2.62 +#+begin_example
    2.63 +eventbus-1.4.jar
    2.64 +jbullet.jar
    2.65 +jheora-jst-debug-0.6.0.jar
    2.66 +jinput.jar
    2.67 +jME3-jbullet.jar
    2.68 +jME3-lwjgl-natives.jar
    2.69 +jME3-testdata.jar
    2.70 +jME3-test.jar
    2.71 +jMonkeyEngine3.jar
    2.72 +j-ogg-oggd.jar
    2.73 +j-ogg-vorbisd.jar
    2.74 +lwjgl.jar
    2.75 +nifty-1.3.jar
    2.76 +nifty-default-controls-1.3.jar
    2.77 +nifty-examples-1.3.jar
    2.78 +nifty-lwjgl-renderer-1.3.jar
    2.79 +nifty-openal-soundsystem-1.0.jar
    2.80 +nifty-style-black-1.3.jar
    2.81 +nifty-style-grey-1.0.jar
    2.82 +noise-0.0.1-SNAPSHOT.jar
    2.83 +stack-alloc.jar
    2.84 +vecmath.jar
    2.85 +xmlpull-xpp3-1.1.4c.jar
    2.86 +#+end_example
    2.87 +
    2.88 +It's good to create a =assets= directory in the style that the
    2.89 +=AssetManager= will like.
    2.90 +
    2.91 +#+srcname: create-assets
    2.92 +#+begin_src sh :results verbatim
    2.93 +mkdir -p assets
    2.94 +mkdir -p assets/Interface
    2.95 +mkdir -p assets/Materials
    2.96 +mkdir -p assets/MatDefs
    2.97 +mkdir -p assets/Models
    2.98 +mkdir -p assets/Scenes
    2.99 +mkdir -p assets/Shaders
   2.100 +mkdir -p assets/Sounds
   2.101 +mkdir -p assets/Textures
   2.102 +tree -L 1 assets
   2.103 +#+end_src
   2.104 +
   2.105 +#+results: create-assets
   2.106 +#+begin_example
   2.107 +assets
   2.108 +|-- Interface
   2.109 +|-- MatDefs
   2.110 +|-- Materials
   2.111 +|-- Models
   2.112 +|-- Scenes
   2.113 +|-- Shaders
   2.114 +|-- Sounds
   2.115 +`-- Textures
   2.116 +
   2.117 +8 directories, 0 files
   2.118 +#+end_example
   2.119 +
   2.120 +
   2.121 +The java classpath should have all the jars contained in the =lib=
   2.122 +directory as well as the src directory.
   2.123 +
   2.124 +For example, here is the file I use to run my REPL for clojure.
   2.125 +
   2.126 +#+include: "/home/r/bin/swank-all" src sh :exports code
   2.127 +
   2.128 +The important thing here is that =cortex/lib/*=, =cortex/src=, and
   2.129 +=cortex/assets= appear on the classpath. (=cortex= is the base
   2.130 +directory of this project.)
   2.131 +
   2.132 +#+srcname: pwd
   2.133 +#+begin_src sh 
   2.134 +pwd
   2.135 +#+end_src
   2.136 +
   2.137 +#+results: pwd
   2.138 +: /home/r/proj/cortex
   2.139 +