annotate org/intro.org @ 561:6a61b637a4c5

correct wrong word choice.
author Robert McIntyre <rlm@mit.edu>
date Sun, 04 May 2014 15:38:11 -0400
parents bb81cef09ad7
children
rev   line source
rlm@22 1 #+title: Simulated Senses
rlm@22 2 #+author: Robert McIntyre
rlm@22 3 #+email: rlm@mit.edu
rlm@22 4 #+description: Simulating senses for AI research using JMonkeyEngine3
rlm@306 5 #+keywords: Alan Turing, AI, simulated senses, jMonkeyEngine3, virtual world
rlm@22 6 #+SETUPFILE: ../../aurellem/org/setup.org
rlm@22 7 #+INCLUDE: ../../aurellem/org/level-0.org
rlm@22 8 #+babel: :mkdirp yes :noweb yes
rlm@22 9
rlm@22 10 * Background
rlm@333 11
rlm@333 12 Artificial Intelligence has tried and failed for more than
rlm@333 13 half a century to produce programs as flexible, creative,
rlm@333 14 and "intelligent" as the human mind itself. Clearly, we are
rlm@333 15 still missing some important ideas concerning intelligent
rlm@333 16 programs or we would have strong AI already. What idea could
rlm@333 17 be missing?
rlm@22 18
rlm@22 19 When Turing first proposed his famous "Turing Test" in the
rlm@333 20 groundbreaking paper [[../sources/turing.pdf][/Computing Machines and Intelligence/]],
rlm@333 21 he gave little importance to how a computer program might
rlm@333 22 interact with the world:
rlm@22 23
rlm@22 24 #+BEGIN_QUOTE
rlm@333 25 \ldquo{}We need not be too concerned about the legs, eyes,
rlm@333 26 etc. The example of Miss Helen Keller shows that education
rlm@333 27 can take place provided that communication in both
rlm@333 28 directions between teacher and pupil can take place by some
rlm@333 29 means or other.\rdquo{}
rlm@22 30 #+END_QUOTE
rlm@22 31
rlm@333 32 And from the example of Hellen Keller he went on to assume
rlm@333 33 that the only thing a fledgling AI program could need by way
rlm@333 34 of communication is a teletypewriter. But Hellen Keller did
rlm@333 35 possess vision and hearing for the first few months of her
rlm@333 36 life, and her tactile sense was far more rich than any
rlm@333 37 text-stream could hope to achieve. She possessed a body she
rlm@333 38 could move freely, and had continual access to the real
rlm@333 39 world to learn from her actions.
rlm@22 40
rlm@333 41 I believe that our programs are suffering from too little
rlm@333 42 sensory input to become really intelligent. Imagine for a
rlm@333 43 moment that you lived in a world completely cut off form all
rlm@333 44 sensory stimulation. You have no eyes to see, no ears to
rlm@333 45 hear, no mouth to speak. No body, no taste, no feeling
rlm@333 46 whatsoever. The only sense you get at all is a single point
rlm@333 47 of light, flickering on and off in the void. If this was
rlm@333 48 your life from birth, you would never learn anything, and
rlm@333 49 could never become intelligent. Actual humans placed in
rlm@333 50 sensory deprivation chambers experience hallucinations and
rlm@333 51 can begin to loose their sense of reality. Most of the time,
rlm@333 52 the programs we write are in exactly this situation. They do
rlm@333 53 not interface with cameras and microphones, and they do not
rlm@333 54 control a real or simulated body or interact with any sort
rlm@333 55 of world.
rlm@22 56
rlm@22 57 * Simulation vs. Reality
rlm@333 58
rlm@22 59 I want demonstrate that multiple senses are what enable
rlm@333 60 intelligence. There are two ways of playing around with
rlm@333 61 senses and computer programs:
rlm@34 62
rlm@34 63 ** Simulation
rlm@22 64
rlm@333 65 The first is to go entirely with simulation: virtual world,
rlm@333 66 virtual character, virtual senses. The advantages are that
rlm@333 67 when everything is a simulation, experiments in that
rlm@333 68 simulation are absolutely reproducible. It's also easier to
rlm@333 69 change the character and world to explore new situations and
rlm@333 70 different sensory combinations.
rlm@333 71
rlm@333 72 If the world is to be simulated on a computer, then not only
rlm@333 73 do you have to worry about whether the character's senses
rlm@333 74 are rich enough to learn from the world, but whether the
rlm@333 75 world itself is rendered with enough detail and realism to
rlm@333 76 give enough working material to the character's senses. To
rlm@333 77 name just a few difficulties facing modern physics
rlm@333 78 simulators: destructibility of the environment, simulation
rlm@333 79 of water/other fluids, large areas, nonrigid bodies, lots of
rlm@333 80 objects, smoke. I don't know of any computer simulation that
rlm@333 81 would allow a character to take a rock and grind it into
rlm@333 82 fine dust, then use that dust to make a clay sculpture, at
rlm@333 83 least not without spending years calculating the
rlm@333 84 interactions of every single small grain of dust. Maybe a
rlm@333 85 simulated world with today's limitations doesn't provide
rlm@22 86 enough richness for real intelligence to evolve.
rlm@22 87
rlm@34 88 ** Reality
rlm@22 89
rlm@333 90 The other approach for playing with senses is to hook your
rlm@333 91 software up to real cameras, microphones, robots, etc., and
rlm@333 92 let it loose in the real world. This has the advantage of
rlm@333 93 eliminating concerns about simulating the world at the
rlm@333 94 expense of increasing the complexity of implementing the
rlm@333 95 senses. Instead of just grabbing the current rendered frame
rlm@333 96 for processing, you have to use an actual camera with real
rlm@333 97 lenses and interact with photons to get an image. It is much
rlm@333 98 harder to change the character, which is now partly a
rlm@333 99 physical robot of some sort, since doing so involves
rlm@333 100 changing things around in the real world instead of
rlm@333 101 modifying lines of code. While the real world is very rich
rlm@333 102 and definitely provides enough stimulation for intelligence
rlm@333 103 to develop as evidenced by our own existence, it is also
rlm@333 104 uncontrollable in the sense that a particular situation
rlm@333 105 cannot be recreated perfectly or saved for later use. It is
rlm@333 106 harder to conduct science because it is harder to repeat an
rlm@333 107 experiment. The worst thing about using the real world
rlm@333 108 instead of a simulation is the matter of time. Instead of
rlm@333 109 simulated time you get the constant and unstoppable flow of
rlm@333 110 real time. This severely limits the sorts of software you
rlm@333 111 can use to program the AI because all sense inputs must be
rlm@333 112 handled in real time. Complicated ideas may have to be
rlm@333 113 implemented in hardware or may simply be impossible given
rlm@333 114 the current speed of our processors. Contrast this with a
rlm@333 115 simulation, in which the flow of time in the simulated world
rlm@333 116 can be slowed down to accommodate the limitations of the
rlm@333 117 character's programming. In terms of cost, doing everything
rlm@333 118 in software is far cheaper than building custom real-time
rlm@22 119 hardware. All you need is a laptop and some patience.
rlm@22 120
rlm@22 121 * Choose a Simulation Engine
rlm@22 122
rlm@333 123 Mainly because of issues with controlling the flow of time,
rlm@333 124 I chose to simulate both the world and the character. I set
rlm@333 125 out to make a world in which I could embed a character with
rlm@333 126 multiple senses. My main goal is to make an environment
rlm@333 127 where I can perform further experiments in simulated senses.
rlm@22 128
rlm@333 129 I examined many different 3D environments to try and find
rlm@333 130 something I would use as the base for my simulation;
rlm@333 131 eventually the choice came down to three engines: the Quake
rlm@333 132 II engine, the Source Engine, and jMonkeyEngine.
rlm@22 133
rlm@27 134 ** [[http://www.idsoftware.com][Quake II]]/[[http://www.bytonic.de/html/jake2.html][Jake2]]
rlm@22 135
rlm@333 136 I spent a bit more than a month working with the Quake II
rlm@333 137 Engine from ID software to see if I could use it for my
rlm@333 138 purposes. All the source code was released by ID software
rlm@333 139 into the Public Domain several years ago, and as a result it
rlm@333 140 has been ported and modified for many different
rlm@333 141 reasons. This engine was famous for its advanced use of
rlm@22 142 realistic shading and had decent and fast physics
rlm@333 143 simulation. Researchers at Princeton [[http://papers.cnl.salk.edu/PDFs/Intracelllular%20Dynamics%20of%20Virtual%20Place%20Cells%202011-4178.pdf][used this code]] ([[http://brainwindows.wordpress.com/2009/10/14/playing-quake-with-a-real-mouse/][video]])
rlm@333 144 to study spatial information encoding in the hippocampal
rlm@333 145 cells of rats. Those researchers created a special Quake II
rlm@333 146 level that simulated a maze, and added an interface where a
rlm@333 147 mouse could run on top of a ball in various directions to
rlm@333 148 move the character in the simulated maze. They measured
rlm@333 149 hippocampal activity during this exercise to try and tease
rlm@333 150 out the method in which spatial data was stored in that area
rlm@333 151 of the brain. I find this promising because if a real living
rlm@333 152 rat can interact with a computer simulation of a maze in the
rlm@333 153 same way as it interacts with a real-world maze, then maybe
rlm@333 154 that simulation is close enough to reality that a simulated
rlm@333 155 sense of vision and motor control interacting with that
rlm@333 156 simulation could reveal useful information about the real
rlm@333 157 thing. There is a Java port of the original C source code
rlm@333 158 called Jake2. The port demonstrates Java's OpenGL bindings
rlm@333 159 and runs anywhere from 90% to 105% as fast as the C
rlm@333 160 version. After reviewing much of the source of Jake2, I
rlm@333 161 rejected it because the engine is too tied to the concept of
rlm@333 162 a first-person shooter game. One of the problems I had was
rlm@333 163 that there does not seem to be any easy way to attach
rlm@333 164 multiple cameras to a single character. There are also
rlm@333 165 several physics clipping issues that are corrected in a way
rlm@333 166 that only applies to the main character and do not apply to
rlm@333 167 arbitrary objects. While there is a large community of level
rlm@333 168 modders, I couldn't find a community to support using the
rlm@333 169 engine to make new things.
rlm@22 170
rlm@27 171 ** [[http://source.valvesoftware.com/][Source Engine]]
rlm@22 172
rlm@333 173 The Source Engine evolved from the Quake II and Quake I
rlm@333 174 engines and is used by Valve in the Half-Life series of
rlm@333 175 games. The physics simulation in the Source Engine is quite
rlm@333 176 accurate and probably the best out of all the engines I
rlm@333 177 investigated. There is also an extensive community actively
rlm@333 178 working with the engine. However, applications that use the
rlm@333 179 Source Engine must be written in C++, the code is not open,
rlm@333 180 it only runs on Windows, and the tools that come with the
rlm@333 181 SDK to handle models and textures are complicated and
rlm@333 182 awkward to use.
rlm@22 183
rlm@27 184 ** [[http://jmonkeyengine.com/][jMonkeyEngine3]]
rlm@22 185
rlm@333 186 jMonkeyEngine is a new library for creating games in
rlm@333 187 Java. It uses OpenGL to render to the screen and uses
rlm@333 188 screengraphs to avoid drawing things that do not appear on
rlm@333 189 the screen. It has an active community and several games in
rlm@333 190 the pipeline. The engine was not built to serve any
rlm@333 191 particular game but is instead meant to be used for any 3D
rlm@333 192 game. After experimenting with each of these three engines
rlm@333 193 and a few others for about 2 months I settled on
rlm@333 194 jMonkeyEngine. I chose it because it had the most features
rlm@333 195 out of all the open projects I looked at, and because I
rlm@333 196 could then write my code in Clojure, an implementation of
rlm@333 197 LISP that runs on the JVM.