view org/intro.org @ 411:a331d5ff73e0

saving progress for the night. completed self-organizing touch, still working on stream predicates.
author Robert McIntyre <rlm@mit.edu>
date Tue, 18 Mar 2014 23:04:48 -0400
parents f4ef73370da1
children bb81cef09ad7
line wrap: on
line source
1 #+title: Simulated Senses
2 #+author: Robert McIntyre
3 #+email: rlm@mit.edu
4 #+description: Simulating senses for AI research using JMonkeyEngine3
5 #+keywords: Alan Turing, AI, simulated senses, jMonkeyEngine3, virtual world
6 #+SETUPFILE: ../../aurellem/org/setup.org
7 #+INCLUDE: ../../aurellem/org/level-0.org
8 #+babel: :mkdirp yes :noweb yes
10 * Background
12 Artificial Intelligence has tried and failed for more than
13 half a century to produce programs as flexible, creative,
14 and "intelligent" as the human mind itself. Clearly, we are
15 still missing some important ideas concerning intelligent
16 programs or we would have strong AI already. What idea could
17 be missing?
19 When Turing first proposed his famous "Turing Test" in the
20 groundbreaking paper [[../sources/turing.pdf][/Computing Machines and Intelligence/]],
21 he gave little importance to how a computer program might
22 interact with the world:
24 #+BEGIN_QUOTE
25 \ldquo{}We need not be too concerned about the legs, eyes,
26 etc. The example of Miss Helen Keller shows that education
27 can take place provided that communication in both
28 directions between teacher and pupil can take place by some
29 means or other.\rdquo{}
30 #+END_QUOTE
32 And from the example of Hellen Keller he went on to assume
33 that the only thing a fledgling AI program could need by way
34 of communication is a teletypewriter. But Hellen Keller did
35 possess vision and hearing for the first few months of her
36 life, and her tactile sense was far more rich than any
37 text-stream could hope to achieve. She possessed a body she
38 could move freely, and had continual access to the real
39 world to learn from her actions.
41 I believe that our programs are suffering from too little
42 sensory input to become really intelligent. Imagine for a
43 moment that you lived in a world completely cut off form all
44 sensory stimulation. You have no eyes to see, no ears to
45 hear, no mouth to speak. No body, no taste, no feeling
46 whatsoever. The only sense you get at all is a single point
47 of light, flickering on and off in the void. If this was
48 your life from birth, you would never learn anything, and
49 could never become intelligent. Actual humans placed in
50 sensory deprivation chambers experience hallucinations and
51 can begin to loose their sense of reality. Most of the time,
52 the programs we write are in exactly this situation. They do
53 not interface with cameras and microphones, and they do not
54 control a real or simulated body or interact with any sort
55 of world.
57 * Simulation vs. Reality
59 I want demonstrate that multiple senses are what enable
60 intelligence. There are two ways of playing around with
61 senses and computer programs:
63 ** Simulation
65 The first is to go entirely with simulation: virtual world,
66 virtual character, virtual senses. The advantages are that
67 when everything is a simulation, experiments in that
68 simulation are absolutely reproducible. It's also easier to
69 change the character and world to explore new situations and
70 different sensory combinations.
72 If the world is to be simulated on a computer, then not only
73 do you have to worry about whether the character's senses
74 are rich enough to learn from the world, but whether the
75 world itself is rendered with enough detail and realism to
76 give enough working material to the character's senses. To
77 name just a few difficulties facing modern physics
78 simulators: destructibility of the environment, simulation
79 of water/other fluids, large areas, nonrigid bodies, lots of
80 objects, smoke. I don't know of any computer simulation that
81 would allow a character to take a rock and grind it into
82 fine dust, then use that dust to make a clay sculpture, at
83 least not without spending years calculating the
84 interactions of every single small grain of dust. Maybe a
85 simulated world with today's limitations doesn't provide
86 enough richness for real intelligence to evolve.
88 ** Reality
90 The other approach for playing with senses is to hook your
91 software up to real cameras, microphones, robots, etc., and
92 let it loose in the real world. This has the advantage of
93 eliminating concerns about simulating the world at the
94 expense of increasing the complexity of implementing the
95 senses. Instead of just grabbing the current rendered frame
96 for processing, you have to use an actual camera with real
97 lenses and interact with photons to get an image. It is much
98 harder to change the character, which is now partly a
99 physical robot of some sort, since doing so involves
100 changing things around in the real world instead of
101 modifying lines of code. While the real world is very rich
102 and definitely provides enough stimulation for intelligence
103 to develop as evidenced by our own existence, it is also
104 uncontrollable in the sense that a particular situation
105 cannot be recreated perfectly or saved for later use. It is
106 harder to conduct science because it is harder to repeat an
107 experiment. The worst thing about using the real world
108 instead of a simulation is the matter of time. Instead of
109 simulated time you get the constant and unstoppable flow of
110 real time. This severely limits the sorts of software you
111 can use to program the AI because all sense inputs must be
112 handled in real time. Complicated ideas may have to be
113 implemented in hardware or may simply be impossible given
114 the current speed of our processors. Contrast this with a
115 simulation, in which the flow of time in the simulated world
116 can be slowed down to accommodate the limitations of the
117 character's programming. In terms of cost, doing everything
118 in software is far cheaper than building custom real-time
119 hardware. All you need is a laptop and some patience.
121 * Choose a Simulation Engine
123 Mainly because of issues with controlling the flow of time,
124 I chose to simulate both the world and the character. I set
125 out to make a world in which I could embed a character with
126 multiple senses. My main goal is to make an environment
127 where I can perform further experiments in simulated senses.
129 I examined many different 3D environments to try and find
130 something I would use as the base for my simulation;
131 eventually the choice came down to three engines: the Quake
132 II engine, the Source Engine, and jMonkeyEngine.
134 ** [[http://www.idsoftware.com][Quake II]]/[[http://www.bytonic.de/html/jake2.html][Jake2]]
136 I spent a bit more than a month working with the Quake II
137 Engine from ID software to see if I could use it for my
138 purposes. All the source code was released by ID software
139 into the Public Domain several years ago, and as a result it
140 has been ported and modified for many different
141 reasons. This engine was famous for its advanced use of
142 realistic shading and had decent and fast physics
143 simulation. Researchers at Princeton [[http://papers.cnl.salk.edu/PDFs/Intracelllular%20Dynamics%20of%20Virtual%20Place%20Cells%202011-4178.pdf][used this code]] ([[http://brainwindows.wordpress.com/2009/10/14/playing-quake-with-a-real-mouse/][video]])
144 to study spatial information encoding in the hippocampal
145 cells of rats. Those researchers created a special Quake II
146 level that simulated a maze, and added an interface where a
147 mouse could run on top of a ball in various directions to
148 move the character in the simulated maze. They measured
149 hippocampal activity during this exercise to try and tease
150 out the method in which spatial data was stored in that area
151 of the brain. I find this promising because if a real living
152 rat can interact with a computer simulation of a maze in the
153 same way as it interacts with a real-world maze, then maybe
154 that simulation is close enough to reality that a simulated
155 sense of vision and motor control interacting with that
156 simulation could reveal useful information about the real
157 thing. There is a Java port of the original C source code
158 called Jake2. The port demonstrates Java's OpenGL bindings
159 and runs anywhere from 90% to 105% as fast as the C
160 version. After reviewing much of the source of Jake2, I
161 rejected it because the engine is too tied to the concept of
162 a first-person shooter game. One of the problems I had was
163 that there does not seem to be any easy way to attach
164 multiple cameras to a single character. There are also
165 several physics clipping issues that are corrected in a way
166 that only applies to the main character and do not apply to
167 arbitrary objects. While there is a large community of level
168 modders, I couldn't find a community to support using the
169 engine to make new things.
171 ** [[http://source.valvesoftware.com/][Source Engine]]
173 The Source Engine evolved from the Quake II and Quake I
174 engines and is used by Valve in the Half-Life series of
175 games. The physics simulation in the Source Engine is quite
176 accurate and probably the best out of all the engines I
177 investigated. There is also an extensive community actively
178 working with the engine. However, applications that use the
179 Source Engine must be written in C++, the code is not open,
180 it only runs on Windows, and the tools that come with the
181 SDK to handle models and textures are complicated and
182 awkward to use.
184 ** [[http://jmonkeyengine.com/][jMonkeyEngine3]]
186 jMonkeyEngine is a new library for creating games in
187 Java. It uses OpenGL to render to the screen and uses
188 screengraphs to avoid drawing things that do not appear on
189 the screen. It has an active community and several games in
190 the pipeline. The engine was not built to serve any
191 particular game but is instead meant to be used for any 3D
192 game. After experimenting with each of these three engines
193 and a few others for about 2 months I settled on
194 jMonkeyEngine. I chose it because it had the most features
195 out of all the open projects I looked at, and because I
196 could then write my code in Clojure, an implementation of
197 LISP that runs on the JVM.