comparison thesis/dylan-cortex-diff.diff @ 514:447c3c8405a2

accept/reject changes
author Robert McIntyre <rlm@mit.edu>
date Sun, 30 Mar 2014 10:50:05 -0400
parents 4c4d45f6f30b
children 58fa1ffd481e
comparison
equal deleted inserted replaced
513:4c4d45f6f30b 514:447c3c8405a2
292 -** Because of Time, simulation is perferable to reality 292 -** Because of Time, simulation is perferable to reality
293 +** Simulated time enables rapid prototyping and complex scenes 293 +** Simulated time enables rapid prototyping and complex scenes
294 294
295 I envision =CORTEX= being used to support rapid prototyping and 295 I envision =CORTEX= being used to support rapid prototyping and
296 iteration of ideas. Even if I could put together a well constructed 296 iteration of ideas. Even if I could put together a well constructed
297 @@ -459,8 +577,8 @@
298 simulations of very simple creatures in =CORTEX= generally run at
299 40x on my machine!
300
301 -** What is a sense?
302 -
303 +** All sense organs are two-dimensional surfaces
304 +# What is a sense?
305 If =CORTEX= is to support a wide variety of senses, it would help
306 to have a better understanding of what a ``sense'' actually is!
307 While vision, touch, and hearing all seem like they are quite
308 @@ -956,7 +1074,7 @@
309 #+ATTR_LaTeX: :width 15cm
310 [[./images/physical-hand.png]]
311
312 -** Eyes reuse standard video game components
313 +** Sight reuses standard video game components...
314
315 Vision is one of the most important senses for humans, so I need to
316 build a simulated sense of vision for my AI. I will do this with
317 @@ -1257,8 +1375,8 @@
318 community and is now (in modified form) part of a system for
319 capturing in-game video to a file.
320
321 -** Hearing is hard; =CORTEX= does it right
322 -
323 +** ...but hearing must be built from scratch
324 +# is hard; =CORTEX= does it right
325 At the end of this section I will have simulated ears that work the
326 same way as the simulated eyes in the last section. I will be able to
327 place any number of ear-nodes in a blender file, and they will bind to
328 @@ -1565,7 +1683,7 @@
329 jMonkeyEngine3 community and is used to record audio for demo
330 videos.
331
332 -** Touch uses hundreds of hair-like elements
333 +** Hundreds of hair-like elements provide a sense of touch
334
335 Touch is critical to navigation and spatial reasoning and as such I
336 need a simulated version of it to give to my AI creatures.
337 @@ -2059,7 +2177,7 @@
338 #+ATTR_LaTeX: :width 15cm
339 [[./images/touch-cube.png]]
340
341 -** Proprioception is the sense that makes everything ``real''
342 +** Proprioception provides knowledge of your own body's position
343
344 Close your eyes, and touch your nose with your right index finger.
345 How did you do it? You could not see your hand, and neither your
346 @@ -2193,7 +2311,7 @@
347 #+ATTR_LaTeX: :width 11cm
348 [[./images/proprio.png]]
349
350 -** Muscles are both effectors and sensors
351 +** Muscles contain both sensors and effectors
352
353 Surprisingly enough, terrestrial creatures only move by using
354 torque applied about their joints. There's not a single straight
355 @@ -2440,7 +2558,8 @@ 297 @@ -2440,7 +2558,8 @@
356 hard control problems without worrying about physics or 298 hard control problems without worrying about physics or
357 senses. 299 senses.
358 300
359 -* Empathy in a simulated worm 301 -* Empathy in a simulated worm
369 -** Empathy is the process of tracing though \Phi-space 311 -** Empathy is the process of tracing though \Phi-space
370 +** ``Empathy'' requires retracing steps though \Phi-space 312 +** ``Empathy'' requires retracing steps though \Phi-space
371 313
372 Here is the core of a basic empathy algorithm, starting with an 314 Here is the core of a basic empathy algorithm, starting with an
373 experience vector: 315 experience vector:
374 @@ -2888,7 +3007,7 @@
375 #+end_src
376 #+end_listing
377
378 -** Efficient action recognition with =EMPATH=
379 +** =EMPATH= recognizes actions efficiently
380
381 To use =EMPATH= with the worm, I first need to gather a set of
382 experiences from the worm that includes the actions I want to
383 @@ -3044,9 +3163,9 @@
384 to interpretation, and dissaggrement between empathy and experience
385 is more excusable.
386
387 -** Digression: bootstrapping touch using free exploration
388 -
389 - In the previous section I showed how to compute actions in terms of
390 +** Digression: Learn touch sensor layout through haptic experimentation, instead
391 +# Boostraping touch using free exploration
392 +In the previous section I showed how to compute actions in terms of
393 body-centered predicates which relied averate touch activation of
394 pre-defined regions of the worm's skin. What if, instead of recieving
395 touch pre-grouped into the six faces of each worm segment, the true