diff thesis/cortex.org @ 514:447c3c8405a2

accept/reject changes
author Robert McIntyre <rlm@mit.edu>
date Sun, 30 Mar 2014 10:50:05 -0400
parents 07c3feb32df3
children 58fa1ffd481e
line wrap: on
line diff
     1.1 --- a/thesis/cortex.org	Sun Mar 30 10:41:18 2014 -0400
     1.2 +++ b/thesis/cortex.org	Sun Mar 30 10:50:05 2014 -0400
     1.3 @@ -498,9 +498,9 @@
     1.4     is a vast and rich place, and for now simulations are a very poor
     1.5     reflection of its complexity. It may be that there is a significant
     1.6     qualatative difference between dealing with senses in the real
     1.7 -   world and dealing with pale facilimilies of them in a simulation.
     1.8 -   What are the advantages and disadvantages of a simulation vs.
     1.9 -   reality?
    1.10 +   world and dealing with pale facilimilies of them in a simulation
    1.11 +   \cite{brooks-representation}. What are the advantages and
    1.12 +   disadvantages of a simulation vs. reality?
    1.13  
    1.14  *** Simulation
    1.15  
    1.16 @@ -578,7 +578,7 @@
    1.17     40x on my machine!
    1.18  
    1.19  ** All sense organs are two-dimensional surfaces
    1.20 -# What is a sense?   
    1.21 +
    1.22     If =CORTEX= is to support a wide variety of senses, it would help
    1.23     to have a better understanding of what a ``sense'' actually is!
    1.24     While vision, touch, and hearing all seem like they are quite
    1.25 @@ -1376,7 +1376,7 @@
    1.26      capturing in-game video to a file.
    1.27  
    1.28  ** ...but hearing must be built from scratch
    1.29 -# is hard; =CORTEX= does it right
    1.30 +
    1.31     At the end of this section I will have simulated ears that work the
    1.32     same way as the simulated eyes in the last section. I will be able to
    1.33     place any number of ear-nodes in a blender file, and they will bind to
    1.34 @@ -3163,18 +3163,18 @@
    1.35    to interpretation, and dissaggrement between empathy and experience
    1.36    is more excusable.
    1.37  
    1.38 -** Digression: Learn touch sensor layout through haptic experimentation, instead 
    1.39 -# Boostraping touch using free exploration   
    1.40 -In the previous section I showed how to compute actions in terms of
    1.41 +** Digression: Learn touch sensor layout through free play
    1.42 +
    1.43 +   In the previous section I showed how to compute actions in terms of
    1.44     body-centered predicates which relied averate touch activation of
    1.45 -   pre-defined regions of the worm's skin. What if, instead of recieving
    1.46 -   touch pre-grouped into the six faces of each worm segment, the true
    1.47 -   topology of the worm's skin was unknown? This is more similiar to how
    1.48 -   a nerve fiber bundle might be arranged. While two fibers that are
    1.49 -   close in a nerve bundle /might/ correspond to two touch sensors that
    1.50 -   are close together on the skin, the process of taking a complicated
    1.51 -   surface and forcing it into essentially a circle requires some cuts
    1.52 -   and rerragenments.
    1.53 +   pre-defined regions of the worm's skin. What if, instead of
    1.54 +   recieving touch pre-grouped into the six faces of each worm
    1.55 +   segment, the true topology of the worm's skin was unknown? This is
    1.56 +   more similiar to how a nerve fiber bundle might be arranged. While
    1.57 +   two fibers that are close in a nerve bundle /might/ correspond to
    1.58 +   two touch sensors that are close together on the skin, the process
    1.59 +   of taking a complicated surface and forcing it into essentially a
    1.60 +   circle requires some cuts and rerragenments.
    1.61     
    1.62     In this section I show how to automatically learn the skin-topology of
    1.63     a worm segment by free exploration. As the worm rolls around on the