annotate thesis/dylan-accept.diff @ 524:8e52a2802821

incorporating winston's changes.
author Robert McIntyre <rlm@mit.edu>
date Sun, 20 Apr 2014 21:46:46 -0400
parents 447c3c8405a2
children
rev   line source
rlm@513 1 @@ -3210,13 +3329,14 @@
rlm@513 2
rlm@513 3 In this thesis you have seen the =CORTEX= system, a complete
rlm@513 4 environment for creating simulated creatures. You have seen how to
rlm@513 5 - implement five senses including touch, proprioception, hearing,
rlm@513 6 - vision, and muscle tension. You have seen how to create new creatues
rlm@513 7 - using blender, a 3D modeling tool. I hope that =CORTEX= will be
rlm@513 8 - useful in further research projects. To this end I have included the
rlm@513 9 - full source to =CORTEX= along with a large suite of tests and
rlm@513 10 - examples. I have also created a user guide for =CORTEX= which is
rlm@513 11 - inculded in an appendix to this thesis.
rlm@513 12 + implement five senses: touch, proprioception, hearing, vision, and
rlm@513 13 + muscle tension. You have seen how to create new creatues using
rlm@513 14 + blender, a 3D modeling tool. I hope that =CORTEX= will be useful in
rlm@513 15 + further research projects. To this end I have included the full
rlm@513 16 + source to =CORTEX= along with a large suite of tests and examples. I
rlm@513 17 + have also created a user guide for =CORTEX= which is inculded in an
rlm@513 18 + appendix to this thesis \ref{}.
rlm@513 19 +# dxh: todo reference appendix
rlm@513 20
rlm@513 21 You have also seen how I used =CORTEX= as a platform to attach the
rlm@513 22 /action recognition/ problem, which is the problem of recognizing
rlm@514 23
rlm@514 24 @@ -2888,7 +3007,7 @@
rlm@514 25 #+end_src
rlm@514 26 #+end_listing
rlm@514 27
rlm@514 28 -** Efficient action recognition with =EMPATH=
rlm@514 29 +** =EMPATH= recognizes actions efficiently
rlm@514 30
rlm@514 31 To use =EMPATH= with the worm, I first need to gather a set of
rlm@514 32 experiences from the worm that includes the actions I want to
rlm@514 33
rlm@514 34 @@ -3044,9 +3163,9 @@
rlm@514 35 to interpretation, and dissaggrement between empathy and experience
rlm@514 36 is more excusable.
rlm@514 37
rlm@514 38 -** Digression: bootstrapping touch using free exploration
rlm@514 39 -
rlm@514 40 - In the previous section I showed how to compute actions in terms of
rlm@514 41 +** Digression: Learn touch sensor layout through haptic experimentation, instead
rlm@514 42 +# Boostraping touch using free exploration
rlm@514 43 +In the previous section I showed how to compute actions in terms of
rlm@514 44 body-centered predicates which relied averate touch activation of
rlm@514 45 pre-defined regions of the worm's skin. What if, instead of recieving
rlm@514 46 touch pre-grouped into the six faces of each worm segment, the true
rlm@514 47
rlm@514 48 @@ -2193,7 +2311,7 @@
rlm@514 49 #+ATTR_LaTeX: :width 11cm
rlm@514 50 [[./images/proprio.png]]
rlm@514 51
rlm@514 52 -** Muscles are both effectors and sensors
rlm@514 53 +** Muscles contain both sensors and effectors
rlm@514 54
rlm@514 55 Surprisingly enough, terrestrial creatures only move by using
rlm@514 56 torque applied about their joints. There's not a single straight
rlm@514 57
rlm@514 58 @@ -2059,7 +2177,7 @@
rlm@514 59 #+ATTR_LaTeX: :width 15cm
rlm@514 60 [[./images/touch-cube.png]]
rlm@514 61
rlm@514 62 -** Proprioception is the sense that makes everything ``real''
rlm@514 63 +** Proprioception provides knowledge of your own body's position
rlm@514 64
rlm@514 65 Close your eyes, and touch your nose with your right index finger.
rlm@514 66 How did you do it? You could not see your hand, and neither your
rlm@514 67
rlm@514 68 @@ -1257,8 +1375,8 @@
rlm@514 69 community and is now (in modified form) part of a system for
rlm@514 70 capturing in-game video to a file.
rlm@514 71
rlm@514 72 -** Hearing is hard; =CORTEX= does it right
rlm@514 73 -
rlm@514 74 +** ...but hearing must be built from scratch
rlm@514 75 +# is hard; =CORTEX= does it right
rlm@514 76 At the end of this section I will have simulated ears that work the
rlm@514 77 same way as the simulated eyes in the last section. I will be able to
rlm@514 78 place any number of ear-nodes in a blender file, and they will bind to
rlm@514 79
rlm@514 80 @@ -1565,7 +1683,7 @@
rlm@514 81 jMonkeyEngine3 community and is used to record audio for demo
rlm@514 82 videos.
rlm@514 83
rlm@514 84 -** Touch uses hundreds of hair-like elements
rlm@514 85 +** Hundreds of hair-like elements provide a sense of touch
rlm@514 86
rlm@514 87 Touch is critical to navigation and spatial reasoning and as such I
rlm@514 88 need a simulated version of it to give to my AI creatures.
rlm@514 89
rlm@514 90 @@ -956,7 +1074,7 @@
rlm@514 91 #+ATTR_LaTeX: :width 15cm
rlm@514 92 [[./images/physical-hand.png]]
rlm@514 93
rlm@514 94 -** Eyes reuse standard video game components
rlm@514 95 +** Sight reuses standard video game components...
rlm@514 96
rlm@514 97 Vision is one of the most important senses for humans, so I need to
rlm@514 98 build a simulated sense of vision for my AI. I will do this with
rlm@514 99 @@ -459,8 +577,8 @@
rlm@514 100 simulations of very simple creatures in =CORTEX= generally run at
rlm@514 101 40x on my machine!
rlm@514 102
rlm@514 103 -** What is a sense?
rlm@514 104 -
rlm@514 105 +** All sense organs are two-dimensional surfaces
rlm@514 106 +# What is a sense?
rlm@514 107 If =CORTEX= is to support a wide variety of senses, it would help
rlm@514 108 to have a better understanding of what a ``sense'' actually is!
rlm@514 109 While vision, touch, and hearing all seem like they are quite