rlm@513: @@ -3210,13 +3329,14 @@ rlm@513: rlm@513: In this thesis you have seen the =CORTEX= system, a complete rlm@513: environment for creating simulated creatures. You have seen how to rlm@513: - implement five senses including touch, proprioception, hearing, rlm@513: - vision, and muscle tension. You have seen how to create new creatues rlm@513: - using blender, a 3D modeling tool. I hope that =CORTEX= will be rlm@513: - useful in further research projects. To this end I have included the rlm@513: - full source to =CORTEX= along with a large suite of tests and rlm@513: - examples. I have also created a user guide for =CORTEX= which is rlm@513: - inculded in an appendix to this thesis. rlm@513: + implement five senses: touch, proprioception, hearing, vision, and rlm@513: + muscle tension. You have seen how to create new creatues using rlm@513: + blender, a 3D modeling tool. I hope that =CORTEX= will be useful in rlm@513: + further research projects. To this end I have included the full rlm@513: + source to =CORTEX= along with a large suite of tests and examples. I rlm@513: + have also created a user guide for =CORTEX= which is inculded in an rlm@513: + appendix to this thesis \ref{}. rlm@513: +# dxh: todo reference appendix rlm@513: rlm@513: You have also seen how I used =CORTEX= as a platform to attach the rlm@513: /action recognition/ problem, which is the problem of recognizing rlm@514: rlm@514: @@ -2888,7 +3007,7 @@ rlm@514: #+end_src rlm@514: #+end_listing rlm@514: rlm@514: -** Efficient action recognition with =EMPATH= rlm@514: +** =EMPATH= recognizes actions efficiently rlm@514: rlm@514: To use =EMPATH= with the worm, I first need to gather a set of rlm@514: experiences from the worm that includes the actions I want to rlm@514: rlm@514: @@ -3044,9 +3163,9 @@ rlm@514: to interpretation, and dissaggrement between empathy and experience rlm@514: is more excusable. rlm@514: rlm@514: -** Digression: bootstrapping touch using free exploration rlm@514: - rlm@514: - In the previous section I showed how to compute actions in terms of rlm@514: +** Digression: Learn touch sensor layout through haptic experimentation, instead rlm@514: +# Boostraping touch using free exploration rlm@514: +In the previous section I showed how to compute actions in terms of rlm@514: body-centered predicates which relied averate touch activation of rlm@514: pre-defined regions of the worm's skin. What if, instead of recieving rlm@514: touch pre-grouped into the six faces of each worm segment, the true rlm@514: rlm@514: @@ -2193,7 +2311,7 @@ rlm@514: #+ATTR_LaTeX: :width 11cm rlm@514: [[./images/proprio.png]] rlm@514: rlm@514: -** Muscles are both effectors and sensors rlm@514: +** Muscles contain both sensors and effectors rlm@514: rlm@514: Surprisingly enough, terrestrial creatures only move by using rlm@514: torque applied about their joints. There's not a single straight rlm@514: rlm@514: @@ -2059,7 +2177,7 @@ rlm@514: #+ATTR_LaTeX: :width 15cm rlm@514: [[./images/touch-cube.png]] rlm@514: rlm@514: -** Proprioception is the sense that makes everything ``real'' rlm@514: +** Proprioception provides knowledge of your own body's position rlm@514: rlm@514: Close your eyes, and touch your nose with your right index finger. rlm@514: How did you do it? You could not see your hand, and neither your rlm@514: rlm@514: @@ -1257,8 +1375,8 @@ rlm@514: community and is now (in modified form) part of a system for rlm@514: capturing in-game video to a file. rlm@514: rlm@514: -** Hearing is hard; =CORTEX= does it right rlm@514: - rlm@514: +** ...but hearing must be built from scratch rlm@514: +# is hard; =CORTEX= does it right rlm@514: At the end of this section I will have simulated ears that work the rlm@514: same way as the simulated eyes in the last section. I will be able to rlm@514: place any number of ear-nodes in a blender file, and they will bind to rlm@514: rlm@514: @@ -1565,7 +1683,7 @@ rlm@514: jMonkeyEngine3 community and is used to record audio for demo rlm@514: videos. rlm@514: rlm@514: -** Touch uses hundreds of hair-like elements rlm@514: +** Hundreds of hair-like elements provide a sense of touch rlm@514: rlm@514: Touch is critical to navigation and spatial reasoning and as such I rlm@514: need a simulated version of it to give to my AI creatures. rlm@514: rlm@514: @@ -956,7 +1074,7 @@ rlm@514: #+ATTR_LaTeX: :width 15cm rlm@514: [[./images/physical-hand.png]] rlm@514: rlm@514: -** Eyes reuse standard video game components rlm@514: +** Sight reuses standard video game components... rlm@514: rlm@514: Vision is one of the most important senses for humans, so I need to rlm@514: build a simulated sense of vision for my AI. I will do this with rlm@514: @@ -459,8 +577,8 @@ rlm@514: simulations of very simple creatures in =CORTEX= generally run at rlm@514: 40x on my machine! rlm@514: rlm@514: -** What is a sense? rlm@514: - rlm@514: +** All sense organs are two-dimensional surfaces rlm@514: +# What is a sense? rlm@514: If =CORTEX= is to support a wide variety of senses, it would help rlm@514: to have a better understanding of what a ``sense'' actually is! rlm@514: While vision, touch, and hearing all seem like they are quite