Mercurial > cortex
comparison thesis/cortex.org @ 530:21b8389922ee
fix vermopomorphic footnote.
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Sat, 26 Apr 2014 19:36:13 -0400 |
parents | 96c189d4d15e |
children | 749452f063e5 |
comparison
equal
deleted
inserted
replaced
529:96c189d4d15e | 530:21b8389922ee |
---|---|
39 #+name: name | 39 #+name: name |
40 #+ATTR_LaTeX: :width 10cm | 40 #+ATTR_LaTeX: :width 10cm |
41 [[./images/aurellem-gray.png]] | 41 [[./images/aurellem-gray.png]] |
42 | 42 |
43 | 43 |
44 * Empathy \& Embodiment: problem solving strategies | 44 * COMMENT Empathy \& Embodiment: problem solving strategies |
45 | 45 |
46 By the end of this thesis, you will have seen a novel approach to | 46 By the end of this thesis, you will have seen a novel approach to |
47 interpreting video using embodiment and empathy. You will also see | 47 interpreting video using embodiment and empathy. You will also see |
48 one way to efficiently implement physical empathy for embodied | 48 one way to efficiently implement physical empathy for embodied |
49 creatures. Finally, you will become familiar with =CORTEX=, a system | 49 creatures. Finally, you will become familiar with =CORTEX=, a system |
413 its own finger from the eye in its palm, and that it can feel its | 413 its own finger from the eye in its palm, and that it can feel its |
414 own thumb touching its palm.} | 414 own thumb touching its palm.} |
415 \end{sidewaysfigure} | 415 \end{sidewaysfigure} |
416 #+END_LaTeX | 416 #+END_LaTeX |
417 | 417 |
418 * Designing =CORTEX= | 418 * COMMENT Designing =CORTEX= |
419 | 419 |
420 In this section, I outline the design decisions that went into | 420 In this section, I outline the design decisions that went into |
421 making =CORTEX=, along with some details about its implementation. | 421 making =CORTEX=, along with some details about its implementation. |
422 (A practical guide to getting started with =CORTEX=, which skips | 422 (A practical guide to getting started with =CORTEX=, which skips |
423 over the history and implementation details presented here, is | 423 over the history and implementation details presented here, is |
2543 :proprioception (proprioception! model) | 2543 :proprioception (proprioception! model) |
2544 :muscles (movement! model)})) | 2544 :muscles (movement! model)})) |
2545 #+end_src | 2545 #+end_src |
2546 #+end_listing | 2546 #+end_listing |
2547 | 2547 |
2548 ** Embodiment factors action recognition into manageable parts | 2548 ** COMMENT Embodiment factors action recognition into manageable parts |
2549 | 2549 |
2550 Using empathy, I divide the problem of action recognition into a | 2550 Using empathy, I divide the problem of action recognition into a |
2551 recognition process expressed in the language of a full compliment | 2551 recognition process expressed in the language of a full compliment |
2552 of senses, and an imaginative process that generates full sensory | 2552 of senses, and an imaginative process that generates full sensory |
2553 data from partial sensory data. Splitting the action recognition | 2553 data from partial sensory data. Splitting the action recognition |
2572 relies on proprioception, =resting?= relies on touch, =wiggling?= | 2572 relies on proprioception, =resting?= relies on touch, =wiggling?= |
2573 relies on a Fourier analysis of muscle contraction, and | 2573 relies on a Fourier analysis of muscle contraction, and |
2574 =grand-circle?= relies on touch and reuses =curled?= in its | 2574 =grand-circle?= relies on touch and reuses =curled?= in its |
2575 definition, showing how embodied predicates can be composed. | 2575 definition, showing how embodied predicates can be composed. |
2576 | 2576 |
2577 | |
2577 #+caption: Program for detecting whether the worm is curled. This is the | 2578 #+caption: Program for detecting whether the worm is curled. This is the |
2578 #+caption: simplest action predicate, because it only uses the last frame | 2579 #+caption: simplest action predicate, because it only uses the last frame |
2579 #+caption: of sensory experience, and only uses proprioceptive data. Even | 2580 #+caption: of sensory experience, and only uses proprioceptive data. Even |
2580 #+caption: this simple predicate, however, is automatically frame | 2581 #+caption: this simple predicate, however, is automatically frame |
2581 #+caption: independent and ignores vermopomorphic \begin{footnote} Like | 2582 #+caption: independent and ignores vermopomorphic\protect\footnotemark |
2582 #+caption: \emph{anthropomorphic}, except for worms instead of humans. | 2583 #+caption: differences such as worm textures and colors. |
2583 #+caption: \end{footnote} differences such as worm textures and colors. | |
2584 #+name: curled | 2584 #+name: curled |
2585 #+begin_listing clojure | 2585 #+begin_listing clojure |
2586 #+begin_src clojure | 2586 #+begin_src clojure |
2587 (defn curled? | 2587 (defn curled? |
2588 "Is the worm curled up?" | 2588 "Is the worm curled up?" |
2591 (fn [[_ _ bend]] | 2591 (fn [[_ _ bend]] |
2592 (> (Math/sin bend) 0.64)) | 2592 (> (Math/sin bend) 0.64)) |
2593 (:proprioception (peek experiences)))) | 2593 (:proprioception (peek experiences)))) |
2594 #+end_src | 2594 #+end_src |
2595 #+end_listing | 2595 #+end_listing |
2596 | |
2597 #+BEGIN_LaTeX | |
2598 \footnotetext{Like \emph{anthropomorphic} except for worms instead of humans.} | |
2599 #+END_LaTeX | |
2596 | 2600 |
2597 #+caption: Program for summarizing the touch information in a patch | 2601 #+caption: Program for summarizing the touch information in a patch |
2598 #+caption: of skin. | 2602 #+caption: of skin. |
2599 #+name: touch-summary | 2603 #+name: touch-summary |
2600 #+begin_listing clojure | 2604 #+begin_listing clojure |
2742 | 2746 |
2743 The trick now is to make the action predicates work even when the | 2747 The trick now is to make the action predicates work even when the |
2744 sensory data on which they depend is absent. If I can do that, then | 2748 sensory data on which they depend is absent. If I can do that, then |
2745 I will have gained much. | 2749 I will have gained much. |
2746 | 2750 |
2747 ** \Phi-space describes the worm's experiences | 2751 ** COMMENT \Phi-space describes the worm's experiences |
2748 | 2752 |
2749 As a first step towards building empathy, I need to gather all of | 2753 As a first step towards building empathy, I need to gather all of |
2750 the worm's experiences during free play. I use a simple vector to | 2754 the worm's experiences during free play. I use a simple vector to |
2751 store all the experiences. | 2755 store all the experiences. |
2752 | 2756 |
2788 | 2792 |
2789 There is a simple way of taking \Phi-space and the total ordering | 2793 There is a simple way of taking \Phi-space and the total ordering |
2790 provided by an experience vector and reliably inferring the rest of | 2794 provided by an experience vector and reliably inferring the rest of |
2791 the senses. | 2795 the senses. |
2792 | 2796 |
2793 ** Empathy is the process of tracing though \Phi-space | 2797 ** COMMENT Empathy is the process of tracing though \Phi-space |
2794 | 2798 |
2795 Here is the core of a basic empathy algorithm, starting with an | 2799 Here is the core of a basic empathy algorithm, starting with an |
2796 experience vector: | 2800 experience vector: |
2797 | 2801 |
2798 First, group the experiences into tiered proprioceptive bins. I use | 2802 First, group the experiences into tiered proprioceptive bins. I use |
2949 (recur (dec i) (assoc! v (dec i) cur))) | 2953 (recur (dec i) (assoc! v (dec i) cur))) |
2950 (recur i (assoc! v i 0)))))) | 2954 (recur i (assoc! v i 0)))))) |
2951 #+end_src | 2955 #+end_src |
2952 #+end_listing | 2956 #+end_listing |
2953 | 2957 |
2954 ** =EMPATH= recognizes actions efficiently | 2958 ** COMMENT =EMPATH= recognizes actions efficiently |
2955 | 2959 |
2956 To use =EMPATH= with the worm, I first need to gather a set of | 2960 To use =EMPATH= with the worm, I first need to gather a set of |
2957 experiences from the worm that includes the actions I want to | 2961 experiences from the worm that includes the actions I want to |
2958 recognize. The =generate-phi-space= program (listing | 2962 recognize. The =generate-phi-space= program (listing |
2959 \ref{generate-phi-space} runs the worm through a series of | 2963 \ref{generate-phi-space} runs the worm through a series of |
3105 boundaries of transitioning from one type of action to another. | 3109 boundaries of transitioning from one type of action to another. |
3106 During these transitions the exact label for the action is more open | 3110 During these transitions the exact label for the action is more open |
3107 to interpretation, and disagreement between empathy and experience | 3111 to interpretation, and disagreement between empathy and experience |
3108 is more excusable. | 3112 is more excusable. |
3109 | 3113 |
3110 ** Digression: Learn touch sensor layout through free play | 3114 ** COMMENT Digression: Learn touch sensor layout through free play |
3111 | 3115 |
3112 In the previous section I showed how to compute actions in terms of | 3116 In the previous section I showed how to compute actions in terms of |
3113 body-centered predicates which relied on the average touch | 3117 body-centered predicates which relied on the average touch |
3114 activation of pre-defined regions of the worm's skin. What if, | 3118 activation of pre-defined regions of the worm's skin. What if, |
3115 instead of receiving touch pre-grouped into the six faces of each | 3119 instead of receiving touch pre-grouped into the six faces of each |
3267 deduce that the worm has six sides. Note that =learn-touch-regions= | 3271 deduce that the worm has six sides. Note that =learn-touch-regions= |
3268 would work just as well even if the worm's touch sense data were | 3272 would work just as well even if the worm's touch sense data were |
3269 completely scrambled. The cross shape is just for convenience. This | 3273 completely scrambled. The cross shape is just for convenience. This |
3270 example justifies the use of pre-defined touch regions in =EMPATH=. | 3274 example justifies the use of pre-defined touch regions in =EMPATH=. |
3271 | 3275 |
3272 * Contributions | 3276 * COMMENT Contributions |
3273 | 3277 |
3274 In this thesis you have seen the =CORTEX= system, a complete | 3278 In this thesis you have seen the =CORTEX= system, a complete |
3275 environment for creating simulated creatures. You have seen how to | 3279 environment for creating simulated creatures. You have seen how to |
3276 implement five senses: touch, proprioception, hearing, vision, and | 3280 implement five senses: touch, proprioception, hearing, vision, and |
3277 muscle tension. You have seen how to create new creatures using | 3281 muscle tension. You have seen how to create new creatures using |
3305 | 3309 |
3306 #+BEGIN_LaTeX | 3310 #+BEGIN_LaTeX |
3307 \appendix | 3311 \appendix |
3308 #+END_LaTeX | 3312 #+END_LaTeX |
3309 | 3313 |
3310 * Appendix: =CORTEX= User Guide | 3314 * COMMENT Appendix: =CORTEX= User Guide |
3311 | 3315 |
3312 Those who write a thesis should endeavor to make their code not only | 3316 Those who write a thesis should endeavor to make their code not only |
3313 accessible, but actually usable, as a way to pay back the community | 3317 accessible, but actually usable, as a way to pay back the community |
3314 that made the thesis possible in the first place. This thesis would | 3318 that made the thesis possible in the first place. This thesis would |
3315 not be possible without Free Software such as jMonkeyEngine3, | 3319 not be possible without Free Software such as jMonkeyEngine3, |