Mercurial > cortex
comparison thesis/cortex.org @ 531:749452f063e5
elaborate paragraph.
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Sat, 26 Apr 2014 20:10:20 -0400 |
parents | 21b8389922ee |
children | 686f77b88292 |
comparison
equal
deleted
inserted
replaced
530:21b8389922ee | 531:749452f063e5 |
---|---|
2379 (movement-kernel creature muscle))) | 2379 (movement-kernel creature muscle))) |
2380 #+END_SRC | 2380 #+END_SRC |
2381 #+end_listing | 2381 #+end_listing |
2382 | 2382 |
2383 | 2383 |
2384 =movement-kernel= creates a function that controlls the movement | 2384 =movement-kernel= creates a function that controls the movement |
2385 of the nearest physical node to the muscle node. The muscle exerts | 2385 of the nearest physical node to the muscle node. The muscle exerts |
2386 a rotational force dependent on it's orientation to the object in | 2386 a rotational force dependent on it's orientation to the object in |
2387 the blender file. The function returned by =movement-kernel= is | 2387 the blender file. The function returned by =movement-kernel= is |
2388 also a sense function: it returns the percent of the total muscle | 2388 also a sense function: it returns the percent of the total muscle |
2389 strength that is currently being employed. This is analogous to | 2389 strength that is currently being employed. This is analogous to |
2543 :proprioception (proprioception! model) | 2543 :proprioception (proprioception! model) |
2544 :muscles (movement! model)})) | 2544 :muscles (movement! model)})) |
2545 #+end_src | 2545 #+end_src |
2546 #+end_listing | 2546 #+end_listing |
2547 | 2547 |
2548 ** COMMENT Embodiment factors action recognition into manageable parts | 2548 ** Embodiment factors action recognition into manageable parts |
2549 | 2549 |
2550 Using empathy, I divide the problem of action recognition into a | 2550 Using empathy, I divide the problem of action recognition into a |
2551 recognition process expressed in the language of a full compliment | 2551 recognition process expressed in the language of a full compliment |
2552 of senses, and an imaginative process that generates full sensory | 2552 of senses, and an imaginative process that generates full sensory |
2553 data from partial sensory data. Splitting the action recognition | 2553 data from partial sensory data. Splitting the action recognition |
2736 [[./images/worm-identify-init.png]] | 2736 [[./images/worm-identify-init.png]] |
2737 | 2737 |
2738 These action predicates satisfy the recognition requirement of an | 2738 These action predicates satisfy the recognition requirement of an |
2739 empathic recognition system. There is power in the simplicity of | 2739 empathic recognition system. There is power in the simplicity of |
2740 the action predicates. They describe their actions without getting | 2740 the action predicates. They describe their actions without getting |
2741 confused in visual details of the worm. Each one is frame | 2741 confused in visual details of the worm. Each one is independent of |
2742 independent, but more than that, they are each independent of | 2742 position and rotation, but more than that, they are each |
2743 irrelevant visual details of the worm and the environment. They | 2743 independent of irrelevant visual details of the worm and the |
2744 will work regardless of whether the worm is a different color or | 2744 environment. They will work regardless of whether the worm is a |
2745 heavily textured, or if the environment has strange lighting. | 2745 different color or heavily textured, or if the environment has |
2746 strange lighting. | |
2747 | |
2748 Consider how the human act of jumping might be described with | |
2749 body-centered action predicates: You might specify that jumping is | |
2750 mainly the feeling of your knees bending, your thigh muscles | |
2751 contracting, and your inner ear experiencing a certain sort of back | |
2752 and forth acceleration. This representation is a very concrete | |
2753 description of jumping, couched in terms of muscles and senses, but | |
2754 it also has the ability to describe almost all kinds of jumping, a | |
2755 generality that you might think could only be achieved by a very | |
2756 abstract description. The body centered jumping predicate does not | |
2757 have terms that consider the color of a person's skin or whether | |
2758 they are male or female, instead it gets right to the meat of what | |
2759 jumping actually /is/. | |
2760 | |
2761 Of course, the action predicates are not directly applicable to | |
2762 video data which lacks the advanced sensory information which they | |
2763 require! | |
2746 | 2764 |
2747 The trick now is to make the action predicates work even when the | 2765 The trick now is to make the action predicates work even when the |
2748 sensory data on which they depend is absent. If I can do that, then | 2766 sensory data on which they depend is absent. If I can do that, then |
2749 I will have gained much. | 2767 I will have gained much. |
2750 | 2768 |
2751 ** COMMENT \Phi-space describes the worm's experiences | 2769 ** \Phi-space describes the worm's experiences |
2752 | 2770 |
2753 As a first step towards building empathy, I need to gather all of | 2771 As a first step towards building empathy, I need to gather all of |
2754 the worm's experiences during free play. I use a simple vector to | 2772 the worm's experiences during free play. I use a simple vector to |
2755 store all the experiences. | 2773 store all the experiences. |
2756 | 2774 |
2792 | 2810 |
2793 There is a simple way of taking \Phi-space and the total ordering | 2811 There is a simple way of taking \Phi-space and the total ordering |
2794 provided by an experience vector and reliably inferring the rest of | 2812 provided by an experience vector and reliably inferring the rest of |
2795 the senses. | 2813 the senses. |
2796 | 2814 |
2797 ** COMMENT Empathy is the process of tracing though \Phi-space | 2815 ** Empathy is the process of tracing though \Phi-space |
2798 | 2816 |
2799 Here is the core of a basic empathy algorithm, starting with an | 2817 Here is the core of a basic empathy algorithm, starting with an |
2800 experience vector: | 2818 experience vector: |
2801 | 2819 |
2802 First, group the experiences into tiered proprioceptive bins. I use | 2820 First, group the experiences into tiered proprioceptive bins. I use |
2863 #+end_listing | 2881 #+end_listing |
2864 | 2882 |
2865 #+caption: =longest-thread= finds the longest path of consecutive | 2883 #+caption: =longest-thread= finds the longest path of consecutive |
2866 #+caption: experiences to explain proprioceptive worm data from | 2884 #+caption: experiences to explain proprioceptive worm data from |
2867 #+caption: previous data. Here, the film strip represents the | 2885 #+caption: previous data. Here, the film strip represents the |
2868 #+caption: creature's previous experience. Sort sequeuces of | 2886 #+caption: creature's previous experience. Sort sequences of |
2869 #+caption: memories are spliced together to match the | 2887 #+caption: memories are spliced together to match the |
2870 #+caption: proprioceptive data. Their carry the other senses | 2888 #+caption: proprioceptive data. Their carry the other senses |
2871 #+caption: along with them. | 2889 #+caption: along with them. |
2872 #+name: phi-space-history-scan | 2890 #+name: phi-space-history-scan |
2873 #+ATTR_LaTeX: :width 10cm | 2891 #+ATTR_LaTeX: :width 10cm |
3271 deduce that the worm has six sides. Note that =learn-touch-regions= | 3289 deduce that the worm has six sides. Note that =learn-touch-regions= |
3272 would work just as well even if the worm's touch sense data were | 3290 would work just as well even if the worm's touch sense data were |
3273 completely scrambled. The cross shape is just for convenience. This | 3291 completely scrambled. The cross shape is just for convenience. This |
3274 example justifies the use of pre-defined touch regions in =EMPATH=. | 3292 example justifies the use of pre-defined touch regions in =EMPATH=. |
3275 | 3293 |
3276 * COMMENT Contributions | 3294 * Contributions |
3277 | 3295 |
3278 In this thesis you have seen the =CORTEX= system, a complete | 3296 In this thesis you have seen the =CORTEX= system, a complete |
3279 environment for creating simulated creatures. You have seen how to | 3297 environment for creating simulated creatures. You have seen how to |
3280 implement five senses: touch, proprioception, hearing, vision, and | 3298 implement five senses: touch, proprioception, hearing, vision, and |
3281 muscle tension. You have seen how to create new creatures using | 3299 muscle tension. You have seen how to create new creatures using |