# HG changeset patch # User Robert McIntyre # Date 1398557420 14400 # Node ID 749452f063e5411f571010dea5951839a7dcefb6 # Parent 21b8389922eed47ab6f0dd59003d4a46a9b12718 elaborate paragraph. diff -r 21b8389922ee -r 749452f063e5 thesis/cortex.org --- a/thesis/cortex.org Sat Apr 26 19:36:13 2014 -0400 +++ b/thesis/cortex.org Sat Apr 26 20:10:20 2014 -0400 @@ -2381,7 +2381,7 @@ #+end_listing - =movement-kernel= creates a function that controlls the movement + =movement-kernel= creates a function that controls the movement of the nearest physical node to the muscle node. The muscle exerts a rotational force dependent on it's orientation to the object in the blender file. The function returned by =movement-kernel= is @@ -2545,7 +2545,7 @@ #+end_src #+end_listing -** COMMENT Embodiment factors action recognition into manageable parts +** Embodiment factors action recognition into manageable parts Using empathy, I divide the problem of action recognition into a recognition process expressed in the language of a full compliment @@ -2738,17 +2738,35 @@ These action predicates satisfy the recognition requirement of an empathic recognition system. There is power in the simplicity of the action predicates. They describe their actions without getting - confused in visual details of the worm. Each one is frame - independent, but more than that, they are each independent of - irrelevant visual details of the worm and the environment. They - will work regardless of whether the worm is a different color or - heavily textured, or if the environment has strange lighting. + confused in visual details of the worm. Each one is independent of + position and rotation, but more than that, they are each + independent of irrelevant visual details of the worm and the + environment. They will work regardless of whether the worm is a + different color or heavily textured, or if the environment has + strange lighting. + + Consider how the human act of jumping might be described with + body-centered action predicates: You might specify that jumping is + mainly the feeling of your knees bending, your thigh muscles + contracting, and your inner ear experiencing a certain sort of back + and forth acceleration. This representation is a very concrete + description of jumping, couched in terms of muscles and senses, but + it also has the ability to describe almost all kinds of jumping, a + generality that you might think could only be achieved by a very + abstract description. The body centered jumping predicate does not + have terms that consider the color of a person's skin or whether + they are male or female, instead it gets right to the meat of what + jumping actually /is/. + + Of course, the action predicates are not directly applicable to + video data which lacks the advanced sensory information which they + require! The trick now is to make the action predicates work even when the sensory data on which they depend is absent. If I can do that, then I will have gained much. -** COMMENT \Phi-space describes the worm's experiences +** \Phi-space describes the worm's experiences As a first step towards building empathy, I need to gather all of the worm's experiences during free play. I use a simple vector to @@ -2794,7 +2812,7 @@ provided by an experience vector and reliably inferring the rest of the senses. -** COMMENT Empathy is the process of tracing though \Phi-space +** Empathy is the process of tracing though \Phi-space Here is the core of a basic empathy algorithm, starting with an experience vector: @@ -2865,7 +2883,7 @@ #+caption: =longest-thread= finds the longest path of consecutive #+caption: experiences to explain proprioceptive worm data from #+caption: previous data. Here, the film strip represents the - #+caption: creature's previous experience. Sort sequeuces of + #+caption: creature's previous experience. Sort sequences of #+caption: memories are spliced together to match the #+caption: proprioceptive data. Their carry the other senses #+caption: along with them. @@ -3273,7 +3291,7 @@ completely scrambled. The cross shape is just for convenience. This example justifies the use of pre-defined touch regions in =EMPATH=. -* COMMENT Contributions +* Contributions In this thesis you have seen the =CORTEX= system, a complete environment for creating simulated creatures. You have seen how to