rlm@436: Here I demonstrate the power of using embodied artificial intelligence rlm@436: to attack the /action recognition/ problem, which is the challenge of rlm@436: recognizing actions performed by a creature given limited data about rlm@437: the creature's actions, such as a video recording. I solve this rlm@437: problem in the case of a worm-like creature performing actions such as rlm@437: curling and wiggling. rlm@432: rlm@436: To attack the action recognition problem, I developed a computational rlm@437: model of empathy (=EMPATH=) which allows me to use simple, embodied rlm@436: representations of actions (which require rich sensory data), even rlm@436: when that sensory data is not actually available. The missing sense rlm@436: data is ``imagined'' by the system by combining previous experiences rlm@436: gained from unsupervised free play. rlm@432: rlm@436: In order to build this empathic, action-recognizing system, I created rlm@436: a program called =CORTEX=, which is a complete platform for embodied rlm@436: AI research. It provides multiple senses for simulated creatures, rlm@436: including vision, touch, proprioception, muscle tension, and rlm@436: hearing. Each of these senses provides a wealth of parameters that are rlm@436: biologically inspired. =CORTEX= is able to simulate any number of rlm@436: creatures and senses, and provides facilities for easily modeling and rlm@436: creating new creatures. As a research platform it is more complete rlm@436: than any other system currently available.