rlm@572: Here I demonstrate the power of using embodied artificial intelligence rlm@572: to attack the \emph{action recognition} problem, which is the challenge of rlm@572: recognizing actions performed by a creature given limited data about rlm@572: the creature's actions, such as a video recording. I solve this rlm@572: problem in the case of a worm-like creature performing actions such as rlm@572: curling and wiggling. rlm@572: rlm@572: To attack the action recognition problem, I developed a computational rlm@572: model of empathy (\texttt{EMPATH}) which allows me to recognize actions using rlm@572: simple, embodied representations of actions (which require rich rlm@572: sensory data), even when that sensory data is not actually available. rlm@572: The missing sense data is imagined by combining previous experiences rlm@572: gained from unsupervised free play. The worm is a five-segment rlm@572: creature equipped with touch, proprioception, and muscle tension rlm@572: senses. It recognizes actions using only proprioception data. rlm@572: rlm@572: In order to build this empathic, action-recognizing system, I created rlm@572: a program called \texttt{CORTEX}, which is a complete platform for embodied rlm@572: AI research. It provides multiple senses for simulated creatures, rlm@572: including vision, touch, proprioception, muscle tension, and hearing. rlm@572: Each of these senses provides a wealth of parameters that are rlm@572: biologically inspired. \texttt{CORTEX} is able to simulate any number of rlm@572: creatures and senses, and provides facilities for easily modeling and rlm@572: creating new creatures. As a research platform it is more complete rlm@572: than any other system currently available.