comparison thesis/cortex.org @ 451:0a4362d1f138

finishing up chapter 3.
author Robert McIntyre <rlm@mit.edu>
date Wed, 26 Mar 2014 20:38:17 -0400
parents 432f2c4646cb
children f339e3d5cc8c
comparison
equal deleted inserted replaced
450:432f2c4646cb 451:0a4362d1f138
1 #+title: =CORTEX= 1 #+title: =CORTEX=
2 #+author: Robert McIntyre 2 #+author: Robert McIntyre
3 #+email: rlm@mit.edu 3 #+email: rlm@mit.edu
4 #+description: Using embodied AI to facilitate Artificial Imagination. 4 #+description: Using embodied AI to facilitate Artificial Imagination.
5 #+keywords: AI, clojure, embodiment 5 #+keywords: AI, clojure, embodiment
6 6 #+LaTeX_CLASS_OPTIONS: [nofloat]
7 7
8 * Empathy and Embodiment as problem solving strategies 8 * Empathy and Embodiment as problem solving strategieszzzzzzz
9 9
10 By the end of this thesis, you will have seen a novel approach to 10 By the end of this thesis, you will have seen a novel approach to
11 interpreting video using embodiment and empathy. You will have also 11 interpreting video using embodiment and empathy. You will have also
12 seen one way to efficiently implement empathy for embodied 12 seen one way to efficiently implement empathy for embodied
13 creatures. Finally, you will become familiar with =CORTEX=, a system 13 creatures. Finally, you will become familiar with =CORTEX=, a system
295 - evolutionary algorithms involving creature construction 295 - evolutionary algorithms involving creature construction
296 - exploration of exoitic senses and effectors that are not possible 296 - exploration of exoitic senses and effectors that are not possible
297 in the real world (such as telekenisis or a semantic sense) 297 in the real world (such as telekenisis or a semantic sense)
298 - imagination using subworlds 298 - imagination using subworlds
299 299
300 During one test with =CORTEX=, I created 3,000 entities each with 300 During one test with =CORTEX=, I created 3,000 creatures each with
301 their own independent senses and ran them all at only 1/80 real 301 their own independent senses and ran them all at only 1/80 real
302 time. In another test, I created a detailed model of my own hand, 302 time. In another test, I created a detailed model of my own hand,
303 equipped with a realistic distribution of touch (more sensitive at 303 equipped with a realistic distribution of touch (more sensitive at
304 the fingertips), as well as eyes and ears, and it ran at around 1/4 304 the fingertips), as well as eyes and ears, and it ran at around 1/4
305 real time. 305 real time.
306 306
307 #+BEGIN_LaTeX 307 #+BEGIN_LaTeX
308 \begin{sidewaysfigure} 308 \begin{sidewaysfigure}
309 \includegraphics[width=9.5in]{images/full-hand.png} 309 \includegraphics[width=9.5in]{images/full-hand.png}
310 \caption{Here is the worm from above modeled in Blender, 310 \caption{
311 a free 3D-modeling program. Senses and joints are described 311 I modeled my own right hand in Blender and rigged it with all the
312 using special nodes in Blender. The senses are displayed on 312 senses that {\tt CORTEX} supports. My simulated hand has a
313 the right, and the simulation is displayed on the left. Notice 313 biologically inspired distribution of touch sensors. The senses are
314 that the hand is curling its fingers, that it can see its own 314 displayed on the right, and the simulation is displayed on the
315 finger from the eye in its palm, and thta it can feel its own 315 left. Notice that my hand is curling its fingers, that it can see
316 thumb touching its palm.} 316 its own finger from the eye in its palm, and that it can feel its
317 own thumb touching its palm.}
317 \end{sidewaysfigure} 318 \end{sidewaysfigure}
318 #+END_LaTeX 319 #+END_LaTeX
319 320
320 ** Contributions 321 ** Contributions
321 322
322 I built =CORTEX=, a comprehensive platform for embodied AI 323 - I built =CORTEX=, a comprehensive platform for embodied AI
323 experiments. =CORTEX= many new features lacking in other systems, 324 experiments. =CORTEX= supports many features lacking in other
324 such as sound. It is easy to create new creatures using Blender, a 325 systems, such proper simulation of hearing. It is easy to create
325 free 3D modeling program. 326 new =CORTEX= creatures using Blender, a free 3D modeling program.
326 327
327 I built =EMPATH=, which uses =CORTEX= to identify the actions of a 328 - I built =EMPATH=, which uses =CORTEX= to identify the actions of
328 worm-like creature using a computational model of empathy. 329 a worm-like creature using a computational model of empathy.
329 330
330 * Building =CORTEX= 331 * Building =CORTEX=
331 332
332 ** To explore embodiment, we need a world, body, and senses 333 ** To explore embodiment, we need a world, body, and senses
333 334
370 371
371 #+caption: Here is the worm with which we will be working. 372 #+caption: Here is the worm with which we will be working.
372 #+caption: It is composed of 5 segments. Each segment has a 373 #+caption: It is composed of 5 segments. Each segment has a
373 #+caption: pair of extensor and flexor muscles. Each of the 374 #+caption: pair of extensor and flexor muscles. Each of the
374 #+caption: worm's four joints is a hinge joint which allows 375 #+caption: worm's four joints is a hinge joint which allows
375 #+caption: 30 degrees of rotation to either side. Each segment 376 #+caption: about 30 degrees of rotation to either side. Each segment
376 #+caption: of the worm is touch-capable and has a uniform 377 #+caption: of the worm is touch-capable and has a uniform
377 #+caption: distribution of touch sensors on each of its faces. 378 #+caption: distribution of touch sensors on each of its faces.
378 #+caption: Each joint has a proprioceptive sense to detect 379 #+caption: Each joint has a proprioceptive sense to detect
379 #+caption: relative positions. The worm segments are all the 380 #+caption: relative positions. The worm segments are all the
380 #+caption: same except for the first one, which has a much 381 #+caption: same except for the first one, which has a much
416 Embodied representations using multiple senses such as touch, 417 Embodied representations using multiple senses such as touch,
417 proprioception, and muscle tension turns out be be exceedingly 418 proprioception, and muscle tension turns out be be exceedingly
418 efficient at describing body-centered actions. It is the ``right 419 efficient at describing body-centered actions. It is the ``right
419 language for the job''. For example, it takes only around 5 lines 420 language for the job''. For example, it takes only around 5 lines
420 of LISP code to describe the action of ``curling'' using embodied 421 of LISP code to describe the action of ``curling'' using embodied
421 primitives. It takes about 8 lines to describe the seemingly 422 primitives. It takes about 10 lines to describe the seemingly
422 complicated action of wiggling. 423 complicated action of wiggling.
423 424
424 The following action predicates each take a stream of sensory 425 The following action predicates each take a stream of sensory
425 experience, observe however much of it they desire, and decide 426 experience, observe however much of it they desire, and decide
426 whether the worm is doing the action they describe. =curled?= 427 whether the worm is doing the action they describe. =curled?=
576 #+end_src 577 #+end_src
577 #+end_listing 578 #+end_listing
578 579
579 #+caption: Using =debug-experience=, the body-centered predicates 580 #+caption: Using =debug-experience=, the body-centered predicates
580 #+caption: work together to classify the behaviour of the worm. 581 #+caption: work together to classify the behaviour of the worm.
581 #+caption: while under manual motor control. 582 #+caption: the predicates are operating with access to the worm's
583 #+caption: full sensory data.
582 #+name: basic-worm-view 584 #+name: basic-worm-view
583 #+ATTR_LaTeX: :width 10cm 585 #+ATTR_LaTeX: :width 10cm
584 [[./images/worm-identify-init.png]] 586 [[./images/worm-identify-init.png]]
585 587
586 These action predicates satisfy the recognition requirement of an 588 These action predicates satisfy the recognition requirement of an
587 empathic recognition system. There is a lot of power in the 589 empathic recognition system. There is power in the simplicity of
588 simplicity of the action predicates. They describe their actions 590 the action predicates. They describe their actions without getting
589 without getting confused in visual details of the worm. Each one is 591 confused in visual details of the worm. Each one is frame
590 frame independent, but more than that, they are each indepent of 592 independent, but more than that, they are each indepent of
591 irrelevant visual details of the worm and the environment. They 593 irrelevant visual details of the worm and the environment. They
592 will work regardless of whether the worm is a different color or 594 will work regardless of whether the worm is a different color or
593 hevaily textured, or of the environment has strange lighting. 595 hevaily textured, or if the environment has strange lighting.
594 596
595 The trick now is to make the action predicates work even when the 597 The trick now is to make the action predicates work even when the
596 sensory data on which they depend is absent. If I can do that, then 598 sensory data on which they depend is absent. If I can do that, then
597 I will have gained much, 599 I will have gained much,
598 600
599 ** \Phi-space describes the worm's experiences 601 ** \Phi-space describes the worm's experiences
600 602
601 As a first step towards building empathy, I need to gather all of 603 As a first step towards building empathy, I need to gather all of
602 the worm's experiences during free play. I use a simple vector to 604 the worm's experiences during free play. I use a simple vector to
603 store all the experiences. 605 store all the experiences.
604
605 #+caption: Program to gather the worm's experiences into a vector for
606 #+caption: further processing. The =motor-control-program= line uses
607 #+caption: a motor control script that causes the worm to execute a series
608 #+caption: of ``exercices'' that include all the action predicates.
609 #+name: generate-phi-space
610 #+begin_listing clojure
611 #+begin_src clojure
612 (defn generate-phi-space []
613 (let [experiences (atom [])]
614 (run-world
615 (apply-map
616 worm-world
617 (merge
618 (worm-world-defaults)
619 {:end-frame 700
620 :motor-control
621 (motor-control-program worm-muscle-labels do-all-the-things)
622 :experiences experiences})))
623 @experiences))
624 #+end_src
625 #+end_listing
626 606
627 Each element of the experience vector exists in the vast space of 607 Each element of the experience vector exists in the vast space of
628 all possible worm-experiences. Most of this vast space is actually 608 all possible worm-experiences. Most of this vast space is actually
629 unreachable due to physical constraints of the worm's body. For 609 unreachable due to physical constraints of the worm's body. For
630 example, the worm's segments are connected by hinge joints that put 610 example, the worm's segments are connected by hinge joints that put
631 a practical limit on the worm's degrees of freedom. Also, the worm 611 a practical limit on the worm's range of motions without limiting
632 can not be bent into a circle so that its ends are touching and at 612 its degrees of freedom. Some groupings of senses are impossible;
633 the same time not also experience the sensation of touching itself. 613 the worm can not be bent into a circle so that its ends are
634 614 touching and at the same time not also experience the sensation of
635 As the worm moves around during free play and the vector grows 615 touching itself.
636 larger, the vector begins to define a subspace which is all the 616
637 practical experiences the worm can experience during normal 617 As the worm moves around during free play and its experience vector
638 operation, which I call \Phi-space, short for physical-space. The 618 grows larger, the vector begins to define a subspace which is all
639 vector defines a path through \Phi-space. This path has interesting 619 the sensations the worm can practicaly experience during normal
640 properties that all derive from embodiment. The proprioceptive 620 operation. I call this subspace \Phi-space, short for
641 components are completely smooth, because in order for the worm to 621 physical-space. The experience vector defines a path through
642 move from one position to another, it must pass through the 622 \Phi-space. This path has interesting properties that all derive
643 intermediate positions. The path invariably forms loops as actions 623 from physical embodiment. The proprioceptive components are
644 are repeated. Finally and most importantly, proprioception actually 624 completely smooth, because in order for the worm to move from one
645 gives very strong inference about the other senses. For example, 625 position to another, it must pass through the intermediate
646 when the worm is flat, you can infer that it is touching the ground 626 positions. The path invariably forms loops as actions are repeated.
647 and that its muscles are not active, because if the muscles were 627 Finally and most importantly, proprioception actually gives very
648 active, the worm would be moving and would not be perfectly flat. 628 strong inference about the other senses. For example, when the worm
649 In order to stay flat, the worm has to be touching the ground, or 629 is flat, you can infer that it is touching the ground and that its
650 it would again be moving out of the flat position due to gravity. 630 muscles are not active, because if the muscles were active, the
651 If the worm is positioned in such a way that it interacts with 631 worm would be moving and would not be perfectly flat. In order to
652 itself, then it is very likely to be feeling the same tactile 632 stay flat, the worm has to be touching the ground, or it would
653 feelings as the last time it was in that position, because it has 633 again be moving out of the flat position due to gravity. If the
654 the same body as then. If you observe multiple frames of 634 worm is positioned in such a way that it interacts with itself,
655 proprioceptive data, then you can become increasingly confident 635 then it is very likely to be feeling the same tactile feelings as
656 about the exact activations of the worm's muscles, because it 636 the last time it was in that position, because it has the same body
657 generally takes a unique combination of muscle contractions to 637 as then. If you observe multiple frames of proprioceptive data,
658 transform the worm's body along a specific path through \Phi-space. 638 then you can become increasingly confident about the exact
639 activations of the worm's muscles, because it generally takes a
640 unique combination of muscle contractions to transform the worm's
641 body along a specific path through \Phi-space.
659 642
660 There is a simple way of taking \Phi-space and the total ordering 643 There is a simple way of taking \Phi-space and the total ordering
661 provided by an experience vector and reliably infering the rest of 644 provided by an experience vector and reliably infering the rest of
662 the senses. 645 the senses.
663 646
664 ** Empathy is the process of tracing though \Phi-space 647 ** Empathy is the process of tracing though \Phi-space
665 648
666 Here is the core of a basic empathy algorithm, starting with an 649 Here is the core of a basic empathy algorithm, starting with an
667 experience vector: First, group the experiences into tiered 650 experience vector:
668 proprioceptive bins. I use powers of 10 and 3 bins, and the 651
669 smallest bin has and approximate size of 0.001 radians in all 652 First, group the experiences into tiered proprioceptive bins. I use
670 proprioceptive dimensions. 653 powers of 10 and 3 bins, and the smallest bin has an approximate
654 size of 0.001 radians in all proprioceptive dimensions.
671 655
672 Then, given a sequence of proprioceptive input, generate a set of 656 Then, given a sequence of proprioceptive input, generate a set of
673 matching experience records for each input. 657 matching experience records for each input, using the tiered
658 proprioceptive bins.
674 659
675 Finally, to infer sensory data, select the longest consective chain 660 Finally, to infer sensory data, select the longest consective chain
676 of experiences as determined by the indexes into the experience 661 of experiences. Conecutive experience means that the experiences
677 vector. 662 appear next to each other in the experience vector.
678 663
679 This algorithm has three advantages: 664 This algorithm has three advantages:
680 665
681 1. It's simple 666 1. It's simple
682 667
683 3. It's very fast -- both tracing through possibilites and 668 3. It's very fast -- retrieving possible interpretations takes
684 retrieving possible interpretations take essentially constant 669 constant time. Tracing through chains of interpretations takes
685 time. 670 time proportional to the average number of experiences in a
671 proprioceptive bin. Redundant experiences in \Phi-space can be
672 merged to save computation.
686 673
687 2. It protects from wrong interpretations of transient ambiguous 674 2. It protects from wrong interpretations of transient ambiguous
688 proprioceptive data : for example, if the worm is flat for just 675 proprioceptive data. For example, if the worm is flat for just
689 an instant, this flattness will not be interpreted as implying 676 an instant, this flattness will not be interpreted as implying
690 that the worm has its muscles relaxed, since the flattness is 677 that the worm has its muscles relaxed, since the flattness is
691 part of a longer chain which includes a distinct pattern of 678 part of a longer chain which includes a distinct pattern of
692 muscle activation. A memoryless statistical model such as a 679 muscle activation. Markov chains or other memoryless statistical
693 markov model that operates on individual frames may very well 680 models that operate on individual frames may very well make this
694 make this mistake. 681 mistake.
695 682
696 #+caption: Program to convert an experience vector into a 683 #+caption: Program to convert an experience vector into a
697 #+caption: proprioceptively binned lookup function. 684 #+caption: proprioceptively binned lookup function.
698 #+name: bin 685 #+name: bin
699 #+begin_listing clojure 686 #+begin_listing clojure
723 (fn lookup [proprio-data] 710 (fn lookup [proprio-data]
724 (set (some #(% proprio-data) lookups))))) 711 (set (some #(% proprio-data) lookups)))))
725 #+end_src 712 #+end_src
726 #+end_listing 713 #+end_listing
727 714
715 #+caption: =longest-thread= finds the longest path of consecutive
716 #+caption: experiences to explain proprioceptive worm data.
717 #+name: phi-space-history-scan
718 #+ATTR_LaTeX: :width 10cm
719 [[./images/aurellem-gray.png]]
720
721 =longest-thread= infers sensory data by stitching together pieces
722 from previous experience. It prefers longer chains of previous
723 experience to shorter ones. For example, during training the worm
724 might rest on the ground for one second before it performs its
725 excercises. If during recognition the worm rests on the ground for
726 five seconds, =longest-thread= will accomodate this five second
727 rest period by looping the one second rest chain five times.
728
729 =longest-thread= takes time proportinal to the average number of
730 entries in a proprioceptive bin, because for each element in the
731 starting bin it performes a series of set lookups in the preceeding
732 bins. If the total history is limited, then this is only a constant
733 multiple times the number of entries in the starting bin. This
734 analysis also applies even if the action requires multiple longest
735 chains -- it's still the average number of entries in a
736 proprioceptive bin times the desired chain length. Because
737 =longest-thread= is so efficient and simple, I can interpret
738 worm-actions in real time.
728 739
729 #+caption: Program to calculate empathy by tracing though \Phi-space 740 #+caption: Program to calculate empathy by tracing though \Phi-space
730 #+caption: and finding the longest (ie. most coherent) interpretation 741 #+caption: and finding the longest (ie. most coherent) interpretation
731 #+caption: of the data. 742 #+caption: of the data.
732 #+name: longest-thread 743 #+name: longest-thread
759 (recur (concat longest-thread result) 770 (recur (concat longest-thread result)
760 (drop (count longest-thread) phi-index-sets)))))) 771 (drop (count longest-thread) phi-index-sets))))))
761 #+end_src 772 #+end_src
762 #+end_listing 773 #+end_listing
763 774
764 775 There is one final piece, which is to replace missing sensory data
765 There is one final piece, which is to replace missing sensory data 776 with a best-guess estimate. While I could fill in missing data by
766 with a best-guess estimate. While I could fill in missing data by 777 using a gradient over the closest known sensory data points,
767 using a gradient over the closest known sensory data points, averages 778 averages can be misleading. It is certainly possible to create an
768 can be misleading. It is certainly possible to create an impossible 779 impossible sensory state by averaging two possible sensory states.
769 sensory state by averaging two possible sensory states. Therefore, I 780 Therefore, I simply replicate the most recent sensory experience to
770 simply replicate the most recent sensory experience to fill in the 781 fill in the gaps.
771 gaps.
772 782
773 #+caption: Fill in blanks in sensory experience by replicating the most 783 #+caption: Fill in blanks in sensory experience by replicating the most
774 #+caption: recent experience. 784 #+caption: recent experience.
775 #+name: infer-nils 785 #+name: infer-nils
776 #+begin_listing clojure 786 #+begin_listing clojure
787 (recur (dec i) v) 797 (recur (dec i) v)
788 (recur (dec i) (assoc! v (dec i) cur))) 798 (recur (dec i) (assoc! v (dec i) cur)))
789 (recur i (assoc! v i 0)))))) 799 (recur i (assoc! v i 0))))))
790 #+end_src 800 #+end_src
791 #+end_listing 801 #+end_listing
792
793 802
794 ** Efficient action recognition with =EMPATH= 803 ** Efficient action recognition with =EMPATH=
795 804
796 In my exploration with the worm, I can generally infer actions from 805 To use =EMPATH= with the worm, I first need to gather a set of
797 proprioceptive data exactly as well as when I have the complete 806 experiences from the worm that includes the actions I want to
798 sensory data. To reach this level, I have to train the worm with 807 recognize. The =generate-phi-space= program (listint
799 verious exercices for about 1 minute. 808 \ref{generate-phi-space} runs the worm through a series of
809 exercices and gatheres those experiences into a vector. The
810 =do-all-the-things= program is a routine expressed in a simple
811 muscle contraction script language for automated worm control.
812
813 #+caption: Program to gather the worm's experiences into a vector for
814 #+caption: further processing. The =motor-control-program= line uses
815 #+caption: a motor control script that causes the worm to execute a series
816 #+caption: of ``exercices'' that include all the action predicates.
817 #+name: generate-phi-space
818 #+attr_latex: [!H]
819 #+begin_listing clojure
820 #+begin_src clojure
821 (def do-all-the-things
822 (concat
823 curl-script
824 [[300 :d-ex 40]
825 [320 :d-ex 0]]
826 (shift-script 280 (take 16 wiggle-script))))
827
828 (defn generate-phi-space []
829 (let [experiences (atom [])]
830 (run-world
831 (apply-map
832 worm-world
833 (merge
834 (worm-world-defaults)
835 {:end-frame 700
836 :motor-control
837 (motor-control-program worm-muscle-labels do-all-the-things)
838 :experiences experiences})))
839 @experiences))
840 #+end_src
841 #+end_listing
842
843 #+caption: Use longest thread and a phi-space generated from a short
844 #+caption: exercise routine to interpret actions during free play.
845 #+name: empathy-debug
846 #+begin_listing clojure
847 #+begin_src clojure
848 (defn init []
849 (def phi-space (generate-phi-space))
850 (def phi-scan (gen-phi-scan phi-space)))
851
852 (defn empathy-demonstration []
853 (let [proprio (atom ())]
854 (fn
855 [experiences text]
856 (let [phi-indices (phi-scan (:proprioception (peek experiences)))]
857 (swap! proprio (partial cons phi-indices))
858 (let [exp-thread (longest-thread (take 300 @proprio))
859 empathy (mapv phi-space (infer-nils exp-thread))]
860 (println-repl (vector:last-n exp-thread 22))
861 (cond
862 (grand-circle? empathy) (.setText text "Grand Circle")
863 (curled? empathy) (.setText text "Curled")
864 (wiggling? empathy) (.setText text "Wiggling")
865 (resting? empathy) (.setText text "Resting")
866 :else (.setText text "Unknown")))))))
867
868 (defn empathy-experiment [record]
869 (.start (worm-world :experience-watch (debug-experience-phi)
870 :record record :worm worm*)))
871 #+end_src
872 #+end_listing
873
874 The result of running =empathy-experiment= is that the system is
875 generally able to interpret worm actions using the action-predicates
876 on simulated sensory data just as well as with actual data. Figure
877 \ref{empathy-debug-image} was generated using =empathy-experiment=:
878
879 #+caption: From only proprioceptive data, =EMPATH= was able to infer
880 #+caption: the complete sensory experience and classify four poses
881 #+caption: (The last panel shows a composite image of \emph{wriggling},
882 #+caption: a dynamic pose.)
883 #+name: empathy-debug-image
884 #+ATTR_LaTeX: :width 10cm :placement [H]
885 [[./images/empathy-1.png]]
886
887 One way to measure the performance of =EMPATH= is to compare the
888 sutiability of the imagined sense experience to trigger the same
889 action predicates as the real sensory experience.
890
891 #+caption: Determine how closely empathy approximates actual
892 #+caption: sensory data.
893 #+name: test-empathy-accuracy
894 #+begin_listing clojure
895 #+begin_src clojure
896 (def worm-action-label
897 (juxt grand-circle? curled? wiggling?))
898
899 (defn compare-empathy-with-baseline [matches]
900 (let [proprio (atom ())]
901 (fn
902 [experiences text]
903 (let [phi-indices (phi-scan (:proprioception (peek experiences)))]
904 (swap! proprio (partial cons phi-indices))
905 (let [exp-thread (longest-thread (take 300 @proprio))
906 empathy (mapv phi-space (infer-nils exp-thread))
907 experience-matches-empathy
908 (= (worm-action-label experiences)
909 (worm-action-label empathy))]
910 (println-repl experience-matches-empathy)
911 (swap! matches #(conj % experience-matches-empathy)))))))
912
913 (defn accuracy [v]
914 (float (/ (count (filter true? v)) (count v))))
915
916 (defn test-empathy-accuracy []
917 (let [res (atom [])]
918 (run-world
919 (worm-world :experience-watch
920 (compare-empathy-with-baseline res)
921 :worm worm*))
922 (accuracy @res)))
923 #+end_src
924 #+end_listing
925
926 Running =test-empathy-accuracy= using the very short exercise
927 program defined in listing \ref{generate-phi-space}, and then doing
928 a similar pattern of activity manually yeilds an accuracy of around
929 73%. This is based on very limited worm experience. By training the
930 worm for longer, the accuracy dramatically improves.
931
932 #+caption: Program to generate \Phi-space using manual training.
933 #+name: manual-phi-space
934 #+begin_listing clojure
935 #+begin_src clojure
936 (defn init-interactive []
937 (def phi-space
938 (let [experiences (atom [])]
939 (run-world
940 (apply-map
941 worm-world
942 (merge
943 (worm-world-defaults)
944 {:experiences experiences})))
945 @experiences))
946 (def phi-scan (gen-phi-scan phi-space)))
947 #+end_src
948 #+end_listing
949
950 After about 1 minute of manual training, I was able to achieve 95%
951 accuracy on manual testing of the worm using =init-interactive= and
952 =test-empathy-accuracy=. The ability of the system to infer sensory
953 states is truly impressive.
800 954
801 ** Digression: bootstrapping touch using free exploration 955 ** Digression: bootstrapping touch using free exploration
802 956
803 * Contributions 957 * Contributions
804 958