Mercurial > cortex
changeset 450:432f2c4646cb
sleepig.
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Wed, 26 Mar 2014 03:18:57 -0400 |
parents | 09b7c8dd4365 |
children | 0a4362d1f138 |
files | thesis/Makefile thesis/cortex.org |
diffstat | 2 files changed, 56 insertions(+), 10 deletions(-) [+] |
line wrap: on
line diff
1.1 --- a/thesis/Makefile Wed Mar 26 02:42:01 2014 -0400 1.2 +++ b/thesis/Makefile Wed Mar 26 03:18:57 2014 -0400 1.3 @@ -1,6 +1,7 @@ 1.4 #INVOKE_LATEX = pdflatex -shell-escape thesis.tex; 1.5 THESIS_NAME = rlm-cortex-meng 1.6 INVOKE_LATEX = texi2dvi --shell-escape --pdf -V --batch $(THESIS_NAME).tex; 1.7 +#INVOKE_LATEX = texi2dvi --shell-escape --pdf -V $(THESIS_NAME).tex; 1.8 1.9 all: 1.10 ./weave-thesis.sh cortex
2.1 --- a/thesis/cortex.org Wed Mar 26 02:42:01 2014 -0400 2.2 +++ b/thesis/cortex.org Wed Mar 26 03:18:57 2014 -0400 2.3 @@ -663,8 +663,41 @@ 2.4 2.5 ** Empathy is the process of tracing though \Phi-space 2.6 2.7 + Here is the core of a basic empathy algorithm, starting with an 2.8 + experience vector: First, group the experiences into tiered 2.9 + proprioceptive bins. I use powers of 10 and 3 bins, and the 2.10 + smallest bin has and approximate size of 0.001 radians in all 2.11 + proprioceptive dimensions. 2.12 + 2.13 + Then, given a sequence of proprioceptive input, generate a set of 2.14 + matching experience records for each input. 2.15 2.16 + Finally, to infer sensory data, select the longest consective chain 2.17 + of experiences as determined by the indexes into the experience 2.18 + vector. 2.19 2.20 + This algorithm has three advantages: 2.21 + 2.22 + 1. It's simple 2.23 + 2.24 + 3. It's very fast -- both tracing through possibilites and 2.25 + retrieving possible interpretations take essentially constant 2.26 + time. 2.27 + 2.28 + 2. It protects from wrong interpretations of transient ambiguous 2.29 + proprioceptive data : for example, if the worm is flat for just 2.30 + an instant, this flattness will not be interpreted as implying 2.31 + that the worm has its muscles relaxed, since the flattness is 2.32 + part of a longer chain which includes a distinct pattern of 2.33 + muscle activation. A memoryless statistical model such as a 2.34 + markov model that operates on individual frames may very well 2.35 + make this mistake. 2.36 + 2.37 + #+caption: Program to convert an experience vector into a 2.38 + #+caption: proprioceptively binned lookup function. 2.39 + #+name: bin 2.40 + #+begin_listing clojure 2.41 + #+begin_src clojure 2.42 (defn bin [digits] 2.43 (fn [angles] 2.44 (->> angles 2.45 @@ -674,11 +707,10 @@ 2.46 (mapv #(Math/round (* % (Math/pow 10 (dec digits)))))))) 2.47 2.48 (defn gen-phi-scan 2.49 -"Nearest-neighbors with spatial binning. Only returns a result if 2.50 - the propriceptive data is within 10% of a previously recorded 2.51 - result in all dimensions." 2.52 - 2.53 -[phi-space] 2.54 + "Nearest-neighbors with binning. Only returns a result if 2.55 + the propriceptive data is within 10% of a previously recorded 2.56 + result in all dimensions." 2.57 + [phi-space] 2.58 (let [bin-keys (map bin [3 2 1]) 2.59 bin-maps 2.60 (map (fn [bin-key] 2.61 @@ -686,12 +718,20 @@ 2.62 (comp bin-key :proprioception phi-space) 2.63 (range (count phi-space)))) bin-keys) 2.64 lookups (map (fn [bin-key bin-map] 2.65 - (fn [proprio] (bin-map (bin-key proprio)))) 2.66 - bin-keys bin-maps)] 2.67 + (fn [proprio] (bin-map (bin-key proprio)))) 2.68 + bin-keys bin-maps)] 2.69 (fn lookup [proprio-data] 2.70 (set (some #(% proprio-data) lookups))))) 2.71 + #+end_src 2.72 + #+end_listing 2.73 2.74 2.75 + #+caption: Program to calculate empathy by tracing though \Phi-space 2.76 + #+caption: and finding the longest (ie. most coherent) interpretation 2.77 + #+caption: of the data. 2.78 + #+name: longest-thread 2.79 + #+begin_listing clojure 2.80 + #+begin_src clojure 2.81 (defn longest-thread 2.82 "Find the longest thread from phi-index-sets. The index sets should 2.83 be ordered from most recent to least recent." 2.84 @@ -718,6 +758,9 @@ 2.85 threads)] 2.86 (recur (concat longest-thread result) 2.87 (drop (count longest-thread) phi-index-sets)))))) 2.88 + #+end_src 2.89 + #+end_listing 2.90 + 2.91 2.92 There is one final piece, which is to replace missing sensory data 2.93 with a best-guess estimate. While I could fill in missing data by 2.94 @@ -747,12 +790,14 @@ 2.95 #+end_src 2.96 #+end_listing 2.97 2.98 - 2.99 - 2.100 - 2.101 2.102 ** Efficient action recognition with =EMPATH= 2.103 2.104 + In my exploration with the worm, I can generally infer actions from 2.105 + proprioceptive data exactly as well as when I have the complete 2.106 + sensory data. To reach this level, I have to train the worm with 2.107 + verious exercices for about 1 minute. 2.108 + 2.109 ** Digression: bootstrapping touch using free exploration 2.110 2.111 * Contributions