Mercurial > cortex
diff thesis/cortex.org @ 450:432f2c4646cb
sleepig.
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Wed, 26 Mar 2014 03:18:57 -0400 |
parents | 09b7c8dd4365 |
children | 0a4362d1f138 |
line wrap: on
line diff
1.1 --- a/thesis/cortex.org Wed Mar 26 02:42:01 2014 -0400 1.2 +++ b/thesis/cortex.org Wed Mar 26 03:18:57 2014 -0400 1.3 @@ -663,8 +663,41 @@ 1.4 1.5 ** Empathy is the process of tracing though \Phi-space 1.6 1.7 + Here is the core of a basic empathy algorithm, starting with an 1.8 + experience vector: First, group the experiences into tiered 1.9 + proprioceptive bins. I use powers of 10 and 3 bins, and the 1.10 + smallest bin has and approximate size of 0.001 radians in all 1.11 + proprioceptive dimensions. 1.12 + 1.13 + Then, given a sequence of proprioceptive input, generate a set of 1.14 + matching experience records for each input. 1.15 1.16 + Finally, to infer sensory data, select the longest consective chain 1.17 + of experiences as determined by the indexes into the experience 1.18 + vector. 1.19 1.20 + This algorithm has three advantages: 1.21 + 1.22 + 1. It's simple 1.23 + 1.24 + 3. It's very fast -- both tracing through possibilites and 1.25 + retrieving possible interpretations take essentially constant 1.26 + time. 1.27 + 1.28 + 2. It protects from wrong interpretations of transient ambiguous 1.29 + proprioceptive data : for example, if the worm is flat for just 1.30 + an instant, this flattness will not be interpreted as implying 1.31 + that the worm has its muscles relaxed, since the flattness is 1.32 + part of a longer chain which includes a distinct pattern of 1.33 + muscle activation. A memoryless statistical model such as a 1.34 + markov model that operates on individual frames may very well 1.35 + make this mistake. 1.36 + 1.37 + #+caption: Program to convert an experience vector into a 1.38 + #+caption: proprioceptively binned lookup function. 1.39 + #+name: bin 1.40 + #+begin_listing clojure 1.41 + #+begin_src clojure 1.42 (defn bin [digits] 1.43 (fn [angles] 1.44 (->> angles 1.45 @@ -674,11 +707,10 @@ 1.46 (mapv #(Math/round (* % (Math/pow 10 (dec digits)))))))) 1.47 1.48 (defn gen-phi-scan 1.49 -"Nearest-neighbors with spatial binning. Only returns a result if 1.50 - the propriceptive data is within 10% of a previously recorded 1.51 - result in all dimensions." 1.52 - 1.53 -[phi-space] 1.54 + "Nearest-neighbors with binning. Only returns a result if 1.55 + the propriceptive data is within 10% of a previously recorded 1.56 + result in all dimensions." 1.57 + [phi-space] 1.58 (let [bin-keys (map bin [3 2 1]) 1.59 bin-maps 1.60 (map (fn [bin-key] 1.61 @@ -686,12 +718,20 @@ 1.62 (comp bin-key :proprioception phi-space) 1.63 (range (count phi-space)))) bin-keys) 1.64 lookups (map (fn [bin-key bin-map] 1.65 - (fn [proprio] (bin-map (bin-key proprio)))) 1.66 - bin-keys bin-maps)] 1.67 + (fn [proprio] (bin-map (bin-key proprio)))) 1.68 + bin-keys bin-maps)] 1.69 (fn lookup [proprio-data] 1.70 (set (some #(% proprio-data) lookups))))) 1.71 + #+end_src 1.72 + #+end_listing 1.73 1.74 1.75 + #+caption: Program to calculate empathy by tracing though \Phi-space 1.76 + #+caption: and finding the longest (ie. most coherent) interpretation 1.77 + #+caption: of the data. 1.78 + #+name: longest-thread 1.79 + #+begin_listing clojure 1.80 + #+begin_src clojure 1.81 (defn longest-thread 1.82 "Find the longest thread from phi-index-sets. The index sets should 1.83 be ordered from most recent to least recent." 1.84 @@ -718,6 +758,9 @@ 1.85 threads)] 1.86 (recur (concat longest-thread result) 1.87 (drop (count longest-thread) phi-index-sets)))))) 1.88 + #+end_src 1.89 + #+end_listing 1.90 + 1.91 1.92 There is one final piece, which is to replace missing sensory data 1.93 with a best-guess estimate. While I could fill in missing data by 1.94 @@ -747,12 +790,14 @@ 1.95 #+end_src 1.96 #+end_listing 1.97 1.98 - 1.99 - 1.100 - 1.101 1.102 ** Efficient action recognition with =EMPATH= 1.103 1.104 + In my exploration with the worm, I can generally infer actions from 1.105 + proprioceptive data exactly as well as when I have the complete 1.106 + sensory data. To reach this level, I have to train the worm with 1.107 + verious exercices for about 1 minute. 1.108 + 1.109 ** Digression: bootstrapping touch using free exploration 1.110 1.111 * Contributions