Mercurial > cortex
changeset 567:7837ca42d82c
final thesis printed!
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Mon, 12 May 2014 16:33:34 -0400 |
parents | b9b8567c14ee |
children | 807fb1046a98 |
files | thesis/cortex.bib thesis/cortex.org |
diffstat | 2 files changed, 14 insertions(+), 12 deletions(-) [+] |
line wrap: on
line diff
1.1 --- a/thesis/cortex.bib Mon May 12 15:01:53 2014 -0400 1.2 +++ b/thesis/cortex.bib Mon May 12 16:33:34 2014 -0400 1.3 @@ -81,7 +81,7 @@ 1.4 to detect actions in video. I consider this to be 1.5 the wrong language for describing actions, because 1.6 it has no way to completely describe even a simple 1.7 - action like ``curling'' form all points of view.}} 1.8 + action like ``curling'' from all points of view.}} 1.9 } 1.10 1.11 @book{man-wife-hat, 1.12 @@ -170,7 +170,7 @@ 1.13 navigate --- they don't just have meaning from our 1.14 own highly advanced imaginations. I want to see if a 1.15 rat can reasonably grow up if it lives its entire 1.16 - live hooked up to the game!}} 1.17 + life hooked up to the game!}} 1.18 } 1.19 1.20
2.1 --- a/thesis/cortex.org Mon May 12 15:01:53 2014 -0400 2.2 +++ b/thesis/cortex.org Mon May 12 16:33:34 2014 -0400 2.3 @@ -349,7 +349,7 @@ 2.4 popular, free 3D modeling program. Each sense can be specified 2.5 using special Blender nodes with biologically inspired 2.6 parameters. You need not write any code to create a creature, and 2.7 - can use a wide library of pre-existing blender models as a base 2.8 + can use a wide library of pre-existing Blender models as a base 2.9 for your own creatures. 2.10 2.11 - =CORTEX= implements a wide variety of senses: touch, 2.12 @@ -768,7 +768,7 @@ 2.13 nodes in the Blender file and creates physical representations for 2.14 any of them that have mass defined in their Blender meta-data. 2.15 2.16 - #+caption: Program for iterating through the nodes in a blender file 2.17 + #+caption: Program for iterating through the nodes in a Blender file 2.18 #+caption: and generating physical jMonkeyEngine3 objects with mass 2.19 #+caption: and a matching physics shape. 2.20 #+name: physical 2.21 @@ -1144,7 +1144,7 @@ 2.22 follows the closest physical node in 'creature. The camera will 2.23 point in the X direction and use the Z vector as up as determined 2.24 by the rotation of these vectors in Blender coordinate space. Use 2.25 - XZY rotation for the node in blender." 2.26 + XZY rotation for the node in Blender." 2.27 [#^Node creature #^Spatial eye] 2.28 (let [target (closest-node creature eye) 2.29 [cam-width cam-height] 2.30 @@ -1310,7 +1310,7 @@ 2.31 The vision code is not much more complicated than the body code, 2.32 and enables multiple further paths for simulated vision. For 2.33 example, it is quite easy to create bifocal vision -- you just 2.34 - make two eyes next to each other in blender! It is also possible 2.35 + make two eyes next to each other in Blender! It is also possible 2.36 to encode vision transforms in the retinal files. For example, the 2.37 human like retina file in figure \ref{retina} approximates a 2.38 log-polar transform. 2.39 @@ -1511,7 +1511,7 @@ 2.40 bindings that are not worth mentioning, =CORTEX= gains the ability 2.41 to access multiple sound streams from =OpenAL=. 2.42 2.43 - #+caption: Program to create an ear from a blender empty node. The ear 2.44 + #+caption: Program to create an ear from a Blender empty node. The ear 2.45 #+caption: follows around the nearest physical object and passes 2.46 #+caption: all sensory data to a continuation function. 2.47 #+name: add-ear 2.48 @@ -2316,7 +2316,7 @@ 2.49 #+begin_listing clojure 2.50 #+BEGIN_SRC clojure 2.51 (defn muscle-profile-image 2.52 - "Get the muscle-profile image from the node's blender meta-data." 2.53 + "Get the muscle-profile image from the node's Blender meta-data." 2.54 [#^Node muscle] 2.55 (if-let [image (meta-data muscle "muscle")] 2.56 (load-image image))) 2.57 @@ -2864,6 +2864,8 @@ 2.58 its head generates possible interpretations for each frame (the 2.59 numbers are experience-indices): 2.60 2.61 + \clearpage 2.62 + 2.63 #+BEGIN_EXAMPLE 2.64 [ flat, flat, flat, flat, flat, flat, flat, lift-head ] 2.65 1 1 1 1 1 1 1 4 2.66 @@ -3144,7 +3146,7 @@ 2.67 #+caption: (The last panel shows a composite image of /wiggling/, 2.68 #+caption: a dynamic pose.) 2.69 #+name: empathy-debug-image 2.70 - #+ATTR_LaTeX: :width 10cm :placement [H] 2.71 + #+ATTR_LaTeX: :width 10cm 2.72 [[./images/empathy-1.png]] 2.73 2.74 One way to measure the performance of =EMPATH= is to compare the 2.75 @@ -3708,13 +3710,13 @@ 2.76 #+caption: (where there are many white pixels) and regions of low 2.77 #+caption: sensitivity (where white pixels are sparse). 2.78 #+name: guide-fingertip-UV 2.79 - #+ATTR_LaTeX: :width 9cm :placement [H] 2.80 + #+ATTR_LaTeX: :width 9cm 2.81 [[./images/finger-UV.png]] 2.82 2.83 #+caption: The fingertip UV-image form above applied to a simple 2.84 #+caption: model of a fingertip. 2.85 #+name: guide-fingertip 2.86 - #+ATTR_LaTeX: :width 9cm :placement [H] 2.87 + #+ATTR_LaTeX: :width 9cm 2.88 [[./images/finger-1.png]] 2.89 2.90 *** Proprioception 2.91 @@ -3755,7 +3757,7 @@ 2.92 #+caption: has weaker motor neurons at the beginning, just 2.93 #+caption: like human muscle. 2.94 #+name: muscle-recruit 2.95 - #+ATTR_LaTeX: :width 7cm :placement [H] 2.96 + #+ATTR_LaTeX: :width 7cm 2.97 [[./images/basic-muscle.png]] 2.98 2.99 Muscles twist the nearest physical object about the muscle node's