changeset 268:6446e964810f

merged changes.
author Robert McIntyre <rlm@mit.edu>
date Tue, 14 Feb 2012 05:00:02 -0700
parents d487348c461c (current diff) bee5145ce463 (diff)
children bbd787e12025 aa3641042958
files
diffstat 2 files changed, 37 insertions(+), 20 deletions(-) [+]
line wrap: on
line diff
     1.1 Binary file images/diagram_rendermanager.png has changed
     2.1 --- a/org/vision.org	Tue Feb 14 04:52:23 2012 -0700
     2.2 +++ b/org/vision.org	Tue Feb 14 05:00:02 2012 -0700
     2.3 @@ -7,8 +7,15 @@
     2.4  #+INCLUDE: ../../aurellem/org/level-0.org
     2.5  #+babel: :mkdirp yes :noweb yes :exports both
     2.6  
     2.7 -* Vision
     2.8 -  
     2.9 +# SUGGEST: Call functions by their name, without
    2.10 +# parentheses. e.g. =add-eye!=, not =(add-eye!)=. The reason for this
    2.11 +# is that it is potentially easy to confuse the /function/ =f= with its
    2.12 +# /value/ at a particular point =(f x)=. Mathematicians have this
    2.13 +# problem with their notation; we don't need it in ours.
    2.14 +
    2.15 +#* Vision
    2.16 +* JMonkeyEngine natively supports multiple views of the same world.
    2.17 + 
    2.18  Vision is one of the most important senses for humans, so I need to
    2.19  build a simulated sense of vision for my AI. I will do this with
    2.20  simulated eyes. Each eye can be independely moved and should see its
    2.21 @@ -25,7 +32,9 @@
    2.22  #+caption: jMonkeyEngine supports multiple views to enable split-screen games, like GoldenEye, which was one of the first games to use split-screen views.
    2.23  [[../images/goldeneye-4-player.png]]
    2.24  
    2.25 -* Brief Description of jMonkeyEngine's Rendering Pipeline
    2.26 +** =ViewPorts=, =SceneProcessors=, and the =RenderManager=. 
    2.27 +# =Viewports= are cameras; =RenderManger= takes snapshots each frame. 
    2.28 +#* A Brief Description of jMonkeyEngine's Rendering Pipeline
    2.29  
    2.30  jMonkeyEngine allows you to create a =ViewPort=, which represents a
    2.31  view of the simulated world. You can create as many of these as you
    2.32 @@ -33,6 +42,10 @@
    2.33  =ViewPort=, rendering the scene in the GPU. For each =ViewPort= there
    2.34  is a =FrameBuffer= which represents the rendered image in the GPU.
    2.35  
    2.36 +#+caption: =ViewPorts= are cameras in the world. During each frame, the =Rendermanager= records a snapshot of what each view is currently seeing.
    2.37 +#+ATTR_HTML: width="400"
    2.38 +[[../images/diagram_rendermanager.png]]
    2.39 +
    2.40  Each =ViewPort= can have any number of attached =SceneProcessor=
    2.41  objects, which are called every time a new frame is rendered. A
    2.42  =SceneProcessor= recieves its =ViewPort's= =FrameBuffer= and can do
    2.43 @@ -40,9 +53,10 @@
    2.44  specific operations on the rendered image.  The =SceneProcessor= can
    2.45  also copy the GPU image data to RAM and process it with the CPU.
    2.46  
    2.47 -* The Vision Pipeline
    2.48 +** From Views to Vision
    2.49 +# Appropriating Views for Vision.
    2.50  
    2.51 -Each eye in the simulated creature needs it's own =ViewPort= so that
    2.52 +Each eye in the simulated creature needs its own =ViewPort= so that
    2.53  it can see the world from its own perspective. To this =ViewPort=, I
    2.54  add a =SceneProcessor= that feeds the visual data to any arbitray
    2.55  continuation function for further processing.  That continuation
    2.56 @@ -143,8 +157,8 @@
    2.57  They can be queried every cycle, but their information may not
    2.58  necessairly be different every cycle.
    2.59  
    2.60 -* Physical Eyes
    2.61 -
    2.62 +# * Optical sensor arrays are described as images and stored as metadata.
    2.63 +* Optical sensor arrays are described with images and referenced with metadata
    2.64  The vision pipeline described above handles the flow of rendered
    2.65  images. Now, we need simulated eyes to serve as the source of these
    2.66  images. 
    2.67 @@ -278,7 +292,7 @@
    2.68       (apply max (map second dimensions))]))
    2.69  #+end_src
    2.70  
    2.71 -* Eye Creation 
    2.72 +* Importing and parsing descriptions of eyes.
    2.73  First off, get the children of the "eyes" empty node to find all the
    2.74  eyes the creature has.
    2.75  #+name: eye-node
    2.76 @@ -405,7 +419,7 @@
    2.77  simulation or the simulated senses, but can be annoying.
    2.78  =(gen-fix-display)= restores the in-simulation display.
    2.79  
    2.80 -** Vision!
    2.81 +** The =vision!= function creates sensory probes.
    2.82  
    2.83  All the hard work has been done; all that remains is to apply
    2.84  =(vision-kernel)= to each eye in the creature and gather the results
    2.85 @@ -423,8 +437,8 @@
    2.86       (vision-kernel creature eye))))
    2.87  #+end_src
    2.88  
    2.89 -** Visualization of Vision
    2.90 -
    2.91 +** Displaying visual data for debugging.
    2.92 +# Visualization of Vision. Maybe less alliteration would be better.
    2.93  It's vital to have a visual representation for each sense. Here I use
    2.94  =(view-sense)= to construct a function that will create a display for
    2.95  visual data.
    2.96 @@ -448,11 +462,11 @@
    2.97         image))))
    2.98  #+end_src
    2.99  
   2.100 -* Tests
   2.101 -** Basic Test
   2.102 +* Demonstrations
   2.103 +** Demonstrating the vision pipeline.
   2.104  
   2.105  This is a basic test for the vision system.  It only tests the
   2.106 -vision-pipeline and does not deal with loadig eyes from a blender
   2.107 +vision-pipeline and does not deal with loading eyes from a blender
   2.108  file. The code creates two videos of the same rotating cube from
   2.109  different angles. 
   2.110  
   2.111 @@ -512,8 +526,7 @@
   2.112  Creating multiple eyes like this can be used for stereoscopic vision
   2.113  simulation in a single creature or for simulating multiple creatures,
   2.114  each with their own sense of vision.
   2.115 -
   2.116 -** Adding Vision to the Worm
   2.117 +** Demonstrating eye import and parsing.
   2.118  
   2.119  To the worm from the last post, I add a new node that describes its
   2.120  eyes.
   2.121 @@ -683,6 +696,14 @@
   2.122  ffmpeg -r 25 -b 9001k -i out/%07d.png -vcodec libtheora worm-vision.ogg 
   2.123  #+end_src
   2.124     
   2.125 +* Onward!
   2.126 +  - As a neat bonus, this idea behind simulated vision also enables one
   2.127 +    to [[../../cortex/html/capture-video.html][capture live video feeds from jMonkeyEngine]].
   2.128 +  - Now that we have vision, it's time to tackle [[./hearing.org][hearing]].
   2.129 +
   2.130 +
   2.131 +#+appendix
   2.132 +
   2.133  * Headers
   2.134  
   2.135  #+name: vision-header
   2.136 @@ -725,10 +746,6 @@
   2.137    (:import (com.aurellem.capture Capture RatchetTimer)))
   2.138  #+end_src
   2.139  
   2.140 -* Onward!
   2.141 -  - As a neat bonus, this idea behind simulated vision also enables one
   2.142 -    to [[../../cortex/html/capture-video.html][capture live video feeds from jMonkeyEngine]].
   2.143 -  - Now that we have vision, it's time to tackle [[./hearing.org][hearing]].
   2.144  
   2.145  * Source Listing
   2.146    - [[../src/cortex/vision.clj][cortex.vision]]