# HG changeset patch # User Dylan Holmes # Date 1329139769 21600 # Node ID f8227f6d4ac6fd10c89c4c0b3dd7273b38d57946 # Parent e15cd6f60ffebc0a83b7300fc5fdc79f03a7d552 Some section renaming and minor other changes in vision. diff -r e15cd6f60ffe -r f8227f6d4ac6 org/vision.org --- a/org/vision.org Mon Feb 13 06:49:34 2012 -0600 +++ b/org/vision.org Mon Feb 13 07:29:29 2012 -0600 @@ -7,8 +7,9 @@ #+INCLUDE: ../../aurellem/org/level-0.org #+babel: :mkdirp yes :noweb yes :exports both -* Vision - +#* Vision +* JMonkeyEngine natively supports multiple views of the same world. + Vision is one of the most important senses for humans, so I need to build a simulated sense of vision for my AI. I will do this with simulated eyes. Each eye can be independely moved and should see its @@ -25,7 +26,9 @@ #+caption: jMonkeyEngine supports multiple views to enable split-screen games, like GoldenEye, which was one of the first games to use split-screen views. [[../images/goldeneye-4-player.png]] -* Brief Description of jMonkeyEngine's Rendering Pipeline +** =ViewPorts=, =SceneProcessors=, and the =RenderManager=. +# =Viewports= are cameras; =RenderManger= takes snapshots each frame. +#* A Brief Description of jMonkeyEngine's Rendering Pipeline jMonkeyEngine allows you to create a =ViewPort=, which represents a view of the simulated world. You can create as many of these as you @@ -44,9 +47,10 @@ specific operations on the rendered image. The =SceneProcessor= can also copy the GPU image data to RAM and process it with the CPU. -* The Vision Pipeline +** From Views to Vision +# Appropriating Views for Vision. -Each eye in the simulated creature needs it's own =ViewPort= so that +Each eye in the simulated creature needs its own =ViewPort= so that it can see the world from its own perspective. To this =ViewPort=, I add a =SceneProcessor= that feeds the visual data to any arbitray continuation function for further processing. That continuation @@ -147,7 +151,7 @@ They can be queried every cycle, but their information may not necessairly be different every cycle. -* Physical Eyes +* Optical sensor arrays are described as images and stored as metadata. The vision pipeline described above handles the flow of rendered images. Now, we need simulated eyes to serve as the source of these @@ -282,7 +286,7 @@ (apply max (map second dimensions))])) #+end_src -* Eye Creation +* Putting it all together: Importing and parsing descriptions of eyes. First off, get the children of the "eyes" empty node to find all the eyes the creature has. #+name: eye-node @@ -452,11 +456,11 @@ image)))) #+end_src -* Tests -** Basic Test +* Demonstrations +** Demonstrating the vision pipeline. This is a basic test for the vision system. It only tests the -vision-pipeline and does not deal with loadig eyes from a blender +vision-pipeline and does not deal with loading eyes from a blender file. The code creates two videos of the same rotating cube from different angles. @@ -516,8 +520,7 @@ Creating multiple eyes like this can be used for stereoscopic vision simulation in a single creature or for simulating multiple creatures, each with their own sense of vision. - -** Adding Vision to the Worm +** Demonstrating eye import and parsing. To the worm from the last post, I add a new node that describes its eyes.