Mercurial > cortex
comparison org/vision.org @ 264:f8227f6d4ac6
Some section renaming and minor other changes in vision.
author | Dylan Holmes <ocsenave@gmail.com> |
---|---|
date | Mon, 13 Feb 2012 07:29:29 -0600 |
parents | 0e85237d27a7 |
children | e57d8c52f12f |
comparison
equal
deleted
inserted
replaced
263:e15cd6f60ffe | 264:f8227f6d4ac6 |
---|---|
5 #+keywords: computer vision, jMonkeyEngine3, clojure | 5 #+keywords: computer vision, jMonkeyEngine3, clojure |
6 #+SETUPFILE: ../../aurellem/org/setup.org | 6 #+SETUPFILE: ../../aurellem/org/setup.org |
7 #+INCLUDE: ../../aurellem/org/level-0.org | 7 #+INCLUDE: ../../aurellem/org/level-0.org |
8 #+babel: :mkdirp yes :noweb yes :exports both | 8 #+babel: :mkdirp yes :noweb yes :exports both |
9 | 9 |
10 * Vision | 10 #* Vision |
11 | 11 * JMonkeyEngine natively supports multiple views of the same world. |
12 | |
12 Vision is one of the most important senses for humans, so I need to | 13 Vision is one of the most important senses for humans, so I need to |
13 build a simulated sense of vision for my AI. I will do this with | 14 build a simulated sense of vision for my AI. I will do this with |
14 simulated eyes. Each eye can be independely moved and should see its | 15 simulated eyes. Each eye can be independely moved and should see its |
15 own version of the world depending on where it is. | 16 own version of the world depending on where it is. |
16 | 17 |
23 and then projecting it back onto a surface in the 3D world. | 24 and then projecting it back onto a surface in the 3D world. |
24 | 25 |
25 #+caption: jMonkeyEngine supports multiple views to enable split-screen games, like GoldenEye, which was one of the first games to use split-screen views. | 26 #+caption: jMonkeyEngine supports multiple views to enable split-screen games, like GoldenEye, which was one of the first games to use split-screen views. |
26 [[../images/goldeneye-4-player.png]] | 27 [[../images/goldeneye-4-player.png]] |
27 | 28 |
28 * Brief Description of jMonkeyEngine's Rendering Pipeline | 29 ** =ViewPorts=, =SceneProcessors=, and the =RenderManager=. |
30 # =Viewports= are cameras; =RenderManger= takes snapshots each frame. | |
31 #* A Brief Description of jMonkeyEngine's Rendering Pipeline | |
29 | 32 |
30 jMonkeyEngine allows you to create a =ViewPort=, which represents a | 33 jMonkeyEngine allows you to create a =ViewPort=, which represents a |
31 view of the simulated world. You can create as many of these as you | 34 view of the simulated world. You can create as many of these as you |
32 want. Every frame, the =RenderManager= iterates through each | 35 want. Every frame, the =RenderManager= iterates through each |
33 =ViewPort=, rendering the scene in the GPU. For each =ViewPort= there | 36 =ViewPort=, rendering the scene in the GPU. For each =ViewPort= there |
42 =SceneProcessor= recieves its =ViewPort's= =FrameBuffer= and can do | 45 =SceneProcessor= recieves its =ViewPort's= =FrameBuffer= and can do |
43 whatever it wants to the data. Often this consists of invoking GPU | 46 whatever it wants to the data. Often this consists of invoking GPU |
44 specific operations on the rendered image. The =SceneProcessor= can | 47 specific operations on the rendered image. The =SceneProcessor= can |
45 also copy the GPU image data to RAM and process it with the CPU. | 48 also copy the GPU image data to RAM and process it with the CPU. |
46 | 49 |
47 * The Vision Pipeline | 50 ** From Views to Vision |
48 | 51 # Appropriating Views for Vision. |
49 Each eye in the simulated creature needs it's own =ViewPort= so that | 52 |
53 Each eye in the simulated creature needs its own =ViewPort= so that | |
50 it can see the world from its own perspective. To this =ViewPort=, I | 54 it can see the world from its own perspective. To this =ViewPort=, I |
51 add a =SceneProcessor= that feeds the visual data to any arbitray | 55 add a =SceneProcessor= that feeds the visual data to any arbitray |
52 continuation function for further processing. That continuation | 56 continuation function for further processing. That continuation |
53 function may perform both CPU and GPU operations on the data. To make | 57 function may perform both CPU and GPU operations on the data. To make |
54 this easy for the continuation function, the =SceneProcessor= | 58 this easy for the continuation function, the =SceneProcessor= |
145 maintain a reference to sensor-data which is periodically updated | 149 maintain a reference to sensor-data which is periodically updated |
146 by the continuation function established by its init-function. | 150 by the continuation function established by its init-function. |
147 They can be queried every cycle, but their information may not | 151 They can be queried every cycle, but their information may not |
148 necessairly be different every cycle. | 152 necessairly be different every cycle. |
149 | 153 |
150 * Physical Eyes | 154 * Optical sensor arrays are described as images and stored as metadata. |
151 | 155 |
152 The vision pipeline described above handles the flow of rendered | 156 The vision pipeline described above handles the flow of rendered |
153 images. Now, we need simulated eyes to serve as the source of these | 157 images. Now, we need simulated eyes to serve as the source of these |
154 images. | 158 images. |
155 | 159 |
280 (vals (retina-sensor-profile eye)))] | 284 (vals (retina-sensor-profile eye)))] |
281 [(apply max (map first dimensions)) | 285 [(apply max (map first dimensions)) |
282 (apply max (map second dimensions))])) | 286 (apply max (map second dimensions))])) |
283 #+end_src | 287 #+end_src |
284 | 288 |
285 * Eye Creation | 289 * Putting it all together: Importing and parsing descriptions of eyes. |
286 First off, get the children of the "eyes" empty node to find all the | 290 First off, get the children of the "eyes" empty node to find all the |
287 eyes the creature has. | 291 eyes the creature has. |
288 #+name: eye-node | 292 #+name: eye-node |
289 #+begin_src clojure | 293 #+begin_src clojure |
290 (defvar | 294 (defvar |
450 (.setRGB image ((coords i) 0) ((coords i) 1) | 454 (.setRGB image ((coords i) 0) ((coords i) 1) |
451 (gray (int (* 255 (sensor-data i))))))) | 455 (gray (int (* 255 (sensor-data i))))))) |
452 image)))) | 456 image)))) |
453 #+end_src | 457 #+end_src |
454 | 458 |
455 * Tests | 459 * Demonstrations |
456 ** Basic Test | 460 ** Demonstrating the vision pipeline. |
457 | 461 |
458 This is a basic test for the vision system. It only tests the | 462 This is a basic test for the vision system. It only tests the |
459 vision-pipeline and does not deal with loadig eyes from a blender | 463 vision-pipeline and does not deal with loading eyes from a blender |
460 file. The code creates two videos of the same rotating cube from | 464 file. The code creates two videos of the same rotating cube from |
461 different angles. | 465 different angles. |
462 | 466 |
463 #+name: test-1 | 467 #+name: test-1 |
464 #+begin_src clojure | 468 #+begin_src clojure |
514 #+end_html | 518 #+end_html |
515 | 519 |
516 Creating multiple eyes like this can be used for stereoscopic vision | 520 Creating multiple eyes like this can be used for stereoscopic vision |
517 simulation in a single creature or for simulating multiple creatures, | 521 simulation in a single creature or for simulating multiple creatures, |
518 each with their own sense of vision. | 522 each with their own sense of vision. |
519 | 523 ** Demonstrating eye import and parsing. |
520 ** Adding Vision to the Worm | |
521 | 524 |
522 To the worm from the last post, I add a new node that describes its | 525 To the worm from the last post, I add a new node that describes its |
523 eyes. | 526 eyes. |
524 | 527 |
525 #+attr_html: width=755 | 528 #+attr_html: width=755 |