comparison org/vision.org @ 265:e57d8c52f12f

More tweaks to vision.
author Dylan Holmes <ocsenave@gmail.com>
date Mon, 13 Feb 2012 21:53:28 -0600
parents f8227f6d4ac6
children aa3641042958
comparison
equal deleted inserted replaced
264:f8227f6d4ac6 265:e57d8c52f12f
5 #+keywords: computer vision, jMonkeyEngine3, clojure 5 #+keywords: computer vision, jMonkeyEngine3, clojure
6 #+SETUPFILE: ../../aurellem/org/setup.org 6 #+SETUPFILE: ../../aurellem/org/setup.org
7 #+INCLUDE: ../../aurellem/org/level-0.org 7 #+INCLUDE: ../../aurellem/org/level-0.org
8 #+babel: :mkdirp yes :noweb yes :exports both 8 #+babel: :mkdirp yes :noweb yes :exports both
9 9
10 # SUGGEST: Call functions by their name, without
11 # parentheses. e.g. =add-eye!=, not =(add-eye!)=. The reason for this
12 # is that it is potentially easy to confuse the /function/ =f= with its
13 # /value/ at a particular point =(f x)=. Mathematicians have this
14 # problem with their notation; we don't need it in ours.
15
10 #* Vision 16 #* Vision
11 * JMonkeyEngine natively supports multiple views of the same world. 17 * JMonkeyEngine natively supports multiple views of the same world.
12 18
13 Vision is one of the most important senses for humans, so I need to 19 Vision is one of the most important senses for humans, so I need to
14 build a simulated sense of vision for my AI. I will do this with 20 build a simulated sense of vision for my AI. I will do this with
35 want. Every frame, the =RenderManager= iterates through each 41 want. Every frame, the =RenderManager= iterates through each
36 =ViewPort=, rendering the scene in the GPU. For each =ViewPort= there 42 =ViewPort=, rendering the scene in the GPU. For each =ViewPort= there
37 is a =FrameBuffer= which represents the rendered image in the GPU. 43 is a =FrameBuffer= which represents the rendered image in the GPU.
38 44
39 #+caption: =ViewPorts= are cameras in the world. During each frame, the =Rendermanager= records a snapshot of what each view is currently seeing. 45 #+caption: =ViewPorts= are cameras in the world. During each frame, the =Rendermanager= records a snapshot of what each view is currently seeing.
40 #+attr_html:width="400" 46 #+ATTR_HTML: width="400"
41 [[../images/diagram_rendermanager.png]] 47 [[../images/diagram_rendermanager.png]]
42 48
43 Each =ViewPort= can have any number of attached =SceneProcessor= 49 Each =ViewPort= can have any number of attached =SceneProcessor=
44 objects, which are called every time a new frame is rendered. A 50 objects, which are called every time a new frame is rendered. A
45 =SceneProcessor= recieves its =ViewPort's= =FrameBuffer= and can do 51 =SceneProcessor= recieves its =ViewPort's= =FrameBuffer= and can do
149 maintain a reference to sensor-data which is periodically updated 155 maintain a reference to sensor-data which is periodically updated
150 by the continuation function established by its init-function. 156 by the continuation function established by its init-function.
151 They can be queried every cycle, but their information may not 157 They can be queried every cycle, but their information may not
152 necessairly be different every cycle. 158 necessairly be different every cycle.
153 159
154 * Optical sensor arrays are described as images and stored as metadata. 160 # * Optical sensor arrays are described as images and stored as metadata.
155 161 * Optical sensor arrays are described with images and referenced with metadata
156 The vision pipeline described above handles the flow of rendered 162 The vision pipeline described above handles the flow of rendered
157 images. Now, we need simulated eyes to serve as the source of these 163 images. Now, we need simulated eyes to serve as the source of these
158 images. 164 images.
159 165
160 An eye is described in blender in the same way as a joint. They are 166 An eye is described in blender in the same way as a joint. They are
284 (vals (retina-sensor-profile eye)))] 290 (vals (retina-sensor-profile eye)))]
285 [(apply max (map first dimensions)) 291 [(apply max (map first dimensions))
286 (apply max (map second dimensions))])) 292 (apply max (map second dimensions))]))
287 #+end_src 293 #+end_src
288 294
289 * Putting it all together: Importing and parsing descriptions of eyes. 295 * Importing and parsing descriptions of eyes.
290 First off, get the children of the "eyes" empty node to find all the 296 First off, get the children of the "eyes" empty node to find all the
291 eyes the creature has. 297 eyes the creature has.
292 #+name: eye-node 298 #+name: eye-node
293 #+begin_src clojure 299 #+begin_src clojure
294 (defvar 300 (defvar
411 The in-game display can be disrupted by all the viewports that the 417 The in-game display can be disrupted by all the viewports that the
412 functions greated by =(vision-kernel)= add. This doesn't affect the 418 functions greated by =(vision-kernel)= add. This doesn't affect the
413 simulation or the simulated senses, but can be annoying. 419 simulation or the simulated senses, but can be annoying.
414 =(gen-fix-display)= restores the in-simulation display. 420 =(gen-fix-display)= restores the in-simulation display.
415 421
416 ** Vision! 422 ** The =vision!= function creates sensory probes.
417 423
418 All the hard work has been done; all that remains is to apply 424 All the hard work has been done; all that remains is to apply
419 =(vision-kernel)= to each eye in the creature and gather the results 425 =(vision-kernel)= to each eye in the creature and gather the results
420 into one list of functions. 426 into one list of functions.
421 427
429 concat 435 concat
430 (for [eye (eyes creature)] 436 (for [eye (eyes creature)]
431 (vision-kernel creature eye)))) 437 (vision-kernel creature eye))))
432 #+end_src 438 #+end_src
433 439
434 ** Visualization of Vision 440 ** Displaying visual data for debugging.
435 441 # Visualization of Vision. Maybe less alliteration would be better.
436 It's vital to have a visual representation for each sense. Here I use 442 It's vital to have a visual representation for each sense. Here I use
437 =(view-sense)= to construct a function that will create a display for 443 =(view-sense)= to construct a function that will create a display for
438 visual data. 444 visual data.
439 445
440 #+name: display 446 #+name: display
688 #+begin_src sh :results silent 694 #+begin_src sh :results silent
689 cd /home/r/proj/cortex/render/worm-vision 695 cd /home/r/proj/cortex/render/worm-vision
690 ffmpeg -r 25 -b 9001k -i out/%07d.png -vcodec libtheora worm-vision.ogg 696 ffmpeg -r 25 -b 9001k -i out/%07d.png -vcodec libtheora worm-vision.ogg
691 #+end_src 697 #+end_src
692 698
699 * Onward!
700 - As a neat bonus, this idea behind simulated vision also enables one
701 to [[../../cortex/html/capture-video.html][capture live video feeds from jMonkeyEngine]].
702 - Now that we have vision, it's time to tackle [[./hearing.org][hearing]].
703
704
705 #+appendix
706
693 * Headers 707 * Headers
694 708
695 #+name: vision-header 709 #+name: vision-header
696 #+begin_src clojure 710 #+begin_src clojure
697 (ns cortex.vision 711 (ns cortex.vision
730 (:import com.jme3.math.Vector3f) 744 (:import com.jme3.math.Vector3f)
731 (:import java.io.File) 745 (:import java.io.File)
732 (:import (com.aurellem.capture Capture RatchetTimer))) 746 (:import (com.aurellem.capture Capture RatchetTimer)))
733 #+end_src 747 #+end_src
734 748
735 * Onward!
736 - As a neat bonus, this idea behind simulated vision also enables one
737 to [[../../cortex/html/capture-video.html][capture live video feeds from jMonkeyEngine]].
738 - Now that we have vision, it's time to tackle [[./hearing.org][hearing]].
739 749
740 * Source Listing 750 * Source Listing
741 - [[../src/cortex/vision.clj][cortex.vision]] 751 - [[../src/cortex/vision.clj][cortex.vision]]
742 - [[../src/cortex/test/vision.clj][cortex.test.vision]] 752 - [[../src/cortex/test/vision.clj][cortex.test.vision]]
743 - [[../src/cortex/video/magick2.clj][cortex.video.magick2]] 753 - [[../src/cortex/video/magick2.clj][cortex.video.magick2]]