Mercurial > cortex
comparison org/sense.org @ 201:1c915cc1118b
minor edits for sense.org
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Tue, 07 Feb 2012 07:13:45 -0700 |
parents | 7eb966144dad |
children | d5c597a7aed4 |
comparison
equal
deleted
inserted
replaced
200:7eb966144dad | 201:1c915cc1118b |
---|---|
73 | 73 |
74 ** UV-maps | 74 ** UV-maps |
75 | 75 |
76 Blender and jMonkeyEngine already have support for exactly this sort | 76 Blender and jMonkeyEngine already have support for exactly this sort |
77 of data structure because it is used to "skin" models for games. It is | 77 of data structure because it is used to "skin" models for games. It is |
78 called [[http://wiki.blender.org/index.php/Doc:2.6/Manual/Textures/Mapping/UV][UV-mapping]]. The three-dimensional surface is cut and smooshed | 78 called [[http://wiki.blender.org/index.php/Doc:2.6/Manual/Textures/Mapping/UV][UV-mapping]]. The three-dimensional surface of a model is cut |
79 until it fits on a two-dimensional image. You paint whatever you want | 79 and smooshed until it fits on a two-dimensional image. You paint |
80 on that image, and when the three-dimensional shape is rendered in a | 80 whatever you want on that image, and when the three-dimensional shape |
81 game that image the smooshing and cutting us reversed and the image | 81 is rendered in a game the smooshing and cutting us reversed and the |
82 appears on the three-dimensional object. | 82 image appears on the three-dimensional object. |
83 | 83 |
84 To make a sense, interpret the UV-image as describing the distribution | 84 To make a sense, interpret the UV-image as describing the distribution |
85 of that senses sensors. To get different types of sensors, you can | 85 of that senses sensors. To get different types of sensors, you can |
86 either use a different color for each type of sensor, or use multiple | 86 either use a different color for each type of sensor, or use multiple |
87 UV-maps, each labeled with that sensor type. I generally use a white | 87 UV-maps, each labeled with that sensor type. I generally use a white |
109 | 109 |
110 The following code loads images and gets the locations of the white | 110 The following code loads images and gets the locations of the white |
111 pixels so that they can be used to create senses. =(load-image)= finds | 111 pixels so that they can be used to create senses. =(load-image)= finds |
112 images using jMonkeyEngine's asset-manager, so the image path is | 112 images using jMonkeyEngine's asset-manager, so the image path is |
113 expected to be relative to the =assets= directory. Thanks to Dylan | 113 expected to be relative to the =assets= directory. Thanks to Dylan |
114 for the beautiful version of filter-pixels. | 114 for the beautiful version of =(filter-pixels)=. |
115 | 115 |
116 #+name: topology-1 | 116 #+name: topology-1 |
117 #+begin_src clojure | 117 #+begin_src clojure |
118 (defn load-image | 118 (defn load-image |
119 "Load an image as a BufferedImage using the asset-manager system." | 119 "Load an image as a BufferedImage using the asset-manager system." |
350 (if-let [sense-node (.getChild creature parent-name)] | 350 (if-let [sense-node (.getChild creature parent-name)] |
351 (seq (.getChildren sense-node)) | 351 (seq (.getChildren sense-node)) |
352 (do (println-repl "could not find" parent-name "node") [])))) | 352 (do (println-repl "could not find" parent-name "node") [])))) |
353 | 353 |
354 (defn closest-node | 354 (defn closest-node |
355 "Return the node in creature which is closest to the given node." | 355 "Return the physical node in creature which is closest to the given |
356 node." | |
356 [#^Node creature #^Node empty] | 357 [#^Node creature #^Node empty] |
357 (loop [radius (float 0.01)] | 358 (loop [radius (float 0.01)] |
358 (let [results (CollisionResults.)] | 359 (let [results (CollisionResults.)] |
359 (.collideWith | 360 (.collideWith |
360 creature | 361 creature |
423 java.io.File | 424 java.io.File |
424 (com.jme3.math Vector3f ColorRGBA) | 425 (com.jme3.math Vector3f ColorRGBA) |
425 (com.aurellem.capture RatchetTimer Capture))) | 426 (com.aurellem.capture RatchetTimer Capture))) |
426 | 427 |
427 (defn test-bind-sense | 428 (defn test-bind-sense |
428 "Show a camera that stays in the same relative position to a blue cube." | 429 "Show a camera that stays in the same relative position to a blue |
430 cube." | |
429 [] | 431 [] |
430 (let [camera-pos (Vector3f. 0 30 0) | 432 (let [eye-pos (Vector3f. 0 30 0) |
431 rock (box 1 1 1 :color ColorRGBA/Blue | 433 rock (box 1 1 1 :color ColorRGBA/Blue |
432 :position (Vector3f. 0 10 0) | 434 :position (Vector3f. 0 10 0) |
433 :mass 30) | 435 :mass 30) |
434 rot (.getWorldRotation rock) | |
435 table (box 3 1 10 :color ColorRGBA/Gray :mass 0 | 436 table (box 3 1 10 :color ColorRGBA/Gray :mass 0 |
436 :position (Vector3f. 0 -3 0))] | 437 :position (Vector3f. 0 -3 0))] |
437 (world | 438 (world |
438 (nodify [rock table]) | 439 (nodify [rock table]) |
439 standard-debug-controls | 440 standard-debug-controls |
440 (fn [world] | 441 (fn init [world] |
441 (let | 442 (let [cam (doto (.clone (.getCamera world)) |
442 [cam (doto (.clone (.getCamera world)) | 443 (.setLocation eye-pos) |
443 (.setLocation camera-pos) | 444 (.lookAt Vector3f/ZERO |
444 (.lookAt Vector3f/ZERO | 445 Vector3f/UNIT_X))] |
445 Vector3f/UNIT_X))] | |
446 (bind-sense rock cam) | 446 (bind-sense rock cam) |
447 (.setTimer world (RatchetTimer. 60)) | 447 (.setTimer world (RatchetTimer. 60)) |
448 (Capture/captureVideo | 448 (Capture/captureVideo |
449 world (File. "/home/r/proj/cortex/render/bind-sense0")) | 449 world (File. "/home/r/proj/cortex/render/bind-sense0")) |
450 (add-camera! | 450 (add-camera! |