# HG changeset patch # User Robert McIntyre # Date 1328274270 25200 # Node ID 95bf556142116917946f069b48b321eb341bb985 # Parent bb235258f8357d81f5e8072ad56c418dfe422431 moved goals and deleted old test diff -r bb235258f835 -r 95bf55614211 org/ideas.org --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/org/ideas.org Fri Feb 03 06:04:30 2012 -0700 @@ -0,0 +1,98 @@ +* Brainstorming different sensors and effectors. + +Every sense that we have should have an effector that changes what +that sense (or others who have that sense) experiences. + +** Classic Senses +| Sense | Effector | +|------------------------------+---------------------------------| +| Vision | Variable Coloration | +| Hearing | Speech | +| Proprioception | Movement | +| Smell/Taste (Chemoreception) | Pheremones | +| Touch | Movement / Controllable Texture | +| Acceleration | Movement | +| Balance (sense gravity) | Movement | +| | | + +- New Senses/Effectors +- Levitation +- Telekenesis +- control of gravity within a certain radius +- speed up/slow time +- object creation/destruction + +- Symbol Sense + Where objects in the world can be queried for description / + symbols. + +- Symbol Marking + The ability to mark objects in the world with your own descriptions + and symbols. + +- Vision + Distinguish the polarization of light + Color + Movement + +* project ideas + - HACKER for writing muscle-control programs : Presented with + low-level muscle control/ sense API, generate higher level programs + for accomplishing various stated goals. Example goals might be + "extend all your fingers" or "move your hand into the area with + blue light" or "decrease the angle of this joint". It would be + like Sussman's HACKER, except it would operate with much more data + in a more realistic world. Start off with "calestanthics" to + develop subrouitines over the motor control API. This would be the + "spinal chord" of a more intelligent creature. The low level + programming code might be a turning machine that could develop + programs to iterate over a "tape" where each entry in the tape + could control recruitment of the fibers in a muscle. + - Make a virtual computer in the virtual world which with which the + creature interacts using its fingers to press keys on a virtual + keyboard. The creature can access the internet, watch videos, take + over the world, anything it wants. + - Make virtual insturments like pianos, drumbs, etc that it learns to + play. + - make a joint that figures out what type of joint it is (range of + motion) + + + + + +* goals + +** have to get done before winston + - [ ] write an explination for why greyscale bitmaps for senses is + appropiate -- 1/2 day + - [X] muscle control -- day + - [X] proprioception sensor map in the style of the other senses -- day + - [ ] refactor integration code to distribute to each of the senses + -- day + - [ ] create video showing all the senses for Winston -- 2 days + - [ ] send package to friends for critiques -- 2 days + - [ ] write summary of project for Winston \ + - [ ] project proposals for Winston \ + - [ ] additional senses to be implemented for Winston | -- 2 days + - [ ] send Winston package / + +** would be cool to get done before winston + - [X] enable greyscale bitmaps for touch -- 2 hours + - [X] use sawfish to auto-tile sense windows -- 6 hours + - [X] sawfish keybinding to automatically delete all sense windows + - [ ] directly change the UV-pixels to show sensor activation -- 2 + days + - [ ] proof of concept C sense manipulation -- 2 days + - [ ] proof of concept GPU sense manipulation -- week + - [ ] fourier view of sound -- 2 or 3 days + - [ ] dancing music listener -- 1 day, depends on fourier + +** don't have to get done before winston + - [ ] write tests for integration -- 3 days + - [ ] usertime/gametime clock HUD display -- day + - [ ] find papers for each of the senses justifying my own + representation -- week + - [ ] show sensor maps in HUD display? -- 4 days + - [ ] show sensor maps in AWT display? -- 2 days + diff -r bb235258f835 -r 95bf55614211 org/test-creature.org --- a/org/test-creature.org Fri Feb 03 06:01:22 2012 -0700 +++ b/org/test-creature.org Fri Feb 03 06:04:30 2012 -0700 @@ -8,104 +8,6 @@ -* Brainstorming different sensors and effectors. - -Every sense that we have should have an effector that changes what -that sense (or others who have that sense) experiences. - -** Classic Senses -| Sense | Effector | -|------------------------------+---------------------------------| -| Vision | Variable Coloration | -| Hearing | Speech | -| Proprioception | Movement | -| Smell/Taste (Chemoreception) | Pheremones | -| Touch | Movement / Controllable Texture | -| Acceleration | Movement | -| Balance (sense gravity) | Movement | -| | | - -- New Senses/Effectors -- Levitation -- Telekenesis -- control of gravity within a certain radius -- speed up/slow time -- object creation/destruction - -- Symbol Sense - Where objects in the world can be queried for description / - symbols. - -- Symbol Marking - The ability to mark objects in the world with your own descriptions - and symbols. - -- Vision - Distinguish the polarization of light - Color - Movement - -* project ideas - - HACKER for writing muscle-control programs : Presented with - low-level muscle control/ sense API, generate higher level programs - for accomplishing various stated goals. Example goals might be - "extend all your fingers" or "move your hand into the area with - blue light" or "decrease the angle of this joint". It would be - like Sussman's HACKER, except it would operate with much more data - in a more realistic world. Start off with "calestanthics" to - develop subrouitines over the motor control API. This would be the - "spinal chord" of a more intelligent creature. The low level - programming code might be a turning machine that could develop - programs to iterate over a "tape" where each entry in the tape - could control recruitment of the fibers in a muscle. - - Make a virtual computer in the virtual world which with which the - creature interacts using its fingers to press keys on a virtual - keyboard. The creature can access the internet, watch videos, take - over the world, anything it wants. - - Make virtual insturments like pianos, drumbs, etc that it learns to - play. - - make a joint that figures out what type of joint it is (range of - motion) - - - - - -* goals - -** have to get done before winston - - [ ] write an explination for why greyscale bitmaps for senses is - appropiate -- 1/2 day - - [X] muscle control -- day - - [X] proprioception sensor map in the style of the other senses -- day - - [ ] refactor integration code to distribute to each of the senses - -- day - - [ ] create video showing all the senses for Winston -- 2 days - - [ ] send package to friends for critiques -- 2 days - - [ ] write summary of project for Winston \ - - [ ] project proposals for Winston \ - - [ ] additional senses to be implemented for Winston | -- 2 days - - [ ] send Winston package / - -** would be cool to get done before winston - - [X] enable greyscale bitmaps for touch -- 2 hours - - [X] use sawfish to auto-tile sense windows -- 6 hours - - [X] sawfish keybinding to automatically delete all sense windows - - [ ] directly change the UV-pixels to show sensor activation -- 2 - days - - [ ] proof of concept C sense manipulation -- 2 days - - [ ] proof of concept GPU sense manipulation -- week - - [ ] fourier view of sound -- 2 or 3 days - - [ ] dancing music listener -- 1 day, depends on fourier - -** don't have to get done before winston - - [ ] write tests for integration -- 3 days - - [ ] usertime/gametime clock HUD display -- day - - [ ] find papers for each of the senses justifying my own - representation -- week - - [ ] show sensor maps in HUD display? -- 4 days - - [ ] show sensor maps in AWT display? -- 2 days - * Intro So far, I've made the following senses -- @@ -934,64 +836,6 @@ ;; (println-repl (float (/ @timer 60)))))) )))) - - - - - - - - -;;; experiments in collisions - - - -(defn collision-test [] - (let [b-radius 1 - b-position (Vector3f. 0 0 0) - obj-b (box 1 1 1 :color ColorRGBA/Blue - :position b-position - :mass 0) - node (nodify [obj-b]) - bounds-b - (doto (Picture.) - (.setHeight 50) - (.setWidth 50) - (.setImage (asset-manager) - "Models/creature1/hand.png" - false - )) - - ;;(Ray. (Vector3f. 0 -5 0) (.normalize (Vector3f. 0 1 0))) - - collisions - (let [cr (CollisionResults.)] - (.collideWith node bounds-b cr) - (println (map #(.getContactPoint %) cr)) - cr) - - ;;collision-points - ;;(map #(sphere 0.1 :position (.getContactPoint %)) - ;; collisions) - - ;;node (nodify (conj collision-points obj-b)) - - sim - (world node - {"key-space" - (fn [_ value] - (if value - (let [cr (CollisionResults.)] - (.collideWith node bounds-b cr) - (println-repl (map #(.getContactPoint %) cr)) - cr)))} - no-op - no-op) - - ] - sim - -)) ;; the camera will stay in its initial position/rotation with relation