Mercurial > cortex
changeset 155:95bf55614211
moved goals and deleted old test
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Fri, 03 Feb 2012 06:04:30 -0700 |
parents | bb235258f835 |
children | e8df6e76c3e5 |
files | org/ideas.org org/test-creature.org |
diffstat | 2 files changed, 98 insertions(+), 156 deletions(-) [+] |
line wrap: on
line diff
1.1 --- /dev/null Thu Jan 01 00:00:00 1970 +0000 1.2 +++ b/org/ideas.org Fri Feb 03 06:04:30 2012 -0700 1.3 @@ -0,0 +1,98 @@ 1.4 +* Brainstorming different sensors and effectors. 1.5 + 1.6 +Every sense that we have should have an effector that changes what 1.7 +that sense (or others who have that sense) experiences. 1.8 + 1.9 +** Classic Senses 1.10 +| Sense | Effector | 1.11 +|------------------------------+---------------------------------| 1.12 +| Vision | Variable Coloration | 1.13 +| Hearing | Speech | 1.14 +| Proprioception | Movement | 1.15 +| Smell/Taste (Chemoreception) | Pheremones | 1.16 +| Touch | Movement / Controllable Texture | 1.17 +| Acceleration | Movement | 1.18 +| Balance (sense gravity) | Movement | 1.19 +| | | 1.20 + 1.21 +- New Senses/Effectors 1.22 +- Levitation 1.23 +- Telekenesis 1.24 +- control of gravity within a certain radius 1.25 +- speed up/slow time 1.26 +- object creation/destruction 1.27 + 1.28 +- Symbol Sense 1.29 + Where objects in the world can be queried for description / 1.30 + symbols. 1.31 + 1.32 +- Symbol Marking 1.33 + The ability to mark objects in the world with your own descriptions 1.34 + and symbols. 1.35 + 1.36 +- Vision 1.37 + Distinguish the polarization of light 1.38 + Color 1.39 + Movement 1.40 + 1.41 +* project ideas 1.42 + - HACKER for writing muscle-control programs : Presented with 1.43 + low-level muscle control/ sense API, generate higher level programs 1.44 + for accomplishing various stated goals. Example goals might be 1.45 + "extend all your fingers" or "move your hand into the area with 1.46 + blue light" or "decrease the angle of this joint". It would be 1.47 + like Sussman's HACKER, except it would operate with much more data 1.48 + in a more realistic world. Start off with "calestanthics" to 1.49 + develop subrouitines over the motor control API. This would be the 1.50 + "spinal chord" of a more intelligent creature. The low level 1.51 + programming code might be a turning machine that could develop 1.52 + programs to iterate over a "tape" where each entry in the tape 1.53 + could control recruitment of the fibers in a muscle. 1.54 + - Make a virtual computer in the virtual world which with which the 1.55 + creature interacts using its fingers to press keys on a virtual 1.56 + keyboard. The creature can access the internet, watch videos, take 1.57 + over the world, anything it wants. 1.58 + - Make virtual insturments like pianos, drumbs, etc that it learns to 1.59 + play. 1.60 + - make a joint that figures out what type of joint it is (range of 1.61 + motion) 1.62 + 1.63 + 1.64 + 1.65 + 1.66 + 1.67 +* goals 1.68 + 1.69 +** have to get done before winston 1.70 + - [ ] write an explination for why greyscale bitmaps for senses is 1.71 + appropiate -- 1/2 day 1.72 + - [X] muscle control -- day 1.73 + - [X] proprioception sensor map in the style of the other senses -- day 1.74 + - [ ] refactor integration code to distribute to each of the senses 1.75 + -- day 1.76 + - [ ] create video showing all the senses for Winston -- 2 days 1.77 + - [ ] send package to friends for critiques -- 2 days 1.78 + - [ ] write summary of project for Winston \ 1.79 + - [ ] project proposals for Winston \ 1.80 + - [ ] additional senses to be implemented for Winston | -- 2 days 1.81 + - [ ] send Winston package / 1.82 + 1.83 +** would be cool to get done before winston 1.84 + - [X] enable greyscale bitmaps for touch -- 2 hours 1.85 + - [X] use sawfish to auto-tile sense windows -- 6 hours 1.86 + - [X] sawfish keybinding to automatically delete all sense windows 1.87 + - [ ] directly change the UV-pixels to show sensor activation -- 2 1.88 + days 1.89 + - [ ] proof of concept C sense manipulation -- 2 days 1.90 + - [ ] proof of concept GPU sense manipulation -- week 1.91 + - [ ] fourier view of sound -- 2 or 3 days 1.92 + - [ ] dancing music listener -- 1 day, depends on fourier 1.93 + 1.94 +** don't have to get done before winston 1.95 + - [ ] write tests for integration -- 3 days 1.96 + - [ ] usertime/gametime clock HUD display -- day 1.97 + - [ ] find papers for each of the senses justifying my own 1.98 + representation -- week 1.99 + - [ ] show sensor maps in HUD display? -- 4 days 1.100 + - [ ] show sensor maps in AWT display? -- 2 days 1.101 +
2.1 --- a/org/test-creature.org Fri Feb 03 06:01:22 2012 -0700 2.2 +++ b/org/test-creature.org Fri Feb 03 06:04:30 2012 -0700 2.3 @@ -8,104 +8,6 @@ 2.4 2.5 2.6 2.7 -* Brainstorming different sensors and effectors. 2.8 - 2.9 -Every sense that we have should have an effector that changes what 2.10 -that sense (or others who have that sense) experiences. 2.11 - 2.12 -** Classic Senses 2.13 -| Sense | Effector | 2.14 -|------------------------------+---------------------------------| 2.15 -| Vision | Variable Coloration | 2.16 -| Hearing | Speech | 2.17 -| Proprioception | Movement | 2.18 -| Smell/Taste (Chemoreception) | Pheremones | 2.19 -| Touch | Movement / Controllable Texture | 2.20 -| Acceleration | Movement | 2.21 -| Balance (sense gravity) | Movement | 2.22 -| | | 2.23 - 2.24 -- New Senses/Effectors 2.25 -- Levitation 2.26 -- Telekenesis 2.27 -- control of gravity within a certain radius 2.28 -- speed up/slow time 2.29 -- object creation/destruction 2.30 - 2.31 -- Symbol Sense 2.32 - Where objects in the world can be queried for description / 2.33 - symbols. 2.34 - 2.35 -- Symbol Marking 2.36 - The ability to mark objects in the world with your own descriptions 2.37 - and symbols. 2.38 - 2.39 -- Vision 2.40 - Distinguish the polarization of light 2.41 - Color 2.42 - Movement 2.43 - 2.44 -* project ideas 2.45 - - HACKER for writing muscle-control programs : Presented with 2.46 - low-level muscle control/ sense API, generate higher level programs 2.47 - for accomplishing various stated goals. Example goals might be 2.48 - "extend all your fingers" or "move your hand into the area with 2.49 - blue light" or "decrease the angle of this joint". It would be 2.50 - like Sussman's HACKER, except it would operate with much more data 2.51 - in a more realistic world. Start off with "calestanthics" to 2.52 - develop subrouitines over the motor control API. This would be the 2.53 - "spinal chord" of a more intelligent creature. The low level 2.54 - programming code might be a turning machine that could develop 2.55 - programs to iterate over a "tape" where each entry in the tape 2.56 - could control recruitment of the fibers in a muscle. 2.57 - - Make a virtual computer in the virtual world which with which the 2.58 - creature interacts using its fingers to press keys on a virtual 2.59 - keyboard. The creature can access the internet, watch videos, take 2.60 - over the world, anything it wants. 2.61 - - Make virtual insturments like pianos, drumbs, etc that it learns to 2.62 - play. 2.63 - - make a joint that figures out what type of joint it is (range of 2.64 - motion) 2.65 - 2.66 - 2.67 - 2.68 - 2.69 - 2.70 -* goals 2.71 - 2.72 -** have to get done before winston 2.73 - - [ ] write an explination for why greyscale bitmaps for senses is 2.74 - appropiate -- 1/2 day 2.75 - - [X] muscle control -- day 2.76 - - [X] proprioception sensor map in the style of the other senses -- day 2.77 - - [ ] refactor integration code to distribute to each of the senses 2.78 - -- day 2.79 - - [ ] create video showing all the senses for Winston -- 2 days 2.80 - - [ ] send package to friends for critiques -- 2 days 2.81 - - [ ] write summary of project for Winston \ 2.82 - - [ ] project proposals for Winston \ 2.83 - - [ ] additional senses to be implemented for Winston | -- 2 days 2.84 - - [ ] send Winston package / 2.85 - 2.86 -** would be cool to get done before winston 2.87 - - [X] enable greyscale bitmaps for touch -- 2 hours 2.88 - - [X] use sawfish to auto-tile sense windows -- 6 hours 2.89 - - [X] sawfish keybinding to automatically delete all sense windows 2.90 - - [ ] directly change the UV-pixels to show sensor activation -- 2 2.91 - days 2.92 - - [ ] proof of concept C sense manipulation -- 2 days 2.93 - - [ ] proof of concept GPU sense manipulation -- week 2.94 - - [ ] fourier view of sound -- 2 or 3 days 2.95 - - [ ] dancing music listener -- 1 day, depends on fourier 2.96 - 2.97 -** don't have to get done before winston 2.98 - - [ ] write tests for integration -- 3 days 2.99 - - [ ] usertime/gametime clock HUD display -- day 2.100 - - [ ] find papers for each of the senses justifying my own 2.101 - representation -- week 2.102 - - [ ] show sensor maps in HUD display? -- 4 days 2.103 - - [ ] show sensor maps in AWT display? -- 2 days 2.104 - 2.105 2.106 * Intro 2.107 So far, I've made the following senses -- 2.108 @@ -934,64 +836,6 @@ 2.109 ;; (println-repl (float (/ @timer 60)))))) 2.110 )))) 2.111 2.112 - 2.113 - 2.114 - 2.115 - 2.116 - 2.117 - 2.118 - 2.119 - 2.120 -;;; experiments in collisions 2.121 - 2.122 - 2.123 - 2.124 -(defn collision-test [] 2.125 - (let [b-radius 1 2.126 - b-position (Vector3f. 0 0 0) 2.127 - obj-b (box 1 1 1 :color ColorRGBA/Blue 2.128 - :position b-position 2.129 - :mass 0) 2.130 - node (nodify [obj-b]) 2.131 - bounds-b 2.132 - (doto (Picture.) 2.133 - (.setHeight 50) 2.134 - (.setWidth 50) 2.135 - (.setImage (asset-manager) 2.136 - "Models/creature1/hand.png" 2.137 - false 2.138 - )) 2.139 - 2.140 - ;;(Ray. (Vector3f. 0 -5 0) (.normalize (Vector3f. 0 1 0))) 2.141 - 2.142 - collisions 2.143 - (let [cr (CollisionResults.)] 2.144 - (.collideWith node bounds-b cr) 2.145 - (println (map #(.getContactPoint %) cr)) 2.146 - cr) 2.147 - 2.148 - ;;collision-points 2.149 - ;;(map #(sphere 0.1 :position (.getContactPoint %)) 2.150 - ;; collisions) 2.151 - 2.152 - ;;node (nodify (conj collision-points obj-b)) 2.153 - 2.154 - sim 2.155 - (world node 2.156 - {"key-space" 2.157 - (fn [_ value] 2.158 - (if value 2.159 - (let [cr (CollisionResults.)] 2.160 - (.collideWith node bounds-b cr) 2.161 - (println-repl (map #(.getContactPoint %) cr)) 2.162 - cr)))} 2.163 - no-op 2.164 - no-op) 2.165 - 2.166 - ] 2.167 - sim 2.168 - 2.169 -)) 2.170 2.171 2.172 ;; the camera will stay in its initial position/rotation with relation