rlm@155: * Brainstorming different sensors and effectors. rlm@155: rlm@155: Every sense that we have should have an effector that changes what rlm@155: that sense (or others who have that sense) experiences. rlm@155: rlm@155: ** Classic Senses rlm@155: | Sense | Effector | rlm@155: |------------------------------+---------------------------------| rlm@155: | Vision | Variable Coloration | rlm@155: | Hearing | Speech | rlm@155: | Proprioception | Movement | rlm@155: | Smell/Taste (Chemoreception) | Pheremones | rlm@155: | Touch | Movement / Controllable Texture | rlm@155: | Acceleration | Movement | rlm@155: | Balance (sense gravity) | Movement | rlm@155: | | | rlm@155: rlm@155: - New Senses/Effectors rlm@155: - Levitation rlm@155: - Telekenesis rlm@155: - control of gravity within a certain radius rlm@155: - speed up/slow time rlm@155: - object creation/destruction rlm@155: rlm@155: - Symbol Sense rlm@155: Where objects in the world can be queried for description / rlm@155: symbols. rlm@155: rlm@155: - Symbol Marking rlm@155: The ability to mark objects in the world with your own descriptions rlm@155: and symbols. rlm@155: rlm@155: - Vision rlm@155: Distinguish the polarization of light rlm@155: Color rlm@155: Movement rlm@155: rlm@155: * project ideas rlm@155: - HACKER for writing muscle-control programs : Presented with rlm@155: low-level muscle control/ sense API, generate higher level programs rlm@155: for accomplishing various stated goals. Example goals might be rlm@155: "extend all your fingers" or "move your hand into the area with rlm@155: blue light" or "decrease the angle of this joint". It would be rlm@155: like Sussman's HACKER, except it would operate with much more data rlm@155: in a more realistic world. Start off with "calestanthics" to rlm@155: develop subrouitines over the motor control API. This would be the rlm@155: "spinal chord" of a more intelligent creature. The low level rlm@155: programming code might be a turning machine that could develop rlm@155: programs to iterate over a "tape" where each entry in the tape rlm@155: could control recruitment of the fibers in a muscle. rlm@155: - Make a virtual computer in the virtual world which with which the rlm@155: creature interacts using its fingers to press keys on a virtual rlm@155: keyboard. The creature can access the internet, watch videos, take rlm@155: over the world, anything it wants. rlm@155: - Make virtual insturments like pianos, drumbs, etc that it learns to rlm@155: play. rlm@155: - make a joint that figures out what type of joint it is (range of rlm@155: motion) rlm@155: rlm@155: rlm@155: rlm@155: rlm@155: rlm@155: * goals rlm@155: rlm@155: ** have to get done before winston rlm@155: - [ ] write an explination for why greyscale bitmaps for senses is rlm@155: appropiate -- 1/2 day rlm@155: - [X] muscle control -- day rlm@155: - [X] proprioception sensor map in the style of the other senses -- day rlm@193: - [X] refactor integration code to distribute to each of the senses rlm@155: -- day rlm@155: - [ ] create video showing all the senses for Winston -- 2 days rlm@155: - [ ] send package to friends for critiques -- 2 days rlm@155: - [ ] write summary of project for Winston \ rlm@155: - [ ] project proposals for Winston \ rlm@155: - [ ] additional senses to be implemented for Winston | -- 2 days rlm@155: - [ ] send Winston package / rlm@155: rlm@155: ** would be cool to get done before winston rlm@155: - [X] enable greyscale bitmaps for touch -- 2 hours rlm@155: - [X] use sawfish to auto-tile sense windows -- 6 hours rlm@155: - [X] sawfish keybinding to automatically delete all sense windows rlm@155: - [ ] directly change the UV-pixels to show sensor activation -- 2 rlm@155: days rlm@155: - [ ] proof of concept C sense manipulation -- 2 days rlm@155: - [ ] proof of concept GPU sense manipulation -- week rlm@155: - [ ] fourier view of sound -- 2 or 3 days rlm@155: - [ ] dancing music listener -- 1 day, depends on fourier rlm@155: rlm@155: ** don't have to get done before winston rlm@155: - [ ] write tests for integration -- 3 days rlm@155: - [ ] usertime/gametime clock HUD display -- day rlm@155: - [ ] find papers for each of the senses justifying my own rlm@155: representation -- week rlm@155: - [ ] show sensor maps in HUD display? -- 4 days rlm@155: - [ ] show sensor maps in AWT display? -- 2 days rlm@155: rlm@162: * refactoring objectives rlm@184: - [X] consistent, high-quality names rlm@184: - [X] joint-creation function in the style of others, kill blender-creature rlm@184: - [X] docstrings for every function rlm@184: - [X] common image-loading code rlm@193: - [X] refactor display/debug code rlm@184: - [X] refactor the "get the XX nodes functions" rlm@162: rlm@162: rlm@162: rlm@163: these are the sense-functions --- they each take a Node which rlm@163: repepsents the creature rlm@163: rlm@163: rlm@184: - [X] body! rlm@184: - [X] joints! rlm@163: rlm@184: - [X] touch! rlm@164: - [X] hearing! rlm@184: - [X] vision! rlm@184: - [X] proprioception! rlm@184: - [X] movement! rlm@164: rlm@164: rlm@164: ;;In the elder days of Art, rlm@164: ;;Builders wrought with greatest care rlm@164: ;;Each minute and unseen part; rlm@164: ;;For the Gods see everywhere.