rlm@155
|
1 * Brainstorming different sensors and effectors.
|
rlm@155
|
2
|
rlm@155
|
3 Every sense that we have should have an effector that changes what
|
rlm@155
|
4 that sense (or others who have that sense) experiences.
|
rlm@155
|
5
|
rlm@155
|
6 ** Classic Senses
|
rlm@155
|
7 | Sense | Effector |
|
rlm@155
|
8 |------------------------------+---------------------------------|
|
rlm@155
|
9 | Vision | Variable Coloration |
|
rlm@155
|
10 | Hearing | Speech |
|
rlm@155
|
11 | Proprioception | Movement |
|
rlm@155
|
12 | Smell/Taste (Chemoreception) | Pheremones |
|
rlm@155
|
13 | Touch | Movement / Controllable Texture |
|
rlm@155
|
14 | Acceleration | Movement |
|
rlm@155
|
15 | Balance (sense gravity) | Movement |
|
rlm@155
|
16 | | |
|
rlm@155
|
17
|
rlm@155
|
18 - New Senses/Effectors
|
rlm@155
|
19 - Levitation
|
rlm@155
|
20 - Telekenesis
|
rlm@155
|
21 - control of gravity within a certain radius
|
rlm@155
|
22 - speed up/slow time
|
rlm@155
|
23 - object creation/destruction
|
rlm@155
|
24
|
rlm@155
|
25 - Symbol Sense
|
rlm@155
|
26 Where objects in the world can be queried for description /
|
rlm@155
|
27 symbols.
|
rlm@155
|
28
|
rlm@155
|
29 - Symbol Marking
|
rlm@155
|
30 The ability to mark objects in the world with your own descriptions
|
rlm@155
|
31 and symbols.
|
rlm@155
|
32
|
rlm@155
|
33 - Vision
|
rlm@155
|
34 Distinguish the polarization of light
|
rlm@155
|
35 Color
|
rlm@155
|
36 Movement
|
rlm@155
|
37
|
rlm@155
|
38 * project ideas
|
rlm@155
|
39 - HACKER for writing muscle-control programs : Presented with
|
rlm@155
|
40 low-level muscle control/ sense API, generate higher level programs
|
rlm@155
|
41 for accomplishing various stated goals. Example goals might be
|
rlm@155
|
42 "extend all your fingers" or "move your hand into the area with
|
rlm@155
|
43 blue light" or "decrease the angle of this joint". It would be
|
rlm@155
|
44 like Sussman's HACKER, except it would operate with much more data
|
rlm@155
|
45 in a more realistic world. Start off with "calestanthics" to
|
rlm@155
|
46 develop subrouitines over the motor control API. This would be the
|
rlm@155
|
47 "spinal chord" of a more intelligent creature. The low level
|
rlm@155
|
48 programming code might be a turning machine that could develop
|
rlm@155
|
49 programs to iterate over a "tape" where each entry in the tape
|
rlm@155
|
50 could control recruitment of the fibers in a muscle.
|
rlm@155
|
51 - Make a virtual computer in the virtual world which with which the
|
rlm@155
|
52 creature interacts using its fingers to press keys on a virtual
|
rlm@155
|
53 keyboard. The creature can access the internet, watch videos, take
|
rlm@155
|
54 over the world, anything it wants.
|
rlm@155
|
55 - Make virtual insturments like pianos, drumbs, etc that it learns to
|
rlm@155
|
56 play.
|
rlm@155
|
57 - make a joint that figures out what type of joint it is (range of
|
rlm@155
|
58 motion)
|
rlm@155
|
59
|
rlm@155
|
60
|
rlm@155
|
61
|
rlm@155
|
62
|
rlm@155
|
63
|
rlm@155
|
64 * goals
|
rlm@155
|
65
|
rlm@155
|
66 ** have to get done before winston
|
rlm@155
|
67 - [ ] write an explination for why greyscale bitmaps for senses is
|
rlm@155
|
68 appropiate -- 1/2 day
|
rlm@155
|
69 - [X] muscle control -- day
|
rlm@155
|
70 - [X] proprioception sensor map in the style of the other senses -- day
|
rlm@155
|
71 - [ ] refactor integration code to distribute to each of the senses
|
rlm@155
|
72 -- day
|
rlm@155
|
73 - [ ] create video showing all the senses for Winston -- 2 days
|
rlm@155
|
74 - [ ] send package to friends for critiques -- 2 days
|
rlm@155
|
75 - [ ] write summary of project for Winston \
|
rlm@155
|
76 - [ ] project proposals for Winston \
|
rlm@155
|
77 - [ ] additional senses to be implemented for Winston | -- 2 days
|
rlm@155
|
78 - [ ] send Winston package /
|
rlm@155
|
79
|
rlm@155
|
80 ** would be cool to get done before winston
|
rlm@155
|
81 - [X] enable greyscale bitmaps for touch -- 2 hours
|
rlm@155
|
82 - [X] use sawfish to auto-tile sense windows -- 6 hours
|
rlm@155
|
83 - [X] sawfish keybinding to automatically delete all sense windows
|
rlm@155
|
84 - [ ] directly change the UV-pixels to show sensor activation -- 2
|
rlm@155
|
85 days
|
rlm@155
|
86 - [ ] proof of concept C sense manipulation -- 2 days
|
rlm@155
|
87 - [ ] proof of concept GPU sense manipulation -- week
|
rlm@155
|
88 - [ ] fourier view of sound -- 2 or 3 days
|
rlm@155
|
89 - [ ] dancing music listener -- 1 day, depends on fourier
|
rlm@155
|
90
|
rlm@155
|
91 ** don't have to get done before winston
|
rlm@155
|
92 - [ ] write tests for integration -- 3 days
|
rlm@155
|
93 - [ ] usertime/gametime clock HUD display -- day
|
rlm@155
|
94 - [ ] find papers for each of the senses justifying my own
|
rlm@155
|
95 representation -- week
|
rlm@155
|
96 - [ ] show sensor maps in HUD display? -- 4 days
|
rlm@155
|
97 - [ ] show sensor maps in AWT display? -- 2 days
|
rlm@155
|
98
|
rlm@162
|
99 * refactoring objectives
|
rlm@162
|
100 - [ ] consistent, high-quality names
|
rlm@162
|
101 - [ ] joint-creation function in the style of others, kill blender-creature
|
rlm@162
|
102 - [ ] docstrings for every function
|
rlm@162
|
103 - [ ] common image-loading code
|
rlm@162
|
104 - [ ] refactor display/debug code
|
rlm@162
|
105
|
rlm@162
|
106
|
rlm@162
|
107
|
rlm@162
|
108 for each sense,
|