rlm@328
|
1
|
rlm@328
|
2
|
rlm@155
|
3 * Brainstorming different sensors and effectors.
|
rlm@155
|
4
|
rlm@459
|
5 ** toys
|
rlm@459
|
6 Make toys in the world with "docking points," which would attach
|
rlm@459
|
7 together in physically stable ways. Could make analogs to legos,
|
rlm@459
|
8 tinker toys, and lincoln logs.
|
rlm@459
|
9
|
rlm@155
|
10 Every sense that we have should have an effector that changes what
|
rlm@155
|
11 that sense (or others who have that sense) experiences.
|
rlm@155
|
12
|
rlm@155
|
13 ** Classic Senses
|
rlm@155
|
14 | Sense | Effector |
|
rlm@155
|
15 |------------------------------+---------------------------------|
|
rlm@155
|
16 | Vision | Variable Coloration |
|
rlm@155
|
17 | Hearing | Speech |
|
rlm@155
|
18 | Proprioception | Movement |
|
rlm@306
|
19 | Smell/Taste (Chemoreception) | Pheromones |
|
rlm@155
|
20 | Touch | Movement / Controllable Texture |
|
rlm@155
|
21 | Acceleration | Movement |
|
rlm@155
|
22 | Balance (sense gravity) | Movement |
|
rlm@155
|
23
|
rlm@155
|
24 - New Senses/Effectors
|
rlm@155
|
25 - Levitation
|
rlm@306
|
26 - Telekinesis
|
rlm@155
|
27 - control of gravity within a certain radius
|
rlm@155
|
28 - speed up/slow time
|
rlm@155
|
29 - object creation/destruction
|
rlm@391
|
30 - prescience -- step the simulation forward a few ticks, gather
|
rlm@322
|
31 sensory data, then supply this data for the creature as one of its
|
rlm@322
|
32 actual senses.
|
rlm@155
|
33
|
rlm@155
|
34 - Symbol Sense
|
rlm@391
|
35 Objects in the world can be queried for description /
|
rlm@155
|
36 symbols.
|
rlm@155
|
37
|
rlm@155
|
38 - Symbol Marking
|
rlm@155
|
39 The ability to mark objects in the world with your own descriptions
|
rlm@155
|
40 and symbols.
|
rlm@155
|
41
|
rlm@155
|
42 - Vision
|
rlm@155
|
43 Distinguish the polarization of light
|
rlm@155
|
44 Color
|
rlm@155
|
45 Movement
|
rlm@155
|
46
|
rlm@155
|
47 * project ideas
|
rlm@155
|
48 - HACKER for writing muscle-control programs : Presented with
|
rlm@155
|
49 low-level muscle control/ sense API, generate higher level programs
|
rlm@155
|
50 for accomplishing various stated goals. Example goals might be
|
rlm@155
|
51 "extend all your fingers" or "move your hand into the area with
|
rlm@155
|
52 blue light" or "decrease the angle of this joint". It would be
|
rlm@155
|
53 like Sussman's HACKER, except it would operate with much more data
|
rlm@306
|
54 in a more realistic world. Start off with "calisthenics" to
|
rlm@306
|
55 develop subroutines over the motor control API. This would be the
|
rlm@155
|
56 "spinal chord" of a more intelligent creature. The low level
|
rlm@155
|
57 programming code might be a turning machine that could develop
|
rlm@155
|
58 programs to iterate over a "tape" where each entry in the tape
|
rlm@155
|
59 could control recruitment of the fibers in a muscle.
|
rlm@155
|
60 - Make a virtual computer in the virtual world which with which the
|
rlm@155
|
61 creature interacts using its fingers to press keys on a virtual
|
rlm@155
|
62 keyboard. The creature can access the internet, watch videos, take
|
rlm@155
|
63 over the world, anything it wants.
|
rlm@306
|
64 - Make virtual instruments like pianos, drums, etc that it learns to
|
rlm@155
|
65 play.
|
rlm@155
|
66 - make a joint that figures out what type of joint it is (range of
|
rlm@155
|
67 motion)
|
rlm@155
|
68
|
rlm@155
|
69 * goals
|
rlm@155
|
70
|
rlm@306
|
71 ** have to get done before Winston
|
rlm@306
|
72 - [X] write an explanation for why greyscale bitmaps for senses is
|
rlm@306
|
73 appropriate -- 1/2 day
|
rlm@155
|
74 - [X] muscle control -- day
|
rlm@155
|
75 - [X] proprioception sensor map in the style of the other senses -- day
|
rlm@193
|
76 - [X] refactor integration code to distribute to each of the senses
|
rlm@155
|
77 -- day
|
rlm@302
|
78 - [X] create video showing all the senses for Winston -- 2 days
|
rlm@322
|
79 - [X] spell checking !!!!!
|
rlm@322
|
80 - [X] send package to friends for critiques -- 2 days
|
rlm@306
|
81 - [X] fix videos that were encoded wrong, test on Internet Explorer.
|
rlm@302
|
82 - [X] redo videos vision with new collapse code
|
rlm@273
|
83 - [X] find a topology that looks good. (maybe nil topology?)
|
rlm@273
|
84 - [X] fix red part of touch cube in video and image
|
rlm@322
|
85 - [X] write summary of project for Winston \
|
rlm@322
|
86 - [X] project proposals for Winston \
|
rlm@322
|
87 - [X] additional senses to be implemented for Winston | -- 2 days
|
rlm@322
|
88 - [X] send Winston package /
|
rlm@155
|
89
|
rlm@306
|
90 ** would be cool to get done before Winston
|
rlm@155
|
91 - [X] enable greyscale bitmaps for touch -- 2 hours
|
rlm@155
|
92 - [X] use sawfish to auto-tile sense windows -- 6 hours
|
rlm@155
|
93 - [X] sawfish keybinding to automatically delete all sense windows
|
rlm@155
|
94 - [ ] proof of concept C sense manipulation -- 2 days
|
rlm@155
|
95 - [ ] proof of concept GPU sense manipulation -- week
|
rlm@306
|
96 - [ ] Fourier view of sound -- 2 or 3 days
|
rlm@306
|
97 - [ ] dancing music listener -- 1 day, depends on Fourier
|
rlm@269
|
98 - [ ] uberjar cortex demo
|
rlm@155
|
99
|
rlm@155
|
100 ** don't have to get done before winston
|
rlm@322
|
101 - [X] write tests for integration -- 3 days
|
rlm@205
|
102 - [X] usertime/gametime clock HUD display -- day
|
rlm@273
|
103 - [X] show sensor maps in HUD display? -- 4 days
|
rlm@273
|
104 - [X] show sensor maps in AWT display? -- 2 days
|
rlm@322
|
105 - [X] upgrade to clojure 1.3, replace all defvars with new def
|
rlm@261
|
106 - [ ] common video creation code.
|
rlm@155
|
107
|
rlm@234
|
108 * jMonkeyEngine ideas
|
rlm@234
|
109 - [ ] video showing bullet joints problem
|
rlm@234
|
110 - [ ] add mult for Matrix to Ray
|
rlm@306
|
111 - [ ] add iterator constructs to Vector3f, Vector2f, Triangle,
|
rlm@234
|
112 Matrix3f, Matrix4f, etc
|
rlm@234
|
113
|
rlm@164
|
114 ;;In the elder days of Art,
|
rlm@164
|
115 ;;Builders wrought with greatest care
|
rlm@164
|
116 ;;Each minute and unseen part;
|
rlm@164
|
117 ;;For the Gods see everywhere.
|
rlm@376
|
118
|
rlm@376
|
119 * misc
|
rlm@376
|
120 - use object tracking on moving objects to derive good static
|
rlm@376
|
121 detectors and achieve background separation
|
rlm@376
|
122 - temporal scale pyramids. this can help in verb recognition by
|
rlm@376
|
123 making verb identification time-scale independent (up to a certian
|
rlm@391
|
124 factor)
|