rlm@401
|
1 In order for this to be a reasonable thesis that I can be proud of,
|
rlm@401
|
2 what are the /minimum/ number of things I need to get done?
|
rlm@401
|
3
|
rlm@401
|
4
|
rlm@401
|
5 * worm OR hand registration
|
rlm@401
|
6 - training from a few examples (2 to start out)
|
rlm@401
|
7 - aligning the body with the scene
|
rlm@401
|
8 - generating sensory data
|
rlm@401
|
9 - matching previous labeled examples using dot-products or some
|
rlm@401
|
10 other basic thing
|
rlm@401
|
11 - showing that it works with different views
|
rlm@401
|
12
|
rlm@401
|
13 * first draft
|
rlm@401
|
14 - draft of thesis without bibliography or formatting
|
rlm@401
|
15 - should have basic experiment and have full description of
|
rlm@401
|
16 framework with code
|
rlm@401
|
17 - review with Winston
|
rlm@401
|
18
|
rlm@401
|
19 * final draft
|
rlm@401
|
20 - implement stretch goals from Winston if possible
|
rlm@401
|
21 - complete final formatting and submit
|
rlm@401
|
22
|
rlm@570
|
23 <<<<<<< local
|
rlm@570
|
24 * DONE CORTEX
|
rlm@570
|
25 CLOSED: [2014-05-29 Thu 17:01] DEADLINE: <2014-05-09 Fri>
|
rlm@401
|
26 SHIT THAT'S IN 67 DAYS!!!
|
rlm@401
|
27
|
rlm@403
|
28 ** program simple feature matching code for the worm's segments
|
rlm@403
|
29
|
rlm@401
|
30 Subgoals:
|
rlm@401
|
31 *** DONE Get cortex working again, run tests, no jmonkeyengine updates
|
rlm@401
|
32 CLOSED: [2014-03-03 Mon 22:07] SCHEDULED: <2014-03-03 Mon>
|
rlm@401
|
33 *** DONE get blender working again
|
rlm@401
|
34 CLOSED: [2014-03-03 Mon 22:43] SCHEDULED: <2014-03-03 Mon>
|
rlm@401
|
35 *** DONE make sparce touch worm segment in blender
|
rlm@401
|
36 CLOSED: [2014-03-03 Mon 23:16] SCHEDULED: <2014-03-03 Mon>
|
rlm@401
|
37 CLOCK: [2014-03-03 Mon 22:44]--[2014-03-03 Mon 23:16] => 0:32
|
rlm@401
|
38 *** DONE make multi-segment touch worm with touch sensors and display
|
rlm@401
|
39 CLOSED: [2014-03-03 Mon 23:54] SCHEDULED: <2014-03-03 Mon>
|
rlm@401
|
40
|
rlm@401
|
41 *** DONE Make a worm wiggle and curl
|
rlm@401
|
42 CLOSED: [2014-03-04 Tue 23:03] SCHEDULED: <2014-03-04 Tue>
|
rlm@403
|
43
|
rlm@401
|
44
|
rlm@401
|
45 ** First draft
|
rlm@403
|
46
|
rlm@401
|
47 Subgoals:
|
rlm@401
|
48 *** Writeup new worm experiments.
|
rlm@401
|
49 *** Triage implementation code and get it into chapter form.
|
rlm@401
|
50
|
rlm@401
|
51
|
rlm@401
|
52
|
rlm@401
|
53
|
rlm@401
|
54
|
rlm@401
|
55 ** for today
|
rlm@401
|
56
|
rlm@401
|
57 - guided worm :: control the worm with the keyboard. Useful for
|
rlm@401
|
58 testing the body-centered recog scripts, and for
|
rlm@401
|
59 preparing a cool demo video.
|
rlm@401
|
60
|
rlm@401
|
61 - body-centered recognition :: detect actions using hard coded
|
rlm@401
|
62 body-centered scripts.
|
rlm@401
|
63
|
rlm@401
|
64 - cool demo video of the worm being moved and recognizing things ::
|
rlm@401
|
65 will be a neat part of the thesis.
|
rlm@401
|
66
|
rlm@401
|
67 - thesis export :: refactoring and organization of code so that it
|
rlm@401
|
68 spits out a thesis in addition to the web page.
|
rlm@401
|
69
|
rlm@401
|
70 - video alignment :: analyze the frames of a video in order to align
|
rlm@401
|
71 the worm. Requires body-centered recognition. Can "cheat".
|
rlm@401
|
72
|
rlm@401
|
73 - smoother actions :: use debugging controls to directly influence the
|
rlm@401
|
74 demo actions, and to generate recoginition procedures.
|
rlm@401
|
75
|
rlm@401
|
76 - degenerate video demonstration :: show the system recognizing a
|
rlm@401
|
77 curled worm from dead on. Crowning achievement of thesis.
|
rlm@401
|
78
|
rlm@401
|
79 ** Ordered from easiest to hardest
|
rlm@401
|
80
|
rlm@401
|
81 Just report the positions of everything. I don't think that this
|
rlm@401
|
82 necessairly shows anything usefull.
|
rlm@401
|
83
|
rlm@401
|
84 Worm-segment vision -- you initialize a view of the worm, but instead
|
rlm@401
|
85 of pixels you use labels via ray tracing. Has the advantage of still
|
rlm@401
|
86 allowing for visual occlusion, but reliably identifies the objects,
|
rlm@401
|
87 even without rainbow coloring. You can code this as an image.
|
rlm@401
|
88
|
rlm@401
|
89 Same as above, except just with worm/non-worm labels.
|
rlm@401
|
90
|
rlm@401
|
91 Color code each worm segment and then recognize them using blob
|
rlm@401
|
92 detectors. Then you solve for the perspective and the action
|
rlm@401
|
93 simultaneously.
|
rlm@401
|
94
|
rlm@401
|
95 The entire worm can be colored the same, high contrast color against a
|
rlm@401
|
96 nearly black background.
|
rlm@401
|
97
|
rlm@401
|
98 "Rooted" vision. You give the exact coordinates of ONE piece of the
|
rlm@401
|
99 worm, but the algorithm figures out the rest.
|
rlm@401
|
100
|
rlm@401
|
101 More rooted vision -- start off the entire worm with one posistion.
|
rlm@401
|
102
|
rlm@401
|
103 The right way to do alignment is to use motion over multiple frames to
|
rlm@401
|
104 snap individual pieces of the model into place sharing and
|
rlm@401
|
105 propragating the individual alignments over the whole model. We also
|
rlm@401
|
106 want to limit the alignment search to just those actions we are
|
rlm@401
|
107 prepared to identify. This might mean that I need some small "micro
|
rlm@401
|
108 actions" such as the individual movements of the worm pieces.
|
rlm@401
|
109
|
rlm@401
|
110 Get just the centers of each segment projected onto the imaging
|
rlm@401
|
111 plane. (best so far).
|
rlm@401
|
112
|
rlm@401
|
113
|
rlm@401
|
114 Repertoire of actions + video frames -->
|
rlm@401
|
115 directed multi-frame-search alg
|
rlm@401
|
116
|
rlm@401
|
117
|
rlm@401
|
118
|
rlm@401
|
119
|
rlm@401
|
120
|
rlm@401
|
121
|
rlm@401
|
122 !! Could also have a bounding box around the worm provided by
|
rlm@401
|
123 filtering the worm/non-worm render, and use bbbgs. As a bonus, I get
|
rlm@401
|
124 to include bbbgs in my thesis! Could finally do that recursive things
|
rlm@401
|
125 where I make bounding boxes be those things that give results that
|
rlm@401
|
126 give good bounding boxes. If I did this I could use a disruptive
|
rlm@401
|
127 pattern on the worm.
|
rlm@401
|
128
|
rlm@401
|
129 Re imagining using default textures is very simple for this system,
|
rlm@401
|
130 but hard for others.
|
rlm@401
|
131
|
rlm@401
|
132
|
rlm@401
|
133 Want to demonstrate, at minimum, alignment of some model of the worm
|
rlm@401
|
134 to the video, and a lookup of the action by simulated perception.
|
rlm@401
|
135
|
rlm@401
|
136 note: the purple/white points is a very beautiful texture, because
|
rlm@401
|
137 when it moves slightly, the white dots look like they're
|
rlm@401
|
138 twinkling. Would look even better if it was a darker purple. Also
|
rlm@401
|
139 would look better more spread out.
|
rlm@401
|
140
|
rlm@401
|
141
|
rlm@401
|
142 embed assumption of one frame of view, search by moving around in
|
rlm@401
|
143 simulated world.
|
rlm@401
|
144
|
rlm@401
|
145 Allowed to limit search by setting limits to a hemisphere around the
|
rlm@401
|
146 imagined worm! This limits scale also.
|
rlm@401
|
147
|
rlm@401
|
148
|
rlm@401
|
149
|
rlm@401
|
150
|
rlm@401
|
151
|
rlm@401
|
152 !! Limited search with worm/non-worm rendering.
|
rlm@401
|
153 How much inverse kinematics do we have to do?
|
rlm@401
|
154 What about cached (allowed state-space) paths, derived from labeled
|
rlm@401
|
155 training. You have to lead from one to another.
|
rlm@401
|
156
|
rlm@401
|
157 What about initial state? Could start the input videos at a specific
|
rlm@401
|
158 state, then just match that explicitly.
|
rlm@401
|
159
|
rlm@401
|
160 !! The training doesn't have to be labeled -- you can just move around
|
rlm@401
|
161 for a while!!
|
rlm@401
|
162
|
rlm@401
|
163 !! Limited search with motion based alignment.
|
rlm@401
|
164
|
rlm@401
|
165
|
rlm@401
|
166
|
rlm@401
|
167
|
rlm@401
|
168 "play arounds" can establish a chain of linked sensoriums. Future
|
rlm@401
|
169 matches must fall into one of the already experienced things, and once
|
rlm@401
|
170 they do, it greatly limits the things that are possible in the future.
|
rlm@401
|
171
|
rlm@401
|
172
|
rlm@401
|
173 frame differences help to detect muscle exertion.
|
rlm@401
|
174
|
rlm@401
|
175 Can try to match on a few "representative" frames. Can also just have
|
rlm@401
|
176 a few "bodies" in various states which we try to match.
|
rlm@401
|
177
|
rlm@401
|
178
|
rlm@401
|
179
|
rlm@401
|
180 Paths through state-space have the exact same signature as
|
rlm@401
|
181 simulation. BUT, these can be searched in parallel and don't interfere
|
rlm@401
|
182 with each other.
|
rlm@401
|
183
|
rlm@401
|
184
|
rlm@402
|
185
|
rlm@402
|
186
|
rlm@402
|
187 ** Final stretch up to First Draft
|
rlm@402
|
188
|
rlm@404
|
189 *** DONE complete debug control of worm
|
rlm@404
|
190 CLOSED: [2014-03-17 Mon 17:29] SCHEDULED: <2014-03-17 Mon>
|
rlm@404
|
191 CLOCK: [2014-03-17 Mon 14:01]--[2014-03-17 Mon 17:29] => 3:28
|
rlm@405
|
192 *** DONE add phi-space output to debug control
|
rlm@405
|
193 CLOSED: [2014-03-17 Mon 17:42] SCHEDULED: <2014-03-17 Mon>
|
rlm@405
|
194 CLOCK: [2014-03-17 Mon 17:31]--[2014-03-17 Mon 17:42] => 0:11
|
rlm@407
|
195
|
rlm@409
|
196 *** DONE complete automatic touch partitioning
|
rlm@409
|
197 CLOSED: [2014-03-18 Tue 21:43] SCHEDULED: <2014-03-18 Tue>
|
rlm@415
|
198 *** DONE complete cyclic predicate
|
rlm@415
|
199 CLOSED: [2014-03-19 Wed 16:34] SCHEDULED: <2014-03-18 Tue>
|
rlm@415
|
200 CLOCK: [2014-03-19 Wed 13:16]--[2014-03-19 Wed 16:34] => 3:18
|
rlm@415
|
201 *** DONE complete three phi-stream action predicatates; test them with debug control
|
rlm@415
|
202 CLOSED: [2014-03-19 Wed 16:35] SCHEDULED: <2014-03-17 Mon>
|
rlm@415
|
203 CLOCK: [2014-03-18 Tue 18:36]--[2014-03-18 Tue 21:43] => 3:07
|
rlm@407
|
204 CLOCK: [2014-03-18 Tue 18:34]--[2014-03-18 Tue 18:36] => 0:02
|
rlm@407
|
205 CLOCK: [2014-03-17 Mon 19:19]--[2014-03-17 Mon 21:19] => 2:00
|
rlm@415
|
206 *** DONE build an automatic "do all the things" sequence.
|
rlm@415
|
207 CLOSED: [2014-03-19 Wed 16:55] SCHEDULED: <2014-03-19 Wed>
|
rlm@415
|
208 CLOCK: [2014-03-19 Wed 16:53]--[2014-03-19 Wed 16:55] => 0:02
|
rlm@417
|
209 *** DONE implement proprioception based movement lookup in phi-space
|
rlm@417
|
210 CLOSED: [2014-03-19 Wed 22:04] SCHEDULED: <2014-03-19 Wed>
|
rlm@417
|
211 CLOCK: [2014-03-19 Wed 19:32]--[2014-03-19 Wed 22:04] => 2:32
|
rlm@418
|
212 *** DONE make proprioception reference phi-space indexes
|
rlm@418
|
213 CLOSED: [2014-03-19 Wed 22:47] SCHEDULED: <2014-03-19 Wed>
|
rlm@417
|
214
|
rlm@415
|
215
|
rlm@420
|
216 *** DONE create test videos, also record positions of worm segments
|
rlm@420
|
217 CLOSED: [2014-03-20 Thu 22:02] SCHEDULED: <2014-03-19 Wed>
|
rlm@402
|
218
|
rlm@403
|
219 *** TODO Collect intro, worm-learn and cortex creation into draft thesis.
|
rlm@405
|
220
|