Mercurial > cortex
diff winston-intro.txt @ 305:19c43ec6958d
modified cover letter, wonston-letter, and began uploading files to youtube for backup
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Sat, 18 Feb 2012 10:28:14 -0700 |
parents | 7e3938f40c52 |
children | 71946ec07be9 |
line wrap: on
line diff
1.1 --- a/winston-intro.txt Sat Feb 18 02:07:40 2012 -0600 1.2 +++ b/winston-intro.txt Sat Feb 18 10:28:14 2012 -0700 1.3 @@ -1,103 +1,85 @@ 1.4 Dear Professor Winston, 1.5 1.6 -I'm ready for you to look through the work that I've done so far. It's 1.7 -a sequence of posts describing the different simulated senses I've 1.8 -implemented, with videos. 1.9 +I've finished the first part of my project, building a framework for 1.10 +virtual sensate creatures; I would like your help evaluating what I've 1.11 +done so far, and deciding what to do next. 1.12 1.13 -It's "blocks world reloaded", because like you say, you need multiple 1.14 -senses to enable intelligence. 1.15 +For the work I've done so far, I compiled the results into short 1.16 +articles that explain how I implemented each sense, with videos that 1.17 +show each sense in action. Please look through the articles, in 1.18 +particular the video showcase, and tell me what you think. 1.19 1.20 -Please look through the videos and skim the text and tell me what 1.21 -you think: 1.22 +Video Showcase : http://aurellem.localhost/cortex/org/cover.html 1.23 1.24 Introduction: 1.25 http://aurellem.org/cortex/html/intro.html 1.26 http://aurellem.org/cortex/html/sense.html 1.27 1.28 -http://aurellem.org/cortex/html/body.html -- simulated physical bodies 1.29 -http://aurellem.org/cortex/html/vision.html -- simulated eyes 1.30 -http://aurellem.org/cortex/html/hearing.html -- simulated ears 1.31 -http://aurellem.org/cortex/html/touch.html -- simulated skin/hairs 1.32 -http://aurellem.org/cortex/html/proprioception.html -- simulated proprioception 1.33 -http://aurellem.org/cortex/html/movement.html -- simulated muscles 1.34 -http://aurellem.org/cortex/html/integration.html -- full demonstration 1.35 +Physical Bodies : http://aurellem.org/cortex/html/body.html 1.36 +Vision : http://aurellem.org/cortex/html/vision.html 1.37 +Hearing : http://aurellem.org/cortex/html/hearing.html 1.38 +Touch : http://aurellem.org/cortex/html/touch.html 1.39 +Proprioception : http://aurellem.org/cortex/html/proprioception.html 1.40 +Muscles : http://aurellem.org/cortex/html/movement.html 1.41 +Full Demonstration : http://aurellem.org/cortex/html/integration.html 1.42 1.43 -In particular, look at the video at 1.44 -http://aurellem.org/cortex/html/integration.html. It shows a 1.45 -simulated hand equipped with all of the senses I've built so far. 1.46 +I think this work could be a fruitful foundation for a Master's 1.47 +thesis, so in particular, I'd like critiques, suggestions, and project 1.48 +ideas. For example, here are some projects I think would be worthy, in 1.49 +increasing order of complexity: 1.50 1.51 -There's some more background information and full source code at 1.52 -http://aurellem.org 1.53 + * Create a self-powered joint that can determine its range of 1.54 + motion and joint type (hinge, cone, point-to-point, etc.) by 1.55 + making exploratory muscle movements and observing their effect. 1.56 1.57 -If you can't see a video, let me know and I'll upload it to YouTube so 1.58 -you can see it. 1.59 + * Develop an agent that writes and debugs low-level motor control 1.60 + programs to achieve simple goals like "look at the light" or 1.61 + "extend all of your fingers". These simple "calisthenic" 1.62 + programs could then be combined to form more elaborate 1.63 + procedures of motion, which in turn could be the basic 1.64 + instinctive reflexes in the "spinal cord" of some more advanced 1.65 + creature. (like Sussman's HACKER program but in a richer world) 1.66 1.67 + * Program a group of creatures that cooperate with each 1.68 + other. Because the creatures would be simulated, I could 1.69 + investigate computationally complex rules of behavior which 1.70 + still, from the group's point of view, would happen in "real 1.71 + time". Interactions could be as simple as cellular organisms 1.72 + communicating via flashing lights, or as complex as humanoids 1.73 + completing social tasks, etc. 1.74 1.75 + * Simulated Imagination -- this would involve a creature with an 1.76 + effector which creates an entire new sub-simulation where the 1.77 + creature has direct control over placement/creation of objects 1.78 + via simulated telekinesis. The creature observes this sub-world 1.79 + through it's normal senses and uses its observations to make 1.80 + predictions about it's top level world. 1.81 1.82 + * Integrate the simulated world with Genesis, so that Genesis 1.83 + could use the simulated world to answer questions about a 1.84 + proposed physical scenario. For example "You stack two blocks 1.85 + together, then hit the bottom block with your hand. Does the top 1.86 + block move?". This project is complicated and very large in 1.87 + scope, but it could be narrowed to focus on a single key 1.88 + aspect. For example, one key aspect of turning a scenario into a 1.89 + simulation is knowing when you're constructing "typical" or 1.90 + "atypical" examples of the scenario. So, a narrower project 1.91 + might simply learn about the edge cases of different scenarios 1.92 + (e.g. "A block stacked on top of another block is usually 1.93 + stable, provided the bottom block is large enough, and is not 1.94 + moving, and is level, etc."). With this knowledge, this kind of 1.95 + program could aid Genesis not only in answering common-sense 1.96 + questions, but in refining them: "A block is stacked on top of 1.97 + another block. Is it stable?"; "Usually, but do you know if the 1.98 + bottom block is slanted?", etc. 1.99 1.100 -Now, I need your help moving forward. Can I use this work as a base 1.101 -for a Masters thesis with you when I come back to MIT this coming Fall? 1.102 -What critiques and project ideas do you have after looking through 1.103 -what I've done so far? 1.104 +These are some ideas, but I think you can come up with better ones. I 1.105 +can't wait to hear your critiques and suggestions. 1.106 1.107 -I have some ideas on where I can go with this project but I think you 1.108 -will have some better ones. 1.109 +Finally, regarding next year at MIT, can I be considered for the 1.110 +position of TA for 6.034 or 6.xxx? Also, do you want me to return Fall 1.111 +or Summer? 1.112 1.113 -Here are some possible projects I might do with this as a base that I 1.114 -think would be worthy Masters projects. 1.115 +Sincerely, 1.116 1.117 - - HACKER for writing muscle-control programs : Presented with 1.118 - low-level muscle control/ sense API, generate higher level programs 1.119 - for accomplishing various stated goals. Example goals might be 1.120 - "extend all your fingers" or "move your hand into the area with 1.121 - blue light" or "decrease the angle of this joint". It would be 1.122 - like Sussman's HACKER, except it would operate with much more data 1.123 - in a more realistic world. Start off with "calisthenics" to 1.124 - develop subroutines over the motor control API. This would be the 1.125 - "spinal chord" of a more intelligent creature. 1.126 - 1.127 - - Create hundreds of small creatures and have them do simple 1.128 - simulated swarm tasks. 1.129 - 1.130 - - A joint that learns what sort of joint it (cone, point-to-point, 1.131 - hinge, etc.) is by correlating exploratory muscle movements with 1.132 - vision. 1.133 - 1.134 - - Something with cross-modal clustering using the rich sense 1.135 - data. This might prove difficult due to the higher dimensionality 1.136 - of my senses. 1.137 - 1.138 - - Simulated Imagination --- this would involve a creature with an 1.139 - effector which creates an /entire new sub-simulation/ where the 1.140 - creature has direct control over placement/creation of objects via 1.141 - simulated telekinesis. The creature observes this sub-world through 1.142 - it's normal senses and uses its observations to make predictions 1.143 - about it's top level world. 1.144 - 1.145 - - Hook it up with Genesis --- I could make a "semantic effector" 1.146 - which marks objects/sensory states with semantic information. In 1.147 - conjunction with Simulated Imagination, and HACKER-like motor 1.148 - control, Genesis might be able to ask simple questions like "stack 1.149 - two blocks together and hit the top one with your hand; does the 1.150 - bottom block move?" and the system could answer "yes". This would 1.151 - be rather complicated to do and involves many of the above 1.152 - projects, but there may be a way to scale it down to Master's 1.153 - thesis size. 1.154 - 1.155 - - Make a virtual computer in the virtual world which with which the 1.156 - creature interacts using its fingers to press keys on a virtual 1.157 - keyboard. The creature can access the internet, watch videos, take 1.158 - over the world, anything it wants. (This is probably not worthy of 1.159 - a Masters project, I just thought it was a neat idea. It's possible 1.160 - to use videos/etc in the simulated world at any rate.) 1.161 - 1.162 - 1.163 -I can't wait to hear your critiques and ideas. If you think I 1.164 -shouldn't use this system as a base and should instead do something 1.165 -else, that's fine too. 1.166 - 1.167 -On a related note, can I be considered for the position of TA next 1.168 -year for 6.034 or 6.xxx? 1.169 - 1.170 -sincerely, 1.171 --Robert McIntyre