Mercurial > cortex
changeset 305:19c43ec6958d
modified cover letter, wonston-letter, and began uploading files to youtube for backup
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Sat, 18 Feb 2012 10:28:14 -0700 |
parents | 2dfebf71053c |
children | 7e7f8d6d9ec5 |
files | org/cover.html org/youtube.org winston-intro.txt |
diffstat | 3 files changed, 118 insertions(+), 106 deletions(-) [+] |
line wrap: on
line diff
1.1 --- a/org/cover.html Sat Feb 18 02:07:40 2012 -0600 1.2 +++ b/org/cover.html Sat Feb 18 10:28:14 2012 -0700 1.3 @@ -35,50 +35,59 @@ 1.4 <br><br> 1.5 1.6 <p> 1.7 -The purpose of this project is to create a framework for sensate 1.8 -creatures in a virtual environment. This approach is advantageous 1.9 -because of timing. 1.10 1.11 -You can see a showcase of each sense below, as well as a short article 1.12 -explaining how I implemented it. 1.13 +<video controls="controls" width="755"> 1.14 + <source src="../video/hand.ogg" type="video/ogg" preload="none" poster="../images/aurellem-1280x480.png"> 1.15 +</video> 1.16 1.17 -<h2>Vision</h2> 1.18 -<a href="./vision.html">Read the article on implementing vision »</a> 1.19 +Here are videos which showcase each of the senses shown in the video 1.20 +above, as well as links to full source and an article explaining how I 1.21 +implemented each of them. 1.22 + 1.23 +<h2>Physical Bodies</h2> 1.24 +<a href="../html/body.html">Read the article on implementing physical bodies »</a> 1.25 <video controls="controls" width="755"> 1.26 - <source src="./video/worm-vision.ogg" type="video/ogg" preload="none" poster="../images/aurellem-1280x480.png"> 1.27 + <source src="../video/full-hand.ogg" type="video/ogg" preload="none" poster="../images/aurellem-1280x480.png"> 1.28 </video> 1.29 1.30 <h2>Vision</h2> 1.31 -<a href="./vision.html">Read the article on implementing vision »</a> 1.32 +<a href="../html/vision.html">Read the article on implementing vision »</a> 1.33 <video controls="controls" width="755"> 1.34 - <source src="./video/worm-vision.ogg" type="video/ogg" preload="none" poster="../images/aurellem-1280x480.png"> 1.35 + <source src="../video/worm-vision.ogg" type="video/ogg" preload="none" poster="../images/aurellem-1280x480.png"> 1.36 </video> 1.37 1.38 -<h2>Vision</h2> 1.39 -<a href="./vision.html">Read the article on implementing vision »</a> 1.40 + 1.41 +<h2>Hearing</h2> 1.42 +<a href="../html/hearing.html">Read the article on implementing hearing »</a> 1.43 <video controls="controls" width="755"> 1.44 - <source src="./video/worm-vision.ogg" type="video/ogg" preload="none" poster="../images/aurellem-1280x480.png"> 1.45 + <source src="../video/java-hearing-test.ogg" type="video/ogg" preload="none" poster="../images/aurellem-1280x480.png"> 1.46 +</video> 1.47 +<video controls="controls" width="755"> 1.48 + <source src="../video/worm-hearing.ogg" type="video/ogg" preload="none" poster="../images/aurellem-1280x480.png"> 1.49 </video> 1.50 1.51 -<h2>Vision</h2> 1.52 -<a href="./vision.html">Read the article on implementing vision »</a> 1.53 + 1.54 +<h2>Touch</h2> 1.55 +<a href="../html/touch.html">Read the article on implementing touch »</a> 1.56 <video controls="controls" width="755"> 1.57 - <source src="./video/worm-vision.ogg" type="video/ogg" preload="none" poster="../images/aurellem-1280x480.png"> 1.58 + <source src="../video/basic-touch.ogg" type="video/ogg" preload="none" poster="../images/aurellem-1280x480.png"> 1.59 +</video> 1.60 +<video controls="controls" width="755"> 1.61 + <source src="../video/worm-touch.ogg" type="video/ogg" preload="none" poster="../images/aurellem-1280x480.png"> 1.62 </video> 1.63 1.64 -<h2>Vision</h2> 1.65 -<a href="./vision.html">Read the article on implementing vision »</a> 1.66 +<h2>Proprioception</h2> 1.67 +<a href="../html/proprioception.html">Read the article on implementing proprioception »</a> 1.68 <video controls="controls" width="755"> 1.69 - <source src="./video/worm-vision.ogg" type="video/ogg" preload="none" poster="../images/aurellem-1280x480.png"> 1.70 + <source src="../video/test-proprioception.ogg" type="video/ogg" preload="none" poster="../images/aurellem-1280x480.png"> 1.71 </video> 1.72 1.73 -<h2>Vision</h2> 1.74 -<a href="./vision.html">Read the article on implementing vision »</a> 1.75 +<h2>Movement</h2> 1.76 +<a href="../html/movement.html">Read the article on implementing muscles »</a> 1.77 <video controls="controls" width="755"> 1.78 - <source src="./video/worm-vision.ogg" type="video/ogg" preload="none" poster="../images/aurellem-1280x480.png"> 1.79 + <source src="../video/worm-muscles.ogg" type="video/ogg" preload="none" poster="../images/aurellem-1280x480.png"> 1.80 </video> 1.81 1.82 - 1.83 </p> 1.84 1.85 </body>
2.1 --- /dev/null Thu Jan 01 00:00:00 1970 +0000 2.2 +++ b/org/youtube.org Sat Feb 18 10:28:14 2012 -0700 2.3 @@ -0,0 +1,21 @@ 2.4 +| local-file | youtube-url | 2.5 +|-------------------------+-------------| 2.6 +| basic-touch.ogg | | 2.7 +| hand.ogg | | 2.8 +| worm-hearing.ogg | | 2.9 +| bind-sense.ogg | | 2.10 +| java-hearing-test.ogg | | 2.11 +| worm-muscles.ogg | | 2.12 +| crumbly-hand.ogg | | 2.13 +| spinning-cube.ogg | | 2.14 +| worm-touch.ogg | | 2.15 +| cube.ogg | | 2.16 +| test-proprioception.ogg | | 2.17 +| worm-vision.ogg | | 2.18 +| full-hand.ogg | | 2.19 +| touch-cube.ogg | | 2.20 +| ghost-hand.ogg | | 2.21 +| worm-1.ogg | | 2.22 +|-------------------------+-------------| 2.23 + 2.24 +
3.1 --- a/winston-intro.txt Sat Feb 18 02:07:40 2012 -0600 3.2 +++ b/winston-intro.txt Sat Feb 18 10:28:14 2012 -0700 3.3 @@ -1,103 +1,85 @@ 3.4 Dear Professor Winston, 3.5 3.6 -I'm ready for you to look through the work that I've done so far. It's 3.7 -a sequence of posts describing the different simulated senses I've 3.8 -implemented, with videos. 3.9 +I've finished the first part of my project, building a framework for 3.10 +virtual sensate creatures; I would like your help evaluating what I've 3.11 +done so far, and deciding what to do next. 3.12 3.13 -It's "blocks world reloaded", because like you say, you need multiple 3.14 -senses to enable intelligence. 3.15 +For the work I've done so far, I compiled the results into short 3.16 +articles that explain how I implemented each sense, with videos that 3.17 +show each sense in action. Please look through the articles, in 3.18 +particular the video showcase, and tell me what you think. 3.19 3.20 -Please look through the videos and skim the text and tell me what 3.21 -you think: 3.22 +Video Showcase : http://aurellem.localhost/cortex/org/cover.html 3.23 3.24 Introduction: 3.25 http://aurellem.org/cortex/html/intro.html 3.26 http://aurellem.org/cortex/html/sense.html 3.27 3.28 -http://aurellem.org/cortex/html/body.html -- simulated physical bodies 3.29 -http://aurellem.org/cortex/html/vision.html -- simulated eyes 3.30 -http://aurellem.org/cortex/html/hearing.html -- simulated ears 3.31 -http://aurellem.org/cortex/html/touch.html -- simulated skin/hairs 3.32 -http://aurellem.org/cortex/html/proprioception.html -- simulated proprioception 3.33 -http://aurellem.org/cortex/html/movement.html -- simulated muscles 3.34 -http://aurellem.org/cortex/html/integration.html -- full demonstration 3.35 +Physical Bodies : http://aurellem.org/cortex/html/body.html 3.36 +Vision : http://aurellem.org/cortex/html/vision.html 3.37 +Hearing : http://aurellem.org/cortex/html/hearing.html 3.38 +Touch : http://aurellem.org/cortex/html/touch.html 3.39 +Proprioception : http://aurellem.org/cortex/html/proprioception.html 3.40 +Muscles : http://aurellem.org/cortex/html/movement.html 3.41 +Full Demonstration : http://aurellem.org/cortex/html/integration.html 3.42 3.43 -In particular, look at the video at 3.44 -http://aurellem.org/cortex/html/integration.html. It shows a 3.45 -simulated hand equipped with all of the senses I've built so far. 3.46 +I think this work could be a fruitful foundation for a Master's 3.47 +thesis, so in particular, I'd like critiques, suggestions, and project 3.48 +ideas. For example, here are some projects I think would be worthy, in 3.49 +increasing order of complexity: 3.50 3.51 -There's some more background information and full source code at 3.52 -http://aurellem.org 3.53 + * Create a self-powered joint that can determine its range of 3.54 + motion and joint type (hinge, cone, point-to-point, etc.) by 3.55 + making exploratory muscle movements and observing their effect. 3.56 3.57 -If you can't see a video, let me know and I'll upload it to YouTube so 3.58 -you can see it. 3.59 + * Develop an agent that writes and debugs low-level motor control 3.60 + programs to achieve simple goals like "look at the light" or 3.61 + "extend all of your fingers". These simple "calisthenic" 3.62 + programs could then be combined to form more elaborate 3.63 + procedures of motion, which in turn could be the basic 3.64 + instinctive reflexes in the "spinal cord" of some more advanced 3.65 + creature. (like Sussman's HACKER program but in a richer world) 3.66 3.67 + * Program a group of creatures that cooperate with each 3.68 + other. Because the creatures would be simulated, I could 3.69 + investigate computationally complex rules of behavior which 3.70 + still, from the group's point of view, would happen in "real 3.71 + time". Interactions could be as simple as cellular organisms 3.72 + communicating via flashing lights, or as complex as humanoids 3.73 + completing social tasks, etc. 3.74 3.75 + * Simulated Imagination -- this would involve a creature with an 3.76 + effector which creates an entire new sub-simulation where the 3.77 + creature has direct control over placement/creation of objects 3.78 + via simulated telekinesis. The creature observes this sub-world 3.79 + through it's normal senses and uses its observations to make 3.80 + predictions about it's top level world. 3.81 3.82 + * Integrate the simulated world with Genesis, so that Genesis 3.83 + could use the simulated world to answer questions about a 3.84 + proposed physical scenario. For example "You stack two blocks 3.85 + together, then hit the bottom block with your hand. Does the top 3.86 + block move?". This project is complicated and very large in 3.87 + scope, but it could be narrowed to focus on a single key 3.88 + aspect. For example, one key aspect of turning a scenario into a 3.89 + simulation is knowing when you're constructing "typical" or 3.90 + "atypical" examples of the scenario. So, a narrower project 3.91 + might simply learn about the edge cases of different scenarios 3.92 + (e.g. "A block stacked on top of another block is usually 3.93 + stable, provided the bottom block is large enough, and is not 3.94 + moving, and is level, etc."). With this knowledge, this kind of 3.95 + program could aid Genesis not only in answering common-sense 3.96 + questions, but in refining them: "A block is stacked on top of 3.97 + another block. Is it stable?"; "Usually, but do you know if the 3.98 + bottom block is slanted?", etc. 3.99 3.100 -Now, I need your help moving forward. Can I use this work as a base 3.101 -for a Masters thesis with you when I come back to MIT this coming Fall? 3.102 -What critiques and project ideas do you have after looking through 3.103 -what I've done so far? 3.104 +These are some ideas, but I think you can come up with better ones. I 3.105 +can't wait to hear your critiques and suggestions. 3.106 3.107 -I have some ideas on where I can go with this project but I think you 3.108 -will have some better ones. 3.109 +Finally, regarding next year at MIT, can I be considered for the 3.110 +position of TA for 6.034 or 6.xxx? Also, do you want me to return Fall 3.111 +or Summer? 3.112 3.113 -Here are some possible projects I might do with this as a base that I 3.114 -think would be worthy Masters projects. 3.115 +Sincerely, 3.116 3.117 - - HACKER for writing muscle-control programs : Presented with 3.118 - low-level muscle control/ sense API, generate higher level programs 3.119 - for accomplishing various stated goals. Example goals might be 3.120 - "extend all your fingers" or "move your hand into the area with 3.121 - blue light" or "decrease the angle of this joint". It would be 3.122 - like Sussman's HACKER, except it would operate with much more data 3.123 - in a more realistic world. Start off with "calisthenics" to 3.124 - develop subroutines over the motor control API. This would be the 3.125 - "spinal chord" of a more intelligent creature. 3.126 - 3.127 - - Create hundreds of small creatures and have them do simple 3.128 - simulated swarm tasks. 3.129 - 3.130 - - A joint that learns what sort of joint it (cone, point-to-point, 3.131 - hinge, etc.) is by correlating exploratory muscle movements with 3.132 - vision. 3.133 - 3.134 - - Something with cross-modal clustering using the rich sense 3.135 - data. This might prove difficult due to the higher dimensionality 3.136 - of my senses. 3.137 - 3.138 - - Simulated Imagination --- this would involve a creature with an 3.139 - effector which creates an /entire new sub-simulation/ where the 3.140 - creature has direct control over placement/creation of objects via 3.141 - simulated telekinesis. The creature observes this sub-world through 3.142 - it's normal senses and uses its observations to make predictions 3.143 - about it's top level world. 3.144 - 3.145 - - Hook it up with Genesis --- I could make a "semantic effector" 3.146 - which marks objects/sensory states with semantic information. In 3.147 - conjunction with Simulated Imagination, and HACKER-like motor 3.148 - control, Genesis might be able to ask simple questions like "stack 3.149 - two blocks together and hit the top one with your hand; does the 3.150 - bottom block move?" and the system could answer "yes". This would 3.151 - be rather complicated to do and involves many of the above 3.152 - projects, but there may be a way to scale it down to Master's 3.153 - thesis size. 3.154 - 3.155 - - Make a virtual computer in the virtual world which with which the 3.156 - creature interacts using its fingers to press keys on a virtual 3.157 - keyboard. The creature can access the internet, watch videos, take 3.158 - over the world, anything it wants. (This is probably not worthy of 3.159 - a Masters project, I just thought it was a neat idea. It's possible 3.160 - to use videos/etc in the simulated world at any rate.) 3.161 - 3.162 - 3.163 -I can't wait to hear your critiques and ideas. If you think I 3.164 -shouldn't use this system as a base and should instead do something 3.165 -else, that's fine too. 3.166 - 3.167 -On a related note, can I be considered for the position of TA next 3.168 -year for 6.034 or 6.xxx? 3.169 - 3.170 -sincerely, 3.171 --Robert McIntyre