# HG changeset patch # User Robert McIntyre # Date 1329509038 25200 # Node ID 7e3938f40c52ca9f02b945892e4dc17bb2102445 # Parent 4203c2140b95ac364699b94c2af4fef78ff1be27 added winston letter, corrected source listing for integration diff -r 4203c2140b95 -r 7e3938f40c52 org/ideas.org --- a/org/ideas.org Fri Feb 17 11:10:55 2012 -0700 +++ b/org/ideas.org Fri Feb 17 13:03:58 2012 -0700 @@ -70,10 +70,11 @@ - [X] proprioception sensor map in the style of the other senses -- day - [X] refactor integration code to distribute to each of the senses -- day - - [ ] create video showing all the senses for Winston -- 2 days + - [X] create video showing all the senses for Winston -- 2 days + - [ ] spellchecking !!!!! - [ ] send package to friends for critiques -- 2 days - - [ ] fix videos that were encoded wrong, test on InterNet Explorer. - - [ ] redo videos vision with new collapse code + - [X] fix videos that were encoded wrong, test on InterNet Explorer. + - [X] redo videos vision with new collapse code - [X] find a topology that looks good. (maybe nil topology?) - [X] fix red part of touch cube in video and image - [ ] write summary of project for Winston \ diff -r 4203c2140b95 -r 7e3938f40c52 org/integration.org --- a/org/integration.org Fri Feb 17 11:10:55 2012 -0700 +++ b/org/integration.org Fri Feb 17 13:03:58 2012 -0700 @@ -6,7 +6,7 @@ #+SETUPFILE: ../../aurellem/org/setup.org #+INCLUDE: ../../aurellem/org/level-0.org -* Intro +* Integration This is the ultimate test which features all of the senses that I've made so far. The blender file for the creature serves as an example of @@ -23,7 +23,7 @@ #+end_html - +* Generating the Video #+name: integration #+begin_src clojure @@ -305,7 +305,7 @@ (fix-display world)))))) #+end_src -* Generating the Video +** ImageMagick / ffmpeg Just a bunch of calls to imagemagick to arrange the data that =test-everything!= produces. @@ -700,7 +700,7 @@ * Source Listing - [[../src/cortex/test/integration.clj][cortex.test.integration]] - - [[../src/cortex/video/magick2.clj][cortex.video.magick2]] + - [[../src/cortex/video/magick8.clj][cortex.video.magick8]] - [[../assets/Models/subtitles/hand.blend][subtitles/hand.blend]] #+html: - [[http://hg.bortreb.com ][source-repository]] diff -r 4203c2140b95 -r 7e3938f40c52 winston-intro.txt --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/winston-intro.txt Fri Feb 17 13:03:58 2012 -0700 @@ -0,0 +1,103 @@ +Dear Professor Winston, + +I'm ready for you to look through the work that I've done so far. It's +a sequence of posts describing the different simulated senses I've +implemented, with videos. + +It's "blocks world reloaded", because like you say, you need multiple +senses to enable intelligence. + +Please look through the videos and skim the text and tell me what +you think: + +Introduction: +http://aurellem.org/cortex/html/intro.html +http://aurellem.org/cortex/html/sense.html + +http://aurellem.org/cortex/html/body.html -- simulated physical bodies +http://aurellem.org/cortex/html/vision.html -- simulated eyes +http://aurellem.org/cortex/html/hearing.html -- simulated ears +http://aurellem.org/cortex/html/touch.html -- simulated skin/hairs +http://aurellem.org/cortex/html/proprioception.html -- simulated proprioception +http://aurellem.org/cortex/html/movement.html -- simulated muscles +http://aurellem.org/cortex/html/integration.html -- full demonstration + +In particular, look at the video at +http://aurellem.org/cortex/html/integration.html. It shows a +simulated hand equipped with all of the senses I've built so far. + +There's some more background information and full source code at +http://aurellem.org + +If you can't see a video, let me know and I'll upload it to YouTube so +you can see it. + + + + +Now, I need your help moving forward. Can I use this work as a base +for a Masters thesis with you when I come back to MIT this coming Fall? +What critiques and project ideas do you have after looking through +what I've done so far? + +I have some ideas on where I can go with this project but I think you +will have some better ones. + +Here are some possible projects I might do with this as a base that I +think would be worthy Masters projects. + + - HACKER for writing muscle-control programs : Presented with + low-level muscle control/ sense API, generate higher level programs + for accomplishing various stated goals. Example goals might be + "extend all your fingers" or "move your hand into the area with + blue light" or "decrease the angle of this joint". It would be + like Sussman's HACKER, except it would operate with much more data + in a more realistic world. Start off with "calisthenics" to + develop subroutines over the motor control API. This would be the + "spinal chord" of a more intelligent creature. + + - Create hundreds of small creatures and have them do simple + simulated swarm tasks. + + - A joint that learns what sort of joint it (cone, point-to-point, + hinge, etc.) is by correlating exploratory muscle movements with + vision. + + - Something with cross-modal clustering using the rich sense + data. This might prove difficult due to the higher dimensionality + of my senses. + + - Simulated Imagination --- this would involve a creature with an + effector which creates an /entire new sub-simulation/ where the + creature has direct control over placement/creation of objects via + simulated telekinesis. The creature observes this sub-world through + it's normal senses and uses its observations to make predictions + about it's top level world. + + - Hook it up with Genesis --- I could make a "semantic effector" + which marks objects/sensory states with semantic information. In + conjunction with Simulated Imagination, and HACKER-like motor + control, Genesis might be able to ask simple questions like "stack + two blocks together and hit the top one with your hand; does the + bottom block move?" and the system could answer "yes". This would + be rather complicated to do and involves many of the above + projects, but there may be a way to scale it down to Master's + thesis size. + + - Make a virtual computer in the virtual world which with which the + creature interacts using its fingers to press keys on a virtual + keyboard. The creature can access the internet, watch videos, take + over the world, anything it wants. (This is probably not worthy of + a Masters project, I just thought it was a neat idea. It's possible + to use videos/etc in the simulated world at any rate.) + + +I can't wait to hear your critiques and ideas. If you think I +shouldn't use this system as a base and should instead do something +else, that's fine too. + +On a related note, can I be considered for the position of TA next +year for 6.034 or 6.xxx? + +sincerely, +--Robert McIntyre