changeset 302:7e3938f40c52

added winston letter, corrected source listing for integration
author Robert McIntyre <rlm@mit.edu>
date Fri, 17 Feb 2012 13:03:58 -0700
parents 4203c2140b95
children 2dfebf71053c
files org/ideas.org org/integration.org winston-intro.txt
diffstat 3 files changed, 111 insertions(+), 7 deletions(-) [+]
line wrap: on
line diff
     1.1 --- a/org/ideas.org	Fri Feb 17 11:10:55 2012 -0700
     1.2 +++ b/org/ideas.org	Fri Feb 17 13:03:58 2012 -0700
     1.3 @@ -70,10 +70,11 @@
     1.4   - [X] proprioception sensor map in the style of the other senses -- day
     1.5   - [X] refactor integration code to distribute to each of the senses
     1.6         -- day
     1.7 - - [ ] create video showing all the senses for Winston -- 2 days
     1.8 + - [X] create video showing all the senses for Winston -- 2 days
     1.9 + - [ ] spellchecking !!!!!
    1.10   - [ ] send package to friends for critiques -- 2 days
    1.11 - - [ ] fix videos that were encoded wrong, test on InterNet Explorer.
    1.12 - - [ ] redo videos vision with new collapse code
    1.13 + - [X] fix videos that were encoded wrong, test on InterNet Explorer.
    1.14 + - [X] redo videos vision with new collapse code
    1.15   - [X] find a topology that looks good. (maybe nil topology?)
    1.16   - [X] fix red part of touch cube in video and image
    1.17   - [ ] write summary of project for Winston            \
     2.1 --- a/org/integration.org	Fri Feb 17 11:10:55 2012 -0700
     2.2 +++ b/org/integration.org	Fri Feb 17 13:03:58 2012 -0700
     2.3 @@ -6,7 +6,7 @@
     2.4  #+SETUPFILE: ../../aurellem/org/setup.org
     2.5  #+INCLUDE: ../../aurellem/org/level-0.org
     2.6  
     2.7 -* Intro 
     2.8 +* Integration 
     2.9  
    2.10  This is the ultimate test which features all of the senses that I've
    2.11  made so far. The blender file for the creature serves as an example of
    2.12 @@ -23,7 +23,7 @@
    2.13  </div>
    2.14  #+end_html
    2.15  
    2.16 -
    2.17 +* Generating the Video
    2.18  
    2.19  #+name: integration
    2.20  #+begin_src clojure 
    2.21 @@ -305,7 +305,7 @@
    2.22         (fix-display world))))))
    2.23  #+end_src
    2.24  
    2.25 -* Generating the Video
    2.26 +** ImageMagick / ffmpeg
    2.27  
    2.28  Just a bunch of calls to imagemagick to arrange the data that
    2.29  =test-everything!= produces.
    2.30 @@ -700,7 +700,7 @@
    2.31  
    2.32  * Source Listing
    2.33    - [[../src/cortex/test/integration.clj][cortex.test.integration]]
    2.34 -  - [[../src/cortex/video/magick2.clj][cortex.video.magick2]]
    2.35 +  - [[../src/cortex/video/magick8.clj][cortex.video.magick8]]
    2.36    - [[../assets/Models/subtitles/hand.blend][subtitles/hand.blend]]
    2.37  #+html: <ul> <li> <a href="../org/integration.org">This org file</a> </li> </ul>
    2.38    - [[http://hg.bortreb.com ][source-repository]]
     3.1 --- /dev/null	Thu Jan 01 00:00:00 1970 +0000
     3.2 +++ b/winston-intro.txt	Fri Feb 17 13:03:58 2012 -0700
     3.3 @@ -0,0 +1,103 @@
     3.4 +Dear Professor Winston,
     3.5 +
     3.6 +I'm ready for you to look through the work that I've done so far. It's
     3.7 +a sequence of posts describing the different simulated senses I've
     3.8 +implemented, with videos. 
     3.9 +
    3.10 +It's "blocks world reloaded", because like you say, you need multiple
    3.11 +senses to enable intelligence.
    3.12 +
    3.13 +Please look through the videos and skim the text and tell me what
    3.14 +you think:
    3.15 +
    3.16 +Introduction:
    3.17 +http://aurellem.org/cortex/html/intro.html
    3.18 +http://aurellem.org/cortex/html/sense.html
    3.19 +
    3.20 +http://aurellem.org/cortex/html/body.html  -- simulated physical bodies
    3.21 +http://aurellem.org/cortex/html/vision.html -- simulated eyes
    3.22 +http://aurellem.org/cortex/html/hearing.html -- simulated ears
    3.23 +http://aurellem.org/cortex/html/touch.html   -- simulated skin/hairs
    3.24 +http://aurellem.org/cortex/html/proprioception.html -- simulated proprioception
    3.25 +http://aurellem.org/cortex/html/movement.html       -- simulated muscles
    3.26 +http://aurellem.org/cortex/html/integration.html  -- full demonstration 
    3.27 +
    3.28 +In particular, look at the video at
    3.29 +http://aurellem.org/cortex/html/integration.html.  It shows a
    3.30 +simulated hand equipped with all of the senses I've built so far.
    3.31 +
    3.32 +There's some more background information and full source code at 
    3.33 +http://aurellem.org
    3.34 +
    3.35 +If you can't see a video, let me know and I'll upload it to YouTube so
    3.36 +you can see it.
    3.37 +
    3.38 +
    3.39 +
    3.40 +
    3.41 +Now, I need your help moving forward. Can I use this work as a base
    3.42 +for a Masters thesis with you when I come back to MIT this coming Fall?
    3.43 +What critiques and project ideas do you have after looking through
    3.44 +what I've done so far?
    3.45 +
    3.46 +I have some ideas on where I can go with this project but I think you
    3.47 +will have some better ones.
    3.48 +
    3.49 +Here are some possible projects I might do with this as a base that I
    3.50 +think would be worthy Masters projects.
    3.51 +
    3.52 + - HACKER for writing muscle-control programs : Presented with
    3.53 +   low-level muscle control/ sense API, generate higher level programs
    3.54 +   for accomplishing various stated goals. Example goals might be
    3.55 +   "extend all your fingers" or "move your hand into the area with
    3.56 +   blue light" or "decrease the angle of this joint".  It would be
    3.57 +   like Sussman's HACKER, except it would operate with much more data
    3.58 +   in a more realistic world.  Start off with "calisthenics" to
    3.59 +   develop subroutines over the motor control API.  This would be the
    3.60 +   "spinal chord" of a more intelligent creature.
    3.61 +
    3.62 + - Create hundreds of small creatures and have them do simple
    3.63 +   simulated swarm tasks.
    3.64 + 
    3.65 + - A joint that learns what sort of joint it (cone, point-to-point,
    3.66 +   hinge, etc.) is by correlating exploratory muscle movements with
    3.67 +   vision.
    3.68 +
    3.69 + - Something with cross-modal clustering using the rich sense
    3.70 +   data. This might prove difficult due to the higher dimensionality
    3.71 +   of my senses.
    3.72 +
    3.73 + - Simulated Imagination --- this would involve a creature with an
    3.74 +   effector which creates an /entire new sub-simulation/ where the
    3.75 +   creature has direct control over placement/creation of objects via
    3.76 +   simulated telekinesis. The creature observes this sub-world through
    3.77 +   it's normal senses and uses its observations to make predictions
    3.78 +   about it's top level world.
    3.79 +
    3.80 + - Hook it up with Genesis --- I could make a "semantic effector"
    3.81 +   which marks objects/sensory states with semantic information. In
    3.82 +   conjunction with Simulated Imagination, and HACKER-like motor
    3.83 +   control, Genesis might be able to ask simple questions like "stack
    3.84 +   two blocks together and hit the top one with your hand; does the
    3.85 +   bottom block move?" and the system could answer "yes".  This would
    3.86 +   be rather complicated to do and involves many of the above
    3.87 +   projects, but there may be a way to scale it down to Master's
    3.88 +   thesis size.
    3.89 +
    3.90 + - Make a virtual computer in the virtual world which with which the
    3.91 +   creature interacts using its fingers to press keys on a virtual
    3.92 +   keyboard. The creature can access the internet, watch videos, take
    3.93 +   over the world, anything it wants. (This is probably not worthy of
    3.94 +   a Masters project, I just thought it was a neat idea. It's possible
    3.95 +   to use videos/etc in the simulated world at any rate.)
    3.96 +
    3.97 +
    3.98 +I can't wait to hear your critiques and ideas.  If you think I
    3.99 +shouldn't use this system as a base and should instead do something
   3.100 +else, that's fine too. 
   3.101 +
   3.102 +On a related note, can I be considered for the position of TA next
   3.103 +year for 6.034 or 6.xxx?
   3.104 +
   3.105 +sincerely,
   3.106 +--Robert McIntyre