diff winston-intro.txt @ 302:7e3938f40c52

added winston letter, corrected source listing for integration
author Robert McIntyre <rlm@mit.edu>
date Fri, 17 Feb 2012 13:03:58 -0700
parents
children 19c43ec6958d
line wrap: on
line diff
     1.1 --- /dev/null	Thu Jan 01 00:00:00 1970 +0000
     1.2 +++ b/winston-intro.txt	Fri Feb 17 13:03:58 2012 -0700
     1.3 @@ -0,0 +1,103 @@
     1.4 +Dear Professor Winston,
     1.5 +
     1.6 +I'm ready for you to look through the work that I've done so far. It's
     1.7 +a sequence of posts describing the different simulated senses I've
     1.8 +implemented, with videos. 
     1.9 +
    1.10 +It's "blocks world reloaded", because like you say, you need multiple
    1.11 +senses to enable intelligence.
    1.12 +
    1.13 +Please look through the videos and skim the text and tell me what
    1.14 +you think:
    1.15 +
    1.16 +Introduction:
    1.17 +http://aurellem.org/cortex/html/intro.html
    1.18 +http://aurellem.org/cortex/html/sense.html
    1.19 +
    1.20 +http://aurellem.org/cortex/html/body.html  -- simulated physical bodies
    1.21 +http://aurellem.org/cortex/html/vision.html -- simulated eyes
    1.22 +http://aurellem.org/cortex/html/hearing.html -- simulated ears
    1.23 +http://aurellem.org/cortex/html/touch.html   -- simulated skin/hairs
    1.24 +http://aurellem.org/cortex/html/proprioception.html -- simulated proprioception
    1.25 +http://aurellem.org/cortex/html/movement.html       -- simulated muscles
    1.26 +http://aurellem.org/cortex/html/integration.html  -- full demonstration 
    1.27 +
    1.28 +In particular, look at the video at
    1.29 +http://aurellem.org/cortex/html/integration.html.  It shows a
    1.30 +simulated hand equipped with all of the senses I've built so far.
    1.31 +
    1.32 +There's some more background information and full source code at 
    1.33 +http://aurellem.org
    1.34 +
    1.35 +If you can't see a video, let me know and I'll upload it to YouTube so
    1.36 +you can see it.
    1.37 +
    1.38 +
    1.39 +
    1.40 +
    1.41 +Now, I need your help moving forward. Can I use this work as a base
    1.42 +for a Masters thesis with you when I come back to MIT this coming Fall?
    1.43 +What critiques and project ideas do you have after looking through
    1.44 +what I've done so far?
    1.45 +
    1.46 +I have some ideas on where I can go with this project but I think you
    1.47 +will have some better ones.
    1.48 +
    1.49 +Here are some possible projects I might do with this as a base that I
    1.50 +think would be worthy Masters projects.
    1.51 +
    1.52 + - HACKER for writing muscle-control programs : Presented with
    1.53 +   low-level muscle control/ sense API, generate higher level programs
    1.54 +   for accomplishing various stated goals. Example goals might be
    1.55 +   "extend all your fingers" or "move your hand into the area with
    1.56 +   blue light" or "decrease the angle of this joint".  It would be
    1.57 +   like Sussman's HACKER, except it would operate with much more data
    1.58 +   in a more realistic world.  Start off with "calisthenics" to
    1.59 +   develop subroutines over the motor control API.  This would be the
    1.60 +   "spinal chord" of a more intelligent creature.
    1.61 +
    1.62 + - Create hundreds of small creatures and have them do simple
    1.63 +   simulated swarm tasks.
    1.64 + 
    1.65 + - A joint that learns what sort of joint it (cone, point-to-point,
    1.66 +   hinge, etc.) is by correlating exploratory muscle movements with
    1.67 +   vision.
    1.68 +
    1.69 + - Something with cross-modal clustering using the rich sense
    1.70 +   data. This might prove difficult due to the higher dimensionality
    1.71 +   of my senses.
    1.72 +
    1.73 + - Simulated Imagination --- this would involve a creature with an
    1.74 +   effector which creates an /entire new sub-simulation/ where the
    1.75 +   creature has direct control over placement/creation of objects via
    1.76 +   simulated telekinesis. The creature observes this sub-world through
    1.77 +   it's normal senses and uses its observations to make predictions
    1.78 +   about it's top level world.
    1.79 +
    1.80 + - Hook it up with Genesis --- I could make a "semantic effector"
    1.81 +   which marks objects/sensory states with semantic information. In
    1.82 +   conjunction with Simulated Imagination, and HACKER-like motor
    1.83 +   control, Genesis might be able to ask simple questions like "stack
    1.84 +   two blocks together and hit the top one with your hand; does the
    1.85 +   bottom block move?" and the system could answer "yes".  This would
    1.86 +   be rather complicated to do and involves many of the above
    1.87 +   projects, but there may be a way to scale it down to Master's
    1.88 +   thesis size.
    1.89 +
    1.90 + - Make a virtual computer in the virtual world which with which the
    1.91 +   creature interacts using its fingers to press keys on a virtual
    1.92 +   keyboard. The creature can access the internet, watch videos, take
    1.93 +   over the world, anything it wants. (This is probably not worthy of
    1.94 +   a Masters project, I just thought it was a neat idea. It's possible
    1.95 +   to use videos/etc in the simulated world at any rate.)
    1.96 +
    1.97 +
    1.98 +I can't wait to hear your critiques and ideas.  If you think I
    1.99 +shouldn't use this system as a base and should instead do something
   1.100 +else, that's fine too. 
   1.101 +
   1.102 +On a related note, can I be considered for the position of TA next
   1.103 +year for 6.034 or 6.xxx?
   1.104 +
   1.105 +sincerely,
   1.106 +--Robert McIntyre