changeset 65:1c6af9dd64d5

merge.
author Robert McIntyre <rlm@mit.edu>
date Fri, 27 Sep 2013 13:07:34 -0400
parents be36b7325a47 (diff) b838bd76b081 (current diff)
children eae81fa3a8e0
files
diffstat 1 files changed, 92 insertions(+), 0 deletions(-) [+]
line wrap: on
line diff
     1.1 --- /dev/null	Thu Jan 01 00:00:00 1970 +0000
     1.2 +++ b/org/minds-eye.org	Fri Sep 27 13:07:34 2013 -0400
     1.3 @@ -0,0 +1,92 @@
     1.4 +#+title: Ethics of Mind's Eye
     1.5 +#+author: Robert McIntyre
     1.6 +#+email: rlm@mit.edu
     1.7 +#+description: 
     1.8 +#+keywords: 
     1.9 +#+SETUPFILE: ../../aurellem/org/setup.org
    1.10 +#+INCLUDE: ../../aurellem/org/level-0.org
    1.11 +#+babel: :mkdirp yes :noweb yes :exports both
    1.12 +
    1.13 +* COMMENT Ethical Considerations Regarding DARPA's Mind's Eye Program
    1.14 +
    1.15 +As scientists and engineers, it is our sacred duty to explore the
    1.16 +boundaries of human knowledge in a responsible way. We are part of the
    1.17 +larger organism of humanity, and tasked with discovering new things
    1.18 +that help the race first, and all life second.
    1.19 +
    1.20 +While knowledge and non-sentient technology is neither morally good or
    1.21 +evil, the things we discover are embedded in a wider cultural context,
    1.22 +and in many cases it is possible to foresee possible uses and abuses
    1.23 +our new technology will enable.
    1.24 +
    1.25 +It is naive to think that the government or any group of humans is
    1.26 +either wholly good or evil, but by reasoning from the
    1.27 +motivations/power of such groups, we can try to infer whether a
    1.28 +technology will improve the lot of humanity or not.
    1.29 +
    1.30 +It is possible to give and institution/culture a technology that they
    1.31 +will enthusiastically accept, but which will greatly diminish their
    1.32 +quality of life.
    1.33 +
    1.34 +Some examples:
    1.35 +
    1.36 +PGP -- an encryption suite that can help individuals to send messages
    1.37 +
    1.38 +
    1.39 +
    1.40 +* Questions
    1.41 +
    1.42 +- What should our ethical place be in deciding whether or not to
    1.43 +  pursue research?
    1.44 +
    1.45 +  - Follow orders, assume our overall culture will use things
    1.46 +    responsibly, and leave the ethical considerations to the
    1.47 +    government/people.
    1.48 +    
    1.49 +    - Many scientists who worked on the atomic bomb later questioned
    1.50 +      their decisions.
    1.51 +    - Yet, an advanced society /should/ have atomic weapons, if for
    1.52 +      nothing else than to defend themselves from meteors, blow shit
    1.53 +      up, etc.
    1.54 +   
    1.55 +  - We are each morally responsible for the things we help create. We
    1.56 +    are responsible for the misuses of technology we develop and the
    1.57 +    pain and suffering it causes. Ultimately our contributions to
    1.58 +    society will be judged by whether our technology did more good
    1.59 +    than bad.
    1.60 +
    1.61 +- If we decide that we *are* morally responsible for the technology we
    1.62 +  develop, then is the Mind's Eye project in particular something we
    1.63 +  should be doing?
    1.64 +  
    1.65 +  - Here are some things you can build with Mind's Eye tech:
    1.66 +
    1.67 +    - System to monitor single humans living alone and call for help
    1.68 +      in the case of emergencies (such as a fall)
    1.69 +
    1.70 +    - Same system for hospitals and nursing homes (of course, this
    1.71 +      also makes them even more impersonal)
    1.72 +
    1.73 +    - The equivalent of a FBI special agent watching everything you do
    1.74 +      from the moment you step out your door to the moment you go back
    1.75 +      to your house to sleep, building a dossier of every move you
    1.76 +      make, everyone you talk to, where you shop, etc.
    1.77 +
    1.78 +    - A concentration camp / prison that is ABSOLUTELY impossible to
    1.79 +      escape from.
    1.80 +
    1.81 +    - Many other countries that don't care as much about human rights
    1.82 +      as America will use this technology to monitor their citizens
    1.83 +      24/7. 
    1.84 +
    1.85 +    - Protection from surveillance in you own home does not apply
    1.86 +      unless you *own* your house. Say goodbye to privacy in school
    1.87 +      dorms, apartments, rented houses, etc.
    1.88 +    
    1.89 +  - People are OK with current surveillance because it is
    1.90 +    *dumb*. Mostly people only use security camera footage in the case
    1.91 +    of crimes, and it is a painstaking process because it requires
    1.92 +    manual human intervention. With AI, this will soon not be the
    1.93 +    case.
    1.94 +
    1.95 +  
    1.96 \ No newline at end of file