Mercurial > thoughts
comparison org/minds-eye.org @ 63:df7950667f58
add thoughts on mind's eye
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Fri, 27 Sep 2013 13:06:26 -0400 |
parents | |
children | be36b7325a47 |
comparison
equal
deleted
inserted
replaced
58:82cfd2b29db6 | 63:df7950667f58 |
---|---|
1 #+title: Ethics of Mind's Eye | |
2 #+author: Robert McIntyre | |
3 #+email: rlm@mit.edu | |
4 #+description: | |
5 #+keywords: | |
6 #+SETUPFILE: ../../aurellem/org/setup.org | |
7 #+INCLUDE: ../../aurellem/org/level-0.org | |
8 #+babel: :mkdirp yes :noweb yes :exports both | |
9 | |
10 * COMMENT Ethical Considerations Regarding DARPA's Mind's Eye Program | |
11 | |
12 As scientists and engineers, it is our sacred duty to explore the | |
13 boundaries of human knowledge in a responsible way. We are part of the | |
14 larger organism of humanity, and tasked with discovering new things | |
15 that help the race first, and all life second. | |
16 | |
17 While knowledge and non-sentient technology is neither morally good or | |
18 evil, the things we discover are embedded in a wider cultural context, | |
19 and in many cases it is possible to foresee possible uses and abuses | |
20 our new technology will enable. | |
21 | |
22 It is naive to think that the government or any group of humans is | |
23 either wholly good or evil, but by reasoning from the | |
24 motivations/power of such groups, we can try to infer whether a | |
25 technology will improve the lot of humanity or not. | |
26 | |
27 It is possible to give and institution/culture a technology that they | |
28 will enthusiastically accept, but which will greatly diminish their | |
29 quality of life. | |
30 | |
31 Some examples: | |
32 | |
33 PGP -- an encryption suite that can help individuals to send messages | |
34 | |
35 | |
36 | |
37 * Questions | |
38 | |
39 - What should our ethical place be in deciding whether or not to | |
40 pursue research? | |
41 | |
42 - Follow orders, assume our overall culture will use things | |
43 responsibly, and leave the ethical considerations to the | |
44 government/people. | |
45 | |
46 - Many scientists who worked on the atomic bomb later questioned | |
47 their decisions. | |
48 - Yet, an advanced society /should/ have atomic weapons, if for | |
49 nothing else than to defend themselves from meteors, blow shit | |
50 up, etc. | |
51 | |
52 - We are each morally responsible for the things we help create. We | |
53 are responsible for the misuses of technology we develop and the | |
54 pain and suffering it causes. Ultimately our contributions to | |
55 society will be judged by whether our technology did more good | |
56 than bad. | |
57 | |
58 - If we decide that we *are* morally responsible for the technology we | |
59 develop, then is the Mind's Eye project in particular something we | |
60 should be doing? | |
61 | |
62 - Here are some things you can build with Mind's Eye tech: | |
63 | |
64 - System to monitor single humans living alone and call for help | |
65 in the case of emergencies (such as a fall) | |
66 | |
67 - Same system for hospitals and nursing homes (of course, this | |
68 also makes them even more impersonal) | |
69 | |
70 - The equivalent of a FBI special agent watching everything you do | |
71 from the moment you step out your door to the moment you go back | |
72 to your house to sleep, building a dossier of every move you | |
73 make, everyone you talk to, where you shop, etc. | |
74 | |
75 - A concentration camp / prison that is ABSOLUTELY impossible to | |
76 escape from. | |
77 | |
78 - Many other countries that don't care as much about human rights | |
79 as America will use this technology to monitor their citizens | |
80 24/7. | |
81 | |
82 - Protection from surveillance in you own home does not apply | |
83 unless you *own* your house. Say goodbye to privacy in school | |
84 dorms, apartments, rented houses, etc. | |
85 | |
86 - People are OK with current surveillance because it is | |
87 *dumb*. Mostly people only use security camera footage in the case | |
88 of crimes, and it is a painstaking process because it requires | |
89 manual human intervention. With AI, this will soon not be the | |
90 case. | |
91 |