Mercurial > cortex
comparison thesis/cortex.org @ 563:3c3dc4dbb973
adding changes from winston
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Mon, 12 May 2014 12:49:18 -0400 |
parents | 6299450f9618 |
children | ecae29320b00 |
comparison
equal
deleted
inserted
replaced
562:6299450f9618 | 563:3c3dc4dbb973 |
---|---|
322 - For expediency's sake, I relied on direct knowledge of joint | 322 - For expediency's sake, I relied on direct knowledge of joint |
323 positions in this proof of concept. However, I believe that the | 323 positions in this proof of concept. However, I believe that the |
324 structure of =EMPATH= and =CORTEX= will make future work to | 324 structure of =EMPATH= and =CORTEX= will make future work to |
325 enable video analysis much easier than it would otherwise be. | 325 enable video analysis much easier than it would otherwise be. |
326 | 326 |
327 ** COMMENT =EMPATH= is built on =CORTEX=, a creature builder. | 327 ** =EMPATH= is built on =CORTEX=, a creature builder. |
328 | 328 |
329 I built =CORTEX= to be a general AI research platform for doing | 329 I built =CORTEX= to be a general AI research platform for doing |
330 experiments involving multiple rich senses and a wide variety and | 330 experiments involving multiple rich senses and a wide variety and |
331 number of creatures. I intend it to be useful as a library for many | 331 number of creatures. I intend it to be useful as a library for many |
332 more projects than just this thesis. =CORTEX= was necessary to meet | 332 more projects than just this thesis. =CORTEX= was necessary to meet |
575 and when the three-dimensional shape is rendered in a game the | 575 and when the three-dimensional shape is rendered in a game the |
576 smooshing and cutting is reversed and the image appears on the | 576 smooshing and cutting is reversed and the image appears on the |
577 three-dimensional object. | 577 three-dimensional object. |
578 | 578 |
579 To make a sense, interpret the UV-image as describing the | 579 To make a sense, interpret the UV-image as describing the |
580 distribution of that senses sensors. To get different types of | 580 distribution of that senses' sensors. To get different types of |
581 sensors, you can either use a different color for each type of | 581 sensors, you can either use a different color for each type of |
582 sensor, or use multiple UV-maps, each labeled with that sensor | 582 sensor, or use multiple UV-maps, each labeled with that sensor |
583 type. I generally use a white pixel to mean the presence of a | 583 type. I generally use a white pixel to mean the presence of a |
584 sensor and a black pixel to mean the absence of a sensor, and use | 584 sensor and a black pixel to mean the absence of a sensor, and use |
585 one UV-map for each sensor-type within a given sense. | 585 one UV-map for each sensor-type within a given sense. |
586 | 586 |
587 #+CAPTION: The UV-map for an elongated icososphere. The white | 587 #+CAPTION: The UV-map for an elongated icososphere. The white |
588 #+caption: dots each represent a touch sensor. They are dense | 588 #+caption: dots each represent a touch sensor. They are dense |
589 #+caption: in the regions that describe the tip of the finger, | 589 #+caption: in the regions that describe the tip of the finger, |
590 #+caption: and less dense along the dorsal side of the finger | 590 #+caption: and less dense along the dorsal side of the finger |
591 #+caption: opposite the tip. | 591 #+caption: opposite the tip. |
592 #+name: finger-UV | 592 #+name: finger-UV |
593 #+ATTR_latex: :width 10cm | 593 #+ATTR_latex: :width 10cm |
594 [[./images/finger-UV.png]] | 594 [[./images/finger-UV.png]] |
595 | 595 |
596 #+caption: Ventral side of the UV-mapped finger. Notice the | 596 #+caption: Ventral side of the UV-mapped finger. Note the |
597 #+caption: density of touch sensors at the tip. | 597 #+caption: density of touch sensors at the tip. |
598 #+name: finger-side-view | 598 #+name: finger-side-view |
599 #+ATTR_LaTeX: :width 10cm | 599 #+ATTR_LaTeX: :width 10cm |
600 [[./images/finger-1.png]] | 600 [[./images/finger-1.png]] |
601 | 601 |
610 building systems themselves. | 610 building systems themselves. |
611 | 611 |
612 First off, general purpose video game engines come with a physics | 612 First off, general purpose video game engines come with a physics |
613 engine and lighting / sound system. The physics system provides | 613 engine and lighting / sound system. The physics system provides |
614 tools that can be co-opted to serve as touch, proprioception, and | 614 tools that can be co-opted to serve as touch, proprioception, and |
615 muscles. Since some games support split screen views, a good video | 615 muscles. Because some games support split screen views, a good |
616 game engine will allow you to efficiently create multiple cameras | 616 video game engine will allow you to efficiently create multiple |
617 in the simulated world that can be used as eyes. Video game systems | 617 cameras in the simulated world that can be used as eyes. Video game |
618 offer integrated asset management for things like textures and | 618 systems offer integrated asset management for things like textures |
619 creature models, providing an avenue for defining creatures. They | 619 and creature models, providing an avenue for defining creatures. |
620 also understand UV-mapping, since this technique is used to apply a | 620 They also understand UV-mapping, because this technique is used to |
621 texture to a model. Finally, because video game engines support a | 621 apply a texture to a model. Finally, because video game engines |
622 large number of developers, as long as =CORTEX= doesn't stray too | 622 support a large number of developers, as long as =CORTEX= doesn't |
623 far from the base system, other researchers can turn to this | 623 stray too far from the base system, other researchers can turn to |
624 community for help when doing their research. | 624 this community for help when doing their research. |
625 | 625 |
626 ** =CORTEX= is based on jMonkeyEngine3 | 626 ** =CORTEX= is based on jMonkeyEngine3 |
627 | 627 |
628 While preparing to build =CORTEX= I studied several video game | 628 While preparing to build =CORTEX= I studied several video game |
629 engines to see which would best serve as a base. The top contenders | 629 engines to see which would best serve as a base. The top contenders |
822 The higher order function =sense-nodes= from =cortex.sense= | 822 The higher order function =sense-nodes= from =cortex.sense= |
823 simplifies finding the joints based on their parent ``joints'' | 823 simplifies finding the joints based on their parent ``joints'' |
824 node. | 824 node. |
825 | 825 |
826 #+caption: Retrieving the children empty nodes from a single | 826 #+caption: Retrieving the children empty nodes from a single |
827 #+caption: named empty node is a common pattern in =CORTEX= | 827 #+caption: named empty node is a common pattern in =CORTEX=. |
828 #+caption: further instances of this technique for the senses | 828 #+caption: Further instances of this technique for the senses |
829 #+caption: will be omitted | 829 #+caption: will be omitted |
830 #+name: get-empty-nodes | 830 #+name: get-empty-nodes |
831 #+begin_listing clojure | 831 #+begin_listing clojure |
832 #+begin_src clojure | 832 #+begin_src clojure |
833 (defn sense-nodes | 833 (defn sense-nodes |
851 To find a joint's targets, =CORTEX= creates a small cube, centered | 851 To find a joint's targets, =CORTEX= creates a small cube, centered |
852 around the empty-node, and grows the cube exponentially until it | 852 around the empty-node, and grows the cube exponentially until it |
853 intersects two physical objects. The objects are ordered according | 853 intersects two physical objects. The objects are ordered according |
854 to the joint's rotation, with the first one being the object that | 854 to the joint's rotation, with the first one being the object that |
855 has more negative coordinates in the joint's reference frame. | 855 has more negative coordinates in the joint's reference frame. |
856 Since the objects must be physical, the empty-node itself escapes | 856 Because the objects must be physical, the empty-node itself |
857 detection. Because the objects must be physical, =joint-targets= | 857 escapes detection. Because the objects must be physical, |
858 must be called /after/ =physical!= is called. | 858 =joint-targets= must be called /after/ =physical!= is called. |
859 | 859 |
860 #+caption: Program to find the targets of a joint node by | 860 #+caption: Program to find the targets of a joint node by |
861 #+caption: exponentially growth of a search cube. | 861 #+caption: exponentially growth of a search cube. |
862 #+name: joint-targets | 862 #+name: joint-targets |
863 #+begin_listing clojure | 863 #+begin_listing clojure |
1268 register-eye!))) | 1268 register-eye!))) |
1269 retinal-map)))) | 1269 retinal-map)))) |
1270 #+END_SRC | 1270 #+END_SRC |
1271 #+end_listing | 1271 #+end_listing |
1272 | 1272 |
1273 Note that since each of the functions generated by =vision-kernel= | 1273 Note that because each of the functions generated by |
1274 shares the same =register-eye!= function, the eye will be | 1274 =vision-kernel= shares the same =register-eye!= function, the eye |
1275 registered only once the first time any of the functions from the | 1275 will be registered only once the first time any of the functions |
1276 list returned by =vision-kernel= is called. Each of the functions | 1276 from the list returned by =vision-kernel= is called. Each of the |
1277 returned by =vision-kernel= also allows access to the =Viewport= | 1277 functions returned by =vision-kernel= also allows access to the |
1278 through which it receives images. | 1278 =Viewport= through which it receives images. |
1279 | 1279 |
1280 All the hard work has been done; all that remains is to apply | 1280 All the hard work has been done; all that remains is to apply |
1281 =vision-kernel= to each eye in the creature and gather the results | 1281 =vision-kernel= to each eye in the creature and gather the results |
1282 into one list of functions. | 1282 into one list of functions. |
1283 | 1283 |
1370 implement support for simulated hearing there. | 1370 implement support for simulated hearing there. |
1371 | 1371 |
1372 *** Extending =OpenAl= | 1372 *** Extending =OpenAl= |
1373 | 1373 |
1374 Extending =OpenAL= to support multiple listeners requires 500 | 1374 Extending =OpenAL= to support multiple listeners requires 500 |
1375 lines of =C= code and is too hairy to mention here. Instead, I | 1375 lines of =C= code and is too complicated to mention here. Instead, |
1376 will show a small amount of extension code and go over the high | 1376 I will show a small amount of extension code and go over the high |
1377 level strategy. Full source is of course available with the | 1377 level strategy. Full source is of course available with the |
1378 =CORTEX= distribution if you're interested. | 1378 =CORTEX= distribution if you're interested. |
1379 | 1379 |
1380 =OpenAL= goes to great lengths to support many different systems, | 1380 =OpenAL= goes to great lengths to support many different systems, |
1381 all with different sound capabilities and interfaces. It | 1381 all with different sound capabilities and interfaces. It |