Mercurial > cortex
comparison thesis/cortex.org @ 465:e4104ce9105c
working on body/joints.
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Fri, 28 Mar 2014 11:08:32 -0400 |
parents | 8bf4bb02ed05 |
children | da311eefbb09 |
comparison
equal
deleted
inserted
replaced
464:8bf4bb02ed05 | 465:e4104ce9105c |
---|---|
3 #+email: rlm@mit.edu | 3 #+email: rlm@mit.edu |
4 #+description: Using embodied AI to facilitate Artificial Imagination. | 4 #+description: Using embodied AI to facilitate Artificial Imagination. |
5 #+keywords: AI, clojure, embodiment | 5 #+keywords: AI, clojure, embodiment |
6 #+LaTeX_CLASS_OPTIONS: [nofloat] | 6 #+LaTeX_CLASS_OPTIONS: [nofloat] |
7 | 7 |
8 * Empathy and Embodiment as problem solving strategies | 8 * COMMENT templates |
9 #+caption: | |
10 #+caption: | |
11 #+caption: | |
12 #+caption: | |
13 #+name: name | |
14 #+begin_listing clojure | |
15 #+begin_src clojure | |
16 #+end_src | |
17 #+end_listing | |
18 | |
19 #+caption: | |
20 #+caption: | |
21 #+caption: | |
22 #+name: name | |
23 #+ATTR_LaTeX: :width 10cm | |
24 [[./images/Eve.jpg]] | |
25 | |
26 | |
27 | |
28 * COMMENT Empathy and Embodiment as problem solving strategies | |
9 | 29 |
10 By the end of this thesis, you will have seen a novel approach to | 30 By the end of this thesis, you will have seen a novel approach to |
11 interpreting video using embodiment and empathy. You will have also | 31 interpreting video using embodiment and empathy. You will have also |
12 seen one way to efficiently implement empathy for embodied | 32 seen one way to efficiently implement empathy for embodied |
13 creatures. Finally, you will become familiar with =CORTEX=, a system | 33 creatures. Finally, you will become familiar with =CORTEX=, a system |
337 own. To this end, wherver I have had to make archetictural choices | 357 own. To this end, wherver I have had to make archetictural choices |
338 about =CORTEX=, I have chosen to give as much freedom to the user as | 358 about =CORTEX=, I have chosen to give as much freedom to the user as |
339 possible, so that =CORTEX= may be used for things I have not | 359 possible, so that =CORTEX= may be used for things I have not |
340 forseen. | 360 forseen. |
341 | 361 |
342 ** Simulation or Reality? | 362 ** COMMENT Simulation or Reality? |
343 | 363 |
344 The most important archetictural decision of all is the choice to | 364 The most important archetictural decision of all is the choice to |
345 use a computer-simulated environemnt in the first place! The world | 365 use a computer-simulated environemnt in the first place! The world |
346 is a vast and rich place, and for now simulations are a very poor | 366 is a vast and rich place, and for now simulations are a very poor |
347 reflection of its complexity. It may be that there is a significant | 367 reflection of its complexity. It may be that there is a significant |
400 time in the simulated world can be slowed down to accommodate the | 420 time in the simulated world can be slowed down to accommodate the |
401 limitations of the character's programming. In terms of cost, | 421 limitations of the character's programming. In terms of cost, |
402 doing everything in software is far cheaper than building custom | 422 doing everything in software is far cheaper than building custom |
403 real-time hardware. All you need is a laptop and some patience. | 423 real-time hardware. All you need is a laptop and some patience. |
404 | 424 |
405 ** Because of Time, simulation is perferable to reality | 425 ** COMMENT Because of Time, simulation is perferable to reality |
406 | 426 |
407 I envision =CORTEX= being used to support rapid prototyping and | 427 I envision =CORTEX= being used to support rapid prototyping and |
408 iteration of ideas. Even if I could put together a well constructed | 428 iteration of ideas. Even if I could put together a well constructed |
409 kit for creating robots, it would still not be enough because of | 429 kit for creating robots, it would still not be enough because of |
410 the scourge of real-time processing. Anyone who wants to test their | 430 the scourge of real-time processing. Anyone who wants to test their |
411 ideas in the real world must always worry about getting their | 431 ideas in the real world must always worry about getting their |
412 algorithms to run fast enough to process information in real | 432 algorithms to run fast enough to process information in real time. |
413 time. The need for real time processing only increases if multiple | 433 The need for real time processing only increases if multiple senses |
414 senses are involved. In the extreme case, even simple algorithms | 434 are involved. In the extreme case, even simple algorithms will have |
415 will have to be accelerated by ASIC chips or FPGAs, turning what | 435 to be accelerated by ASIC chips or FPGAs, turning what would |
416 would otherwise be a few lines of code and a 10x speed penality | 436 otherwise be a few lines of code and a 10x speed penality into a |
417 into a multi-month ordeal. For this reason, =CORTEX= supports | 437 multi-month ordeal. For this reason, =CORTEX= supports |
418 /time-dialiation/, which scales back the framerate of the | 438 /time-dialiation/, which scales back the framerate of the |
419 simulation in proportion to the amount of processing each | 439 simulation in proportion to the amount of processing each frame. |
420 frame. From the perspective of the creatures inside the simulation, | 440 From the perspective of the creatures inside the simulation, time |
421 time always appears to flow at a constant rate, regardless of how | 441 always appears to flow at a constant rate, regardless of how |
422 complicated the envorimnent becomes or how many creatures are in | 442 complicated the envorimnent becomes or how many creatures are in |
423 the simulation. The cost is that =CORTEX= can sometimes run slower | 443 the simulation. The cost is that =CORTEX= can sometimes run slower |
424 than real time. This can also be an advantage, however --- | 444 than real time. This can also be an advantage, however --- |
425 simulations of very simple creatures in =CORTEX= generally run at | 445 simulations of very simple creatures in =CORTEX= generally run at |
426 40x on my machine! | 446 40x on my machine! |
427 | 447 |
428 ** Video game engines are a great starting point | 448 ** COMMENT Video game engines are a great starting point |
429 | 449 |
430 I did not need to write my own physics simulation code or shader to | 450 I did not need to write my own physics simulation code or shader to |
431 build =CORTEX=. Doing so would lead to a system that is impossible | 451 build =CORTEX=. Doing so would lead to a system that is impossible |
432 for anyone but myself to use anyway. Instead, I use a video game | 452 for anyone but myself to use anyway. Instead, I use a video game |
433 engine as a base and modify it to accomodate the additional needs | 453 engine as a base and modify it to accomodate the additional needs |
446 Finally, because video game engines support a large number of | 466 Finally, because video game engines support a large number of |
447 users, if I don't stray too far from the base system, other | 467 users, if I don't stray too far from the base system, other |
448 researchers can turn to this community for help when doing their | 468 researchers can turn to this community for help when doing their |
449 research. | 469 research. |
450 | 470 |
451 ** =CORTEX= is based on jMonkeyEngine3 | 471 ** COMMENT =CORTEX= is based on jMonkeyEngine3 |
452 | 472 |
453 While preparing to build =CORTEX= I studied several video game | 473 While preparing to build =CORTEX= I studied several video game |
454 engines to see which would best serve as a base. The top contenders | 474 engines to see which would best serve as a base. The top contenders |
455 were: | 475 were: |
456 | 476 |
527 movie WALL-E. | 547 movie WALL-E. |
528 | 548 |
529 #+caption: =EVE= from the movie WALL-E. This body plan turns | 549 #+caption: =EVE= from the movie WALL-E. This body plan turns |
530 #+caption: out to be much better suited to my purposes than a more | 550 #+caption: out to be much better suited to my purposes than a more |
531 #+caption: human-like one. | 551 #+caption: human-like one. |
552 #+ATTR_LaTeX: :width 10cm | |
532 [[./images/Eve.jpg]] | 553 [[./images/Eve.jpg]] |
533 | 554 |
534 =EVE='s body is composed of several rigid components that are held | 555 =EVE='s body is composed of several rigid components that are held |
535 together by invisible joint constraints. This is what I mean by | 556 together by invisible joint constraints. This is what I mean by |
536 ``eve-like''. The main reason that I use eve-style bodies is for | 557 ``eve-like''. The main reason that I use eve-style bodies is for |
544 joints. Sections do not have to stay as one piece forever; they can | 565 joints. Sections do not have to stay as one piece forever; they can |
545 be dynamically replaced with multiple sections to simulate | 566 be dynamically replaced with multiple sections to simulate |
546 splitting in two. This could be used to simulate retractable claws | 567 splitting in two. This could be used to simulate retractable claws |
547 or =EVE='s hands, which are able to coalesce into one object in the | 568 or =EVE='s hands, which are able to coalesce into one object in the |
548 movie. | 569 movie. |
570 | |
571 *** Solidifying/Connecting the body | |
572 | |
573 Importing bodies from =CORTEX= into blender involves encoding | |
574 metadata into the blender file that specifies the mass of each | |
575 component and the joints by which those components are connected. I | |
576 do this in Blender in two ways. First is by using the ``metadata'' | |
577 field of each solid object to specify the mass. Second is by using | |
578 Blender ``empty nodes'' to specify the position and type of each | |
579 joint. Empty nodes have no mass, physical presence, or appearance, | |
580 but they can hold metadata and have names. I use a tree structure | |
581 of empty nodes to specify joints. There is a parent node named | |
582 ``joints'', and a series of empty child nodes of the ``joints'' | |
583 node that each represent a single joint. | |
584 | |
585 #+caption: View of the hand model in Blender showing the main ``joints'' | |
586 #+caption: node (highlighted in yellow) and its children which each | |
587 #+caption: represent a joint in the hand. Each joint node has metadata | |
588 #+caption: specifying what sort of joint it is. | |
589 #+ATTR_LaTeX: :width 10cm | |
590 [[./images/hand-screenshot1.png]] | |
591 | |
592 | |
593 | |
594 | |
549 | 595 |
550 | 596 |
551 | 597 |
552 ** Eyes reuse standard video game components | 598 ** Eyes reuse standard video game components |
553 | 599 |
561 | 607 |
562 ** =CORTEX= brings complex creatures to life! | 608 ** =CORTEX= brings complex creatures to life! |
563 | 609 |
564 ** =CORTEX= enables many possiblities for further research | 610 ** =CORTEX= enables many possiblities for further research |
565 | 611 |
566 * Empathy in a simulated worm | 612 * COMMENT Empathy in a simulated worm |
567 | 613 |
568 Here I develop a computational model of empathy, using =CORTEX= as a | 614 Here I develop a computational model of empathy, using =CORTEX= as a |
569 base. Empathy in this context is the ability to observe another | 615 base. Empathy in this context is the ability to observe another |
570 creature and infer what sorts of sensations that creature is | 616 creature and infer what sorts of sensations that creature is |
571 feeling. My empathy algorithm involves multiple phases. First is | 617 feeling. My empathy algorithm involves multiple phases. First is |
1341 deduce that the worm has six sides. Note that =learn-touch-regions= | 1387 deduce that the worm has six sides. Note that =learn-touch-regions= |
1342 would work just as well even if the worm's touch sense data were | 1388 would work just as well even if the worm's touch sense data were |
1343 completely scrambled. The cross shape is just for convienence. This | 1389 completely scrambled. The cross shape is just for convienence. This |
1344 example justifies the use of pre-defined touch regions in =EMPATH=. | 1390 example justifies the use of pre-defined touch regions in =EMPATH=. |
1345 | 1391 |
1346 * Contributions | 1392 * COMMENT Contributions |
1347 | 1393 |
1348 In this thesis you have seen the =CORTEX= system, a complete | 1394 In this thesis you have seen the =CORTEX= system, a complete |
1349 environment for creating simulated creatures. You have seen how to | 1395 environment for creating simulated creatures. You have seen how to |
1350 implement five senses including touch, proprioception, hearing, | 1396 implement five senses including touch, proprioception, hearing, |
1351 vision, and muscle tension. You have seen how to create new creatues | 1397 vision, and muscle tension. You have seen how to create new creatues |