view org/joint.org @ 573:ebdedb039cbb tip

add release.
author Robert McIntyre <rlm@mit.edu>
date Sun, 19 Apr 2015 04:01:53 -0700
parents d9128eb5f42e
children
line wrap: on
line source
1 * Summary of Senses
3 vision -- list of functions which must each be called with
4 the world as their argument, each of which returns [topology data]. Each
5 element of data is a number between 0 and 255 representing the
6 intensity of the light recieved at that sensor. Each element of
7 topology is a pair of numbers [x, y] such that numbers whose pairs
8 have a short euclidean distance are generally physically close on the
9 actual sensor.
11 proprioception -- list of nullary functions, one for each joint, which
12 return [heding pitch roll].
14 movement -- list of functions, one for each muscle, which must be
15 called with an integer between 0 and the total number of muscle fibers
16 in the muscle. Each function returns a float which is (current-force/
17 total-possible-force).
19 touch -- list of functions which must each be called with a Node
20 (normally the root node of the simulation) the argument, each of which
21 returns [topology data]. Each element of data is [length limit] where
22 limit is the length of that particular "hair" and length is the amount
23 of the hair that has been activated so far. (= limit length) means that
24 nothing is touching the hair.
27 * A Flower
29 A flower is a basic creature that tries to maximize the amount of
30 light that it sees. It can have one or more eyes, with one eye being
31 "special" in that it is this eye which must recieve maximum light. It
32 can have multiple articulated joints and mulcles.
34 Want an algorithm that uses the sense data of =vision=
35 =proprioception=, and =movement= to maximum benefit in order to look
36 at the light source.
38 The light source will move from place to place and the flower will
39 have to follow it.
41 The algorithm should be generalize to any number of eyes and muscles,
42 and should become /more/ preformant the more sensory data is
43 available.
45 I will punt on working out an elegant model of motivation for the
46 flower which makes it want to go to the light.
48 Maybe I need a motivationless entity first, which just learns how its
49 own body works? But then, wouldn't that just be a motivation itself?
55 #+name: load-creature
56 #+begin_src clojure
57 (in-ns 'cortex.joint)
59 (def joint "Models/joint/joint.blend")
61 (defn joint-creature []
62 (load-blender-model joint))
64 (defn test-joint-creature []
65 (let [me (sphere 0.5 :color ColorRGBA/Blue :physical? false)
66 creature (doto (joint-creature) (body!))
68 ;;;;;;;;;;;; Sensors/Effectors ;;;;;;;;;;;;;;;;;;;;;;;;;;;;
69 touch (touch! creature)
70 touch-display (view-touch)
72 vision (vision! creature)
73 vision-display (view-vision)
75 ;;hearing (hearing! creature)
76 ;;hearing-display (view-hearing)
78 prop (proprioception! creature)
79 prop-display (view-proprioception)
81 muscles (movement! creature)
82 muscle-display (view-movement)
83 ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
85 fix-display (gen-fix-display)
87 floor (box 10 2 10 :position (Vector3f. 0 -9 0)
88 :color ColorRGBA/Gray :mass 0)]
89 (world
90 (nodify [floor me creature])
91 standard-debug-controls
92 (fn [world]
93 ;;(speed-up world)
94 (light-up-everything world)
95 (let [timer (RatchetTimer. 60)]
96 (.setTimer world timer)
97 (display-dilated-time world timer)))
98 (fn [world tpf]
99 (.setLocalTranslation me (.getLocation (.getCamera world)))
100 (fix-display world)))))
101 #+end_src
103 * Headers
104 #+name: joint-header
105 #+begin_src clojure
106 (ns cortex.joint
107 (:require cortex.import)
108 (:use (cortex world util import body sense
109 hearing touch vision proprioception movement))
110 (:import java.io.File)
111 (:import (com.aurellem.capture RatchetTimer IsoTimer)))
113 (cortex.import/mega-import-jme3)
114 (rlm.rlm-commands/help)
115 #+end_src
118 * COMMENT Generate Source
120 #+begin_src clojure :tangle ../src/cortex/joint.clj
121 <<joint-header>>
122 <<load-creature>>
123 #+end_src