rlm@352: * Summary of Senses rlm@346: rlm@352: vision -- list of functions which must each be called with rlm@352: the world as their argument, each of which returns [topology data]. Each rlm@350: element of data is a number between 0 and 255 representing the rlm@352: intensity of the light recieved at that sensor. Each element of rlm@352: topology is a pair of numbers [x, y] such that numbers whose pairs rlm@352: have a short euclidean distance are generally physically close on the rlm@352: actual sensor. rlm@348: rlm@348: proprioception -- list of nullary functions, one for each joint, which rlm@348: return [heding pitch roll]. rlm@348: rlm@350: movement -- list of functions, one for each muscle, which must be rlm@350: called with an integer between 0 and the total number of muscle fibers rlm@350: in the muscle. Each function returns a float which is (current-force/ rlm@350: total-possible-force). rlm@350: rlm@348: touch -- list of functions which must each be called with a Node rlm@348: (normally the root node of the simulation) the argument, each of which rlm@352: returns [topology data]. Each element of data is [length limit] where rlm@352: limit is the length of that particular "hair" and length is the amount rlm@352: of the hair that has been activated so far. (= limit length) means that rlm@352: nothing is touching the hair. rlm@352: rlm@352: rlm@352: * A Flower rlm@352: rlm@352: A flower is a basic creature that tries to maximize the amount of rlm@352: light that it sees. It can have one or more eyes, with one eye being rlm@352: "special" in that it is this eye which must recieve maximum light. It rlm@352: can have multiple articulated joints and mulcles. rlm@352: rlm@352: Want an algorithm that uses the sense data of =vision= rlm@352: =proprioception=, and =movement= to maximum benefit in order to look rlm@352: at the light source. rlm@352: rlm@352: The light source will move from place to place and the flower will rlm@352: have to follow it. rlm@352: rlm@352: The algorithm should be generalize to any number of eyes and muscles, rlm@352: and should become /more/ preformant the more sensory data is rlm@352: available. rlm@352: rlm@352: I will punt on working out an elegant model of motivation for the rlm@352: flower which makes it want to go to the light. rlm@352: rlm@352: Maybe I need a motivationless entity first, which just learns how its rlm@352: own body works? But then, wouldn't that just be a motivation itself? rlm@352: rlm@352: rlm@352: rlm@352: rlm@348: rlm@347: #+name: load-creature rlm@347: #+begin_src clojure rlm@347: (in-ns 'cortex.joint) rlm@343: rlm@347: (def joint "Models/joint/joint.blend") rlm@347: rlm@351: (defn joint-creature [] rlm@347: (load-blender-model joint)) rlm@347: rlm@351: (defn test-joint-creature [] rlm@347: (let [me (sphere 0.5 :color ColorRGBA/Blue :physical? false) rlm@351: creature (doto (joint-creature) (body!)) rlm@347: rlm@347: ;;;;;;;;;;;; Sensors/Effectors ;;;;;;;;;;;;;;;;;;;;;;;;;;;; rlm@347: touch (touch! creature) rlm@347: touch-display (view-touch) rlm@347: rlm@348: vision (vision! creature) rlm@348: vision-display (view-vision) rlm@347: rlm@347: ;;hearing (hearing! creature) rlm@347: ;;hearing-display (view-hearing) rlm@347: rlm@347: prop (proprioception! creature) rlm@347: prop-display (view-proprioception) rlm@347: rlm@347: muscles (movement! creature) rlm@347: muscle-display (view-movement) rlm@347: ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; rlm@347: rlm@347: fix-display (gen-fix-display) rlm@347: rlm@347: floor (box 10 2 10 :position (Vector3f. 0 -9 0) rlm@348: :color ColorRGBA/Gray :mass 0)] rlm@347: (world rlm@347: (nodify [floor me creature]) rlm@347: standard-debug-controls rlm@347: (fn [world] rlm@351: ;;(speed-up world) rlm@351: (light-up-everything world) rlm@348: (let [timer (RatchetTimer. 60)] rlm@348: (.setTimer world timer) rlm@348: (display-dilated-time world timer))) rlm@347: (fn [world tpf] rlm@348: (.setLocalTranslation me (.getLocation (.getCamera world))) rlm@347: (fix-display world))))) rlm@347: #+end_src rlm@343: rlm@343: * Headers rlm@343: #+name: joint-header rlm@343: #+begin_src clojure rlm@343: (ns cortex.joint rlm@347: (:require cortex.import) rlm@347: (:use (cortex world util import body sense rlm@347: hearing touch vision proprioception movement)) rlm@347: (:import java.io.File) rlm@347: (:import (com.aurellem.capture RatchetTimer IsoTimer))) rlm@347: rlm@343: (cortex.import/mega-import-jme3) rlm@346: (rlm.rlm-commands/help) rlm@343: #+end_src rlm@343: rlm@343: rlm@343: * COMMENT Generate Source rlm@343: rlm@343: #+begin_src clojure :tangle ../src/cortex/joint.clj rlm@343: <> rlm@347: <> rlm@343: #+end_src rlm@346: