diff org/hearing.org @ 276:54ec231dec4c

I changed Capture Video, then merged with Robert.
author Dylan Holmes <ocsenave@gmail.com>
date Wed, 15 Feb 2012 01:16:54 -0600
parents c39b8b29a79e
children 23aadf376e9d
line wrap: on
line diff
     1.1 --- a/org/hearing.org	Wed Feb 15 01:15:15 2012 -0600
     1.2 +++ b/org/hearing.org	Wed Feb 15 01:16:54 2012 -0600
     1.3 @@ -776,12 +776,12 @@
     1.4  
     1.5  ** Hearing Pipeline
     1.6  
     1.7 -All sound rendering is done in the CPU, so =(hearing-pipeline)= is
     1.8 -much less complicated than =(vision-pipelie)= The bytes available in
     1.9 +All sound rendering is done in the CPU, so =hearing-pipeline= is
    1.10 +much less complicated than =vision-pipelie= The bytes available in
    1.11  the ByteBuffer obtained from the =send= Device have different meanings
    1.12  dependant upon the particular hardware or your system.  That is why
    1.13  the =AudioFormat= object is necessary to provide the meaning that the
    1.14 -raw bytes lack. =(byteBuffer->pulse-vector)= uses the excellent
    1.15 +raw bytes lack. =byteBuffer->pulse-vector= uses the excellent
    1.16  conversion facilities from [[http://www.tritonus.org/ ][tritonus]] ([[http://tritonus.sourceforge.net/apidoc/org/tritonus/share/sampled/FloatSampleTools.html#byte2floatInterleaved%2528byte%5B%5D,%2520int,%2520float%5B%5D,%2520int,%2520int,%2520javax.sound.sampled.AudioFormat%2529][javadoc]]) to generate a clojure vector of
    1.17  floats which represent the linear PCM encoded waveform of the
    1.18  sound. With linear PCM (pulse code modulation) -1.0 represents maximum
    1.19 @@ -822,14 +822,14 @@
    1.20  
    1.21  Together, these three functions define how ears found in a specially
    1.22  prepared blender file will be translated to =Listener= objects in a
    1.23 -simulation. =(ears)= extracts all the children of to top level node
    1.24 -named "ears".  =(add-ear!)= and =(update-listener-velocity!)= use
    1.25 -=(bind-sense)= to bind a =Listener= object located at the initial
    1.26 +simulation. =ears= extracts all the children of to top level node
    1.27 +named "ears".  =add-ear!= and =update-listener-velocity!= use
    1.28 +=bind-sense= to bind a =Listener= object located at the initial
    1.29  position of an "ear" node to the closest physical object in the
    1.30  creature. That =Listener= will stay in the same orientation to the
    1.31  object with which it is bound, just as the camera in the [[http://aurellem.localhost/cortex/html/sense.html#sec-4-1][sense binding
    1.32  demonstration]].  =OpenAL= simulates the doppler effect for moving
    1.33 -listeners, =(update-listener-velocity!)= ensures that this velocity
    1.34 +listeners, =update-listener-velocity!= ensures that this velocity
    1.35  information is always up-to-date.
    1.36  
    1.37  #+name: hearing-ears
    1.38 @@ -905,7 +905,7 @@
    1.39      (hearing-kernel creature ear)))
    1.40  #+end_src
    1.41  
    1.42 -Each function returned by =(hearing-kernel!)= will register a new
    1.43 +Each function returned by =hearing-kernel!= will register a new
    1.44  =Listener= with the simulation the first time it is called.  Each time
    1.45  it is called, the hearing-function will return a vector of linear PCM
    1.46  encoded sound data that was heard since the last frame. The size of