# HG changeset patch # User Robert McIntyre # Date 1320357738 25200 # Node ID e8ae40c9848cbe1f03bd1eef4a410ff3daba361a # Parent 22ac5a0367cdd51466ae8db5bddf839fdb8d6648 fixed 1,000,000 spelling errors diff -r 22ac5a0367cd -r e8ae40c9848c org/ear.org --- a/org/ear.org Thu Nov 03 14:54:45 2011 -0700 +++ b/org/ear.org Thu Nov 03 15:02:18 2011 -0700 @@ -7,14 +7,11 @@ #+INCLUDE: ../../aurellem/org/level-0.org #+BABEL: :exports both :noweb yes :cache no :mkdirp yes - - - * Hearing -I want to be able to place ears in a similiar manner to how I place +I want to be able to place ears in a similar manner to how I place the eyes. I want to be able to place ears in a unique spatial -position, and recieve as output at every tick the FFT of whatever +position, and receive as output at every tick the F.F.T. of whatever signals are happening at that point. Hearing is one of the more difficult senses to simulate, because there @@ -23,13 +20,13 @@ jMonkeyEngine's sound system works as follows: - - jMonkeyEngine uese the =AppSettings= for the particular application + - jMonkeyEngine uses the =AppSettings= for the particular application to determine what sort of =AudioRenderer= should be used. - although some support is provided for multiple AudioRendering backends, jMonkeyEngine at the time of this writing will either - pick no AudioRender at all, or the =LwjglAudioRenderer= + pick no AudioRenderer at all, or the =LwjglAudioRenderer= - jMonkeyEngine tries to figure out what sort of system you're - running and extracts the appropiate native libraries. + running and extracts the appropriate native libraries. - the =LwjglAudioRenderer= uses the [[http://lwjgl.org/][=LWJGL=]] (LightWeight Java Game Library) bindings to interface with a C library called [[http://kcat.strangesoft.net/openal.html][=OpenAL=]] - =OpenAL= calculates the 3D sound localization and feeds a stream of @@ -37,7 +34,7 @@ how to communicate. A consequence of this is that there's no way to access the actual -sound data produced by =OpenAL=. Even worse, =OpanAL= only supports +sound data produced by =OpenAL=. Even worse, =OpenAL= only supports one /listener/, which normally isn't a problem for games, but becomes a problem when trying to make multiple AI creatures that can each hear the world from a different perspective. @@ -50,21 +47,21 @@ ** =OpenAL= Devices =OpenAL= goes to great lengths to support many different systems, all -with different sound capabilities and interfaces. It acomplishes this +with different sound capabilities and interfaces. It accomplishes this difficult task by providing code for many different sound backends in pseudo-objects called /Devices/. There's a device for the Linux Open -Sound System and the Advanced Linxu Sound Architechture, there's one +Sound System and the Advanced Linux Sound Architecture, there's one for Direct Sound on Windows, there's even one for Solaris. =OpenAL= solves the problem of platform independence by providing all these Devices. Wrapper libraries such as LWJGL are free to examine the system on -which they are running and then select an appropiate device for that +which they are running and then select an appropriate device for that system. There are also a few "special" devices that don't interface with any particular system. These include the Null Device, which doesn't do -anything, and the Wave Device, which writes whatever sound it recieves +anything, and the Wave Device, which writes whatever sound it receives to a file, if everything has been set up correctly when configuring =OpenAL=. @@ -131,7 +128,7 @@ #define UNUSED(x) (void)(x) #+end_src -The main idea behing the Send device is to take advantage of the fact +The main idea behind the Send device is to take advantage of the fact that LWJGL only manages one /context/ when using OpenAL. A /context/ is like a container that holds samples and keeps track of where the listener is. In order to support multiple listeners, the Send device @@ -146,7 +143,7 @@ - Set the LWJGL context as "master" in the =init()= method. - Create any number of additional contexts via =addContext()= - At every call to =renderData()= sync the master context with the - slave contexts vit =syncContexts()= + slave contexts with =syncContexts()= - =syncContexts()= calls =syncSources()= to sync all the sources which are in the master context. - =limitContext()= and =unLimitContext()= make it possible to render @@ -216,11 +213,11 @@ #+end_src -Setting the state of an =OpenAl= source is done with the =alSourcei=, +Setting the state of an =OpenAL= source is done with the =alSourcei=, =alSourcef=, =alSource3i=, and =alSource3f= functions. In order to -complely synchronize two sources, it is necessary to use all of +completely synchronize two sources, it is necessary to use all of them. These macros help to condense the otherwise repetitive -synchronization code involving these simillar low-level =OpenAL= functions. +synchronization code involving these similar low-level =OpenAL= functions. ** Source Synchronization #+begin_src C @@ -288,7 +285,7 @@ alcMakeContextCurrent(current); } #+end_src -This function is long because it has to exaustively go through all the +This function is long because it has to exhaustively go through all the possible state that a source can have and make sure that it is the same between the master and slave sources. I'd like to take this moment to salute the [[http://connect.creativelabs.com/openal/Documentation/Forms/AllItems.aspx][=OpenAL= Reference Manual]], which provides a very @@ -313,7 +310,7 @@ alGenSources(numMissingSources, newSources); } - /* Now, slave is gauranteed to have at least as many sources + /* Now, slave is guaranteed to have at least as many sources as master. Sync each source from master to the corresponding source in slave. */ int i; @@ -328,7 +325,7 @@ Most of the hard work in Context Synchronization is done in =syncSources()=. The only thing that =syncContexts()= has to worry -about is automoatically creating new sources whenever a slave context +about is automatically creating new sources whenever a slave context does not have the same number of sources as the master context. ** Context Creation @@ -380,7 +377,7 @@ static ALuint currentNumContext; /* By default, all contexts are rendered at once for each call to aluMixData. - * This function uses the internals of the ALCdecice struct to temporarly + * This function uses the internals of the ALCdevice struct to temporally * cause aluMixData to only render the chosen context. */ static void limitContext(ALCdevice *Device, ALCcontext *ctx){ @@ -396,10 +393,10 @@ } #+end_src -=OpenAL= normally reneders all Contexts in parallel, outputting the +=OpenAL= normally renders all Contexts in parallel, outputting the whole result to the buffer. It does this by iterating over the Device->Contexts array and rendering each context to the buffer in -turn. By temporarly setting Device->NumContexts to 1 and adjusting +turn. By temporally setting Device->NumContexts to 1 and adjusting the Device's context list to put the desired context-to-be-rendered into position 0, we can get trick =OpenAL= into rendering each slave context separate from all the others. @@ -576,12 +573,12 @@ } #+end_src -*** Initilazation +*** Initialization =initDevice= is called from the Java side after LWJGL has created its context, and before any calls to =addListener=. It establishes the LWJGL context as the master context. -=getAudioFormat= is a convienence function that uses JNI to build up a +=getAudioFormat= is a convenience function that uses JNI to build up a =javax.sound.sampled.AudioFormat= object from data in the Device. This way, there is no ambiguity about what the bits created by =step= and returned by =getSamples= mean. @@ -640,7 +637,7 @@ backends. It's the basis for =OpenAL='s primitive object system. #+begin_src C -//////////////////// Device Initilization / Management +//////////////////// Device Initialization / Management static const ALCchar sendDevice[] = "Multiple Audio Send"; @@ -649,7 +646,7 @@ { send_data *data; // stop any buffering for stdout, so that I can - // see the printf statements in my terminal immediatley + // see the printf statements in my terminal immediately setbuf(stdout, NULL); if(!deviceName) @@ -730,14 +727,14 @@ The Java interface to the Send Device follows naturally from the JNI definitions. It is included here for completeness. The only thing here of note is the =deviceID=. This is available from LWJGL, but to only -way to get it is reflection. Unfornatuently, there is no other way to +way to get it is reflection. Unfortunately, there is no other way to control the Send device than to obtain a pointer to it. #+include: "../java/src/com/aurellem/send/AudioSend.java" src java :exports code * Finally, Ears in clojure! -Now that the infastructure is complete (modulo a few patches to +Now that the infrastructure is complete (modulo a few patches to jMonkeyEngine3 to support accessing this modified version of =OpenAL= that are not worth discussing), the clojure ear abstraction is rather simple. Just as there were =SceneProcessors= for vision, there are @@ -750,7 +747,7 @@ (ns cortex.hearing "Simulate the sense of hearing in jMonkeyEngine3. Enables multiple listeners at different positions in the same world. Passes vectors - of floats in the range [-1.0 -- 1.0] in PCM format to any arbitray + of floats in the range [-1.0 -- 1.0] in PCM format to any arbitrary function." {:author "Robert McIntyre"} (:use (cortex world util))