rlm@15: #+title: Simulated Sense of Hearing rlm@0: #+author: Robert McIntyre rlm@0: #+email: rlm@mit.edu rlm@15: #+description: Simulating multiple listeners and the sense of hearing in jMonkeyEngine3 rlm@15: #+keywords: simulated hearing, openal, clojure, jMonkeyEngine3, LWJGL, AI rlm@15: #+SETUPFILE: ../../aurellem/org/setup.org rlm@15: #+INCLUDE: ../../aurellem/org/level-0.org rlm@0: #+BABEL: :exports both :noweb yes :cache no :mkdirp yes rlm@0: rlm@0: rlm@0: rlm@0: rlm@15: * Hearing rlm@0: rlm@0: I want to be able to place ears in a similiar manner to how I place rlm@0: the eyes. I want to be able to place ears in a unique spatial rlm@0: position, and recieve as output at every tick the FFT of whatever rlm@0: signals are happening at that point. rlm@0: rlm@15: Hearing is one of the more difficult senses to simulate, because there rlm@15: is less support for obtaining the actual sound data that is processed rlm@15: by jMonkeyEngine3. rlm@15: rlm@15: jMonkeyEngine's sound system works as follows: rlm@15: rlm@15: - jMonkeyEngine uese the =AppSettings= for the particular application rlm@15: to determine what sort of =AudioRenderer= should be used. rlm@15: - although some support is provided for multiple AudioRendering rlm@15: backends, jMonkeyEngine at the time of this writing will either rlm@15: pick no AudioRender at all, or the =LwjglAudioRenderer= rlm@15: - jMonkeyEngine tries to figure out what sort of system you're rlm@15: running and extracts the appropiate native libraries. rlm@18: - the =LwjglAudioRenderer= uses the [[http://lwjgl.org/][=LWJGL=]] (LightWeight Java Game rlm@18: Library) bindings to interface with a C library called [[http://kcat.strangesoft.net/openal.html][=OpenAL=]] rlm@15: - =OpenAL= calculates the 3D sound localization and feeds a stream of rlm@15: sound to any of various sound output devices with which it knows rlm@15: how to communicate. rlm@15: rlm@15: A consequence of this is that there's no way to access the actual rlm@15: sound data produced by =OpenAL=. Even worse, =OpanAL= only supports rlm@15: one /listener/, which normally isn't a problem for games, but becomes rlm@15: a problem when trying to make multiple AI creatures that can each hear rlm@15: the world from a different perspective. rlm@15: rlm@15: To make many AI creatures in jMonkeyEngine that can each hear the rlm@15: world from their own perspective, it is necessary to go all the way rlm@15: back to =OpenAL= and implement support for simulated hearing there. rlm@15: rlm@15: ** =OpenAL= Devices rlm@15: rlm@15: =OpenAL= goes to great lengths to support many different systems, all rlm@15: with different sound capabilities and interfaces. It acomplishes this rlm@15: difficult task by providing code for many different sound backends in rlm@15: pseudo-objects called /Devices/. There's a device for the Linux Open rlm@15: Sound System and the Advanced Linxu Sound Architechture, there's one rlm@15: for Direct Sound on Windows, there's even one for Solaris. =OpenAL= rlm@15: solves the problem of platform independence by providing all these rlm@15: Devices. rlm@15: rlm@15: Wrapper libraries such as LWJGL are free to examine the system on rlm@15: which they are running and then select an appropiate device for that rlm@15: system. rlm@15: rlm@15: There are also a few "special" devices that don't interface with any rlm@15: particular system. These include the Null Device, which doesn't do rlm@15: anything, and the Wave Device, which writes whatever sound it recieves rlm@15: to a file, if everything has been set up correctly when configuring rlm@15: =OpenAL=. rlm@15: rlm@15: Actual mixing of the sound data happens in the Devices, and they are rlm@15: the only point in the sound rendering process where this data is rlm@15: available. rlm@15: rlm@15: Therefore, in order to support multiple listeners, and get the sound rlm@15: data in a form that the AIs can use, it is necessary to create a new rlm@15: Device, which supports this features. rlm@15: rlm@15: rlm@15: ** The Send Device rlm@15: Adding a device to OpenAL is rather tricky -- there are five separate rlm@15: files in the =OpenAL= source tree that must be modified to do so. I've rlm@15: documented this process [[./add-new-device.org][here]] for anyone who is interested. rlm@15: rlm@18: rlm@18: Onward to that actual Device! rlm@18: rlm@18: again, my objectives are: rlm@18: rlm@18: - Support Multiple Listeners from jMonkeyEngine3 rlm@18: - Get access to the rendered sound data for further processing from rlm@18: clojure. rlm@18: rlm@18: ** =send.c= rlm@18: rlm@18: ** Header rlm@18: #+srcname: send-header rlm@15: #+begin_src C rlm@15: #include "config.h" rlm@15: #include rlm@15: #include "alMain.h" rlm@15: #include "AL/al.h" rlm@15: #include "AL/alc.h" rlm@15: #include "alSource.h" rlm@15: #include rlm@15: rlm@15: //////////////////// Summary rlm@15: rlm@15: struct send_data; rlm@15: struct context_data; rlm@15: rlm@15: static void addContext(ALCdevice *, ALCcontext *); rlm@15: static void syncContexts(ALCcontext *master, ALCcontext *slave); rlm@15: static void syncSources(ALsource *master, ALsource *slave, rlm@15: ALCcontext *masterCtx, ALCcontext *slaveCtx); rlm@15: rlm@15: static void syncSourcei(ALuint master, ALuint slave, rlm@15: ALCcontext *masterCtx, ALCcontext *ctx2, ALenum param); rlm@15: static void syncSourcef(ALuint master, ALuint slave, rlm@15: ALCcontext *masterCtx, ALCcontext *ctx2, ALenum param); rlm@15: static void syncSource3f(ALuint master, ALuint slave, rlm@15: ALCcontext *masterCtx, ALCcontext *ctx2, ALenum param); rlm@15: rlm@15: static void swapInContext(ALCdevice *, struct context_data *); rlm@15: static void saveContext(ALCdevice *, struct context_data *); rlm@15: static void limitContext(ALCdevice *, ALCcontext *); rlm@15: static void unLimitContext(ALCdevice *); rlm@15: rlm@15: static void init(ALCdevice *); rlm@15: static void renderData(ALCdevice *, int samples); rlm@15: rlm@15: #define UNUSED(x) (void)(x) rlm@18: #+end_src rlm@15: rlm@18: The main idea behing the Send device is to take advantage of the fact rlm@18: that LWJGL only manages one /context/ when using OpenAL. A /context/ rlm@18: is like a container that holds samples and keeps track of where the rlm@18: listener is. In order to support multiple listeners, the Send device rlm@18: identifies the LWJGL context as the master context, and creates any rlm@18: number of slave contexts to represent additional listeners. Every rlm@18: time the device renders sound, it synchronizes every source from the rlm@18: master LWJGL context to the slave contexts. Then, it renders each rlm@18: context separately, using a different listener for each one. The rlm@18: rendered sound is made available via JNI to jMonkeyEngine. rlm@18: rlm@18: To recap, the process is: rlm@18: - Set the LWJGL context as "master" in the =init()= method. rlm@18: - Create any number of additional contexts via =addContext()= rlm@18: - At every call to =renderData()= sync the master context with the rlm@18: slave contexts vit =syncContexts()= rlm@18: - =syncContexts()= calls =syncSources()= to sync all the sources rlm@18: which are in the master context. rlm@18: - =limitContext()= and =unLimitContext()= make it possible to render rlm@18: only one context at a time. rlm@18: rlm@18: ** Necessary State rlm@18: #+begin_src C rlm@15: //////////////////// State rlm@15: rlm@15: typedef struct context_data { rlm@15: ALfloat ClickRemoval[MAXCHANNELS]; rlm@15: ALfloat PendingClicks[MAXCHANNELS]; rlm@15: ALvoid *renderBuffer; rlm@15: ALCcontext *ctx; rlm@15: } context_data; rlm@15: rlm@15: typedef struct send_data { rlm@15: ALuint size; rlm@15: context_data **contexts; rlm@15: ALuint numContexts; rlm@15: ALuint maxContexts; rlm@15: } send_data; rlm@18: #+end_src rlm@15: rlm@18: Switching between contexts is not the normal operation of a Device, rlm@18: and one of the problems with doing so is that a Device normally keeps rlm@18: around a few pieces of state such as the =ClickRemoval= array above rlm@18: which will become corrupted if the contexts are not done in rlm@18: parallel. The solution is to create a copy of this normally global rlm@18: device state for each context, and copy it back and forth into and out rlm@18: of the actual device state whenever a context is rendered. rlm@15: rlm@18: ** Synchronization Macros rlm@15: rlm@18: #+begin_src C rlm@15: //////////////////// Context Creation / Synchronization rlm@15: rlm@15: #define _MAKE_SYNC(NAME, INIT_EXPR, GET_EXPR, SET_EXPR) \ rlm@15: void NAME (ALuint sourceID1, ALuint sourceID2, \ rlm@15: ALCcontext *ctx1, ALCcontext *ctx2, \ rlm@15: ALenum param){ \ rlm@15: INIT_EXPR; \ rlm@15: ALCcontext *current = alcGetCurrentContext(); \ rlm@15: alcMakeContextCurrent(ctx1); \ rlm@15: GET_EXPR; \ rlm@15: alcMakeContextCurrent(ctx2); \ rlm@15: SET_EXPR; \ rlm@15: alcMakeContextCurrent(current); \ rlm@15: } rlm@15: rlm@15: #define MAKE_SYNC(NAME, TYPE, GET, SET) \ rlm@15: _MAKE_SYNC(NAME, \ rlm@15: TYPE value, \ rlm@15: GET(sourceID1, param, &value), \ rlm@15: SET(sourceID2, param, value)) rlm@15: rlm@15: #define MAKE_SYNC3(NAME, TYPE, GET, SET) \ rlm@15: _MAKE_SYNC(NAME, \ rlm@15: TYPE value1; TYPE value2; TYPE value3;, \ rlm@15: GET(sourceID1, param, &value1, &value2, &value3), \ rlm@15: SET(sourceID2, param, value1, value2, value3)) rlm@15: rlm@15: MAKE_SYNC( syncSourcei, ALint, alGetSourcei, alSourcei); rlm@15: MAKE_SYNC( syncSourcef, ALfloat, alGetSourcef, alSourcef); rlm@15: MAKE_SYNC3(syncSource3i, ALint, alGetSource3i, alSource3i); rlm@15: MAKE_SYNC3(syncSource3f, ALfloat, alGetSource3f, alSource3f); rlm@15: rlm@18: #+end_src rlm@18: rlm@18: Setting the state of an =OpenAl= source is done with the =alSourcei=, rlm@18: =alSourcef=, =alSource3i=, and =alSource3f= functions. In order to rlm@18: complely synchronize two sources, it is necessary to use all of rlm@18: them. These macros help to condense the otherwise repetitive rlm@18: synchronization code involving these simillar low-level =OpenAL= functions. rlm@18: rlm@18: ** Source Synchronization rlm@18: #+begin_src C rlm@15: void syncSources(ALsource *masterSource, ALsource *slaveSource, rlm@15: ALCcontext *masterCtx, ALCcontext *slaveCtx){ rlm@15: ALuint master = masterSource->source; rlm@15: ALuint slave = slaveSource->source; rlm@15: ALCcontext *current = alcGetCurrentContext(); rlm@15: rlm@15: syncSourcef(master,slave,masterCtx,slaveCtx,AL_PITCH); rlm@15: syncSourcef(master,slave,masterCtx,slaveCtx,AL_GAIN); rlm@15: syncSourcef(master,slave,masterCtx,slaveCtx,AL_MAX_DISTANCE); rlm@15: syncSourcef(master,slave,masterCtx,slaveCtx,AL_ROLLOFF_FACTOR); rlm@15: syncSourcef(master,slave,masterCtx,slaveCtx,AL_REFERENCE_DISTANCE); rlm@15: syncSourcef(master,slave,masterCtx,slaveCtx,AL_MIN_GAIN); rlm@15: syncSourcef(master,slave,masterCtx,slaveCtx,AL_MAX_GAIN); rlm@15: syncSourcef(master,slave,masterCtx,slaveCtx,AL_CONE_OUTER_GAIN); rlm@15: syncSourcef(master,slave,masterCtx,slaveCtx,AL_CONE_INNER_ANGLE); rlm@15: syncSourcef(master,slave,masterCtx,slaveCtx,AL_CONE_OUTER_ANGLE); rlm@15: syncSourcef(master,slave,masterCtx,slaveCtx,AL_SEC_OFFSET); rlm@15: syncSourcef(master,slave,masterCtx,slaveCtx,AL_SAMPLE_OFFSET); rlm@15: syncSourcef(master,slave,masterCtx,slaveCtx,AL_BYTE_OFFSET); rlm@15: rlm@15: syncSource3f(master,slave,masterCtx,slaveCtx,AL_POSITION); rlm@15: syncSource3f(master,slave,masterCtx,slaveCtx,AL_VELOCITY); rlm@15: syncSource3f(master,slave,masterCtx,slaveCtx,AL_DIRECTION); rlm@15: rlm@15: syncSourcei(master,slave,masterCtx,slaveCtx,AL_SOURCE_RELATIVE); rlm@15: syncSourcei(master,slave,masterCtx,slaveCtx,AL_LOOPING); rlm@15: rlm@15: alcMakeContextCurrent(masterCtx); rlm@15: ALint source_type; rlm@15: alGetSourcei(master, AL_SOURCE_TYPE, &source_type); rlm@15: rlm@15: // Only static sources are currently synchronized! rlm@15: if (AL_STATIC == source_type){ rlm@15: ALint master_buffer; rlm@15: ALint slave_buffer; rlm@15: alGetSourcei(master, AL_BUFFER, &master_buffer); rlm@15: alcMakeContextCurrent(slaveCtx); rlm@15: alGetSourcei(slave, AL_BUFFER, &slave_buffer); rlm@15: if (master_buffer != slave_buffer){ rlm@15: alSourcei(slave, AL_BUFFER, master_buffer); rlm@15: } rlm@15: } rlm@15: rlm@15: // Synchronize the state of the two sources. rlm@15: alcMakeContextCurrent(masterCtx); rlm@15: ALint masterState; rlm@15: ALint slaveState; rlm@15: rlm@15: alGetSourcei(master, AL_SOURCE_STATE, &masterState); rlm@15: alcMakeContextCurrent(slaveCtx); rlm@15: alGetSourcei(slave, AL_SOURCE_STATE, &slaveState); rlm@15: rlm@15: if (masterState != slaveState){ rlm@15: switch (masterState){ rlm@15: case AL_INITIAL : alSourceRewind(slave); break; rlm@15: case AL_PLAYING : alSourcePlay(slave); break; rlm@15: case AL_PAUSED : alSourcePause(slave); break; rlm@15: case AL_STOPPED : alSourceStop(slave); break; rlm@15: } rlm@15: } rlm@15: // Restore whatever context was previously active. rlm@15: alcMakeContextCurrent(current); rlm@15: } rlm@18: #+end_src rlm@18: This function is long because it has to exaustively go through all the rlm@18: possible state that a source can have and make sure that it is the rlm@18: same between the master and slave sources. I'd like to take this rlm@18: moment to salute the [[http://connect.creativelabs.com/openal/Documentation/Forms/AllItems.aspx][=OpenAL= Reference Manual]], which provides a very rlm@18: good description of =OpenAL='s internals. rlm@15: rlm@18: ** Context Synchronization rlm@18: #+begin_src C rlm@15: void syncContexts(ALCcontext *master, ALCcontext *slave){ rlm@15: /* If there aren't sufficient sources in slave to mirror rlm@15: the sources in master, create them. */ rlm@15: ALCcontext *current = alcGetCurrentContext(); rlm@15: rlm@15: UIntMap *masterSourceMap = &(master->SourceMap); rlm@15: UIntMap *slaveSourceMap = &(slave->SourceMap); rlm@15: ALuint numMasterSources = masterSourceMap->size; rlm@15: ALuint numSlaveSources = slaveSourceMap->size; rlm@15: rlm@15: alcMakeContextCurrent(slave); rlm@15: if (numSlaveSources < numMasterSources){ rlm@15: ALuint numMissingSources = numMasterSources - numSlaveSources; rlm@15: ALuint newSources[numMissingSources]; rlm@15: alGenSources(numMissingSources, newSources); rlm@15: } rlm@15: rlm@15: /* Now, slave is gauranteed to have at least as many sources rlm@15: as master. Sync each source from master to the corresponding rlm@15: source in slave. */ rlm@15: int i; rlm@15: for(i = 0; i < masterSourceMap->size; i++){ rlm@15: syncSources((ALsource*)masterSourceMap->array[i].value, rlm@15: (ALsource*)slaveSourceMap->array[i].value, rlm@15: master, slave); rlm@15: } rlm@15: alcMakeContextCurrent(current); rlm@15: } rlm@18: #+end_src rlm@15: rlm@18: Most of the hard work in Context Synchronization is done in rlm@18: =syncSources()=. The only thing that =syncContexts()= has to worry rlm@18: about is automoatically creating new sources whenever a slave context rlm@18: does not have the same number of sources as the master context. rlm@18: rlm@18: * Context Creation rlm@18: #+begin_src C rlm@15: static void addContext(ALCdevice *Device, ALCcontext *context){ rlm@15: send_data *data = (send_data*)Device->ExtraData; rlm@15: // expand array if necessary rlm@15: if (data->numContexts >= data->maxContexts){ rlm@15: ALuint newMaxContexts = data->maxContexts*2 + 1; rlm@15: data->contexts = realloc(data->contexts, newMaxContexts*sizeof(context_data)); rlm@15: data->maxContexts = newMaxContexts; rlm@15: } rlm@15: // create context_data and add it to the main array rlm@15: context_data *ctxData; rlm@15: ctxData = (context_data*)calloc(1, sizeof(*ctxData)); rlm@15: ctxData->renderBuffer = rlm@15: malloc(BytesFromDevFmt(Device->FmtType) * rlm@15: Device->NumChan * Device->UpdateSize); rlm@15: ctxData->ctx = context; rlm@15: rlm@15: data->contexts[data->numContexts] = ctxData; rlm@15: data->numContexts++; rlm@15: } rlm@18: #+end_src rlm@15: rlm@18: Here, the slave context is created, and it's data is stored in the rlm@18: device-wide =ExtraData= structure. The =renderBuffer= that is created rlm@18: here is where the rendered sound samples for this slave context will rlm@18: eventually go. rlm@15: rlm@18: * Context Switching rlm@18: #+begin_src C rlm@15: //////////////////// Context Switching rlm@15: rlm@15: /* A device brings along with it two pieces of state rlm@15: * which have to be swapped in and out with each context. rlm@15: */ rlm@15: static void swapInContext(ALCdevice *Device, context_data *ctxData){ rlm@15: memcpy(Device->ClickRemoval, ctxData->ClickRemoval, sizeof(ALfloat)*MAXCHANNELS); rlm@15: memcpy(Device->PendingClicks, ctxData->PendingClicks, sizeof(ALfloat)*MAXCHANNELS); rlm@15: } rlm@15: rlm@15: static void saveContext(ALCdevice *Device, context_data *ctxData){ rlm@15: memcpy(ctxData->ClickRemoval, Device->ClickRemoval, sizeof(ALfloat)*MAXCHANNELS); rlm@15: memcpy(ctxData->PendingClicks, Device->PendingClicks, sizeof(ALfloat)*MAXCHANNELS); rlm@15: } rlm@15: rlm@15: static ALCcontext **currentContext; rlm@15: static ALuint currentNumContext; rlm@15: rlm@15: /* By default, all contexts are rendered at once for each call to aluMixData. rlm@15: * This function uses the internals of the ALCdecice struct to temporarly rlm@15: * cause aluMixData to only render the chosen context. rlm@15: */ rlm@15: static void limitContext(ALCdevice *Device, ALCcontext *ctx){ rlm@15: currentContext = Device->Contexts; rlm@15: currentNumContext = Device->NumContexts; rlm@15: Device->Contexts = &ctx; rlm@15: Device->NumContexts = 1; rlm@15: } rlm@15: rlm@15: static void unLimitContext(ALCdevice *Device){ rlm@15: Device->Contexts = currentContext; rlm@15: Device->NumContexts = currentNumContext; rlm@15: } rlm@18: #+end_src rlm@15: rlm@18: =OpenAL= normally reneders all Contexts in parallel, outputting the rlm@18: whole result to the buffer. It does this by iterating over the rlm@18: Device->Contexts array and rendering each context to the buffer in rlm@18: turn. By temporarly setting Device->NumContexts to 1 and adjusting rlm@18: the Device's context list to put the desired context-to-be-rendered rlm@18: into position 0, we can get trick =OpenAL= into rendering each slave rlm@18: context separate from all the others. rlm@15: rlm@18: ** Main Device Loop rlm@18: #+begin_src C rlm@15: //////////////////// Main Device Loop rlm@15: rlm@18: /* Establish the LWJGL context as the master context, which will rlm@15: * be synchronized to all the slave contexts rlm@15: */ rlm@15: static void init(ALCdevice *Device){ rlm@15: ALCcontext *masterContext = alcGetCurrentContext(); rlm@15: addContext(Device, masterContext); rlm@15: } rlm@15: rlm@15: rlm@15: static void renderData(ALCdevice *Device, int samples){ rlm@15: if(!Device->Connected){return;} rlm@15: send_data *data = (send_data*)Device->ExtraData; rlm@15: ALCcontext *current = alcGetCurrentContext(); rlm@15: rlm@15: ALuint i; rlm@15: for (i = 1; i < data->numContexts; i++){ rlm@15: syncContexts(data->contexts[0]->ctx , data->contexts[i]->ctx); rlm@15: } rlm@15: rlm@15: if ((uint) samples > Device->UpdateSize){ rlm@15: printf("exceeding internal buffer size; dropping samples\n"); rlm@15: printf("requested %d; available %d\n", samples, Device->UpdateSize); rlm@15: samples = (int) Device->UpdateSize; rlm@15: } rlm@15: rlm@15: for (i = 0; i < data->numContexts; i++){ rlm@15: context_data *ctxData = data->contexts[i]; rlm@15: ALCcontext *ctx = ctxData->ctx; rlm@15: alcMakeContextCurrent(ctx); rlm@15: limitContext(Device, ctx); rlm@15: swapInContext(Device, ctxData); rlm@15: aluMixData(Device, ctxData->renderBuffer, samples); rlm@15: saveContext(Device, ctxData); rlm@15: unLimitContext(Device); rlm@15: } rlm@15: alcMakeContextCurrent(current); rlm@15: } rlm@18: #+end_src rlm@15: rlm@18: The main loop synchronizes the master LWJGL context with all the slave rlm@18: contexts, then walks each context, rendering just that context to it's rlm@18: audio-sample storage buffer. rlm@15: rlm@18: * JNI Methods rlm@18: #+begin_src C rlm@15: //////////////////// JNI Methods rlm@15: rlm@15: #include "com_aurellem_send_AudioSend.h" rlm@15: rlm@15: /* rlm@15: * Class: com_aurellem_send_AudioSend rlm@15: * Method: nstep rlm@15: * Signature: (JI)V rlm@15: */ rlm@15: JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_nstep rlm@15: (JNIEnv *env, jclass clazz, jlong device, jint samples){ rlm@15: UNUSED(env);UNUSED(clazz);UNUSED(device); rlm@15: renderData((ALCdevice*)((intptr_t)device), samples); rlm@15: } rlm@15: rlm@15: /* rlm@15: * Class: com_aurellem_send_AudioSend rlm@15: * Method: ngetSamples rlm@15: * Signature: (JLjava/nio/ByteBuffer;III)V rlm@15: */ rlm@15: JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_ngetSamples rlm@15: (JNIEnv *env, jclass clazz, jlong device, jobject buffer, jint position, rlm@15: jint samples, jint n){ rlm@15: UNUSED(clazz); rlm@15: rlm@15: ALvoid *buffer_address = rlm@15: ((ALbyte *)(((char*)(*env)->GetDirectBufferAddress(env, buffer)) + position)); rlm@15: ALCdevice *recorder = (ALCdevice*) ((intptr_t)device); rlm@15: send_data *data = (send_data*)recorder->ExtraData; rlm@15: if ((ALuint)n > data->numContexts){return;} rlm@15: memcpy(buffer_address, data->contexts[n]->renderBuffer, rlm@15: BytesFromDevFmt(recorder->FmtType) * recorder->NumChan * samples); rlm@15: } rlm@15: rlm@15: /* rlm@15: * Class: com_aurellem_send_AudioSend rlm@15: * Method: naddListener rlm@15: * Signature: (J)V rlm@15: */ rlm@15: JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_naddListener rlm@15: (JNIEnv *env, jclass clazz, jlong device){ rlm@15: UNUSED(env); UNUSED(clazz); rlm@15: //printf("creating new context via naddListener\n"); rlm@15: ALCdevice *Device = (ALCdevice*) ((intptr_t)device); rlm@15: ALCcontext *new = alcCreateContext(Device, NULL); rlm@15: addContext(Device, new); rlm@15: } rlm@15: rlm@15: /* rlm@15: * Class: com_aurellem_send_AudioSend rlm@15: * Method: nsetNthListener3f rlm@15: * Signature: (IFFFJI)V rlm@15: */ rlm@15: JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_nsetNthListener3f rlm@15: (JNIEnv *env, jclass clazz, jint param, rlm@15: jfloat v1, jfloat v2, jfloat v3, jlong device, jint contextNum){ rlm@15: UNUSED(env);UNUSED(clazz); rlm@15: rlm@15: ALCdevice *Device = (ALCdevice*) ((intptr_t)device); rlm@15: send_data *data = (send_data*)Device->ExtraData; rlm@15: rlm@15: ALCcontext *current = alcGetCurrentContext(); rlm@15: if ((ALuint)contextNum > data->numContexts){return;} rlm@15: alcMakeContextCurrent(data->contexts[contextNum]->ctx); rlm@15: alListener3f(param, v1, v2, v3); rlm@15: alcMakeContextCurrent(current); rlm@15: } rlm@15: rlm@15: /* rlm@15: * Class: com_aurellem_send_AudioSend rlm@15: * Method: nsetNthListenerf rlm@15: * Signature: (IFJI)V rlm@15: */ rlm@15: JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_nsetNthListenerf rlm@15: (JNIEnv *env, jclass clazz, jint param, jfloat v1, jlong device, rlm@15: jint contextNum){ rlm@15: rlm@15: UNUSED(env);UNUSED(clazz); rlm@15: rlm@15: ALCdevice *Device = (ALCdevice*) ((intptr_t)device); rlm@15: send_data *data = (send_data*)Device->ExtraData; rlm@15: rlm@15: ALCcontext *current = alcGetCurrentContext(); rlm@15: if ((ALuint)contextNum > data->numContexts){return;} rlm@15: alcMakeContextCurrent(data->contexts[contextNum]->ctx); rlm@15: alListenerf(param, v1); rlm@15: alcMakeContextCurrent(current); rlm@15: } rlm@15: rlm@15: /* rlm@15: * Class: com_aurellem_send_AudioSend rlm@15: * Method: ninitDevice rlm@15: * Signature: (J)V rlm@15: */ rlm@15: JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_ninitDevice rlm@15: (JNIEnv *env, jclass clazz, jlong device){ rlm@15: UNUSED(env);UNUSED(clazz); rlm@15: rlm@15: ALCdevice *Device = (ALCdevice*) ((intptr_t)device); rlm@15: init(Device); rlm@15: rlm@15: } rlm@15: rlm@15: rlm@15: /* rlm@15: * Class: com_aurellem_send_AudioSend rlm@15: * Method: ngetAudioFormat rlm@15: * Signature: (J)Ljavax/sound/sampled/AudioFormat; rlm@15: */ rlm@15: JNIEXPORT jobject JNICALL Java_com_aurellem_send_AudioSend_ngetAudioFormat rlm@15: (JNIEnv *env, jclass clazz, jlong device){ rlm@15: UNUSED(clazz); rlm@15: jclass AudioFormatClass = rlm@15: (*env)->FindClass(env, "javax/sound/sampled/AudioFormat"); rlm@15: jmethodID AudioFormatConstructor = rlm@15: (*env)->GetMethodID(env, AudioFormatClass, "", "(FIIZZ)V"); rlm@15: rlm@15: ALCdevice *Device = (ALCdevice*) ((intptr_t)device); rlm@15: rlm@15: //float frequency rlm@15: rlm@15: int isSigned; rlm@15: switch (Device->FmtType) rlm@15: { rlm@15: case DevFmtUByte: rlm@15: case DevFmtUShort: isSigned = 0; break; rlm@15: default : isSigned = 1; rlm@15: } rlm@15: float frequency = Device->Frequency; rlm@15: int bitsPerFrame = (8 * BytesFromDevFmt(Device->FmtType)); rlm@15: int channels = Device->NumChan; rlm@15: rlm@15: rlm@15: //printf("freq = %f, bpf = %d, channels = %d, signed? = %d\n", rlm@15: // frequency, bitsPerFrame, channels, isSigned); rlm@15: rlm@15: jobject format = (*env)-> rlm@15: NewObject( rlm@15: env,AudioFormatClass,AudioFormatConstructor, rlm@15: frequency, rlm@15: bitsPerFrame, rlm@15: channels, rlm@15: isSigned, rlm@15: 0); rlm@15: return format; rlm@15: } rlm@15: rlm@15: //////////////////// Device Initilization / Management rlm@15: rlm@15: static const ALCchar sendDevice[] = "Multiple Audio Send"; rlm@15: rlm@15: static ALCboolean send_open_playback(ALCdevice *device, rlm@15: const ALCchar *deviceName) rlm@15: { rlm@15: send_data *data; rlm@15: // stop any buffering for stdout, so that I can rlm@15: // see the printf statements in my terminal immediatley rlm@15: setbuf(stdout, NULL); rlm@15: rlm@15: if(!deviceName) rlm@15: deviceName = sendDevice; rlm@15: else if(strcmp(deviceName, sendDevice) != 0) rlm@15: return ALC_FALSE; rlm@15: data = (send_data*)calloc(1, sizeof(*data)); rlm@15: device->szDeviceName = strdup(deviceName); rlm@15: device->ExtraData = data; rlm@15: return ALC_TRUE; rlm@15: } rlm@15: rlm@15: static void send_close_playback(ALCdevice *device) rlm@15: { rlm@15: send_data *data = (send_data*)device->ExtraData; rlm@15: alcMakeContextCurrent(NULL); rlm@15: ALuint i; rlm@15: // Destroy all slave contexts. LWJGL will take care of rlm@15: // its own context. rlm@15: for (i = 1; i < data->numContexts; i++){ rlm@15: context_data *ctxData = data->contexts[i]; rlm@15: alcDestroyContext(ctxData->ctx); rlm@15: free(ctxData->renderBuffer); rlm@15: free(ctxData); rlm@15: } rlm@15: free(data); rlm@15: device->ExtraData = NULL; rlm@15: } rlm@15: rlm@15: static ALCboolean send_reset_playback(ALCdevice *device) rlm@15: { rlm@15: SetDefaultWFXChannelOrder(device); rlm@15: return ALC_TRUE; rlm@15: } rlm@15: rlm@15: static void send_stop_playback(ALCdevice *Device){ rlm@15: UNUSED(Device); rlm@15: } rlm@15: rlm@15: static const BackendFuncs send_funcs = { rlm@15: send_open_playback, rlm@15: send_close_playback, rlm@15: send_reset_playback, rlm@15: send_stop_playback, rlm@15: NULL, rlm@15: NULL, /* These would be filled with functions to */ rlm@15: NULL, /* handle capturing audio if we we into that */ rlm@15: NULL, /* sort of thing... */ rlm@15: NULL, rlm@15: NULL rlm@15: }; rlm@15: rlm@15: ALCboolean alc_send_init(BackendFuncs *func_list){ rlm@15: *func_list = send_funcs; rlm@15: return ALC_TRUE; rlm@15: } rlm@15: rlm@15: void alc_send_deinit(void){} rlm@15: rlm@15: void alc_send_probe(enum DevProbe type) rlm@15: { rlm@15: switch(type) rlm@15: { rlm@15: case DEVICE_PROBE: rlm@15: AppendDeviceList(sendDevice); rlm@15: break; rlm@15: case ALL_DEVICE_PROBE: rlm@15: AppendAllDeviceList(sendDevice); rlm@15: break; rlm@15: case CAPTURE_DEVICE_PROBE: rlm@15: break; rlm@15: } rlm@15: } rlm@15: #+end_src rlm@15: rlm@15: rlm@15: rlm@15: rlm@15: rlm@15: rlm@15: rlm@15: rlm@15: #+srcname: ears rlm@0: #+begin_src clojure rlm@15: (ns cortex.hearing) rlm@0: (use 'cortex.world) rlm@0: (use 'cortex.import) rlm@0: (use 'clojure.contrib.def) rlm@0: (cortex.import/mega-import-jme3) rlm@0: (rlm.rlm-commands/help) rlm@0: (import java.nio.ByteBuffer) rlm@0: (import java.awt.image.BufferedImage) rlm@0: (import java.awt.Color) rlm@0: (import java.awt.Dimension) rlm@0: (import java.awt.Graphics) rlm@0: (import java.awt.Graphics2D) rlm@0: (import java.awt.event.WindowAdapter) rlm@0: (import java.awt.event.WindowEvent) rlm@0: (import java.awt.image.BufferedImage) rlm@0: (import java.nio.ByteBuffer) rlm@0: (import javax.swing.JFrame) rlm@0: (import javax.swing.JPanel) rlm@0: (import javax.swing.SwingUtilities) rlm@0: (import javax.swing.ImageIcon) rlm@0: (import javax.swing.JOptionPane) rlm@0: (import java.awt.image.ImageObserver) rlm@0: rlm@0: (import 'com.jme3.capture.SoundProcessor) rlm@0: rlm@0: rlm@0: (defn sound-processor rlm@0: "deals with converting ByteBuffers into Arrays of bytes so that the rlm@0: continuation functions can be defined in terms of immutable stuff." rlm@0: [continuation] rlm@0: (proxy [SoundProcessor] [] rlm@0: (cleanup []) rlm@0: (process rlm@0: [#^ByteBuffer audioSamples numSamples] rlm@0: (no-exceptions rlm@0: (let [byte-array (byte-array numSamples)] rlm@0: (.get audioSamples byte-array 0 numSamples) rlm@0: (continuation rlm@0: (vec byte-array))))))) rlm@0: rlm@0: (defn add-ear rlm@0: "add an ear to the world. The continuation function will be called rlm@0: on the FFT or the sounds which the ear hears in the given rlm@0: timeframe. Sound is 3D." rlm@0: [world listener continuation] rlm@0: (let [renderer (.getAudioRenderer world)] rlm@0: (.addListener renderer listener) rlm@0: (.registerSoundProcessor renderer listener rlm@0: (sound-processor continuation)) rlm@0: listener)) rlm@0: rlm@0: #+end_src rlm@0: rlm@0: rlm@0: rlm@0: #+srcname: test-hearing rlm@0: #+begin_src clojure :results silent rlm@0: (ns test.hearing) rlm@0: (use 'cortex.world) rlm@0: (use 'cortex.import) rlm@0: (use 'clojure.contrib.def) rlm@0: (use 'body.ear) rlm@0: (cortex.import/mega-import-jme3) rlm@0: (rlm.rlm-commands/help) rlm@0: rlm@0: (defn setup-fn [world] rlm@0: (let [listener (Listener.)] rlm@0: (add-ear world listener #(println (nth % 0))))) rlm@0: rlm@0: (defn play-sound [node world value] rlm@0: (if (not value) rlm@0: (do rlm@0: (.playSource (.getAudioRenderer world) node)))) rlm@0: rlm@0: (defn test-world [] rlm@0: (let [node1 (AudioNode. (asset-manager) "Sounds/pure.wav" false false)] rlm@0: (world rlm@0: (Node.) rlm@0: {"key-space" (partial play-sound node1)} rlm@0: setup-fn rlm@0: no-op rlm@0: ))) rlm@0: rlm@0: rlm@0: #+end_src rlm@0: rlm@0: rlm@0: rlm@15: * Example rlm@15: rlm@0: * COMMENT Code Generation rlm@0: rlm@15: #+begin_src clojure :tangle ../../cortex/src/cortex/hearing.clj rlm@15: <> rlm@0: #+end_src rlm@0: rlm@15: #+begin_src clojure :tangle ../../cortex/src/test/hearing.clj rlm@0: <> rlm@0: #+end_src rlm@0: rlm@0: rlm@15: #+begin_src C :tangle ../Alc/backends/send.c rlm@15: <> rlm@15: #+end_src