Mercurial > audio-send
view org/ear.org @ 20:e8ae40c9848c
fixed 1,000,000 spelling errors
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Thu, 03 Nov 2011 15:02:18 -0700 |
parents | 22ac5a0367cd |
children | 0ee04505a37f |
line wrap: on
line source
1 #+title: Simulated Sense of Hearing2 #+author: Robert McIntyre3 #+email: rlm@mit.edu4 #+description: Simulating multiple listeners and the sense of hearing in jMonkeyEngine35 #+keywords: simulated hearing, openal, clojure, jMonkeyEngine3, LWJGL, AI6 #+SETUPFILE: ../../aurellem/org/setup.org7 #+INCLUDE: ../../aurellem/org/level-0.org8 #+BABEL: :exports both :noweb yes :cache no :mkdirp yes10 * Hearing12 I want to be able to place ears in a similar manner to how I place13 the eyes. I want to be able to place ears in a unique spatial14 position, and receive as output at every tick the F.F.T. of whatever15 signals are happening at that point.17 Hearing is one of the more difficult senses to simulate, because there18 is less support for obtaining the actual sound data that is processed19 by jMonkeyEngine3.21 jMonkeyEngine's sound system works as follows:23 - jMonkeyEngine uses the =AppSettings= for the particular application24 to determine what sort of =AudioRenderer= should be used.25 - although some support is provided for multiple AudioRendering26 backends, jMonkeyEngine at the time of this writing will either27 pick no AudioRenderer at all, or the =LwjglAudioRenderer=28 - jMonkeyEngine tries to figure out what sort of system you're29 running and extracts the appropriate native libraries.30 - the =LwjglAudioRenderer= uses the [[http://lwjgl.org/][=LWJGL=]] (LightWeight Java Game31 Library) bindings to interface with a C library called [[http://kcat.strangesoft.net/openal.html][=OpenAL=]]32 - =OpenAL= calculates the 3D sound localization and feeds a stream of33 sound to any of various sound output devices with which it knows34 how to communicate.36 A consequence of this is that there's no way to access the actual37 sound data produced by =OpenAL=. Even worse, =OpenAL= only supports38 one /listener/, which normally isn't a problem for games, but becomes39 a problem when trying to make multiple AI creatures that can each hear40 the world from a different perspective.42 To make many AI creatures in jMonkeyEngine that can each hear the43 world from their own perspective, it is necessary to go all the way44 back to =OpenAL= and implement support for simulated hearing there.46 * Extending =OpenAL=47 ** =OpenAL= Devices49 =OpenAL= goes to great lengths to support many different systems, all50 with different sound capabilities and interfaces. It accomplishes this51 difficult task by providing code for many different sound backends in52 pseudo-objects called /Devices/. There's a device for the Linux Open53 Sound System and the Advanced Linux Sound Architecture, there's one54 for Direct Sound on Windows, there's even one for Solaris. =OpenAL=55 solves the problem of platform independence by providing all these56 Devices.58 Wrapper libraries such as LWJGL are free to examine the system on59 which they are running and then select an appropriate device for that60 system.62 There are also a few "special" devices that don't interface with any63 particular system. These include the Null Device, which doesn't do64 anything, and the Wave Device, which writes whatever sound it receives65 to a file, if everything has been set up correctly when configuring66 =OpenAL=.68 Actual mixing of the sound data happens in the Devices, and they are69 the only point in the sound rendering process where this data is70 available.72 Therefore, in order to support multiple listeners, and get the sound73 data in a form that the AIs can use, it is necessary to create a new74 Device, which supports this features.76 ** The Send Device77 Adding a device to OpenAL is rather tricky -- there are five separate78 files in the =OpenAL= source tree that must be modified to do so. I've79 documented this process [[./add-new-device.org][here]] for anyone who is interested.82 Onward to that actual Device!84 again, my objectives are:86 - Support Multiple Listeners from jMonkeyEngine387 - Get access to the rendered sound data for further processing from88 clojure.90 ** =send.c=92 ** Header93 #+srcname: send-header94 #+begin_src C95 #include "config.h"96 #include <stdlib.h>97 #include "alMain.h"98 #include "AL/al.h"99 #include "AL/alc.h"100 #include "alSource.h"101 #include <jni.h>103 //////////////////// Summary105 struct send_data;106 struct context_data;108 static void addContext(ALCdevice *, ALCcontext *);109 static void syncContexts(ALCcontext *master, ALCcontext *slave);110 static void syncSources(ALsource *master, ALsource *slave,111 ALCcontext *masterCtx, ALCcontext *slaveCtx);113 static void syncSourcei(ALuint master, ALuint slave,114 ALCcontext *masterCtx, ALCcontext *ctx2, ALenum param);115 static void syncSourcef(ALuint master, ALuint slave,116 ALCcontext *masterCtx, ALCcontext *ctx2, ALenum param);117 static void syncSource3f(ALuint master, ALuint slave,118 ALCcontext *masterCtx, ALCcontext *ctx2, ALenum param);120 static void swapInContext(ALCdevice *, struct context_data *);121 static void saveContext(ALCdevice *, struct context_data *);122 static void limitContext(ALCdevice *, ALCcontext *);123 static void unLimitContext(ALCdevice *);125 static void init(ALCdevice *);126 static void renderData(ALCdevice *, int samples);128 #define UNUSED(x) (void)(x)129 #+end_src131 The main idea behind the Send device is to take advantage of the fact132 that LWJGL only manages one /context/ when using OpenAL. A /context/133 is like a container that holds samples and keeps track of where the134 listener is. In order to support multiple listeners, the Send device135 identifies the LWJGL context as the master context, and creates any136 number of slave contexts to represent additional listeners. Every137 time the device renders sound, it synchronizes every source from the138 master LWJGL context to the slave contexts. Then, it renders each139 context separately, using a different listener for each one. The140 rendered sound is made available via JNI to jMonkeyEngine.142 To recap, the process is:143 - Set the LWJGL context as "master" in the =init()= method.144 - Create any number of additional contexts via =addContext()=145 - At every call to =renderData()= sync the master context with the146 slave contexts with =syncContexts()=147 - =syncContexts()= calls =syncSources()= to sync all the sources148 which are in the master context.149 - =limitContext()= and =unLimitContext()= make it possible to render150 only one context at a time.152 ** Necessary State153 #+begin_src C154 //////////////////// State156 typedef struct context_data {157 ALfloat ClickRemoval[MAXCHANNELS];158 ALfloat PendingClicks[MAXCHANNELS];159 ALvoid *renderBuffer;160 ALCcontext *ctx;161 } context_data;163 typedef struct send_data {164 ALuint size;165 context_data **contexts;166 ALuint numContexts;167 ALuint maxContexts;168 } send_data;169 #+end_src171 Switching between contexts is not the normal operation of a Device,172 and one of the problems with doing so is that a Device normally keeps173 around a few pieces of state such as the =ClickRemoval= array above174 which will become corrupted if the contexts are not done in175 parallel. The solution is to create a copy of this normally global176 device state for each context, and copy it back and forth into and out177 of the actual device state whenever a context is rendered.179 ** Synchronization Macros181 #+begin_src C182 //////////////////// Context Creation / Synchronization184 #define _MAKE_SYNC(NAME, INIT_EXPR, GET_EXPR, SET_EXPR) \185 void NAME (ALuint sourceID1, ALuint sourceID2, \186 ALCcontext *ctx1, ALCcontext *ctx2, \187 ALenum param){ \188 INIT_EXPR; \189 ALCcontext *current = alcGetCurrentContext(); \190 alcMakeContextCurrent(ctx1); \191 GET_EXPR; \192 alcMakeContextCurrent(ctx2); \193 SET_EXPR; \194 alcMakeContextCurrent(current); \195 }197 #define MAKE_SYNC(NAME, TYPE, GET, SET) \198 _MAKE_SYNC(NAME, \199 TYPE value, \200 GET(sourceID1, param, &value), \201 SET(sourceID2, param, value))203 #define MAKE_SYNC3(NAME, TYPE, GET, SET) \204 _MAKE_SYNC(NAME, \205 TYPE value1; TYPE value2; TYPE value3;, \206 GET(sourceID1, param, &value1, &value2, &value3), \207 SET(sourceID2, param, value1, value2, value3))209 MAKE_SYNC( syncSourcei, ALint, alGetSourcei, alSourcei);210 MAKE_SYNC( syncSourcef, ALfloat, alGetSourcef, alSourcef);211 MAKE_SYNC3(syncSource3i, ALint, alGetSource3i, alSource3i);212 MAKE_SYNC3(syncSource3f, ALfloat, alGetSource3f, alSource3f);214 #+end_src216 Setting the state of an =OpenAL= source is done with the =alSourcei=,217 =alSourcef=, =alSource3i=, and =alSource3f= functions. In order to218 completely synchronize two sources, it is necessary to use all of219 them. These macros help to condense the otherwise repetitive220 synchronization code involving these similar low-level =OpenAL= functions.222 ** Source Synchronization223 #+begin_src C224 void syncSources(ALsource *masterSource, ALsource *slaveSource,225 ALCcontext *masterCtx, ALCcontext *slaveCtx){226 ALuint master = masterSource->source;227 ALuint slave = slaveSource->source;228 ALCcontext *current = alcGetCurrentContext();230 syncSourcef(master,slave,masterCtx,slaveCtx,AL_PITCH);231 syncSourcef(master,slave,masterCtx,slaveCtx,AL_GAIN);232 syncSourcef(master,slave,masterCtx,slaveCtx,AL_MAX_DISTANCE);233 syncSourcef(master,slave,masterCtx,slaveCtx,AL_ROLLOFF_FACTOR);234 syncSourcef(master,slave,masterCtx,slaveCtx,AL_REFERENCE_DISTANCE);235 syncSourcef(master,slave,masterCtx,slaveCtx,AL_MIN_GAIN);236 syncSourcef(master,slave,masterCtx,slaveCtx,AL_MAX_GAIN);237 syncSourcef(master,slave,masterCtx,slaveCtx,AL_CONE_OUTER_GAIN);238 syncSourcef(master,slave,masterCtx,slaveCtx,AL_CONE_INNER_ANGLE);239 syncSourcef(master,slave,masterCtx,slaveCtx,AL_CONE_OUTER_ANGLE);240 syncSourcef(master,slave,masterCtx,slaveCtx,AL_SEC_OFFSET);241 syncSourcef(master,slave,masterCtx,slaveCtx,AL_SAMPLE_OFFSET);242 syncSourcef(master,slave,masterCtx,slaveCtx,AL_BYTE_OFFSET);244 syncSource3f(master,slave,masterCtx,slaveCtx,AL_POSITION);245 syncSource3f(master,slave,masterCtx,slaveCtx,AL_VELOCITY);246 syncSource3f(master,slave,masterCtx,slaveCtx,AL_DIRECTION);248 syncSourcei(master,slave,masterCtx,slaveCtx,AL_SOURCE_RELATIVE);249 syncSourcei(master,slave,masterCtx,slaveCtx,AL_LOOPING);251 alcMakeContextCurrent(masterCtx);252 ALint source_type;253 alGetSourcei(master, AL_SOURCE_TYPE, &source_type);255 // Only static sources are currently synchronized!256 if (AL_STATIC == source_type){257 ALint master_buffer;258 ALint slave_buffer;259 alGetSourcei(master, AL_BUFFER, &master_buffer);260 alcMakeContextCurrent(slaveCtx);261 alGetSourcei(slave, AL_BUFFER, &slave_buffer);262 if (master_buffer != slave_buffer){263 alSourcei(slave, AL_BUFFER, master_buffer);264 }265 }267 // Synchronize the state of the two sources.268 alcMakeContextCurrent(masterCtx);269 ALint masterState;270 ALint slaveState;272 alGetSourcei(master, AL_SOURCE_STATE, &masterState);273 alcMakeContextCurrent(slaveCtx);274 alGetSourcei(slave, AL_SOURCE_STATE, &slaveState);276 if (masterState != slaveState){277 switch (masterState){278 case AL_INITIAL : alSourceRewind(slave); break;279 case AL_PLAYING : alSourcePlay(slave); break;280 case AL_PAUSED : alSourcePause(slave); break;281 case AL_STOPPED : alSourceStop(slave); break;282 }283 }284 // Restore whatever context was previously active.285 alcMakeContextCurrent(current);286 }287 #+end_src288 This function is long because it has to exhaustively go through all the289 possible state that a source can have and make sure that it is the290 same between the master and slave sources. I'd like to take this291 moment to salute the [[http://connect.creativelabs.com/openal/Documentation/Forms/AllItems.aspx][=OpenAL= Reference Manual]], which provides a very292 good description of =OpenAL='s internals.294 ** Context Synchronization295 #+begin_src C296 void syncContexts(ALCcontext *master, ALCcontext *slave){297 /* If there aren't sufficient sources in slave to mirror298 the sources in master, create them. */299 ALCcontext *current = alcGetCurrentContext();301 UIntMap *masterSourceMap = &(master->SourceMap);302 UIntMap *slaveSourceMap = &(slave->SourceMap);303 ALuint numMasterSources = masterSourceMap->size;304 ALuint numSlaveSources = slaveSourceMap->size;306 alcMakeContextCurrent(slave);307 if (numSlaveSources < numMasterSources){308 ALuint numMissingSources = numMasterSources - numSlaveSources;309 ALuint newSources[numMissingSources];310 alGenSources(numMissingSources, newSources);311 }313 /* Now, slave is guaranteed to have at least as many sources314 as master. Sync each source from master to the corresponding315 source in slave. */316 int i;317 for(i = 0; i < masterSourceMap->size; i++){318 syncSources((ALsource*)masterSourceMap->array[i].value,319 (ALsource*)slaveSourceMap->array[i].value,320 master, slave);321 }322 alcMakeContextCurrent(current);323 }324 #+end_src326 Most of the hard work in Context Synchronization is done in327 =syncSources()=. The only thing that =syncContexts()= has to worry328 about is automatically creating new sources whenever a slave context329 does not have the same number of sources as the master context.331 ** Context Creation332 #+begin_src C333 static void addContext(ALCdevice *Device, ALCcontext *context){334 send_data *data = (send_data*)Device->ExtraData;335 // expand array if necessary336 if (data->numContexts >= data->maxContexts){337 ALuint newMaxContexts = data->maxContexts*2 + 1;338 data->contexts = realloc(data->contexts, newMaxContexts*sizeof(context_data));339 data->maxContexts = newMaxContexts;340 }341 // create context_data and add it to the main array342 context_data *ctxData;343 ctxData = (context_data*)calloc(1, sizeof(*ctxData));344 ctxData->renderBuffer =345 malloc(BytesFromDevFmt(Device->FmtType) *346 Device->NumChan * Device->UpdateSize);347 ctxData->ctx = context;349 data->contexts[data->numContexts] = ctxData;350 data->numContexts++;351 }352 #+end_src354 Here, the slave context is created, and it's data is stored in the355 device-wide =ExtraData= structure. The =renderBuffer= that is created356 here is where the rendered sound samples for this slave context will357 eventually go.359 ** Context Switching360 #+begin_src C361 //////////////////// Context Switching363 /* A device brings along with it two pieces of state364 * which have to be swapped in and out with each context.365 */366 static void swapInContext(ALCdevice *Device, context_data *ctxData){367 memcpy(Device->ClickRemoval, ctxData->ClickRemoval, sizeof(ALfloat)*MAXCHANNELS);368 memcpy(Device->PendingClicks, ctxData->PendingClicks, sizeof(ALfloat)*MAXCHANNELS);369 }371 static void saveContext(ALCdevice *Device, context_data *ctxData){372 memcpy(ctxData->ClickRemoval, Device->ClickRemoval, sizeof(ALfloat)*MAXCHANNELS);373 memcpy(ctxData->PendingClicks, Device->PendingClicks, sizeof(ALfloat)*MAXCHANNELS);374 }376 static ALCcontext **currentContext;377 static ALuint currentNumContext;379 /* By default, all contexts are rendered at once for each call to aluMixData.380 * This function uses the internals of the ALCdevice struct to temporally381 * cause aluMixData to only render the chosen context.382 */383 static void limitContext(ALCdevice *Device, ALCcontext *ctx){384 currentContext = Device->Contexts;385 currentNumContext = Device->NumContexts;386 Device->Contexts = &ctx;387 Device->NumContexts = 1;388 }390 static void unLimitContext(ALCdevice *Device){391 Device->Contexts = currentContext;392 Device->NumContexts = currentNumContext;393 }394 #+end_src396 =OpenAL= normally renders all Contexts in parallel, outputting the397 whole result to the buffer. It does this by iterating over the398 Device->Contexts array and rendering each context to the buffer in399 turn. By temporally setting Device->NumContexts to 1 and adjusting400 the Device's context list to put the desired context-to-be-rendered401 into position 0, we can get trick =OpenAL= into rendering each slave402 context separate from all the others.404 ** Main Device Loop405 #+begin_src C406 //////////////////// Main Device Loop408 /* Establish the LWJGL context as the master context, which will409 * be synchronized to all the slave contexts410 */411 static void init(ALCdevice *Device){412 ALCcontext *masterContext = alcGetCurrentContext();413 addContext(Device, masterContext);414 }417 static void renderData(ALCdevice *Device, int samples){418 if(!Device->Connected){return;}419 send_data *data = (send_data*)Device->ExtraData;420 ALCcontext *current = alcGetCurrentContext();422 ALuint i;423 for (i = 1; i < data->numContexts; i++){424 syncContexts(data->contexts[0]->ctx , data->contexts[i]->ctx);425 }427 if ((uint) samples > Device->UpdateSize){428 printf("exceeding internal buffer size; dropping samples\n");429 printf("requested %d; available %d\n", samples, Device->UpdateSize);430 samples = (int) Device->UpdateSize;431 }433 for (i = 0; i < data->numContexts; i++){434 context_data *ctxData = data->contexts[i];435 ALCcontext *ctx = ctxData->ctx;436 alcMakeContextCurrent(ctx);437 limitContext(Device, ctx);438 swapInContext(Device, ctxData);439 aluMixData(Device, ctxData->renderBuffer, samples);440 saveContext(Device, ctxData);441 unLimitContext(Device);442 }443 alcMakeContextCurrent(current);444 }445 #+end_src447 The main loop synchronizes the master LWJGL context with all the slave448 contexts, then walks each context, rendering just that context to it's449 audio-sample storage buffer.451 ** JNI Methods453 At this point, we have the ability to create multiple listeners by454 using the master/slave context trick, and the rendered audio data is455 waiting patiently in internal buffers, one for each listener. We need456 a way to transport this information to Java, and also a way to drive457 this device from Java. The following JNI interface code is inspired458 by the way LWJGL interfaces with =OpenAL=.460 *** step462 #+begin_src C463 //////////////////// JNI Methods465 #include "com_aurellem_send_AudioSend.h"467 /*468 * Class: com_aurellem_send_AudioSend469 * Method: nstep470 * Signature: (JI)V471 */472 JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_nstep473 (JNIEnv *env, jclass clazz, jlong device, jint samples){474 UNUSED(env);UNUSED(clazz);UNUSED(device);475 renderData((ALCdevice*)((intptr_t)device), samples);476 }477 #+end_src478 This device, unlike most of the other devices in =OpenAL=, does not479 render sound unless asked. This enables the system to slow down or480 speed up depending on the needs of the AIs who are using it to481 listen. If the device tried to render samples in real-time, a482 complicated AI whose mind takes 100 seconds of computer time to483 simulate 1 second of AI-time would miss almost all of the sound in484 its environment.487 *** getSamples488 #+begin_src C489 /*490 * Class: com_aurellem_send_AudioSend491 * Method: ngetSamples492 * Signature: (JLjava/nio/ByteBuffer;III)V493 */494 JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_ngetSamples495 (JNIEnv *env, jclass clazz, jlong device, jobject buffer, jint position,496 jint samples, jint n){497 UNUSED(clazz);499 ALvoid *buffer_address =500 ((ALbyte *)(((char*)(*env)->GetDirectBufferAddress(env, buffer)) + position));501 ALCdevice *recorder = (ALCdevice*) ((intptr_t)device);502 send_data *data = (send_data*)recorder->ExtraData;503 if ((ALuint)n > data->numContexts){return;}504 memcpy(buffer_address, data->contexts[n]->renderBuffer,505 BytesFromDevFmt(recorder->FmtType) * recorder->NumChan * samples);506 }507 #+end_src509 This is the transport layer between C and Java that will eventually510 allow us to access rendered sound data from clojure.512 *** Listener Management514 =addListener=, =setNthListenerf=, and =setNthListener3f= are515 necessary to change the properties of any listener other than the516 master one, since only the listener of the current active context is517 affected by the normal =OpenAL= listener calls.519 #+begin_src C520 /*521 * Class: com_aurellem_send_AudioSend522 * Method: naddListener523 * Signature: (J)V524 */525 JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_naddListener526 (JNIEnv *env, jclass clazz, jlong device){527 UNUSED(env); UNUSED(clazz);528 //printf("creating new context via naddListener\n");529 ALCdevice *Device = (ALCdevice*) ((intptr_t)device);530 ALCcontext *new = alcCreateContext(Device, NULL);531 addContext(Device, new);532 }534 /*535 * Class: com_aurellem_send_AudioSend536 * Method: nsetNthListener3f537 * Signature: (IFFFJI)V538 */539 JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_nsetNthListener3f540 (JNIEnv *env, jclass clazz, jint param,541 jfloat v1, jfloat v2, jfloat v3, jlong device, jint contextNum){542 UNUSED(env);UNUSED(clazz);544 ALCdevice *Device = (ALCdevice*) ((intptr_t)device);545 send_data *data = (send_data*)Device->ExtraData;547 ALCcontext *current = alcGetCurrentContext();548 if ((ALuint)contextNum > data->numContexts){return;}549 alcMakeContextCurrent(data->contexts[contextNum]->ctx);550 alListener3f(param, v1, v2, v3);551 alcMakeContextCurrent(current);552 }554 /*555 * Class: com_aurellem_send_AudioSend556 * Method: nsetNthListenerf557 * Signature: (IFJI)V558 */559 JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_nsetNthListenerf560 (JNIEnv *env, jclass clazz, jint param, jfloat v1, jlong device,561 jint contextNum){563 UNUSED(env);UNUSED(clazz);565 ALCdevice *Device = (ALCdevice*) ((intptr_t)device);566 send_data *data = (send_data*)Device->ExtraData;568 ALCcontext *current = alcGetCurrentContext();569 if ((ALuint)contextNum > data->numContexts){return;}570 alcMakeContextCurrent(data->contexts[contextNum]->ctx);571 alListenerf(param, v1);572 alcMakeContextCurrent(current);573 }574 #+end_src576 *** Initialization577 =initDevice= is called from the Java side after LWJGL has created its578 context, and before any calls to =addListener=. It establishes the579 LWJGL context as the master context.581 =getAudioFormat= is a convenience function that uses JNI to build up a582 =javax.sound.sampled.AudioFormat= object from data in the Device. This583 way, there is no ambiguity about what the bits created by =step= and584 returned by =getSamples= mean.586 #+begin_src C587 /*588 * Class: com_aurellem_send_AudioSend589 * Method: ninitDevice590 * Signature: (J)V591 */592 JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_ninitDevice593 (JNIEnv *env, jclass clazz, jlong device){594 UNUSED(env);UNUSED(clazz);595 ALCdevice *Device = (ALCdevice*) ((intptr_t)device);596 init(Device);597 }599 /*600 * Class: com_aurellem_send_AudioSend601 * Method: ngetAudioFormat602 * Signature: (J)Ljavax/sound/sampled/AudioFormat;603 */604 JNIEXPORT jobject JNICALL Java_com_aurellem_send_AudioSend_ngetAudioFormat605 (JNIEnv *env, jclass clazz, jlong device){606 UNUSED(clazz);607 jclass AudioFormatClass =608 (*env)->FindClass(env, "javax/sound/sampled/AudioFormat");609 jmethodID AudioFormatConstructor =610 (*env)->GetMethodID(env, AudioFormatClass, "<init>", "(FIIZZ)V");612 ALCdevice *Device = (ALCdevice*) ((intptr_t)device);613 int isSigned;614 switch (Device->FmtType)615 {616 case DevFmtUByte:617 case DevFmtUShort: isSigned = 0; break;618 default : isSigned = 1;619 }620 float frequency = Device->Frequency;621 int bitsPerFrame = (8 * BytesFromDevFmt(Device->FmtType));622 int channels = Device->NumChan;623 jobject format = (*env)->624 NewObject(625 env,AudioFormatClass,AudioFormatConstructor,626 frequency,627 bitsPerFrame,628 channels,629 isSigned,630 0);631 return format;632 }633 #+end_src635 *** Boring Device management stuff636 This code is more-or-less copied verbatim from the other =OpenAL=637 backends. It's the basis for =OpenAL='s primitive object system.639 #+begin_src C640 //////////////////// Device Initialization / Management642 static const ALCchar sendDevice[] = "Multiple Audio Send";644 static ALCboolean send_open_playback(ALCdevice *device,645 const ALCchar *deviceName)646 {647 send_data *data;648 // stop any buffering for stdout, so that I can649 // see the printf statements in my terminal immediately650 setbuf(stdout, NULL);652 if(!deviceName)653 deviceName = sendDevice;654 else if(strcmp(deviceName, sendDevice) != 0)655 return ALC_FALSE;656 data = (send_data*)calloc(1, sizeof(*data));657 device->szDeviceName = strdup(deviceName);658 device->ExtraData = data;659 return ALC_TRUE;660 }662 static void send_close_playback(ALCdevice *device)663 {664 send_data *data = (send_data*)device->ExtraData;665 alcMakeContextCurrent(NULL);666 ALuint i;667 // Destroy all slave contexts. LWJGL will take care of668 // its own context.669 for (i = 1; i < data->numContexts; i++){670 context_data *ctxData = data->contexts[i];671 alcDestroyContext(ctxData->ctx);672 free(ctxData->renderBuffer);673 free(ctxData);674 }675 free(data);676 device->ExtraData = NULL;677 }679 static ALCboolean send_reset_playback(ALCdevice *device)680 {681 SetDefaultWFXChannelOrder(device);682 return ALC_TRUE;683 }685 static void send_stop_playback(ALCdevice *Device){686 UNUSED(Device);687 }689 static const BackendFuncs send_funcs = {690 send_open_playback,691 send_close_playback,692 send_reset_playback,693 send_stop_playback,694 NULL,695 NULL, /* These would be filled with functions to */696 NULL, /* handle capturing audio if we we into that */697 NULL, /* sort of thing... */698 NULL,699 NULL700 };702 ALCboolean alc_send_init(BackendFuncs *func_list){703 *func_list = send_funcs;704 return ALC_TRUE;705 }707 void alc_send_deinit(void){}709 void alc_send_probe(enum DevProbe type)710 {711 switch(type)712 {713 case DEVICE_PROBE:714 AppendDeviceList(sendDevice);715 break;716 case ALL_DEVICE_PROBE:717 AppendAllDeviceList(sendDevice);718 break;719 case CAPTURE_DEVICE_PROBE:720 break;721 }722 }723 #+end_src725 * The Java interface, =AudioSend=727 The Java interface to the Send Device follows naturally from the JNI728 definitions. It is included here for completeness. The only thing here729 of note is the =deviceID=. This is available from LWJGL, but to only730 way to get it is reflection. Unfortunately, there is no other way to731 control the Send device than to obtain a pointer to it.733 #+include: "../java/src/com/aurellem/send/AudioSend.java" src java :exports code735 * Finally, Ears in clojure!737 Now that the infrastructure is complete (modulo a few patches to738 jMonkeyEngine3 to support accessing this modified version of =OpenAL=739 that are not worth discussing), the clojure ear abstraction is rather740 simple. Just as there were =SceneProcessors= for vision, there are741 now =SoundProcessors= for hearing.743 #+include "../../jmeCapture/src/com/aurellem/capture/audio/SoundProcessor.java" src java745 #+srcname: ears746 #+begin_src clojure747 (ns cortex.hearing748 "Simulate the sense of hearing in jMonkeyEngine3. Enables multiple749 listeners at different positions in the same world. Passes vectors750 of floats in the range [-1.0 -- 1.0] in PCM format to any arbitrary751 function."752 {:author "Robert McIntyre"}753 (:use (cortex world util))754 (:import java.nio.ByteBuffer)755 (:import org.tritonus.share.sampled.FloatSampleTools)756 (:import com.aurellem.capture.audio.SoundProcessor)757 (:import javax.sound.sampled.AudioFormat))759 (defn sound-processor760 "Deals with converting ByteBuffers into Vectors of floats so that761 the continuation functions can be defined in terms of immutable762 stuff."763 [continuation]764 (proxy [SoundProcessor] []765 (cleanup [])766 (process767 [#^ByteBuffer audioSamples numSamples #^AudioFormat audioFormat]768 (let [bytes (byte-array numSamples)769 floats (float-array numSamples)]770 (.get audioSamples bytes 0 numSamples)771 (FloatSampleTools/byte2floatInterleaved772 bytes 0 floats 0773 (/ numSamples (.getFrameSize audioFormat)) audioFormat)774 (continuation775 (vec floats))))))777 (defn add-ear778 "Add an ear to the world. The continuation function will be called779 on the FFT or the sounds which the ear hears in the given780 timeframe. Sound is 3D."781 [world listener continuation]782 (let [renderer (.getAudioRenderer world)]783 (.addListener renderer listener)784 (.registerSoundProcessor renderer listener785 (sound-processor continuation))786 listener))787 #+end_src789 * Example791 #+srcname: test-hearing792 #+begin_src clojure :results silent793 (ns test.hearing794 (:use (cortex world util hearing))795 (:import (com.jme3.audio AudioNode Listener))796 (:import com.jme3.scene.Node))798 (defn setup-fn [world]799 (let [listener (Listener.)]800 (add-ear world listener #(println-repl (nth % 0)))))802 (defn play-sound [node world value]803 (if (not value)804 (do805 (.playSource (.getAudioRenderer world) node))))807 (defn test-basic-hearing []808 (.start809 (let [node1 (AudioNode. (asset-manager) "Sounds/pure.wav" false false)]810 (world811 (Node.)812 {"key-space" (partial play-sound node1)}813 setup-fn814 no-op))))815 #+end_src817 This extremely basic program prints out the first sample it encounters818 at every time stamp. You can see the rendered sound begin printed at819 the REPL.821 * COMMENT Code Generation823 #+begin_src clojure :tangle ../../cortex/src/cortex/hearing.clj824 <<ears>>825 #+end_src827 #+begin_src clojure :tangle ../../cortex/src/test/hearing.clj828 <<test-hearing>>829 #+end_src832 #+begin_src C :tangle ../Alc/backends/send.c833 <<send>>834 #+end_src