Mercurial > audio-send
view org/ear.org @ 19:22ac5a0367cd
finally, a first pass at ear.org
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Thu, 03 Nov 2011 14:54:45 -0700 |
parents | 1e201037f666 |
children | e8ae40c9848c |
line wrap: on
line source
1 #+title: Simulated Sense of Hearing2 #+author: Robert McIntyre3 #+email: rlm@mit.edu4 #+description: Simulating multiple listeners and the sense of hearing in jMonkeyEngine35 #+keywords: simulated hearing, openal, clojure, jMonkeyEngine3, LWJGL, AI6 #+SETUPFILE: ../../aurellem/org/setup.org7 #+INCLUDE: ../../aurellem/org/level-0.org8 #+BABEL: :exports both :noweb yes :cache no :mkdirp yes13 * Hearing15 I want to be able to place ears in a similiar manner to how I place16 the eyes. I want to be able to place ears in a unique spatial17 position, and recieve as output at every tick the FFT of whatever18 signals are happening at that point.20 Hearing is one of the more difficult senses to simulate, because there21 is less support for obtaining the actual sound data that is processed22 by jMonkeyEngine3.24 jMonkeyEngine's sound system works as follows:26 - jMonkeyEngine uese the =AppSettings= for the particular application27 to determine what sort of =AudioRenderer= should be used.28 - although some support is provided for multiple AudioRendering29 backends, jMonkeyEngine at the time of this writing will either30 pick no AudioRender at all, or the =LwjglAudioRenderer=31 - jMonkeyEngine tries to figure out what sort of system you're32 running and extracts the appropiate native libraries.33 - the =LwjglAudioRenderer= uses the [[http://lwjgl.org/][=LWJGL=]] (LightWeight Java Game34 Library) bindings to interface with a C library called [[http://kcat.strangesoft.net/openal.html][=OpenAL=]]35 - =OpenAL= calculates the 3D sound localization and feeds a stream of36 sound to any of various sound output devices with which it knows37 how to communicate.39 A consequence of this is that there's no way to access the actual40 sound data produced by =OpenAL=. Even worse, =OpanAL= only supports41 one /listener/, which normally isn't a problem for games, but becomes42 a problem when trying to make multiple AI creatures that can each hear43 the world from a different perspective.45 To make many AI creatures in jMonkeyEngine that can each hear the46 world from their own perspective, it is necessary to go all the way47 back to =OpenAL= and implement support for simulated hearing there.49 * Extending =OpenAL=50 ** =OpenAL= Devices52 =OpenAL= goes to great lengths to support many different systems, all53 with different sound capabilities and interfaces. It acomplishes this54 difficult task by providing code for many different sound backends in55 pseudo-objects called /Devices/. There's a device for the Linux Open56 Sound System and the Advanced Linxu Sound Architechture, there's one57 for Direct Sound on Windows, there's even one for Solaris. =OpenAL=58 solves the problem of platform independence by providing all these59 Devices.61 Wrapper libraries such as LWJGL are free to examine the system on62 which they are running and then select an appropiate device for that63 system.65 There are also a few "special" devices that don't interface with any66 particular system. These include the Null Device, which doesn't do67 anything, and the Wave Device, which writes whatever sound it recieves68 to a file, if everything has been set up correctly when configuring69 =OpenAL=.71 Actual mixing of the sound data happens in the Devices, and they are72 the only point in the sound rendering process where this data is73 available.75 Therefore, in order to support multiple listeners, and get the sound76 data in a form that the AIs can use, it is necessary to create a new77 Device, which supports this features.79 ** The Send Device80 Adding a device to OpenAL is rather tricky -- there are five separate81 files in the =OpenAL= source tree that must be modified to do so. I've82 documented this process [[./add-new-device.org][here]] for anyone who is interested.85 Onward to that actual Device!87 again, my objectives are:89 - Support Multiple Listeners from jMonkeyEngine390 - Get access to the rendered sound data for further processing from91 clojure.93 ** =send.c=95 ** Header96 #+srcname: send-header97 #+begin_src C98 #include "config.h"99 #include <stdlib.h>100 #include "alMain.h"101 #include "AL/al.h"102 #include "AL/alc.h"103 #include "alSource.h"104 #include <jni.h>106 //////////////////// Summary108 struct send_data;109 struct context_data;111 static void addContext(ALCdevice *, ALCcontext *);112 static void syncContexts(ALCcontext *master, ALCcontext *slave);113 static void syncSources(ALsource *master, ALsource *slave,114 ALCcontext *masterCtx, ALCcontext *slaveCtx);116 static void syncSourcei(ALuint master, ALuint slave,117 ALCcontext *masterCtx, ALCcontext *ctx2, ALenum param);118 static void syncSourcef(ALuint master, ALuint slave,119 ALCcontext *masterCtx, ALCcontext *ctx2, ALenum param);120 static void syncSource3f(ALuint master, ALuint slave,121 ALCcontext *masterCtx, ALCcontext *ctx2, ALenum param);123 static void swapInContext(ALCdevice *, struct context_data *);124 static void saveContext(ALCdevice *, struct context_data *);125 static void limitContext(ALCdevice *, ALCcontext *);126 static void unLimitContext(ALCdevice *);128 static void init(ALCdevice *);129 static void renderData(ALCdevice *, int samples);131 #define UNUSED(x) (void)(x)132 #+end_src134 The main idea behing the Send device is to take advantage of the fact135 that LWJGL only manages one /context/ when using OpenAL. A /context/136 is like a container that holds samples and keeps track of where the137 listener is. In order to support multiple listeners, the Send device138 identifies the LWJGL context as the master context, and creates any139 number of slave contexts to represent additional listeners. Every140 time the device renders sound, it synchronizes every source from the141 master LWJGL context to the slave contexts. Then, it renders each142 context separately, using a different listener for each one. The143 rendered sound is made available via JNI to jMonkeyEngine.145 To recap, the process is:146 - Set the LWJGL context as "master" in the =init()= method.147 - Create any number of additional contexts via =addContext()=148 - At every call to =renderData()= sync the master context with the149 slave contexts vit =syncContexts()=150 - =syncContexts()= calls =syncSources()= to sync all the sources151 which are in the master context.152 - =limitContext()= and =unLimitContext()= make it possible to render153 only one context at a time.155 ** Necessary State156 #+begin_src C157 //////////////////// State159 typedef struct context_data {160 ALfloat ClickRemoval[MAXCHANNELS];161 ALfloat PendingClicks[MAXCHANNELS];162 ALvoid *renderBuffer;163 ALCcontext *ctx;164 } context_data;166 typedef struct send_data {167 ALuint size;168 context_data **contexts;169 ALuint numContexts;170 ALuint maxContexts;171 } send_data;172 #+end_src174 Switching between contexts is not the normal operation of a Device,175 and one of the problems with doing so is that a Device normally keeps176 around a few pieces of state such as the =ClickRemoval= array above177 which will become corrupted if the contexts are not done in178 parallel. The solution is to create a copy of this normally global179 device state for each context, and copy it back and forth into and out180 of the actual device state whenever a context is rendered.182 ** Synchronization Macros184 #+begin_src C185 //////////////////// Context Creation / Synchronization187 #define _MAKE_SYNC(NAME, INIT_EXPR, GET_EXPR, SET_EXPR) \188 void NAME (ALuint sourceID1, ALuint sourceID2, \189 ALCcontext *ctx1, ALCcontext *ctx2, \190 ALenum param){ \191 INIT_EXPR; \192 ALCcontext *current = alcGetCurrentContext(); \193 alcMakeContextCurrent(ctx1); \194 GET_EXPR; \195 alcMakeContextCurrent(ctx2); \196 SET_EXPR; \197 alcMakeContextCurrent(current); \198 }200 #define MAKE_SYNC(NAME, TYPE, GET, SET) \201 _MAKE_SYNC(NAME, \202 TYPE value, \203 GET(sourceID1, param, &value), \204 SET(sourceID2, param, value))206 #define MAKE_SYNC3(NAME, TYPE, GET, SET) \207 _MAKE_SYNC(NAME, \208 TYPE value1; TYPE value2; TYPE value3;, \209 GET(sourceID1, param, &value1, &value2, &value3), \210 SET(sourceID2, param, value1, value2, value3))212 MAKE_SYNC( syncSourcei, ALint, alGetSourcei, alSourcei);213 MAKE_SYNC( syncSourcef, ALfloat, alGetSourcef, alSourcef);214 MAKE_SYNC3(syncSource3i, ALint, alGetSource3i, alSource3i);215 MAKE_SYNC3(syncSource3f, ALfloat, alGetSource3f, alSource3f);217 #+end_src219 Setting the state of an =OpenAl= source is done with the =alSourcei=,220 =alSourcef=, =alSource3i=, and =alSource3f= functions. In order to221 complely synchronize two sources, it is necessary to use all of222 them. These macros help to condense the otherwise repetitive223 synchronization code involving these simillar low-level =OpenAL= functions.225 ** Source Synchronization226 #+begin_src C227 void syncSources(ALsource *masterSource, ALsource *slaveSource,228 ALCcontext *masterCtx, ALCcontext *slaveCtx){229 ALuint master = masterSource->source;230 ALuint slave = slaveSource->source;231 ALCcontext *current = alcGetCurrentContext();233 syncSourcef(master,slave,masterCtx,slaveCtx,AL_PITCH);234 syncSourcef(master,slave,masterCtx,slaveCtx,AL_GAIN);235 syncSourcef(master,slave,masterCtx,slaveCtx,AL_MAX_DISTANCE);236 syncSourcef(master,slave,masterCtx,slaveCtx,AL_ROLLOFF_FACTOR);237 syncSourcef(master,slave,masterCtx,slaveCtx,AL_REFERENCE_DISTANCE);238 syncSourcef(master,slave,masterCtx,slaveCtx,AL_MIN_GAIN);239 syncSourcef(master,slave,masterCtx,slaveCtx,AL_MAX_GAIN);240 syncSourcef(master,slave,masterCtx,slaveCtx,AL_CONE_OUTER_GAIN);241 syncSourcef(master,slave,masterCtx,slaveCtx,AL_CONE_INNER_ANGLE);242 syncSourcef(master,slave,masterCtx,slaveCtx,AL_CONE_OUTER_ANGLE);243 syncSourcef(master,slave,masterCtx,slaveCtx,AL_SEC_OFFSET);244 syncSourcef(master,slave,masterCtx,slaveCtx,AL_SAMPLE_OFFSET);245 syncSourcef(master,slave,masterCtx,slaveCtx,AL_BYTE_OFFSET);247 syncSource3f(master,slave,masterCtx,slaveCtx,AL_POSITION);248 syncSource3f(master,slave,masterCtx,slaveCtx,AL_VELOCITY);249 syncSource3f(master,slave,masterCtx,slaveCtx,AL_DIRECTION);251 syncSourcei(master,slave,masterCtx,slaveCtx,AL_SOURCE_RELATIVE);252 syncSourcei(master,slave,masterCtx,slaveCtx,AL_LOOPING);254 alcMakeContextCurrent(masterCtx);255 ALint source_type;256 alGetSourcei(master, AL_SOURCE_TYPE, &source_type);258 // Only static sources are currently synchronized!259 if (AL_STATIC == source_type){260 ALint master_buffer;261 ALint slave_buffer;262 alGetSourcei(master, AL_BUFFER, &master_buffer);263 alcMakeContextCurrent(slaveCtx);264 alGetSourcei(slave, AL_BUFFER, &slave_buffer);265 if (master_buffer != slave_buffer){266 alSourcei(slave, AL_BUFFER, master_buffer);267 }268 }270 // Synchronize the state of the two sources.271 alcMakeContextCurrent(masterCtx);272 ALint masterState;273 ALint slaveState;275 alGetSourcei(master, AL_SOURCE_STATE, &masterState);276 alcMakeContextCurrent(slaveCtx);277 alGetSourcei(slave, AL_SOURCE_STATE, &slaveState);279 if (masterState != slaveState){280 switch (masterState){281 case AL_INITIAL : alSourceRewind(slave); break;282 case AL_PLAYING : alSourcePlay(slave); break;283 case AL_PAUSED : alSourcePause(slave); break;284 case AL_STOPPED : alSourceStop(slave); break;285 }286 }287 // Restore whatever context was previously active.288 alcMakeContextCurrent(current);289 }290 #+end_src291 This function is long because it has to exaustively go through all the292 possible state that a source can have and make sure that it is the293 same between the master and slave sources. I'd like to take this294 moment to salute the [[http://connect.creativelabs.com/openal/Documentation/Forms/AllItems.aspx][=OpenAL= Reference Manual]], which provides a very295 good description of =OpenAL='s internals.297 ** Context Synchronization298 #+begin_src C299 void syncContexts(ALCcontext *master, ALCcontext *slave){300 /* If there aren't sufficient sources in slave to mirror301 the sources in master, create them. */302 ALCcontext *current = alcGetCurrentContext();304 UIntMap *masterSourceMap = &(master->SourceMap);305 UIntMap *slaveSourceMap = &(slave->SourceMap);306 ALuint numMasterSources = masterSourceMap->size;307 ALuint numSlaveSources = slaveSourceMap->size;309 alcMakeContextCurrent(slave);310 if (numSlaveSources < numMasterSources){311 ALuint numMissingSources = numMasterSources - numSlaveSources;312 ALuint newSources[numMissingSources];313 alGenSources(numMissingSources, newSources);314 }316 /* Now, slave is gauranteed to have at least as many sources317 as master. Sync each source from master to the corresponding318 source in slave. */319 int i;320 for(i = 0; i < masterSourceMap->size; i++){321 syncSources((ALsource*)masterSourceMap->array[i].value,322 (ALsource*)slaveSourceMap->array[i].value,323 master, slave);324 }325 alcMakeContextCurrent(current);326 }327 #+end_src329 Most of the hard work in Context Synchronization is done in330 =syncSources()=. The only thing that =syncContexts()= has to worry331 about is automoatically creating new sources whenever a slave context332 does not have the same number of sources as the master context.334 ** Context Creation335 #+begin_src C336 static void addContext(ALCdevice *Device, ALCcontext *context){337 send_data *data = (send_data*)Device->ExtraData;338 // expand array if necessary339 if (data->numContexts >= data->maxContexts){340 ALuint newMaxContexts = data->maxContexts*2 + 1;341 data->contexts = realloc(data->contexts, newMaxContexts*sizeof(context_data));342 data->maxContexts = newMaxContexts;343 }344 // create context_data and add it to the main array345 context_data *ctxData;346 ctxData = (context_data*)calloc(1, sizeof(*ctxData));347 ctxData->renderBuffer =348 malloc(BytesFromDevFmt(Device->FmtType) *349 Device->NumChan * Device->UpdateSize);350 ctxData->ctx = context;352 data->contexts[data->numContexts] = ctxData;353 data->numContexts++;354 }355 #+end_src357 Here, the slave context is created, and it's data is stored in the358 device-wide =ExtraData= structure. The =renderBuffer= that is created359 here is where the rendered sound samples for this slave context will360 eventually go.362 ** Context Switching363 #+begin_src C364 //////////////////// Context Switching366 /* A device brings along with it two pieces of state367 * which have to be swapped in and out with each context.368 */369 static void swapInContext(ALCdevice *Device, context_data *ctxData){370 memcpy(Device->ClickRemoval, ctxData->ClickRemoval, sizeof(ALfloat)*MAXCHANNELS);371 memcpy(Device->PendingClicks, ctxData->PendingClicks, sizeof(ALfloat)*MAXCHANNELS);372 }374 static void saveContext(ALCdevice *Device, context_data *ctxData){375 memcpy(ctxData->ClickRemoval, Device->ClickRemoval, sizeof(ALfloat)*MAXCHANNELS);376 memcpy(ctxData->PendingClicks, Device->PendingClicks, sizeof(ALfloat)*MAXCHANNELS);377 }379 static ALCcontext **currentContext;380 static ALuint currentNumContext;382 /* By default, all contexts are rendered at once for each call to aluMixData.383 * This function uses the internals of the ALCdecice struct to temporarly384 * cause aluMixData to only render the chosen context.385 */386 static void limitContext(ALCdevice *Device, ALCcontext *ctx){387 currentContext = Device->Contexts;388 currentNumContext = Device->NumContexts;389 Device->Contexts = &ctx;390 Device->NumContexts = 1;391 }393 static void unLimitContext(ALCdevice *Device){394 Device->Contexts = currentContext;395 Device->NumContexts = currentNumContext;396 }397 #+end_src399 =OpenAL= normally reneders all Contexts in parallel, outputting the400 whole result to the buffer. It does this by iterating over the401 Device->Contexts array and rendering each context to the buffer in402 turn. By temporarly setting Device->NumContexts to 1 and adjusting403 the Device's context list to put the desired context-to-be-rendered404 into position 0, we can get trick =OpenAL= into rendering each slave405 context separate from all the others.407 ** Main Device Loop408 #+begin_src C409 //////////////////// Main Device Loop411 /* Establish the LWJGL context as the master context, which will412 * be synchronized to all the slave contexts413 */414 static void init(ALCdevice *Device){415 ALCcontext *masterContext = alcGetCurrentContext();416 addContext(Device, masterContext);417 }420 static void renderData(ALCdevice *Device, int samples){421 if(!Device->Connected){return;}422 send_data *data = (send_data*)Device->ExtraData;423 ALCcontext *current = alcGetCurrentContext();425 ALuint i;426 for (i = 1; i < data->numContexts; i++){427 syncContexts(data->contexts[0]->ctx , data->contexts[i]->ctx);428 }430 if ((uint) samples > Device->UpdateSize){431 printf("exceeding internal buffer size; dropping samples\n");432 printf("requested %d; available %d\n", samples, Device->UpdateSize);433 samples = (int) Device->UpdateSize;434 }436 for (i = 0; i < data->numContexts; i++){437 context_data *ctxData = data->contexts[i];438 ALCcontext *ctx = ctxData->ctx;439 alcMakeContextCurrent(ctx);440 limitContext(Device, ctx);441 swapInContext(Device, ctxData);442 aluMixData(Device, ctxData->renderBuffer, samples);443 saveContext(Device, ctxData);444 unLimitContext(Device);445 }446 alcMakeContextCurrent(current);447 }448 #+end_src450 The main loop synchronizes the master LWJGL context with all the slave451 contexts, then walks each context, rendering just that context to it's452 audio-sample storage buffer.454 ** JNI Methods456 At this point, we have the ability to create multiple listeners by457 using the master/slave context trick, and the rendered audio data is458 waiting patiently in internal buffers, one for each listener. We need459 a way to transport this information to Java, and also a way to drive460 this device from Java. The following JNI interface code is inspired461 by the way LWJGL interfaces with =OpenAL=.463 *** step465 #+begin_src C466 //////////////////// JNI Methods468 #include "com_aurellem_send_AudioSend.h"470 /*471 * Class: com_aurellem_send_AudioSend472 * Method: nstep473 * Signature: (JI)V474 */475 JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_nstep476 (JNIEnv *env, jclass clazz, jlong device, jint samples){477 UNUSED(env);UNUSED(clazz);UNUSED(device);478 renderData((ALCdevice*)((intptr_t)device), samples);479 }480 #+end_src481 This device, unlike most of the other devices in =OpenAL=, does not482 render sound unless asked. This enables the system to slow down or483 speed up depending on the needs of the AIs who are using it to484 listen. If the device tried to render samples in real-time, a485 complicated AI whose mind takes 100 seconds of computer time to486 simulate 1 second of AI-time would miss almost all of the sound in487 its environment.490 *** getSamples491 #+begin_src C492 /*493 * Class: com_aurellem_send_AudioSend494 * Method: ngetSamples495 * Signature: (JLjava/nio/ByteBuffer;III)V496 */497 JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_ngetSamples498 (JNIEnv *env, jclass clazz, jlong device, jobject buffer, jint position,499 jint samples, jint n){500 UNUSED(clazz);502 ALvoid *buffer_address =503 ((ALbyte *)(((char*)(*env)->GetDirectBufferAddress(env, buffer)) + position));504 ALCdevice *recorder = (ALCdevice*) ((intptr_t)device);505 send_data *data = (send_data*)recorder->ExtraData;506 if ((ALuint)n > data->numContexts){return;}507 memcpy(buffer_address, data->contexts[n]->renderBuffer,508 BytesFromDevFmt(recorder->FmtType) * recorder->NumChan * samples);509 }510 #+end_src512 This is the transport layer between C and Java that will eventually513 allow us to access rendered sound data from clojure.515 *** Listener Management517 =addListener=, =setNthListenerf=, and =setNthListener3f= are518 necessary to change the properties of any listener other than the519 master one, since only the listener of the current active context is520 affected by the normal =OpenAL= listener calls.522 #+begin_src C523 /*524 * Class: com_aurellem_send_AudioSend525 * Method: naddListener526 * Signature: (J)V527 */528 JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_naddListener529 (JNIEnv *env, jclass clazz, jlong device){530 UNUSED(env); UNUSED(clazz);531 //printf("creating new context via naddListener\n");532 ALCdevice *Device = (ALCdevice*) ((intptr_t)device);533 ALCcontext *new = alcCreateContext(Device, NULL);534 addContext(Device, new);535 }537 /*538 * Class: com_aurellem_send_AudioSend539 * Method: nsetNthListener3f540 * Signature: (IFFFJI)V541 */542 JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_nsetNthListener3f543 (JNIEnv *env, jclass clazz, jint param,544 jfloat v1, jfloat v2, jfloat v3, jlong device, jint contextNum){545 UNUSED(env);UNUSED(clazz);547 ALCdevice *Device = (ALCdevice*) ((intptr_t)device);548 send_data *data = (send_data*)Device->ExtraData;550 ALCcontext *current = alcGetCurrentContext();551 if ((ALuint)contextNum > data->numContexts){return;}552 alcMakeContextCurrent(data->contexts[contextNum]->ctx);553 alListener3f(param, v1, v2, v3);554 alcMakeContextCurrent(current);555 }557 /*558 * Class: com_aurellem_send_AudioSend559 * Method: nsetNthListenerf560 * Signature: (IFJI)V561 */562 JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_nsetNthListenerf563 (JNIEnv *env, jclass clazz, jint param, jfloat v1, jlong device,564 jint contextNum){566 UNUSED(env);UNUSED(clazz);568 ALCdevice *Device = (ALCdevice*) ((intptr_t)device);569 send_data *data = (send_data*)Device->ExtraData;571 ALCcontext *current = alcGetCurrentContext();572 if ((ALuint)contextNum > data->numContexts){return;}573 alcMakeContextCurrent(data->contexts[contextNum]->ctx);574 alListenerf(param, v1);575 alcMakeContextCurrent(current);576 }577 #+end_src579 *** Initilazation580 =initDevice= is called from the Java side after LWJGL has created its581 context, and before any calls to =addListener=. It establishes the582 LWJGL context as the master context.584 =getAudioFormat= is a convienence function that uses JNI to build up a585 =javax.sound.sampled.AudioFormat= object from data in the Device. This586 way, there is no ambiguity about what the bits created by =step= and587 returned by =getSamples= mean.589 #+begin_src C590 /*591 * Class: com_aurellem_send_AudioSend592 * Method: ninitDevice593 * Signature: (J)V594 */595 JNIEXPORT void JNICALL Java_com_aurellem_send_AudioSend_ninitDevice596 (JNIEnv *env, jclass clazz, jlong device){597 UNUSED(env);UNUSED(clazz);598 ALCdevice *Device = (ALCdevice*) ((intptr_t)device);599 init(Device);600 }602 /*603 * Class: com_aurellem_send_AudioSend604 * Method: ngetAudioFormat605 * Signature: (J)Ljavax/sound/sampled/AudioFormat;606 */607 JNIEXPORT jobject JNICALL Java_com_aurellem_send_AudioSend_ngetAudioFormat608 (JNIEnv *env, jclass clazz, jlong device){609 UNUSED(clazz);610 jclass AudioFormatClass =611 (*env)->FindClass(env, "javax/sound/sampled/AudioFormat");612 jmethodID AudioFormatConstructor =613 (*env)->GetMethodID(env, AudioFormatClass, "<init>", "(FIIZZ)V");615 ALCdevice *Device = (ALCdevice*) ((intptr_t)device);616 int isSigned;617 switch (Device->FmtType)618 {619 case DevFmtUByte:620 case DevFmtUShort: isSigned = 0; break;621 default : isSigned = 1;622 }623 float frequency = Device->Frequency;624 int bitsPerFrame = (8 * BytesFromDevFmt(Device->FmtType));625 int channels = Device->NumChan;626 jobject format = (*env)->627 NewObject(628 env,AudioFormatClass,AudioFormatConstructor,629 frequency,630 bitsPerFrame,631 channels,632 isSigned,633 0);634 return format;635 }636 #+end_src638 *** Boring Device management stuff639 This code is more-or-less copied verbatim from the other =OpenAL=640 backends. It's the basis for =OpenAL='s primitive object system.642 #+begin_src C643 //////////////////// Device Initilization / Management645 static const ALCchar sendDevice[] = "Multiple Audio Send";647 static ALCboolean send_open_playback(ALCdevice *device,648 const ALCchar *deviceName)649 {650 send_data *data;651 // stop any buffering for stdout, so that I can652 // see the printf statements in my terminal immediatley653 setbuf(stdout, NULL);655 if(!deviceName)656 deviceName = sendDevice;657 else if(strcmp(deviceName, sendDevice) != 0)658 return ALC_FALSE;659 data = (send_data*)calloc(1, sizeof(*data));660 device->szDeviceName = strdup(deviceName);661 device->ExtraData = data;662 return ALC_TRUE;663 }665 static void send_close_playback(ALCdevice *device)666 {667 send_data *data = (send_data*)device->ExtraData;668 alcMakeContextCurrent(NULL);669 ALuint i;670 // Destroy all slave contexts. LWJGL will take care of671 // its own context.672 for (i = 1; i < data->numContexts; i++){673 context_data *ctxData = data->contexts[i];674 alcDestroyContext(ctxData->ctx);675 free(ctxData->renderBuffer);676 free(ctxData);677 }678 free(data);679 device->ExtraData = NULL;680 }682 static ALCboolean send_reset_playback(ALCdevice *device)683 {684 SetDefaultWFXChannelOrder(device);685 return ALC_TRUE;686 }688 static void send_stop_playback(ALCdevice *Device){689 UNUSED(Device);690 }692 static const BackendFuncs send_funcs = {693 send_open_playback,694 send_close_playback,695 send_reset_playback,696 send_stop_playback,697 NULL,698 NULL, /* These would be filled with functions to */699 NULL, /* handle capturing audio if we we into that */700 NULL, /* sort of thing... */701 NULL,702 NULL703 };705 ALCboolean alc_send_init(BackendFuncs *func_list){706 *func_list = send_funcs;707 return ALC_TRUE;708 }710 void alc_send_deinit(void){}712 void alc_send_probe(enum DevProbe type)713 {714 switch(type)715 {716 case DEVICE_PROBE:717 AppendDeviceList(sendDevice);718 break;719 case ALL_DEVICE_PROBE:720 AppendAllDeviceList(sendDevice);721 break;722 case CAPTURE_DEVICE_PROBE:723 break;724 }725 }726 #+end_src728 * The Java interface, =AudioSend=730 The Java interface to the Send Device follows naturally from the JNI731 definitions. It is included here for completeness. The only thing here732 of note is the =deviceID=. This is available from LWJGL, but to only733 way to get it is reflection. Unfornatuently, there is no other way to734 control the Send device than to obtain a pointer to it.736 #+include: "../java/src/com/aurellem/send/AudioSend.java" src java :exports code738 * Finally, Ears in clojure!740 Now that the infastructure is complete (modulo a few patches to741 jMonkeyEngine3 to support accessing this modified version of =OpenAL=742 that are not worth discussing), the clojure ear abstraction is rather743 simple. Just as there were =SceneProcessors= for vision, there are744 now =SoundProcessors= for hearing.746 #+include "../../jmeCapture/src/com/aurellem/capture/audio/SoundProcessor.java" src java748 #+srcname: ears749 #+begin_src clojure750 (ns cortex.hearing751 "Simulate the sense of hearing in jMonkeyEngine3. Enables multiple752 listeners at different positions in the same world. Passes vectors753 of floats in the range [-1.0 -- 1.0] in PCM format to any arbitray754 function."755 {:author "Robert McIntyre"}756 (:use (cortex world util))757 (:import java.nio.ByteBuffer)758 (:import org.tritonus.share.sampled.FloatSampleTools)759 (:import com.aurellem.capture.audio.SoundProcessor)760 (:import javax.sound.sampled.AudioFormat))762 (defn sound-processor763 "Deals with converting ByteBuffers into Vectors of floats so that764 the continuation functions can be defined in terms of immutable765 stuff."766 [continuation]767 (proxy [SoundProcessor] []768 (cleanup [])769 (process770 [#^ByteBuffer audioSamples numSamples #^AudioFormat audioFormat]771 (let [bytes (byte-array numSamples)772 floats (float-array numSamples)]773 (.get audioSamples bytes 0 numSamples)774 (FloatSampleTools/byte2floatInterleaved775 bytes 0 floats 0776 (/ numSamples (.getFrameSize audioFormat)) audioFormat)777 (continuation778 (vec floats))))))780 (defn add-ear781 "Add an ear to the world. The continuation function will be called782 on the FFT or the sounds which the ear hears in the given783 timeframe. Sound is 3D."784 [world listener continuation]785 (let [renderer (.getAudioRenderer world)]786 (.addListener renderer listener)787 (.registerSoundProcessor renderer listener788 (sound-processor continuation))789 listener))790 #+end_src792 * Example794 #+srcname: test-hearing795 #+begin_src clojure :results silent796 (ns test.hearing797 (:use (cortex world util hearing))798 (:import (com.jme3.audio AudioNode Listener))799 (:import com.jme3.scene.Node))801 (defn setup-fn [world]802 (let [listener (Listener.)]803 (add-ear world listener #(println-repl (nth % 0)))))805 (defn play-sound [node world value]806 (if (not value)807 (do808 (.playSource (.getAudioRenderer world) node))))810 (defn test-basic-hearing []811 (.start812 (let [node1 (AudioNode. (asset-manager) "Sounds/pure.wav" false false)]813 (world814 (Node.)815 {"key-space" (partial play-sound node1)}816 setup-fn817 no-op))))818 #+end_src820 This extremely basic program prints out the first sample it encounters821 at every time stamp. You can see the rendered sound begin printed at822 the REPL.824 * COMMENT Code Generation826 #+begin_src clojure :tangle ../../cortex/src/cortex/hearing.clj827 <<ears>>828 #+end_src830 #+begin_src clojure :tangle ../../cortex/src/test/hearing.clj831 <<test-hearing>>832 #+end_src835 #+begin_src C :tangle ../Alc/backends/send.c836 <<send>>837 #+end_src