view README @ 65:23e3df41db3c

reformatting for web
author Robert McIntyre <rlm@mit.edu>
date Sat, 11 Feb 2012 12:25:26 -0700
parents 76581e11fb72
children
line wrap: on
line source
1 ======Capture Audio/Video to a File======
3 <note>A/V recording is still in development. It works for all of jMonkeyEngine's test cases. If you experience any problems or
4 of something isn't clear, please let me know. -- bortreb</note>
6 So you've made your cool new JMonkeyEngine3 game and you want to
7 create a demo video to show off your hard work. Or maybe you want to
8 make a cutscene for your game using the physics and characters in the
9 game itself. Screen capturing is the most straightforward way to do
10 this, but it can slow down your game and produce low-quality video and
11 audio as a result. A better way is to record video and audio directly
12 from the game while it is running.
14 <note tip>Combine this method with jMonkeyEngine's [[:jme3:advanced:Cinematics]] feature to record high-quality game trailers!</note>
16 ===== Simple Way =====
18 First off, if all you just want to record video at 30fps with no sound, then look
19 no further then jMonkeyEngine3's build in ''VideoRecorderAppState''
20 class.
22 Add the following code to your ''simpleInitApp()'' method.
24 <code java>
25 stateManager.attach(new VideoRecorderAppState()); //start recording
26 </code>
28 The game will run slow, but the recording will be in high-quality and
29 normal speed. The video files will be stored in your user home
30 directory, if you want to save to another file, specify it in the
31 VideoRecorderAppState constructor. Recording starts when the state is
32 attached and ends when the application quits or the state is detached.
34 That's all!
36 ===== Advanced Way =====
38 If you want to record audio as well, record at different framerates,
39 or record from multiple viewpoints at once, then there's a full
40 solution for doing this already made for you here:
42 http://www.aurellem.com/releases/jmeCapture-latest.zip
44 http://www.aurellem.com/releases/jmeCapture-latest.tar.bz2
46 Download the archive in your preferred format, extract,
47 add the jars to your project, and you are ready to go.
49 The javadoc is here:
50 http://www.aurellem.com/jmeCapture/docs/
52 To capture video and audio you use the
53 ''com.aurellem.capture.Capture'' class, which has two methods,
54 ''captureAudio()'' and ''captureVideo()'', and the
55 ''com.aurellem.capture.IsoTimer'' class, which sets the audio and
56 video framerate.
58 The steps are as simple as:
60 <code java>
61 yourApp.setTimer(new IsoTimer(desiredFramesPerSecond));
62 </code>
64 This causes jMonkeyEngine to take as much time as it needs to fully
65 calculate every frame of the video and audio. You will see your game
66 speed up and slow down depending on how computationally demanding your
67 game is, but the final recorded audio and video will be perfectly
68 sychronized and will run at exactly the fps which you specified.
70 <code java>
71 captureVideo(yourApp, targetVideoFile);
72 captureAudio(yourApp, targetAudioFile);
73 </code>
75 These will cause the app to record audio and video when it is run.
76 Audio and video will stop being recorded when the app stops. Your
77 audio will be recorded as a 44,100 Hz linear PCM wav file, while the
78 video will be recorded according to the following rules:
80 1.) (Preferred) If you supply an empty directory as the file, then
81 the video will be saved as a sequence of .png files, one file per
82 frame. The files start at 0000000.png and increment from there.
83 You can then combine the frames into your preferred
84 container/codec. If the directory is not empty, then writing
85 video frames to it will fail, and nothing will be written.
87 2.) If the filename ends in ".avi" then the frames will be encoded as
88 a RAW stream inside an AVI 1.0 container. The resulting file
89 will be quite large and you will probably want to re-encode it to
90 your preferred container/codec format. Be advised that some
91 video payers cannot process AVI with a RAW stream, and that AVI
92 1.0 files generated by this method that exceed 2.0GB are invalid
93 according to the AVI 1.0 spec (but many programs can still deal
94 with them.) Thanks to Werner Randelshofer for his excellent work
95 which made the AVI file writer option possible.
97 3.) Any non-directory file ending in anything other than ".avi" will
98 be processed through Xuggle. Xuggle provides the option to use
99 many codecs/containers, but you will have to install it on your
100 system yourself in order to use this option. Please visit
101 http://www.xuggle.com/ to learn how to do this.
103 Note that you will not hear any sound if you choose to record sound to
104 a file.
106 ==== Basic Example ====
108 Here is a complete example showing how to capture both audio and video
109 from one of jMonkeyEngine3's advanced demo applications.
111 <code java>
112 import java.io.File;
113 import java.io.IOException;
115 import jme3test.water.TestPostWater;
117 import com.aurellem.capture.Capture;
118 import com.aurellem.capture.IsoTimer;
119 import com.jme3.app.SimpleApplication;
122 /**
123 * Demonstrates how to use basic Audio/Video capture with a
124 * jMonkeyEngine application. You can use these techniques to make
125 * high quality cutscenes or demo videos, even on very slow laptops.
126 *
127 * @author Robert McIntyre
128 */
130 public class Basic {
132 public static void main(String[] ignore) throws IOException{
133 File video = File.createTempFile("JME-water-video", ".avi");
134 File audio = File.createTempFile("JME-water-audio", ".wav");
136 SimpleApplication app = new TestPostWater();
137 app.setTimer(new IsoTimer(60));
138 app.setShowSettings(false);
140 Capture.captureVideo(app, video);
141 Capture.captureAudio(app, audio);
143 app.start();
145 System.out.println(video.getCanonicalPath());
146 System.out.println(audio.getCanonicalPath());
147 }
148 }
149 </code>
151 ==== How it works ====
153 A standard JME3 application that extends ''SimpleApplication'' or
154 ''Application'' tries as hard as it can to keep in sync with
155 //user-time//. If a ball is rolling at 1 game-mile per game-hour in the
156 game, and you wait for one user-hour as measured by the clock on your
157 wall, then the ball should have traveled exactly one game-mile. In
158 order to keep sync with the real world, the game throttles its physics
159 engine and graphics display. If the computations involved in running
160 the game are too intense, then the game will first skip frames, then
161 sacrifice physics accuracy. If there are particuraly demanding
162 computations, then you may only get 1 fps, and the ball may tunnel
163 through the floor or obstacles due to inaccurate physics simulation,
164 but after the end of one user-hour, that ball will have traveled one
165 game-mile.
167 When we're recording video, we don't care if the game-time syncs with
168 user-time, but instead whether the time in the recorded video
169 (video-time) syncs with user-time. To continue the analogy, if we
170 recorded the ball rolling at 1 game-mile per game-hour and watched the
171 video later, we would want to see 30 fps video of the ball rolling at
172 1 video-mile per //user-hour//. It doesn't matter how much user-time it
173 took to simulate that hour of game-time to make the high-quality
174 recording.
176 The IsoTimer ignores real-time and always reports that the same amount
177 of time has passed every time it is called. That way, one can put code
178 to write each video/audio frame to a file without worrying about that
179 code itself slowing down the game to the point where the recording
180 would be useless.
183 ==== Advanced Example ====
185 The package from aurellem.com was made for AI research and can do more
186 than just record a single stream of audio and video. You can use it
187 to:
189 1.) Create multiple independent listeners that each hear the world
190 from their own perspective.
192 2.) Process the sound data in any way you wish.
194 3.) Do the same for visual data.
196 Here is a more advanced example, which can also be found along with
197 other examples in the jmeCapture.jar file included in the
198 distribution.
200 <code java>
201 package com.aurellem.capture.examples;
203 import java.io.File;
204 import java.io.FileNotFoundException;
205 import java.io.IOException;
206 import java.lang.reflect.Field;
207 import java.nio.ByteBuffer;
209 import javax.sound.sampled.AudioFormat;
211 import org.tritonus.share.sampled.FloatSampleTools;
213 import com.aurellem.capture.AurellemSystemDelegate;
214 import com.aurellem.capture.Capture;
215 import com.aurellem.capture.IsoTimer;
216 import com.aurellem.capture.audio.CompositeSoundProcessor;
217 import com.aurellem.capture.audio.MultiListener;
218 import com.aurellem.capture.audio.SoundProcessor;
219 import com.aurellem.capture.audio.WaveFileWriter;
220 import com.jme3.app.SimpleApplication;
221 import com.jme3.audio.AudioNode;
222 import com.jme3.audio.Listener;
223 import com.jme3.cinematic.MotionPath;
224 import com.jme3.cinematic.events.AbstractCinematicEvent;
225 import com.jme3.cinematic.events.MotionTrack;
226 import com.jme3.material.Material;
227 import com.jme3.math.ColorRGBA;
228 import com.jme3.math.FastMath;
229 import com.jme3.math.Quaternion;
230 import com.jme3.math.Vector3f;
231 import com.jme3.scene.Geometry;
232 import com.jme3.scene.Node;
233 import com.jme3.scene.shape.Box;
234 import com.jme3.scene.shape.Sphere;
235 import com.jme3.system.AppSettings;
236 import com.jme3.system.JmeSystem;
238 /**
239 *
240 * Demonstrates advanced use of the audio capture and recording
241 * features. Multiple perspectives of the same scene are
242 * simultaneously rendered to different sound files.
243 *
244 * A key limitation of the way multiple listeners are implemented is
245 * that only 3D positioning effects are realized for listeners other
246 * than the main LWJGL listener. This means that audio effects such
247 * as environment settings will *not* be heard on any auxiliary
248 * listeners, though sound attenuation will work correctly.
249 *
250 * Multiple listeners as realized here might be used to make AI
251 * entities that can each hear the world from their own perspective.
252 *
253 * @author Robert McIntyre
254 */
256 public class Advanced extends SimpleApplication {
258 /**
259 * You will see three grey cubes, a blue sphere, and a path which
260 * circles each cube. The blue sphere is generating a constant
261 * monotone sound as it moves along the track. Each cube is
262 * listening for sound; when a cube hears sound whose intensity is
263 * greater than a certain threshold, it changes its color from
264 * grey to green.
265 *
266 * Each cube is also saving whatever it hears to a file. The
267 * scene from the perspective of the viewer is also saved to a
268 * video file. When you listen to each of the sound files
269 * alongside the video, the sound will get louder when the sphere
270 * approaches the cube that generated that sound file. This
271 * shows that each listener is hearing the world from its own
272 * perspective.
273 *
274 */
275 public static void main(String[] args) {
276 Advanced app = new Advanced();
277 AppSettings settings = new AppSettings(true);
278 settings.setAudioRenderer(AurellemSystemDelegate.SEND);
279 JmeSystem.setSystemDelegate(new AurellemSystemDelegate());
280 app.setSettings(settings);
281 app.setShowSettings(false);
282 app.setPauseOnLostFocus(false);
284 try {
285 Capture.captureVideo(app, File.createTempFile("advanced",".avi"));
286 Capture.captureAudio(app, File.createTempFile("advanced", ".wav"));
287 }
288 catch (IOException e) {e.printStackTrace();}
290 app.start();
291 }
294 private Geometry bell;
295 private Geometry ear1;
296 private Geometry ear2;
297 private Geometry ear3;
298 private AudioNode music;
299 private MotionTrack motionControl;
301 private Geometry makeEar(Node root, Vector3f position){
302 Material mat = new Material(assetManager, "Common/MatDefs/Misc/Unshaded.j3md");
303 Geometry ear = new Geometry("ear", new Box(1.0f, 1.0f, 1.0f));
304 ear.setLocalTranslation(position);
305 mat.setColor("Color", ColorRGBA.Green);
306 ear.setMaterial(mat);
307 root.attachChild(ear);
308 return ear;
309 }
311 private Vector3f[] path = new Vector3f[]{
312 // loop 1
313 new Vector3f(0, 0, 0),
314 new Vector3f(0, 0, -10),
315 new Vector3f(-2, 0, -14),
316 new Vector3f(-6, 0, -20),
317 new Vector3f(0, 0, -26),
318 new Vector3f(6, 0, -20),
319 new Vector3f(0, 0, -14),
320 new Vector3f(-6, 0, -20),
321 new Vector3f(0, 0, -26),
322 new Vector3f(6, 0, -20),
323 // loop 2
324 new Vector3f(5, 0, -5),
325 new Vector3f(7, 0, 1.5f),
326 new Vector3f(14, 0, 2),
327 new Vector3f(20, 0, 6),
328 new Vector3f(26, 0, 0),
329 new Vector3f(20, 0, -6),
330 new Vector3f(14, 0, 0),
331 new Vector3f(20, 0, 6),
332 new Vector3f(26, 0, 0),
333 new Vector3f(20, 0, -6),
334 new Vector3f(14, 0, 0),
335 // loop 3
336 new Vector3f(8, 0, 7.5f),
337 new Vector3f(7, 0, 10.5f),
338 new Vector3f(6, 0, 20),
339 new Vector3f(0, 0, 26),
340 new Vector3f(-6, 0, 20),
341 new Vector3f(0, 0, 14),
342 new Vector3f(6, 0, 20),
343 new Vector3f(0, 0, 26),
344 new Vector3f(-6, 0, 20),
345 new Vector3f(0, 0, 14),
346 // begin ellipse
347 new Vector3f(16, 5, 20),
348 new Vector3f(0, 0, 26),
349 new Vector3f(-16, -10, 20),
350 new Vector3f(0, 0, 14),
351 new Vector3f(16, 20, 20),
352 new Vector3f(0, 0, 26),
353 new Vector3f(-10, -25, 10),
354 new Vector3f(-10, 0, 0),
355 // come at me!
356 new Vector3f(-28.00242f, 48.005623f, -34.648228f),
357 new Vector3f(0, 0 , -20),
358 };
360 private void createScene() {
361 Material mat = new Material(assetManager, "Common/MatDefs/Misc/Unshaded.j3md");
362 bell = new Geometry( "sound-emitter" , new Sphere(15,15,1));
363 mat.setColor("Color", ColorRGBA.Blue);
364 bell.setMaterial(mat);
365 rootNode.attachChild(bell);
367 ear1 = makeEar(rootNode, new Vector3f(0, 0 ,-20));
368 ear2 = makeEar(rootNode, new Vector3f(0, 0 ,20));
369 ear3 = makeEar(rootNode, new Vector3f(20, 0 ,0));
371 MotionPath track = new MotionPath();
373 for (Vector3f v : path){
374 track.addWayPoint(v);
375 }
376 track.setCurveTension(0.80f);
378 motionControl = new MotionTrack(bell,track);
380 // for now, use reflection to change the timer...
381 // motionControl.setTimer(new IsoTimer(60));
382 try {
383 Field timerField;
384 timerField = AbstractCinematicEvent.class.getDeclaredField("timer");
385 timerField.setAccessible(true);
386 try {timerField.set(motionControl, new IsoTimer(60));}
387 catch (IllegalArgumentException e) {e.printStackTrace();}
388 catch (IllegalAccessException e) {e.printStackTrace();}
389 }
390 catch (SecurityException e) {e.printStackTrace();}
391 catch (NoSuchFieldException e) {e.printStackTrace();}
393 motionControl.setDirectionType(MotionTrack.Direction.PathAndRotation);
394 motionControl.setRotation(new Quaternion().fromAngleNormalAxis(-FastMath.HALF_PI, Vector3f.UNIT_Y));
395 motionControl.setInitialDuration(20f);
396 motionControl.setSpeed(1f);
398 track.enableDebugShape(assetManager, rootNode);
399 positionCamera();
400 }
403 private void positionCamera(){
404 this.cam.setLocation(new Vector3f(-28.00242f, 48.005623f, -34.648228f));
405 this.cam.setRotation(new Quaternion(0.3359635f, 0.34280345f, -0.13281013f, 0.8671653f));
406 }
408 private void initAudio() {
409 org.lwjgl.input.Mouse.setGrabbed(false);
410 music = new AudioNode(assetManager, "Sound/Effects/Beep.ogg", false);
412 rootNode.attachChild(music);
413 audioRenderer.playSource(music);
414 music.setPositional(true);
415 music.setVolume(1f);
416 music.setReverbEnabled(false);
417 music.setDirectional(false);
418 music.setMaxDistance(200.0f);
419 music.setRefDistance(1f);
420 //music.setRolloffFactor(1f);
421 music.setLooping(false);
422 audioRenderer.pauseSource(music);
423 }
425 public class Dancer implements SoundProcessor {
426 Geometry entity;
427 float scale = 2;
428 public Dancer(Geometry entity){
429 this.entity = entity;
430 }
432 /**
433 * this method is irrelevant since there is no state to cleanup.
434 */
435 public void cleanup() {}
438 /**
439 * Respond to sound! This is the brain of an AI entity that
440 * hears it's surroundings and reacts to them.
441 */
442 public void process(ByteBuffer audioSamples, int numSamples, AudioFormat format) {
443 audioSamples.clear();
444 byte[] data = new byte[numSamples];
445 float[] out = new float[numSamples];
446 audioSamples.get(data);
447 FloatSampleTools.byte2floatInterleaved(data, 0, out, 0,
448 numSamples/format.getFrameSize(), format);
450 float max = Float.NEGATIVE_INFINITY;
451 for (float f : out){if (f > max) max = f;}
452 audioSamples.clear();
454 if (max > 0.1){entity.getMaterial().setColor("Color", ColorRGBA.Green);}
455 else {entity.getMaterial().setColor("Color", ColorRGBA.Gray);}
456 }
457 }
459 private void prepareEar(Geometry ear, int n){
460 if (this.audioRenderer instanceof MultiListener){
461 MultiListener rf = (MultiListener)this.audioRenderer;
463 Listener auxListener = new Listener();
464 auxListener.setLocation(ear.getLocalTranslation());
466 rf.addListener(auxListener);
467 WaveFileWriter aux = null;
469 try {aux = new WaveFileWriter(new File("/home/r/tmp/ear"+n+".wav"));}
470 catch (FileNotFoundException e) {e.printStackTrace();}
472 rf.registerSoundProcessor(auxListener,
473 new CompositeSoundProcessor(new Dancer(ear), aux));
474 }
475 }
478 public void simpleInitApp() {
479 this.setTimer(new IsoTimer(60));
480 initAudio();
482 createScene();
484 prepareEar(ear1, 1);
485 prepareEar(ear2, 1);
486 prepareEar(ear3, 1);
488 motionControl.play();
489 }
491 public void simpleUpdate(float tpf) {
492 if (music.getStatus() != AudioNode.Status.Playing){
493 music.play();
494 }
495 Vector3f loc = cam.getLocation();
496 Quaternion rot = cam.getRotation();
497 listener.setLocation(loc);
498 listener.setRotation(rot);
499 music.setLocalTranslation(bell.getLocalTranslation());
500 }
502 }
503 </code>
505 {{http://www.youtube.com/v/oCEfK0yhDrY?.swf?400×333}}
507 ===== More Information =====
509 This is the old page showing the first version of this idea
510 http://aurellem.org/cortex/html/capture-video.html
512 All source code can be found here:
514 http://hg.bortreb.com/audio-send
516 http://hg.bortreb.com/jmeCapture
518 More information on the modifications to OpenAL to support multiple
519 listeners can be found here.
521 http://aurellem.org/audio-send/html/ear.html