Mercurial > jmeCapture
changeset 56:afc437f637bd
improved formating
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Sat, 03 Dec 2011 19:25:27 -0600 |
parents | b05f629fc296 |
children | 19831877eaed |
files | src/com/aurellem/capture/Capture.java src/com/aurellem/capture/IsoTimer.java src/com/aurellem/capture/examples/Advanced.java src/com/aurellem/capture/examples/Basic.java src/com/aurellem/capture/examples/HelloAudioRecording.java src/com/aurellem/capture/examples/HelloVideoRecording.java |
diffstat | 6 files changed, 414 insertions(+), 402 deletions(-) [+] |
line wrap: on
line diff
1.1 --- a/src/com/aurellem/capture/Capture.java Sat Dec 03 19:18:38 2011 -0600 1.2 +++ b/src/com/aurellem/capture/Capture.java Sat Dec 03 19:25:27 2011 -0600 1.3 @@ -18,122 +18,135 @@ 1.4 import com.jme3.system.JmeSystem; 1.5 1.6 /** 1.7 - * Use the methods in this class for capturing consistent, high quality video and audio from a 1.8 - * jMonkeyEngine3 application. To capture audio or video do the following: 1.9 + * Use the methods in this class for capturing consistent, high quality video 1.10 + * and audio from a jMonkeyEngine3 application. To capture audio or video do 1.11 + * the following: 1.12 * 1.13 - * 1.) Set your application's timer to an IsoTimer. Create the IsoTimer with the desired video 1.14 - * frames-per-second. 1.15 - * 2.) Call captureAudio and/or captureVideo on the Application as desired before starting the Application. 1.16 + * 1.) Set your application's timer to an IsoTimer. Create the IsoTimer with the 1.17 + * desired video frames-per-second. 1.18 + * 1.19 + * 2.) Call captureAudio and/or captureVideo on the Application as desired 1.20 + * before starting the Application. 1.21 * 1.22 - * See the Basic and Advanced demos in the examples section for more information. If you have any trouble, 1.23 - * please PM me on the jMonkeyEngine forums. My username is bortreb. 1.24 + * See the Basic and Advanced demos in the examples section for more 1.25 + * information. If you have any trouble, please PM me on the jMonkeyEngine 1.26 + * forums. My username is bortreb. 1.27 * 1.28 * @author Robert McIntyre 1.29 */ 1.30 1.31 public class Capture { 1.32 1.33 - /** 1.34 - * Use this function to capture video from your application. You specify the framerate at which 1.35 - * the video will be recording by setting the Application's timer to an IsoTimer created with the 1.36 - * desired frames-per-second. 1.37 - * 1.38 - * There are three ways to record and they are selected by the properties of the file that you 1.39 - * specify. 1.40 - * 1.41 - * 1.) (Preferred) 1.42 - * If you supply an empty directory as the file, then the video will 1.43 - * be saved as a sequence of .png files, one file per frame. The files start at 1.44 - * 0000000.png and increment from there. You can then combine the frames into your 1.45 - * preferred container/codec. If the directory is not empty, then writing video frames 1.46 - * to it will fail, and nothing will be written. 1.47 - * 1.48 - * 2.) If the filename ends in ".avi" then the frames will be encoded as a RAW stream 1.49 - * inside an AVI 1.0 container. The resulting file will be quite large and you will 1.50 - * probably want to re-encode it to your preferred container/codec format. Be advised 1.51 - * that some video payers cannot process AVI with a RAW stream, and that AVI 1.0 files 1.52 - * generated by this method that exceed 2.0GB are invalid according to the AVI 1.0 spec 1.53 - * (but many programs can still deal with them.) Thanks to Werner Randelshofer for his 1.54 - * excellent work which made AVI file writer option possible. 1.55 - * 1.56 - * 3.) Any non-directory file ending in anything other than ".avi" will be processed through 1.57 - * Xuggle. Xuggle provides the option to use many codecs/containers, but you will have to 1.58 - * install it on your system yourself in order to use this option. Please visit 1.59 - * http://www.xuggle.com/ to learn how to do this. 1.60 - * 1.61 - * @param app The Application from which you wish to record Video. 1.62 - * @param file The file to which the video will be captured. 1.63 - * @throws IOException 1.64 - */ 1.65 + /** 1.66 + * Use this function to capture video from your application. You 1.67 + * specify the framerate at which the video will be recording by setting 1.68 + * the Application's timer to an IsoTimer created with the desired 1.69 + * frames-per-second. 1.70 + * 1.71 + * There are three ways to record and they are selected by the 1.72 + * properties of the file that you specify. 1.73 + * 1.74 + * 1.) (Preferred) If you supply an empty directory as the file, then 1.75 + * the video will be saved as a sequence of .png files, one file per 1.76 + * frame. The files start at 0000000.png and increment from there. 1.77 + * You can then combine the frames into your preferred 1.78 + * container/codec. If the directory is not empty, then writing 1.79 + * video frames to it will fail, and nothing will be written. 1.80 + * 1.81 + * 2.) If the filename ends in ".avi" then the frames will be encoded as 1.82 + * a RAW stream inside an AVI 1.0 container. The resulting file 1.83 + * will be quite large and you will probably want to re-encode it to 1.84 + * your preferred container/codec format. Be advised that some 1.85 + * video payers cannot process AVI with a RAW stream, and that AVI 1.86 + * 1.0 files generated by this method that exceed 2.0GB are invalid 1.87 + * according to the AVI 1.0 spec (but many programs can still deal 1.88 + * with them.) Thanks to Werner Randelshofer for his excellent work 1.89 + * which made AVI file writer option possible. 1.90 + * 1.91 + * 3.) Any non-directory file ending in anything other than ".avi" will 1.92 + * be processed through Xuggle. Xuggle provides the option to use 1.93 + * many codecs/containers, but you will have to install it on your 1.94 + * system yourself in order to use this option. Please visit 1.95 + * http://www.xuggle.com/ to learn how to do this. 1.96 + * 1.97 + * @param app The Application from which you wish to record Video. 1.98 + * @param file The file to which the video will be captured. 1.99 + * @throws IOException 1.100 + */ 1.101 1.102 - public static void captureVideo(final Application app, final File file) throws IOException{ 1.103 - final AbstractVideoRecorder videoRecorder; 1.104 + public static void captureVideo(final Application app, final File file) throws IOException{ 1.105 + final AbstractVideoRecorder videoRecorder; 1.106 1.107 - if (file.getCanonicalPath().endsWith(".avi")){ 1.108 - videoRecorder = new AVIVideoRecorder(file);} 1.109 - else if (file.isDirectory()){ 1.110 - videoRecorder = new FileVideoRecorder(file);} 1.111 - else { videoRecorder = new XuggleVideoRecorder(file);} 1.112 + if (file.getCanonicalPath().endsWith(".avi")){ 1.113 + videoRecorder = new AVIVideoRecorder(file);} 1.114 + else if (file.isDirectory()){ 1.115 + videoRecorder = new FileVideoRecorder(file);} 1.116 + else { videoRecorder = new XuggleVideoRecorder(file);} 1.117 1.118 - Callable<Object> thunk = new Callable<Object>(){ 1.119 - public Object call(){ 1.120 + Callable<Object> thunk = new Callable<Object>(){ 1.121 + public Object call(){ 1.122 1.123 - ViewPort viewPort = 1.124 - app.getRenderManager() 1.125 - .createPostView("aurellem video record", app.getCamera()); 1.126 + ViewPort viewPort = 1.127 + app.getRenderManager() 1.128 + .createPostView("aurellem video record", app.getCamera()); 1.129 1.130 - viewPort.setClearFlags(false, false, false); 1.131 + viewPort.setClearFlags(false, false, false); 1.132 1.133 - // get GUI node stuff 1.134 - for (Spatial s : app.getGuiViewPort().getScenes()){ 1.135 - viewPort.attachScene(s); 1.136 - } 1.137 + // get GUI node stuff 1.138 + for (Spatial s : app.getGuiViewPort().getScenes()){ 1.139 + viewPort.attachScene(s); 1.140 + } 1.141 1.142 - app.getStateManager().attach(videoRecorder); 1.143 - viewPort.addProcessor(videoRecorder); 1.144 - return null; 1.145 - } 1.146 - }; 1.147 - app.enqueue(thunk); 1.148 - } 1.149 + app.getStateManager().attach(videoRecorder); 1.150 + viewPort.addProcessor(videoRecorder); 1.151 + return null; 1.152 + } 1.153 + }; 1.154 + app.enqueue(thunk); 1.155 + } 1.156 1.157 1.158 - /** 1.159 - * Use this function to capture audio from your application. Audio data will be saved in linear 1.160 - * PCM format at 44,100 hertz in the wav container format to the file that you specify. 1.161 - * 1.162 - * Note that you *have* to use an IsoTimer for your Application's timer while recording audio or 1.163 - * the recording will fail. Also ensure that your IsoTimer obeys the following constraints: 1.164 - * 1.165 - * - The frames-per-second value of the IsoTimer cannot be lower than 10 frames-per-second. 1.166 - * - The frames-per-second value of the IsoTimer must evenly divide 44,100. 1.167 - * 1.168 - * @param app The Application from which you wish to record Audio. 1.169 - * @param file the target file to which you want to record audio data. 1.170 - * @throws IOException 1.171 - */ 1.172 + /** 1.173 + * Use this function to capture audio from your application. Audio data 1.174 + * will be saved in linear PCM format at 44,100 hertz in the wav container 1.175 + * format to the file that you specify. 1.176 + * 1.177 + * Note that you *have* to use an IsoTimer for your Application's timer 1.178 + * while recording audio or the recording will fail. Also ensure that your 1.179 + * IsoTimer obeys the following constraints: 1.180 + * 1.181 + * 1.) The frames-per-second value of the IsoTimer cannot be lower than 10 1.182 + * frames-per-second. 1.183 + * 1.184 + * 2.) The frames-per-second value of the IsoTimer must evenly divide 1.185 + * 44,100. 1.186 + * 1.187 + * @param app The Application from which you wish to record Audio. 1.188 + * @param file the target file to which you want to record audio data. 1.189 + * @throws IOException 1.190 + */ 1.191 1.192 - public static void captureAudio(final Application app, final File file) throws IOException{ 1.193 - AppSettings settings = null; 1.194 - if (app.getContext() != null){settings = app.getContext().getSettings();} 1.195 - if (settings == null){settings = new AppSettings(true);} 1.196 - settings.setAudioRenderer("Send"); 1.197 - app.setSettings(settings); 1.198 + public static void captureAudio(final Application app, final File file) throws IOException{ 1.199 + AppSettings settings = null; 1.200 + if (app.getContext() != null){settings = app.getContext().getSettings();} 1.201 + if (settings == null){settings = new AppSettings(true);} 1.202 + settings.setAudioRenderer("Send"); 1.203 + app.setSettings(settings); 1.204 1.205 - JmeSystem.setSystemDelegate(new AurellemSystemDelegate()); 1.206 + JmeSystem.setSystemDelegate(new AurellemSystemDelegate()); 1.207 1.208 - final WaveFileWriter writer = new WaveFileWriter(file); 1.209 + final WaveFileWriter writer = new WaveFileWriter(file); 1.210 1.211 - Callable<Object> thunk = new Callable<Object>(){ 1.212 - public Object call(){ 1.213 - AudioRenderer ar = app.getAudioRenderer(); 1.214 - if (ar instanceof MultiListener){ 1.215 - MultiListener ml = (MultiListener)ar; 1.216 - ml.registerSoundProcessor(writer); 1.217 - } 1.218 - return null; 1.219 - } 1.220 - }; 1.221 - app.enqueue(thunk); 1.222 - } 1.223 + Callable<Object> thunk = new Callable<Object>(){ 1.224 + public Object call(){ 1.225 + AudioRenderer ar = app.getAudioRenderer(); 1.226 + if (ar instanceof MultiListener){ 1.227 + MultiListener ml = (MultiListener)ar; 1.228 + ml.registerSoundProcessor(writer); 1.229 + } 1.230 + return null; 1.231 + } 1.232 + }; 1.233 + app.enqueue(thunk); 1.234 + } 1.235 }
2.1 --- a/src/com/aurellem/capture/IsoTimer.java Sat Dec 03 19:18:38 2011 -0600 2.2 +++ b/src/com/aurellem/capture/IsoTimer.java Sat Dec 03 19:25:27 2011 -0600 2.3 @@ -12,9 +12,9 @@ 2.4 * throttles its physics engine and graphics display. If the 2.5 * computations involved in running the game are too intense, then the 2.6 * game will first skip frames, then sacrifice physics accuracy. If 2.7 - * there are particularly demanding computations, then you may only get 2.8 - * 1 fps, and the ball may tunnel through the floor or obstacles due 2.9 - * to inaccurate physics simulation, but after the end of one 2.10 + * there are particularly demanding computations, then you may only 2.11 + * get 1 fps, and the ball may tunnel through the floor or obstacles 2.12 + * due to inaccurate physics simulation, but after the end of one 2.13 * user-hour, that ball will have traveled one game-mile. 2.14 * 2.15 * When we're recording video or audio, we don't care if the game-time 2.16 @@ -36,32 +36,32 @@ 2.17 2.18 public class IsoTimer extends Timer { 2.19 2.20 - private float framerate; 2.21 - private int ticks; 2.22 + private float framerate; 2.23 + private int ticks; 2.24 2.25 - public IsoTimer(float framerate){ 2.26 - this.framerate = framerate; 2.27 - this.ticks = 0; 2.28 - } 2.29 + public IsoTimer(float framerate){ 2.30 + this.framerate = framerate; 2.31 + this.ticks = 0; 2.32 + } 2.33 2.34 - public long getTime() { 2.35 - return (long) (this.ticks / this.framerate); 2.36 - } 2.37 + public long getTime() { 2.38 + return (long) (this.ticks / this.framerate); 2.39 + } 2.40 2.41 - public long getResolution() { 2.42 - return 1000000000L; 2.43 - } 2.44 + public long getResolution() { 2.45 + return 1000000000L; 2.46 + } 2.47 2.48 - public float getFrameRate() { 2.49 - return this.framerate; 2.50 - } 2.51 + public float getFrameRate() { 2.52 + return this.framerate; 2.53 + } 2.54 2.55 - public float getTimePerFrame() { 2.56 - return (float) (1.0f / this.framerate); 2.57 - } 2.58 + public float getTimePerFrame() { 2.59 + return (float) (1.0f / this.framerate); 2.60 + } 2.61 2.62 - public void update() {this.ticks++;} 2.63 + public void update() {this.ticks++;} 2.64 2.65 - public void reset() {this.ticks = 0;} 2.66 + public void reset() {this.ticks = 0;} 2.67 2.68 }
3.1 --- a/src/com/aurellem/capture/examples/Advanced.java Sat Dec 03 19:18:38 2011 -0600 3.2 +++ b/src/com/aurellem/capture/examples/Advanced.java Sat Dec 03 19:25:27 2011 -0600 3.3 @@ -37,266 +37,266 @@ 3.4 3.5 /** 3.6 * 3.7 - * Demonstrates advanced use of the audio capture and recording features. 3.8 - * Multiple perspectives of the same scene are simultaneously rendered to 3.9 - * different sound files. 3.10 + * Demonstrates advanced use of the audio capture and recording 3.11 + * features. Multiple perspectives of the same scene are 3.12 + * simultaneously rendered to different sound files. 3.13 * 3.14 - * A key limitation of the way multiple listeners are implemented is that 3.15 - * only 3D positioning effects are realized for listeners other than the 3.16 - * main LWJGL listener. This means that audio effects such as environment 3.17 - * settings will *not* be heard on any auxiliary listeners, though sound 3.18 - * attenuation will work correctly. 3.19 + * A key limitation of the way multiple listeners are implemented is 3.20 + * that only 3D positioning effects are realized for listeners other 3.21 + * than the main LWJGL listener. This means that audio effects such 3.22 + * as environment settings will *not* be heard on any auxiliary 3.23 + * listeners, though sound attenuation will work correctly. 3.24 * 3.25 - * Multiple listeners as realized here might be used to make AI entities 3.26 - * that can each hear the world from their own perspective. 3.27 + * Multiple listeners as realized here might be used to make AI 3.28 + * entities that can each hear the world from their own perspective. 3.29 * 3.30 * @author Robert McIntyre 3.31 */ 3.32 3.33 public class Advanced extends SimpleApplication { 3.34 3.35 - /** 3.36 - * You will see three grey cubes, a blue sphere, and a path 3.37 - * which circles each cube. The blue sphere is generating a 3.38 - * constant monotone sound as it moves along the track. Each 3.39 - * cube is listening for sound; when a cube hears sound whose 3.40 - * intensity is greater than a certain threshold, it changes 3.41 - * its color from grey to green. 3.42 - * 3.43 - * Each cube is also saving whatever it hears to a file. The 3.44 - * scene from the perspective of the viewer is also saved to 3.45 - * a video file. When you listen to each of the sound files 3.46 - * alongside the video, the sound will get louder when the 3.47 - * sphere approaches the cube that generated that sound file. 3.48 - * This shows that each listener is hearing the world from 3.49 - * its own perspective. 3.50 - * 3.51 - */ 3.52 - public static void main(String[] args) { 3.53 - Advanced app = new Advanced(); 3.54 - AppSettings settings = new AppSettings(true); 3.55 - settings.setAudioRenderer(AurellemSystemDelegate.SEND); 3.56 - JmeSystem.setSystemDelegate(new AurellemSystemDelegate()); 3.57 - app.setSettings(settings); 3.58 - app.setShowSettings(false); 3.59 - app.setPauseOnLostFocus(false); 3.60 + /** 3.61 + * You will see three grey cubes, a blue sphere, and a path which 3.62 + * circles each cube. The blue sphere is generating a constant 3.63 + * monotone sound as it moves along the track. Each cube is 3.64 + * listening for sound; when a cube hears sound whose intensity is 3.65 + * greater than a certain threshold, it changes its color from 3.66 + * grey to green. 3.67 + * 3.68 + * Each cube is also saving whatever it hears to a file. The 3.69 + * scene from the perspective of the viewer is also saved to a 3.70 + * video file. When you listen to each of the sound files 3.71 + * alongside the video, the sound will get louder when the sphere 3.72 + * approaches the cube that generated that sound file. This 3.73 + * shows that each listener is hearing the world from its own 3.74 + * perspective. 3.75 + * 3.76 + */ 3.77 + public static void main(String[] args) { 3.78 + Advanced app = new Advanced(); 3.79 + AppSettings settings = new AppSettings(true); 3.80 + settings.setAudioRenderer(AurellemSystemDelegate.SEND); 3.81 + JmeSystem.setSystemDelegate(new AurellemSystemDelegate()); 3.82 + app.setSettings(settings); 3.83 + app.setShowSettings(false); 3.84 + app.setPauseOnLostFocus(false); 3.85 3.86 - try { 3.87 - Capture.captureVideo(app, File.createTempFile("advanced",".avi")); 3.88 - Capture.captureAudio(app, File.createTempFile("advanced", ".wav")); 3.89 - } 3.90 - catch (IOException e) {e.printStackTrace();} 3.91 + try { 3.92 + Capture.captureVideo(app, File.createTempFile("advanced",".avi")); 3.93 + Capture.captureAudio(app, File.createTempFile("advanced", ".wav")); 3.94 + } 3.95 + catch (IOException e) {e.printStackTrace();} 3.96 3.97 - app.start(); 3.98 + app.start(); 3.99 + } 3.100 + 3.101 + 3.102 + private Geometry bell; 3.103 + private Geometry ear1; 3.104 + private Geometry ear2; 3.105 + private Geometry ear3; 3.106 + private AudioNode music; 3.107 + private MotionTrack motionControl; 3.108 + 3.109 + private Geometry makeEar(Node root, Vector3f position){ 3.110 + Material mat = new Material(assetManager, "Common/MatDefs/Misc/Unshaded.j3md"); 3.111 + Geometry ear = new Geometry("ear", new Box(1.0f, 1.0f, 1.0f)); 3.112 + ear.setLocalTranslation(position); 3.113 + mat.setColor("Color", ColorRGBA.Green); 3.114 + ear.setMaterial(mat); 3.115 + root.attachChild(ear); 3.116 + return ear; 3.117 + } 3.118 + 3.119 + private Vector3f[] path = new Vector3f[]{ 3.120 + // loop 1 3.121 + new Vector3f(0, 0, 0), 3.122 + new Vector3f(0, 0, -10), 3.123 + new Vector3f(-2, 0, -14), 3.124 + new Vector3f(-6, 0, -20), 3.125 + new Vector3f(0, 0, -26), 3.126 + new Vector3f(6, 0, -20), 3.127 + new Vector3f(0, 0, -14), 3.128 + new Vector3f(-6, 0, -20), 3.129 + new Vector3f(0, 0, -26), 3.130 + new Vector3f(6, 0, -20), 3.131 + // loop 2 3.132 + new Vector3f(5, 0, -5), 3.133 + new Vector3f(7, 0, 1.5f), 3.134 + new Vector3f(14, 0, 2), 3.135 + new Vector3f(20, 0, 6), 3.136 + new Vector3f(26, 0, 0), 3.137 + new Vector3f(20, 0, -6), 3.138 + new Vector3f(14, 0, 0), 3.139 + new Vector3f(20, 0, 6), 3.140 + new Vector3f(26, 0, 0), 3.141 + new Vector3f(20, 0, -6), 3.142 + new Vector3f(14, 0, 0), 3.143 + // loop 3 3.144 + new Vector3f(8, 0, 7.5f), 3.145 + new Vector3f(7, 0, 10.5f), 3.146 + new Vector3f(6, 0, 20), 3.147 + new Vector3f(0, 0, 26), 3.148 + new Vector3f(-6, 0, 20), 3.149 + new Vector3f(0, 0, 14), 3.150 + new Vector3f(6, 0, 20), 3.151 + new Vector3f(0, 0, 26), 3.152 + new Vector3f(-6, 0, 20), 3.153 + new Vector3f(0, 0, 14), 3.154 + // begin ellipse 3.155 + new Vector3f(16, 5, 20), 3.156 + new Vector3f(0, 0, 26), 3.157 + new Vector3f(-16, -10, 20), 3.158 + new Vector3f(0, 0, 14), 3.159 + new Vector3f(16, 20, 20), 3.160 + new Vector3f(0, 0, 26), 3.161 + new Vector3f(-10, -25, 10), 3.162 + new Vector3f(-10, 0, 0), 3.163 + // come at me! 3.164 + new Vector3f(-28.00242f, 48.005623f, -34.648228f), 3.165 + new Vector3f(0, 0 , -20), 3.166 + }; 3.167 + 3.168 + private void createScene() { 3.169 + Material mat = new Material(assetManager, "Common/MatDefs/Misc/Unshaded.j3md"); 3.170 + bell = new Geometry( "sound-emitter" , new Sphere(15,15,1)); 3.171 + mat.setColor("Color", ColorRGBA.Blue); 3.172 + bell.setMaterial(mat); 3.173 + rootNode.attachChild(bell); 3.174 + 3.175 + ear1 = makeEar(rootNode, new Vector3f(0, 0 ,-20)); 3.176 + ear2 = makeEar(rootNode, new Vector3f(0, 0 ,20)); 3.177 + ear3 = makeEar(rootNode, new Vector3f(20, 0 ,0)); 3.178 + 3.179 + MotionPath track = new MotionPath(); 3.180 + 3.181 + for (Vector3f v : path){ 3.182 + track.addWayPoint(v); 3.183 + } 3.184 + track.setCurveTension(0.80f); 3.185 + 3.186 + motionControl = new MotionTrack(bell,track); 3.187 + 3.188 + // for now, use reflection to change the timer... 3.189 + // motionControl.setTimer(new IsoTimer(60)); 3.190 + try { 3.191 + Field timerField; 3.192 + timerField = AbstractCinematicEvent.class.getDeclaredField("timer"); 3.193 + timerField.setAccessible(true); 3.194 + try {timerField.set(motionControl, new IsoTimer(60));} 3.195 + catch (IllegalArgumentException e) {e.printStackTrace();} 3.196 + catch (IllegalAccessException e) {e.printStackTrace();} 3.197 + } 3.198 + catch (SecurityException e) {e.printStackTrace();} 3.199 + catch (NoSuchFieldException e) {e.printStackTrace();} 3.200 + 3.201 + motionControl.setDirectionType(MotionTrack.Direction.PathAndRotation); 3.202 + motionControl.setRotation(new Quaternion().fromAngleNormalAxis(-FastMath.HALF_PI, Vector3f.UNIT_Y)); 3.203 + motionControl.setInitialDuration(20f); 3.204 + motionControl.setSpeed(1f); 3.205 + 3.206 + track.enableDebugShape(assetManager, rootNode); 3.207 + positionCamera(); 3.208 + } 3.209 + 3.210 + 3.211 + private void positionCamera(){ 3.212 + this.cam.setLocation(new Vector3f(-28.00242f, 48.005623f, -34.648228f)); 3.213 + this.cam.setRotation(new Quaternion(0.3359635f, 0.34280345f, -0.13281013f, 0.8671653f)); 3.214 + } 3.215 + 3.216 + private void initAudio() { 3.217 + org.lwjgl.input.Mouse.setGrabbed(false); 3.218 + music = new AudioNode(assetManager, "Sound/Effects/Beep.ogg", false); 3.219 + 3.220 + rootNode.attachChild(music); 3.221 + audioRenderer.playSource(music); 3.222 + music.setPositional(true); 3.223 + music.setVolume(1f); 3.224 + music.setReverbEnabled(false); 3.225 + music.setDirectional(false); 3.226 + music.setMaxDistance(200.0f); 3.227 + music.setRefDistance(1f); 3.228 + //music.setRolloffFactor(1f); 3.229 + music.setLooping(false); 3.230 + audioRenderer.pauseSource(music); 3.231 + } 3.232 + 3.233 + public class Dancer implements SoundProcessor { 3.234 + Geometry entity; 3.235 + float scale = 2; 3.236 + public Dancer(Geometry entity){ 3.237 + this.entity = entity; 3.238 } 3.239 3.240 - 3.241 - private Geometry bell; 3.242 - private Geometry ear1; 3.243 - private Geometry ear2; 3.244 - private Geometry ear3; 3.245 - private AudioNode music; 3.246 - private MotionTrack motionControl; 3.247 + /** 3.248 + * this method is irrelevant since there is no state to cleanup. 3.249 + */ 3.250 + public void cleanup() {} 3.251 + 3.252 + 3.253 + /** 3.254 + * Respond to sound! This is the brain of an AI entity that 3.255 + * hears it's surroundings and reacts to them. 3.256 + */ 3.257 + public void process(ByteBuffer audioSamples, int numSamples, AudioFormat format) { 3.258 + audioSamples.clear(); 3.259 + byte[] data = new byte[numSamples]; 3.260 + float[] out = new float[numSamples]; 3.261 + audioSamples.get(data); 3.262 + FloatSampleTools.byte2floatInterleaved(data, 0, out, 0, 3.263 + numSamples/format.getFrameSize(), format); 3.264 + 3.265 + float max = Float.NEGATIVE_INFINITY; 3.266 + for (float f : out){if (f > max) max = f;} 3.267 + audioSamples.clear(); 3.268 + 3.269 + if (max > 0.1){entity.getMaterial().setColor("Color", ColorRGBA.Green);} 3.270 + else {entity.getMaterial().setColor("Color", ColorRGBA.Gray);} 3.271 + } 3.272 + } 3.273 + 3.274 + private void prepareEar(Geometry ear, int n){ 3.275 + if (this.audioRenderer instanceof MultiListener){ 3.276 + MultiListener rf = (MultiListener)this.audioRenderer; 3.277 + 3.278 + Listener auxListener = new Listener(); 3.279 + auxListener.setLocation(ear.getLocalTranslation()); 3.280 + 3.281 + rf.addListener(auxListener); 3.282 + WaveFileWriter aux = null; 3.283 + 3.284 + try {aux = new WaveFileWriter(new File("/home/r/tmp/ear"+n+".wav"));} 3.285 + catch (FileNotFoundException e) {e.printStackTrace();} 3.286 + 3.287 + rf.registerSoundProcessor(auxListener, 3.288 + new CompositeSoundProcessor(new Dancer(ear), aux)); 3.289 + } 3.290 + } 3.291 + 3.292 + 3.293 + public void simpleInitApp() { 3.294 + this.setTimer(new IsoTimer(60)); 3.295 + initAudio(); 3.296 3.297 - private Geometry makeEar(Node root, Vector3f position){ 3.298 - Material mat = new Material(assetManager, "Common/MatDefs/Misc/Unshaded.j3md"); 3.299 - Geometry ear = new Geometry("ear", new Box(1.0f, 1.0f, 1.0f)); 3.300 - ear.setLocalTranslation(position); 3.301 - mat.setColor("Color", ColorRGBA.Green); 3.302 - ear.setMaterial(mat); 3.303 - root.attachChild(ear); 3.304 - return ear; 3.305 - } 3.306 + createScene(); 3.307 3.308 - private Vector3f[] path = new Vector3f[]{ 3.309 - // loop 1 3.310 - new Vector3f(0, 0, 0), 3.311 - new Vector3f(0, 0, -10), 3.312 - new Vector3f(-2, 0, -14), 3.313 - new Vector3f(-6, 0, -20), 3.314 - new Vector3f(0, 0, -26), 3.315 - new Vector3f(6, 0, -20), 3.316 - new Vector3f(0, 0, -14), 3.317 - new Vector3f(-6, 0, -20), 3.318 - new Vector3f(0, 0, -26), 3.319 - new Vector3f(6, 0, -20), 3.320 - // loop 2 3.321 - new Vector3f(5, 0, -5), 3.322 - new Vector3f(7, 0, 1.5f), 3.323 - new Vector3f(14, 0, 2), 3.324 - new Vector3f(20, 0, 6), 3.325 - new Vector3f(26, 0, 0), 3.326 - new Vector3f(20, 0, -6), 3.327 - new Vector3f(14, 0, 0), 3.328 - new Vector3f(20, 0, 6), 3.329 - new Vector3f(26, 0, 0), 3.330 - new Vector3f(20, 0, -6), 3.331 - new Vector3f(14, 0, 0), 3.332 - // loop 3 3.333 - new Vector3f(8, 0, 7.5f), 3.334 - new Vector3f(7, 0, 10.5f), 3.335 - new Vector3f(6, 0, 20), 3.336 - new Vector3f(0, 0, 26), 3.337 - new Vector3f(-6, 0, 20), 3.338 - new Vector3f(0, 0, 14), 3.339 - new Vector3f(6, 0, 20), 3.340 - new Vector3f(0, 0, 26), 3.341 - new Vector3f(-6, 0, 20), 3.342 - new Vector3f(0, 0, 14), 3.343 - // begin ellipse 3.344 - new Vector3f(16, 5, 20), 3.345 - new Vector3f(0, 0, 26), 3.346 - new Vector3f(-16, -10, 20), 3.347 - new Vector3f(0, 0, 14), 3.348 - new Vector3f(16, 20, 20), 3.349 - new Vector3f(0, 0, 26), 3.350 - new Vector3f(-10, -25, 10), 3.351 - new Vector3f(-10, 0, 0), 3.352 - // come at me! 3.353 - new Vector3f(-28.00242f, 48.005623f, -34.648228f), 3.354 - new Vector3f(0, 0 , -20), 3.355 - }; 3.356 + prepareEar(ear1, 1); 3.357 + prepareEar(ear2, 1); 3.358 + prepareEar(ear3, 1); 3.359 3.360 - private void createScene() { 3.361 - Material mat = new Material(assetManager, "Common/MatDefs/Misc/Unshaded.j3md"); 3.362 - bell = new Geometry( "sound-emitter" , new Sphere(15,15,1)); 3.363 - mat.setColor("Color", ColorRGBA.Blue); 3.364 - bell.setMaterial(mat); 3.365 - rootNode.attachChild(bell); 3.366 + motionControl.play(); 3.367 + } 3.368 3.369 - ear1 = makeEar(rootNode, new Vector3f(0, 0 ,-20)); 3.370 - ear2 = makeEar(rootNode, new Vector3f(0, 0 ,20)); 3.371 - ear3 = makeEar(rootNode, new Vector3f(20, 0 ,0)); 3.372 - 3.373 - MotionPath track = new MotionPath(); 3.374 - 3.375 - for (Vector3f v : path){ 3.376 - track.addWayPoint(v); 3.377 - } 3.378 - track.setCurveTension(0.80f); 3.379 - 3.380 - motionControl = new MotionTrack(bell,track); 3.381 - 3.382 - // for now, use reflection to change the timer... 3.383 - // motionControl.setTimer(new IsoTimer(60)); 3.384 - try { 3.385 - Field timerField; 3.386 - timerField = AbstractCinematicEvent.class.getDeclaredField("timer"); 3.387 - timerField.setAccessible(true); 3.388 - try {timerField.set(motionControl, new IsoTimer(60));} 3.389 - catch (IllegalArgumentException e) {e.printStackTrace();} 3.390 - catch (IllegalAccessException e) {e.printStackTrace();} 3.391 - } 3.392 - catch (SecurityException e) {e.printStackTrace();} 3.393 - catch (NoSuchFieldException e) {e.printStackTrace();} 3.394 - 3.395 - motionControl.setDirectionType(MotionTrack.Direction.PathAndRotation); 3.396 - motionControl.setRotation(new Quaternion().fromAngleNormalAxis(-FastMath.HALF_PI, Vector3f.UNIT_Y)); 3.397 - motionControl.setInitialDuration(20f); 3.398 - motionControl.setSpeed(1f); 3.399 - 3.400 - track.enableDebugShape(assetManager, rootNode); 3.401 - positionCamera(); 3.402 + public void simpleUpdate(float tpf) { 3.403 + if (music.getStatus() != AudioNode.Status.Playing){ 3.404 + music.play(); 3.405 } 3.406 - 3.407 - 3.408 - private void positionCamera(){ 3.409 - this.cam.setLocation(new Vector3f(-28.00242f, 48.005623f, -34.648228f)); 3.410 - this.cam.setRotation(new Quaternion(0.3359635f, 0.34280345f, -0.13281013f, 0.8671653f)); 3.411 - } 3.412 - 3.413 - private void initAudio() { 3.414 - org.lwjgl.input.Mouse.setGrabbed(false); 3.415 - music = new AudioNode(assetManager, "Sound/Effects/Beep.ogg", false); 3.416 - 3.417 - rootNode.attachChild(music); 3.418 - audioRenderer.playSource(music); 3.419 - music.setPositional(true); 3.420 - music.setVolume(1f); 3.421 - music.setReverbEnabled(false); 3.422 - music.setDirectional(false); 3.423 - music.setMaxDistance(200.0f); 3.424 - music.setRefDistance(1f); 3.425 - //music.setRolloffFactor(1f); 3.426 - music.setLooping(false); 3.427 - audioRenderer.pauseSource(music); 3.428 - } 3.429 - 3.430 - public class Dancer implements SoundProcessor { 3.431 - Geometry entity; 3.432 - float scale = 2; 3.433 - public Dancer(Geometry entity){ 3.434 - this.entity = entity; 3.435 - } 3.436 - 3.437 - /** 3.438 - * this method is irrelevant since there is no state to cleanup. 3.439 - */ 3.440 - public void cleanup() {} 3.441 - 3.442 - 3.443 - /** 3.444 - * Respond to sound! This is the brain of an AI entity that 3.445 - * hears it's surroundings and reacts to them. 3.446 - */ 3.447 - public void process(ByteBuffer audioSamples, int numSamples, AudioFormat format) { 3.448 - audioSamples.clear(); 3.449 - byte[] data = new byte[numSamples]; 3.450 - float[] out = new float[numSamples]; 3.451 - audioSamples.get(data); 3.452 - FloatSampleTools.byte2floatInterleaved(data, 0, out, 0, 3.453 - numSamples/format.getFrameSize(), format); 3.454 - 3.455 - float max = Float.NEGATIVE_INFINITY; 3.456 - for (float f : out){if (f > max) max = f;} 3.457 - audioSamples.clear(); 3.458 - 3.459 - if (max > 0.1){entity.getMaterial().setColor("Color", ColorRGBA.Green);} 3.460 - else {entity.getMaterial().setColor("Color", ColorRGBA.Gray);} 3.461 - } 3.462 - } 3.463 - 3.464 - private void prepareEar(Geometry ear, int n){ 3.465 - if (this.audioRenderer instanceof MultiListener){ 3.466 - MultiListener rf = (MultiListener)this.audioRenderer; 3.467 - 3.468 - Listener auxListener = new Listener(); 3.469 - auxListener.setLocation(ear.getLocalTranslation()); 3.470 - 3.471 - rf.addListener(auxListener); 3.472 - WaveFileWriter aux = null; 3.473 - 3.474 - try {aux = new WaveFileWriter(new File("/home/r/tmp/ear"+n+".wav"));} 3.475 - catch (FileNotFoundException e) {e.printStackTrace();} 3.476 - 3.477 - rf.registerSoundProcessor(auxListener, 3.478 - new CompositeSoundProcessor(new Dancer(ear), aux)); 3.479 - } 3.480 - } 3.481 - 3.482 - 3.483 - public void simpleInitApp() { 3.484 - this.setTimer(new IsoTimer(60)); 3.485 - initAudio(); 3.486 - 3.487 - createScene(); 3.488 - 3.489 - prepareEar(ear1, 1); 3.490 - prepareEar(ear2, 1); 3.491 - prepareEar(ear3, 1); 3.492 - 3.493 - motionControl.play(); 3.494 - } 3.495 - 3.496 - public void simpleUpdate(float tpf) { 3.497 - if (music.getStatus() != AudioNode.Status.Playing){ 3.498 - music.play(); 3.499 - } 3.500 - Vector3f loc = cam.getLocation(); 3.501 - Quaternion rot = cam.getRotation(); 3.502 - listener.setLocation(loc); 3.503 - listener.setRotation(rot); 3.504 - music.setLocalTranslation(bell.getLocalTranslation()); 3.505 - } 3.506 + Vector3f loc = cam.getLocation(); 3.507 + Quaternion rot = cam.getRotation(); 3.508 + listener.setLocation(loc); 3.509 + listener.setRotation(rot); 3.510 + music.setLocalTranslation(bell.getLocalTranslation()); 3.511 + } 3.512 3.513 }
4.1 --- a/src/com/aurellem/capture/examples/Basic.java Sat Dec 03 19:18:38 2011 -0600 4.2 +++ b/src/com/aurellem/capture/examples/Basic.java Sat Dec 03 19:25:27 2011 -0600 4.3 @@ -11,30 +11,29 @@ 4.4 4.5 4.6 /** 4.7 - * 4.8 - * Demonstrates how to use basic Audio/Video capture with a jMonkeyEngine 4.9 - * application. You can use these techniques to make high quality cutscenes 4.10 - * or demo videos, even on very slow laptops. 4.11 + * Demonstrates how to use basic Audio/Video capture with a 4.12 + * jMonkeyEngine application. You can use these techniques to make 4.13 + * high quality cutscenes or demo videos, even on very slow laptops. 4.14 * 4.15 * @author Robert McIntyre 4.16 */ 4.17 4.18 public class Basic { 4.19 4.20 - public static void main(String[] ignore) throws IOException{ 4.21 - File video = File.createTempFile("JME-water-video", ".avi"); 4.22 - File audio = File.createTempFile("JME-water-audio", ".wav"); 4.23 + public static void main(String[] ignore) throws IOException{ 4.24 + File video = File.createTempFile("JME-water-video", ".avi"); 4.25 + File audio = File.createTempFile("JME-water-audio", ".wav"); 4.26 4.27 - SimpleApplication app = new TestPostWater(); 4.28 - app.setTimer(new IsoTimer(60)); 4.29 - app.setShowSettings(false); 4.30 + SimpleApplication app = new TestPostWater(); 4.31 + app.setTimer(new IsoTimer(60)); 4.32 + app.setShowSettings(false); 4.33 4.34 - Capture.captureVideo(app, video); 4.35 - Capture.captureAudio(app, audio); 4.36 + Capture.captureVideo(app, video); 4.37 + Capture.captureAudio(app, audio); 4.38 4.39 - app.start(); 4.40 + app.start(); 4.41 4.42 - System.out.println(video.getCanonicalPath()); 4.43 - System.out.println(audio.getCanonicalPath()); 4.44 - } 4.45 + System.out.println(video.getCanonicalPath()); 4.46 + System.out.println(audio.getCanonicalPath()); 4.47 + } 4.48 }
5.1 --- a/src/com/aurellem/capture/examples/HelloAudioRecording.java Sat Dec 03 19:18:38 2011 -0600 5.2 +++ b/src/com/aurellem/capture/examples/HelloAudioRecording.java Sat Dec 03 19:25:27 2011 -0600 5.3 @@ -9,12 +9,12 @@ 5.4 import com.aurellem.capture.IsoTimer; 5.5 import com.jme3.app.Application; 5.6 5.7 -/** Recording audio from your Application is simple. If all you 5.8 - * want to do is record Audio, then follow the following steps. 5.9 +/** Recording audio from your Application is simple. If all you want 5.10 + * to do is record Audio, then follow the following steps. 5.11 * 5.12 * 1.) Set the Application's timer to an IsoTimer. The framerate is 5.13 - * irrelevant for sound, but must evenly divide 44,100Hz, which is the 5.14 - * frequency at which sound will be recorded. For example 5.15 + * irrelevant for sound, but must evenly divide 44,100Hz, which is 5.16 + * the frequency at which sound will be recorded. For example 5.17 * IsoTimer(60) is ok, but IsoTimer(61) is not. 5.18 * 5.19 * 2.) Call Capture.captureAudio(yourApplication, target-file) before 5.20 @@ -27,12 +27,12 @@ 5.21 */ 5.22 public class HelloAudioRecording { 5.23 5.24 - public static void main(String[] ignore) throws IOException{ 5.25 - Application app = new HelloAudio(); 5.26 - File audio = File.createTempFile("JME-simple-audio", ".wav"); 5.27 - app.setTimer(new IsoTimer(60)); 5.28 - Capture.captureAudio(app, audio); 5.29 - app.start(); 5.30 - System.out.println(audio.getCanonicalPath()); 5.31 - } 5.32 + public static void main(String[] ignore) throws IOException{ 5.33 + Application app = new HelloAudio(); 5.34 + File audio = File.createTempFile("JME-simple-audio", ".wav"); 5.35 + app.setTimer(new IsoTimer(60)); 5.36 + Capture.captureAudio(app, audio); 5.37 + app.start(); 5.38 + System.out.println(audio.getCanonicalPath()); 5.39 + } 5.40 }
6.1 --- a/src/com/aurellem/capture/examples/HelloVideoRecording.java Sat Dec 03 19:18:38 2011 -0600 6.2 +++ b/src/com/aurellem/capture/examples/HelloVideoRecording.java Sat Dec 03 19:25:27 2011 -0600 6.3 @@ -9,8 +9,8 @@ 6.4 import com.aurellem.capture.IsoTimer; 6.5 import com.jme3.app.Application; 6.6 6.7 -/** Recording Video from your Application is simple. If all you 6.8 - * want to do is record Video, then follow the following steps. 6.9 +/** Recording Video from your Application is simple. If all you want 6.10 + * to do is record Video, then follow the following steps. 6.11 * 6.12 * 1.) Set the Application's timer to an IsoTimer. The framerate of 6.13 * the IsoTimer will determine the framerate of the resulting video. 6.14 @@ -27,12 +27,12 @@ 6.15 */ 6.16 public class HelloVideoRecording { 6.17 6.18 - public static void main(String[] ignore) throws IOException { 6.19 - Application app = new HelloLoop(); 6.20 - File video = File.createTempFile("JME-simple-video", ".avi"); 6.21 - app.setTimer(new IsoTimer(60)); 6.22 - Capture.captureVideo(app, video); 6.23 - app.start(); 6.24 - System.out.println(video.getCanonicalPath()); 6.25 - } 6.26 + public static void main(String[] ignore) throws IOException { 6.27 + Application app = new HelloLoop(); 6.28 + File video = File.createTempFile("JME-simple-video", ".avi"); 6.29 + app.setTimer(new IsoTimer(60)); 6.30 + Capture.captureVideo(app, video); 6.31 + app.start(); 6.32 + System.out.println(video.getCanonicalPath()); 6.33 + } 6.34 }