rlm@0: #+title: Capture Live Video Feeds from JMonkeyEngine rlm@0: #+author: Robert McIntyre rlm@0: #+email: rlm@mit.edu rlm@0: #+description: Capture video from a JMonkeyEngine3 Application with Xuggle, and use gstreamer to compress the video to upload to YouTube. rlm@0: #+keywords: JME3, video, Xuggle, JMonkeyEngine, youtube, capture video, Java rlm@4: #+SETUPFILE: ../../aurellem/org/setup.org rlm@4: #+INCLUDE: ../../aurellem/org/level-0.org rlm@0: rlm@0: rlm@0: * The Problem rlm@0: So you've made your cool new JMonkeyEngine3 game and you want to rlm@0: create a demo video to show off your hard work. Screen capturing is rlm@0: the most straightforward way to do this, but it can slow down your rlm@0: game and produce low-quality video as a result. A better way is to rlm@0: record a video feed directly from the game while it is rlm@0: running. rlm@0: rlm@0: In this post, I'll explain how you can alter your JMonkeyEngine3 game rlm@0: to output video while it is running. The main trick is to alter the rlm@0: pace of JMonkeyEngine3's in-game time: we allow the engine as much rlm@0: time as it needs to compute complicated in-game events and to encode rlm@0: video frames. As a result, the game appears to speed up and slow down rlm@0: as the computational demands shift, but the end result is perfectly rlm@0: smooth video output at a constant framerate. rlm@0: rlm@0: rlm@0: * Game-time vs. User-time vs. Video-time rlm@0: rlm@0: A standard JME3 application that extends =SimpleApplication= or rlm@0: =Application= tries as hard as it can to keep in sync with rlm@0: /user-time/. If a ball is rolling at 1 game-mile per game-hour in the rlm@0: game, and you wait for one user-hour as measured by the clock on your rlm@0: wall, then the ball should have traveled exactly one game-mile. In rlm@0: order to keep sync with the real world, the game throttles its physics rlm@0: engine and graphics display. If the computations involved in running rlm@0: the game are too intense, then the game will first skip frames, then rlm@0: sacrifice physics accuracy. If there are particuraly demanding rlm@0: computations, then you may only get 1 fps, and the ball may tunnel rlm@0: through the floor or obstacles due to inaccurate physics simulation, rlm@0: but after the end of one user-hour, that ball will have traveled one rlm@0: game-mile. rlm@0: rlm@0: When we're recording video, we don't care if the game-time syncs with rlm@0: user-time, but instead whether the time in the recorded video rlm@0: (video-time) syncs with user-time. To continue the analogy, if we rlm@0: recorded the ball rolling at 1 game-mile per game-hour and watched the rlm@0: video later, we would want to see 30 fps video of the ball rolling at rlm@0: 1 video-mile per /user-hour/. It doesn't matter how much user-time it rlm@0: took to simulate that hour of game-time to make the high-quality rlm@0: recording. rlm@0: rlm@0: * COMMENT Two examples to clarify the point: rlm@0: ** Recording from a Simple Simulation rlm@0: rlm@0: *** Without a Special Timer rlm@0: You have a simulation of a ball rolling on an infinite empty plane at rlm@0: one game-mile per game-hour, and a really good computer. Normally, rlm@0: JME3 will throttle the physics engine and graphics display to sync the rlm@0: game-time with user-time. If it takes one-thousandth of a second rlm@0: user-time to simulate one-sixtieth of a second game time and another rlm@0: one-thousandth of a second to draw to the screen, then JME3 will just rlm@0: sit around for the remainder of $\frac{1}{60} - \frac{2}{1000}$ rlm@0: user-seconds, then calculate the next frame in $\frac{2}{1000}$ rlm@0: user-seconds, then wait, and so on. For every second of user time that rlm@0: passes, one second of game-time passes, and the game will run at 60 rlm@0: frames per user-second. rlm@0: rlm@0: rlm@0: *** With a Special Timer rlm@0: Then, you change the game's timer so that user-time will be synced to rlm@0: video-time. Assume that encoding a single frame takes 0 seconds rlm@0: user-time to complete. rlm@0: rlm@0: Now, JME3 takes advantage of all available resources. It still takes rlm@0: one-thousandth of a second to calculate a physics tick, and another rlm@0: one-thousandth to render to the screen. Then it takes 0 seconds to rlm@0: write the video frame to disk and encode the video. In only one second rlm@0: of user time, JME3 will complete 500 physics-tick/render/encode-video rlm@0: cycles, and $\frac{500}{60}=8\frac{1}{3}$ seconds of game-time will rlm@0: have passed. Game-time appears to dilate $8\frac{1}{3}\times$ with rlm@0: respect to user-time, and in only 7.2 minutes user-time, one hour of rlm@0: video will have been recorded. The game itself will run at 500 fps. rlm@0: When someone watches the video, they will see 60 frames per rlm@0: user-second, and $\frac{1}{60}$ video-seconds will pass each frame. It rlm@0: will take exactly one hour user-time (and one hour video-time) for the rlm@0: ball in the video to travel one video-mile. rlm@0: rlm@0: ** Recording from a Complex Simulation rlm@0: rlm@0: *** Without a Special Timer rlm@0: You have a simulation of a ball rolling on an infinite empty plane at rlm@0: one game-mile per game-hour accompanied by multiple explosions rlm@0: involving thousands of nodes, particle effects, and complicated shadow rlm@0: shaders to create realistic shadows. You also have a slow rlm@0: laptop. Normally, JME3 must sacrifice rendering and physics simulation rlm@0: to try to keep up. If it takes $\frac{1}{120}$ of a user-second to rlm@0: calculate $\frac{1}{60}$ game-seconds, and an additional rlm@0: $\frac{1}{60}$ of a user-second to render to screen, then JME3 has rlm@0: it's work cut out for it. In order to render to the screen, it will rlm@0: first step the game forward by up to four physics ticks before rlm@0: rendering to the screen. If it still isn't fast enough then it will rlm@0: decrease the accuracy of the physics engine until game-time and user rlm@0: time are synched or a certain threshold is reached, at which point the rlm@0: game visibly slows down. In this case, JME3 continuously repeat a rlm@0: cycle of two physics ticks, and one screen render. For every rlm@0: user-second that passes, one game-second will pass, but the game will rlm@0: run at 30 fps instead of 60 fps like before. rlm@0: rlm@0: *** With a Special Timer rlm@0: Then, you change the game's timer so that user-time will be synced to rlm@0: video-time. Once again, assume video encoding takes $\frac{1}{60}$ of rlm@0: a user-second. rlm@0: rlm@0: Now, JME3 will spend $\frac{1}{120}$ of a user-second to step the rlm@0: physics tick $\frac{1}{60}$ game-seconds, $\frac{1}{60}$ to draw to rlm@0: the screen, and an additional $\frac{1}{60}$ to encode the video and rlm@0: write the frame to disk. This is a total of $\frac{1}{24}$ rlm@0: user-seconds for each $\frac{1}{60}$ game-seconds. It will take rlm@0: $(\frac{60}{24} = 2.5)$ user-hours to record one game-hour and game-time rlm@0: will appear to flow two-fifths as fast as user time while the game is rlm@0: running. However, just as in example one, when all is said and done we rlm@0: will have an hour long video at 60 fps. rlm@0: rlm@0: rlm@0: * COMMENT proposed names for the new timer rlm@0: # METRONOME rlm@0: # IsoTimer rlm@0: # EvenTimer rlm@0: # PulseTimer rlm@0: # FixedTimer rlm@0: # RigidTimer rlm@0: # FixedTempo rlm@0: # RegularTimer rlm@0: # MetronomeTimer rlm@0: # ConstantTimer rlm@0: # SteadyTimer rlm@0: rlm@0: rlm@0: * =IsoTimer= records time like a metronome rlm@0: rlm@0: The easiest way to achieve this special timing is to create a new rlm@0: timer that always reports the same framerate to JME3 every time it is rlm@0: called. rlm@0: rlm@0: rlm@42: =./src/com/aurellem/capture/IsoTimer.java= rlm@42: #+include ../../jmeCapture/src/com/aurellem/capture/IsoTimer.java src java rlm@0: rlm@0: If an Application uses this =IsoTimer= instead of the normal one, we rlm@0: can be sure that every call to =simpleUpdate=, for example, corresponds rlm@0: to exactly $(\frac{1}{fps})$ seconds of game-time. rlm@0: rlm@0: * Encoding to Video rlm@0: rlm@0: Now that the issue of time is solved, we just need a function that rlm@0: writes each frame to a video. We can put this function somewhere rlm@0: where it will be called exactly one per frame. rlm@0: rlm@42: The basic functions that a =VideoRecorder= should support are rlm@42: recording, starting, stopping, and possibly a final finishing step rlm@42: there it finilizes the recording (such as writing headers for a video rlm@42: file). rlm@42: rlm@42: An appropiate interface describing this behaviour could look like rlm@42: this: rlm@42: rlm@42: =./src/com/aurellem/capture/video/VideoRecorder.java= rlm@42: #+include ../../jmeCapture/src/com/aurellem/capture/video/VideoRecorder.java src java rlm@42: rlm@42: rlm@0: JME3 already provides exactly the class we need: the =SceneProcessor= rlm@0: class can be attached to any viewport and the methods defined therein rlm@0: will be called at the appropriate points in the rendering process. rlm@0: rlm@42: However, it is also important to properly close the video stream and rlm@42: write headers and such, and even though =SceneProcessor= has a rlm@42: =.cleanup()= method, it is only called when the =SceneProcessor= is rlm@42: removed from the =RenderManager=, not when the game is shutting down rlm@42: when the user pressed ESC, for example. To obtain reliable shutdown rlm@42: behaviour, we also have to implement =AppState=, which provides a rlm@42: =.cleanup()= method that /is/ called on shutdown. rlm@42: rlm@42: Here is an AbstractVideoRecorder class that takes care of the details rlm@42: of setup and teardown. rlm@42: rlm@42: =./src/com/aurellem/capture/video/AbstractVideoRecorder.java= rlm@42: #+include ../../jmeCapture/src/com/aurellem/capture/video/AbstractVideoRecorder.java src java rlm@42: rlm@0: If you want to generate video from Java, a great option is [[http://www.xuggle.com/][Xuggle]]. It rlm@0: takes care of everything related to video encoding and decoding and rlm@0: runs on Windows, Linux and Mac. Out of all the video frameworks for rlm@0: Java I personally like this one the best. rlm@0: rlm@42: Here is a =VideoRecorder= that uses [[http://www.xuggle.com/][Xuggle]] to write each frame to a rlm@0: video file. rlm@0: rlm@42: =./src/com/aurellem/capture/video/XuggleVideoRecorder.java= rlm@42: #+include ../../jmeCapture/src/com/aurellem/capture/video/XuggleVideoRecorder.java src java rlm@0: rlm@0: With this, we are able to record video! rlm@0: rlm@42: However, it can be hard to properly install Xuggle. For those of you rlm@42: who would rather not use Xuggle, here is an alternate class that uses rlm@42: [[http://www.randelshofer.ch/blog/2008/08/writing-avi-videos-in-pure-java/][Werner Randelshofer's]] excellent pure Java AVI file writer. rlm@42: rlm@42: =./src/com/aurellem/capture/video/AVIVideoRecorder.java= rlm@42: #+include ../../jmeCapture/src/com/aurellem/capture/video/AVIVideoRecorder.java src java rlm@42: rlm@42: This =AVIVideoRecorder= is more limited than the rlm@42: =XuggleVideoRecorder=, but requires less external dependencies. rlm@42: rlm@42: Finally, for those of you who prefer to create the final video from a rlm@42: sequence of images, there is the =FileVideoRecorder=, which records rlm@42: each frame to a folder as a sequentially numbered image file. Note rlm@42: that you have to remember the FPS at which you recorded the video, as rlm@42: this information is lost when saving each frame to a file. rlm@42: rlm@42: =./src/com/aurellem/capture/video/FileVideoRecorder.java= rlm@42: #+include ../../jmeCapture/src/com/aurellem/capture/video/FileVideoRecorder.java src java rlm@42: rlm@42: rlm@42: * /Really/ Simple Video Recording rlm@42: rlm@42: The most common case for recording a video is probably to just capture rlm@42: whatever is on your screen exactly as you see it. In this case, this rlm@42: method will do. rlm@42: rlm@42: #+begin_src java rlm@42: public static void captureVideo(final Application app, rlm@42: final File file) throws IOException{ rlm@42: final AbstractVideoRecorder videoRecorder; rlm@42: if (file.getCanonicalPath().endsWith(".avi")){ rlm@42: videoRecorder = new AVIVideoRecorder(file);} rlm@42: else if (file.isDirectory()){ rlm@42: videoRecorder = new FileVideoRecorder(file);} rlm@42: else { videoRecorder = new XuggleVideoRecorder(file);} rlm@42: rlm@42: Callable thunk = new Callable(){ rlm@42: public Object call(){ rlm@42: ViewPort viewPort = rlm@42: app.getRenderManager() rlm@42: .createPostView("aurellem record", app.getCamera()); rlm@42: viewPort.setClearFlags(false, false, false); rlm@42: // get GUI node stuff rlm@42: for (Spatial s : app.getGuiViewPort().getScenes()){ rlm@42: viewPort.attachScene(s); rlm@42: } rlm@42: app.getStateManager().attach(videoRecorder); rlm@42: viewPort.addProcessor(videoRecorder); rlm@42: return null; rlm@42: } rlm@42: }; rlm@42: app.enqueue(thunk); rlm@42: } rlm@42: #+end_src rlm@42: rlm@42: This will select the appropiate backend =VideoRecorder= class rlm@42: depending on the file name you specify, and insturment your rlm@42: application to record video to the file. You should still set the rlm@42: game's timer to an =IsoTimer= with the desired fps. rlm@42: rlm@42: This example will record video from the ocean scene from the rlm@42: jMonkeyEngine test suite. rlm@42: #+begin_src java rlm@42: File video = File.createTempFile("JME-water-video", ".avi"); rlm@42: captureVideo(app, video); rlm@42: app.start(); rlm@42: System.out.println(video.getCanonicalPath()); rlm@42: #+end_src rlm@42: rlm@42: rlm@42: I've added support for this under a class called rlm@42: =com.aurellem.capture.Capture=. You can get it [[http://hg.bortreb.com/jmeCapture/][here]]. rlm@42: rlm@42: rlm@0: * Hello Video! rlm@0: rlm@0: I've taken [[http://code.google.com/p/jmonkeyengine/source/browse/trunk/engine/src/test/jme3test/helloworld/HelloLoop.java][=./jme3/src/test/jme3test/helloworld/HelloLoop.java=]] and rlm@0: augmented it with video output as follows: rlm@0: rlm@42: =./src/com/aurellem/capture/examples/HelloVideo.java= rlm@42: #+include ../../src/com/aurellem/capture/examples/HelloVideo.java src java rlm@0: rlm@0: The videos are created in the =hello-video= directory rlm@0: rlm@42: #+begin_src sh :results verbatim :exports both rlm@0: du -h hello-video/* rlm@0: #+end_src rlm@0: rlm@0: #+results: rlm@0: : 932K hello-video/hello-video-moving.flv rlm@0: : 640K hello-video/hello-video-static.flv rlm@0: rlm@0: And can be immediately uploaded to youtube rlm@0: rlm@0: - [[http://www.youtube.com/watch?v=C8gxVAySaPg][hello-video-moving.flv]] rlm@0: #+BEGIN_HTML rlm@0: rlm@0: #+END_HTML rlm@0: - [[http://www.youtube.com/watch?v=pHcFOtIS07Q][hello-video-static.flv]] rlm@0: #+BEGIN_HTML rlm@0: rlm@0: rlm@0: #+END_HTML rlm@0: rlm@0: rlm@0: * Summary rlm@0: It's quite easy to augment your own application to record video, rlm@0: almost regardless of how complicated the actual application is. You rlm@0: can also record from multiple ViewPorts as the above example shows. rlm@0: rlm@0: The process for adding video recording to your application is as rlm@0: follows: rlm@0: rlm@0: Assuming you want to record at 30 fps, add: rlm@0: rlm@0: #+begin_src java :exports code rlm@0: this.setTimer(new IsoTimer(30)); rlm@0: #+end_src rlm@0: rlm@42: Somewhere in the initialization of your Application. rlm@0: rlm@0: If you want to record from the game's main =ViewPort= to a file called rlm@0: =/home/r/record.flv=, then add: rlm@0: rlm@0: #+begin_src java :exports code rlm@42: Capture.captureVideo(app, new File("/home/r/record.flv")); rlm@0: #+end_src rlm@0: rlm@42: Before you call =app.start()=; rlm@0: rlm@0: * More Examples rlm@42: ** COMMENT Hello Physics rlm@0: =HelloVideo= is boring. Let's add some video capturing to =HelloPhysics= rlm@0: and create something fun! rlm@0: rlm@0: This example is a modified version of =HelloPhysics= that creates four rlm@0: simultaneous views of the same scene of cannonballs careening into a rlm@0: brick wall. rlm@0: rlm@0: =./jme3/src/test/jme3test/helloworld/HelloPhysicsWithVideo.java= rlm@0: #+include ./jme3/src/test/jme3test/helloworld/HelloPhysicsWithVideo.java src java rlm@0: rlm@0: Running the program outputs four videos into the =./physics-videos= rlm@0: directory. rlm@0: rlm@0: #+begin_src sh :exports both :results verbatim rlm@0: ls ./physics-videos | grep - rlm@0: #+end_src rlm@0: rlm@0: #+results: rlm@0: : lower-left.flv rlm@0: : lower-right.flv rlm@0: : upper-left.flv rlm@0: : upper-right.flv rlm@0: rlm@0: The videos are fused together with the following =gstreamer= commands: rlm@0: rlm@0: #+begin_src sh :results silent rlm@0: cd physics-videos rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./upper-right.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 left=-640 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: jpegenc ! avimux ! filesink location=upper.flv \ rlm@0: \ rlm@0: filesrc location=./upper-left.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox right=-640 ! mix. rlm@0: #+end_src rlm@0: rlm@0: #+begin_src sh :results silent rlm@0: cd physics-videos rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./lower-left.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 left=-640 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: jpegenc ! avimux ! filesink location=lower.flv \ rlm@0: \ rlm@0: filesrc location=./lower-right.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox right=-640 ! mix. rlm@0: #+end_src rlm@0: rlm@0: #+begin_src sh :results silent rlm@0: cd physics-videos rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./upper.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 bottom=-480 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \ rlm@0: jpegenc ! avimux ! filesink location=../youtube/helloPhysics.flv \ rlm@0: \ rlm@0: filesrc location=./lower.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: videobox top=-480 ! mix. rlm@0: #+end_src rlm@0: rlm@0: #+begin_src sh :results verbatim rlm@0: du -h youtube/helloPhysics.flv rlm@0: #+end_src rlm@0: rlm@0: #+results: rlm@0: : 180M physics-videos/helloPhysics.flv rlm@0: rlm@0: rlm@0: Thats a terribly large size! rlm@0: Let's compress it: rlm@0: rlm@42: ** COMMENT Compressing the HelloPhysics Video rlm@0: First, we'll scale the video, then, we'll decrease it's bitrate. The rlm@0: end result will be perfect for upload to YouTube. rlm@0: rlm@0: #+begin_src sh :results silent rlm@0: cd youtube rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./helloPhysics.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: `: # the original size is 1280 by 960` \ rlm@0: video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \ rlm@0: videoscale ! \ rlm@0: `: # here we scale the video down` \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: `: # and here we limit the bitrate` \ rlm@0: theoraenc bitrate=1024 quality=30 ! \ rlm@0: oggmux ! progressreport update-freq=1 ! \ rlm@0: filesink location=./helloPhysics.ogg rlm@0: #+end_src rlm@0: rlm@0: #+begin_src sh :results verbatim rlm@0: du -h youtube/helloPhysics.ogg rlm@0: #+end_src rlm@0: rlm@0: #+results: rlm@0: : 13M youtube/helloPhysics.ogg rlm@0: rlm@0: [[http://www.youtube.com/watch?v=WIJt9aRGusc][helloPhysics.ogg]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: rlm@0: ** COMMENT failed attempts rlm@0: Let's try the [[http://diracvideo.org/][Dirac]] video encoder. rlm@0: rlm@0: #+begin_src sh :results verbatim rlm@0: cd youtube rlm@0: START=$(date +%s) rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./helloPhysics.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \ rlm@0: schroenc ! filesink location=./helloPhysics.drc > /dev/null rlm@0: echo `expr $(( $(date +%s) - $START))` rlm@0: #+end_src rlm@0: rlm@0: rlm@0: #+results: rlm@0: : 142 rlm@0: rlm@0: That took 142 seconds. Let's see how it does compression-wise: rlm@0: rlm@0: #+begin_src sh :results verbatim rlm@0: du -h ./youtube/helloPhysics.drc rlm@0: #+end_src rlm@0: rlm@0: #+results: rlm@0: : 22M ./physics-videos/helloPhysics.drc rlm@0: rlm@0: rlm@0: #+begin_src sh :results verbatim rlm@0: cd youtube rlm@0: START=$(date +%s) rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./helloPhysics.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \ rlm@0: theoraenc ! oggmux ! filesink location=./helloPhysics.ogg \ rlm@0: > /dev/null rlm@0: echo `expr $(( $(date +%s) - $START))` rlm@0: #+end_src rlm@0: rlm@0: #+results: rlm@0: : 123 rlm@0: rlm@0: #+begin_src sh :results verbatim rlm@0: du -h youtube/helloPhysics.ogg rlm@0: #+end_src rlm@0: rlm@0: #+results: rlm@0: : 59M physics-videos/helloPhysics.ogg rlm@0: rlm@0: rlm@0: =*.drc= files can not be uploaded to YouTube, so I'll go for the rlm@0: avi file. rlm@0: rlm@0: rlm@0: ** COMMENT text for videos rlm@0: Video output from JMonkeyEngine3 (www.jmonkeyengine.org/) using Xuggle rlm@0: (www.xuggle.com/). Everything is explained at rlm@0: http://aurellem.org/cortex/capture-video.html. rlm@0: rlm@0: rlm@0: Video output from JMonkeyEngine3 (www.jmonkeyengine.org/) HelloPhysics rlm@0: demo application using Xuggle (www.xuggle.com/). Everything is rlm@0: explained at http://aurellem.org/cortex/capture-video.html. Here, rlm@0: four points of view are simultaneously recorded and then glued rlm@0: together later. rlm@0: rlm@0: JME3 Xuggle Aurellem video capture rlm@0: rlm@0: rlm@0: * Sample Videos rlm@0: I encoded most of the original JME3 Hello demos for your viewing rlm@42: pleasure, all using the =Capture= and =IsoTimer= classes. rlm@0: rlm@0: ** HelloTerrain rlm@0: [[http://youtu.be/5_4wyDFwrVQ][HelloTerrain.avi]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: ** HelloAssets rlm@0: [[http://www.youtube.com/watch?v=oGg-Q6k1BM4][HelloAssets.avi]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: ** HelloEffects rlm@0: [[http://www.youtube.com/watch?v=TuxlLMe53hA][HelloEffects]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: ** HelloCollision rlm@0: [[http://www.youtube.com/watch?v=GPlvJkiZfFw][HelloCollision.avi]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: ** HelloAnimation rlm@0: [[http://www.youtube.com/watch?v=SDCfOSPYUkg][HelloAnimation.avi]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: ** HelloNode rlm@0: [[http://www.youtube.com/watch?v=pL-0fR0-ilQ][HelloNode.avi]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: ** HelloLoop rlm@0: [[http://www.youtube.com/watch?v=mosZzzcdE5w][HelloLoop.avi]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: rlm@0: *** COMMENT x-form the other stupid rlm@0: progressreport update-freq=1 rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./helloPhy ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \ rlm@0: x264enc ! avimux ! filesink location=helloPhysics.avi \ rlm@0: rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./HelloAnimationStatic.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 left=-640 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: x264enc ! avimux ! progressreport update-freq=1 ! \ rlm@0: filesink location=../youtube/HelloAnimation.avi \ rlm@0: \ rlm@0: filesrc location=./HelloAnimationMotion.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox right=-640 ! mix. rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./HelloCollisionMotion.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=800, height=600, framerate=25/1 ! \ rlm@0: x264enc bitrate=1024 ! avimux ! \ rlm@0: filesink location=../youtube/HelloCollision.avi rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./HelloEffectsStatic.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 left=-640 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \ rlm@0: filesink location=../youtube/HelloEffects.avi \ rlm@0: \ rlm@0: filesrc location=./HelloEffectsMotion.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox right=-640 ! mix. rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./HelloTerrainMotion.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=800, height=600, framerate=25/1 ! \ rlm@0: x264enc bitrate=1024 ! avimux ! \ rlm@0: filesink location=../youtube/HelloTerrain.avi rlm@0: rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./HelloAssetsStatic.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 left=-640 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \ rlm@0: filesink location=../youtube/HelloAssets.avi \ rlm@0: \ rlm@0: filesrc location=./HelloAssetsMotion.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox right=-640 ! mix. rlm@0: rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./HelloNodeStatic.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 left=-640 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \ rlm@0: filesink location=../youtube/HelloNode.avi \ rlm@0: \ rlm@0: filesrc location=./HelloNodeMotion.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox right=-640 ! mix. rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./HelloLoopStatic.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 left=-640 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \ rlm@0: filesink location=../youtube/HelloLoop.avi \ rlm@0: \ rlm@0: filesrc location=./HelloLoopMotion.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox right=-640 ! mix. rlm@0: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@32: rlm@42: