rlm@0: #+title: Capture Live Video Feeds from JMonkeyEngine rlm@0: #+author: Robert McIntyre rlm@0: #+email: rlm@mit.edu rlm@0: #+description: Capture video from a JMonkeyEngine3 Application with Xuggle, and use gstreamer to compress the video to upload to YouTube. rlm@306: #+keywords: JME3, video, Xuggle, JMonkeyEngine, YouTube, capture video, Java rlm@4: #+SETUPFILE: ../../aurellem/org/setup.org rlm@4: #+INCLUDE: ../../aurellem/org/level-0.org rlm@0: rlm@0: rlm@0: * The Problem rlm@0: So you've made your cool new JMonkeyEngine3 game and you want to rlm@0: create a demo video to show off your hard work. Screen capturing is rlm@0: the most straightforward way to do this, but it can slow down your rlm@0: game and produce low-quality video as a result. A better way is to rlm@280: record a video feed directly from the game while it is running. rlm@0: rlm@0: In this post, I'll explain how you can alter your JMonkeyEngine3 game rlm@0: to output video while it is running. The main trick is to alter the rlm@0: pace of JMonkeyEngine3's in-game time: we allow the engine as much rlm@0: time as it needs to compute complicated in-game events and to encode rlm@0: video frames. As a result, the game appears to speed up and slow down rlm@0: as the computational demands shift, but the end result is perfectly rlm@0: smooth video output at a constant framerate. rlm@0: rlm@0: ocsenave@275: * Video recording requires a steady framerate ocsenave@275: ** The built-in =Timer= rushes to keep up. ocsenave@275: #* Game-time vs. User-time vs. Video-time rlm@0: rlm@280: Standard JME3 applications use a =Timer= object to manage time in the rlm@280: simulated world. Because most JME3 applications (e.g. games) are rlm@280: supposed to happen \ldquo{}live\rdquo{}, the built-in =Timer= requires rlm@280: simulated time to match real time. This means that the application rlm@280: must rush to finish all of its calculations on schedule: the more rlm@280: complicated the calculations, the more the application is obligated to rlm@280: rush. And if the workload becomes too much to handle on schedule, rlm@280: =Timer= forces the application to cut corners: it demands fast, rlm@280: approximate answers instead of careful, accurate ones. Although this rlm@280: policy sometimes causes physically impossible glitches and choppy rlm@280: framerates, it ensures that the user will never be kept waiting while rlm@280: the computer stops to make a complicated calculation. rlm@0: ocsenave@275: Now, the built-in =Timer= values speed over accuracy because real-time rlm@280: applications require it. On the other hand, if your goal is to record rlm@280: a glitch-free video, you need a =Timer= that will take its time to rlm@280: ensure that all calculations are accurate, even if they take a long rlm@280: time. In the next section, we will create a new kind of rlm@280: =Timer=\mdash{}called =IsoTimer=\mdash{}which slows down to let the rlm@280: computer finish all its calculations. The result is a perfectly steady rlm@280: framerate and a flawless physical simulation. rlm@0: ocsenave@275: # are supposed to happen \ldquo live \rdquo, this =Timer= requires the rlm@280: # application to update in real-time. In order to keep up with the rlm@280: # real world, JME applications cannot afford to take too much time on rlm@280: # expensive computations. Whenever the workload becomes too much for rlm@280: # the computer to handle on schedule, =Timer= forces the computer to rlm@280: # cut corners, giving fast, approximate answers instead of careful, rlm@280: # accurate ones. Although physical accuracy sometimes suffers as a rlm@280: # result, this policy ensures that the user will never be kept waiting rlm@280: # while the computer stops to make a complicated calculation. rlm@0: ocsenave@275: #fast answers are more important than accurate ones. ocsenave@275: ocsenave@275: # A standard JME3 application that extends =SimpleApplication= or ocsenave@275: # =Application= tries as hard as it can to keep in sync with rlm@280: # /user-time/. If a ball is rolling at 1 game-mile per game-hour in rlm@280: # the game, and you wait for one user-hour as measured by the clock on rlm@280: # your wall, then the ball should have traveled exactly one rlm@280: # game-mile. In order to keep sync with the real world, the game rlm@280: # throttles its physics engine and graphics display. If the rlm@280: # computations involved in running the game are too intense, then the rlm@280: # game will first skip frames, then sacrifice physics accuracy. If rlm@306: # there are particularly demanding computations, then you may only get rlm@280: # 1 fps, and the ball may tunnel through the floor or obstacles due to rlm@280: # inaccurate physics simulation, but after the end of one user-hour, rlm@280: # that ball will have traveled one game-mile. ocsenave@275: rlm@280: # When we're recording video, we don't care if the game-time syncs rlm@280: # with user-time, but instead whether the time in the recorded video ocsenave@275: # (video-time) syncs with user-time. To continue the analogy, if we rlm@280: # recorded the ball rolling at 1 game-mile per game-hour and watched rlm@280: # the video later, we would want to see 30 fps video of the ball rlm@280: # rolling at 1 video-mile per /user-hour/. It doesn't matter how much rlm@280: # user-time it took to simulate that hour of game-time to make the rlm@280: # high-quality recording. ocsenave@275: ** COMMENT Two examples to clarify the point: ocsenave@275: *** Recording from a Simple Simulation ocsenave@275: ocsenave@275: **** Without a Special Timer rlm@0: You have a simulation of a ball rolling on an infinite empty plane at rlm@0: one game-mile per game-hour, and a really good computer. Normally, rlm@0: JME3 will throttle the physics engine and graphics display to sync the rlm@0: game-time with user-time. If it takes one-thousandth of a second rlm@0: user-time to simulate one-sixtieth of a second game time and another rlm@0: one-thousandth of a second to draw to the screen, then JME3 will just rlm@0: sit around for the remainder of $\frac{1}{60} - \frac{2}{1000}$ rlm@0: user-seconds, then calculate the next frame in $\frac{2}{1000}$ rlm@0: user-seconds, then wait, and so on. For every second of user time that rlm@0: passes, one second of game-time passes, and the game will run at 60 rlm@0: frames per user-second. rlm@0: rlm@0: ocsenave@275: **** With a Special Timer rlm@0: Then, you change the game's timer so that user-time will be synced to rlm@0: video-time. Assume that encoding a single frame takes 0 seconds rlm@0: user-time to complete. rlm@0: rlm@0: Now, JME3 takes advantage of all available resources. It still takes rlm@0: one-thousandth of a second to calculate a physics tick, and another rlm@0: one-thousandth to render to the screen. Then it takes 0 seconds to rlm@0: write the video frame to disk and encode the video. In only one second rlm@0: of user time, JME3 will complete 500 physics-tick/render/encode-video rlm@0: cycles, and $\frac{500}{60}=8\frac{1}{3}$ seconds of game-time will rlm@0: have passed. Game-time appears to dilate $8\frac{1}{3}\times$ with rlm@0: respect to user-time, and in only 7.2 minutes user-time, one hour of rlm@0: video will have been recorded. The game itself will run at 500 fps. rlm@0: When someone watches the video, they will see 60 frames per rlm@0: user-second, and $\frac{1}{60}$ video-seconds will pass each frame. It rlm@0: will take exactly one hour user-time (and one hour video-time) for the rlm@0: ball in the video to travel one video-mile. rlm@0: ocsenave@275: *** Recording from a Complex Simulation rlm@0: rlm@0: *** Without a Special Timer rlm@0: You have a simulation of a ball rolling on an infinite empty plane at rlm@0: one game-mile per game-hour accompanied by multiple explosions rlm@0: involving thousands of nodes, particle effects, and complicated shadow rlm@0: shaders to create realistic shadows. You also have a slow rlm@0: laptop. Normally, JME3 must sacrifice rendering and physics simulation rlm@0: to try to keep up. If it takes $\frac{1}{120}$ of a user-second to rlm@0: calculate $\frac{1}{60}$ game-seconds, and an additional rlm@0: $\frac{1}{60}$ of a user-second to render to screen, then JME3 has rlm@0: it's work cut out for it. In order to render to the screen, it will rlm@0: first step the game forward by up to four physics ticks before rlm@0: rendering to the screen. If it still isn't fast enough then it will rlm@0: decrease the accuracy of the physics engine until game-time and user rlm@306: time are synced or a certain threshold is reached, at which point the rlm@0: game visibly slows down. In this case, JME3 continuously repeat a rlm@0: cycle of two physics ticks, and one screen render. For every rlm@0: user-second that passes, one game-second will pass, but the game will rlm@0: run at 30 fps instead of 60 fps like before. rlm@0: rlm@0: *** With a Special Timer rlm@0: Then, you change the game's timer so that user-time will be synced to rlm@0: video-time. Once again, assume video encoding takes $\frac{1}{60}$ of rlm@0: a user-second. rlm@0: rlm@0: Now, JME3 will spend $\frac{1}{120}$ of a user-second to step the rlm@0: physics tick $\frac{1}{60}$ game-seconds, $\frac{1}{60}$ to draw to rlm@0: the screen, and an additional $\frac{1}{60}$ to encode the video and rlm@0: write the frame to disk. This is a total of $\frac{1}{24}$ rlm@0: user-seconds for each $\frac{1}{60}$ game-seconds. It will take rlm@280: $(\frac{60}{24} = 2.5)$ user-hours to record one game-hour and rlm@280: game-time will appear to flow two-fifths as fast as user time while rlm@280: the game is running. However, just as in example one, when all is said rlm@280: and done we will have an hour long video at 60 fps. rlm@0: rlm@0: ocsenave@275: ** COMMENT proposed names for the new timer rlm@0: # METRONOME rlm@0: # IsoTimer rlm@0: # EvenTimer rlm@0: # PulseTimer rlm@0: # FixedTimer rlm@0: # RigidTimer rlm@0: # FixedTempo rlm@0: # RegularTimer rlm@0: # MetronomeTimer rlm@0: # ConstantTimer rlm@0: # SteadyTimer rlm@0: rlm@0: ocsenave@275: ** =IsoTimer= records time like a metronome rlm@0: rlm@0: The easiest way to achieve this special timing is to create a new rlm@0: timer that always reports the same framerate to JME3 every time it is rlm@0: called. rlm@0: rlm@0: rlm@42: =./src/com/aurellem/capture/IsoTimer.java= rlm@42: #+include ../../jmeCapture/src/com/aurellem/capture/IsoTimer.java src java rlm@0: rlm@0: If an Application uses this =IsoTimer= instead of the normal one, we rlm@280: can be sure that every call to =simpleUpdate=, for example, rlm@280: corresponds to exactly $(\frac{1}{fps})$ seconds of game-time. rlm@0: ocsenave@275: * =VideoRecorder= manages video feeds in JMonkeyEngine. ocsenave@275: ocsenave@275: ocsenave@275: ** =AbstractVideoRecorder= provides a general framework for managing videos. rlm@0: rlm@0: Now that the issue of time is solved, we just need a function that rlm@0: writes each frame to a video. We can put this function somewhere ocsenave@275: where it will be called exactly once per frame. rlm@0: rlm@42: The basic functions that a =VideoRecorder= should support are ocsenave@275: recording, starting, stopping, and possibly a cleanup step ocsenave@275: where it finalizes the recording (e.g. by writing headers for a video rlm@42: file). rlm@42: rlm@306: An appropriate interface describing this behavior could look like rlm@42: this: rlm@42: rlm@42: =./src/com/aurellem/capture/video/VideoRecorder.java= rlm@42: #+include ../../jmeCapture/src/com/aurellem/capture/video/VideoRecorder.java src java rlm@42: rlm@42: rlm@0: JME3 already provides exactly the class we need: the =SceneProcessor= rlm@0: class can be attached to any viewport and the methods defined therein rlm@0: will be called at the appropriate points in the rendering process. rlm@0: rlm@42: However, it is also important to properly close the video stream and rlm@42: write headers and such, and even though =SceneProcessor= has a rlm@42: =.cleanup()= method, it is only called when the =SceneProcessor= is rlm@42: removed from the =RenderManager=, not when the game is shutting down rlm@42: when the user pressed ESC, for example. To obtain reliable shutdown rlm@306: behavior, we also have to implement =AppState=, which provides a rlm@42: =.cleanup()= method that /is/ called on shutdown. rlm@42: rlm@42: Here is an AbstractVideoRecorder class that takes care of the details rlm@42: of setup and teardown. rlm@42: rlm@42: =./src/com/aurellem/capture/video/AbstractVideoRecorder.java= rlm@42: #+include ../../jmeCapture/src/com/aurellem/capture/video/AbstractVideoRecorder.java src java rlm@42: ocsenave@275: ocsenave@275: ** There are many options for handling video files in Java ocsenave@275: rlm@0: If you want to generate video from Java, a great option is [[http://www.xuggle.com/][Xuggle]]. It rlm@0: takes care of everything related to video encoding and decoding and rlm@0: runs on Windows, Linux and Mac. Out of all the video frameworks for ocsenave@275: Java, I personally like this one the best. rlm@0: rlm@42: Here is a =VideoRecorder= that uses [[http://www.xuggle.com/][Xuggle]] to write each frame to a rlm@0: video file. rlm@0: rlm@42: =./src/com/aurellem/capture/video/XuggleVideoRecorder.java= rlm@42: #+include ../../jmeCapture/src/com/aurellem/capture/video/XuggleVideoRecorder.java src java rlm@0: rlm@0: With this, we are able to record video! rlm@0: rlm@280: However, it can be hard to properly install Xuggle. If you would rlm@280: rather not use Xuggle, here is an alternate class that uses [[http://www.randelshofer.ch/blog/2008/08/writing-avi-videos-in-pure-java/][Werner rlm@280: Randelshofer's]] excellent pure Java AVI file writer. rlm@42: rlm@42: =./src/com/aurellem/capture/video/AVIVideoRecorder.java= rlm@42: #+include ../../jmeCapture/src/com/aurellem/capture/video/AVIVideoRecorder.java src java rlm@42: rlm@42: This =AVIVideoRecorder= is more limited than the rlm@42: =XuggleVideoRecorder=, but requires less external dependencies. rlm@42: rlm@42: Finally, for those of you who prefer to create the final video from a rlm@42: sequence of images, there is the =FileVideoRecorder=, which records rlm@42: each frame to a folder as a sequentially numbered image file. Note rlm@42: that you have to remember the FPS at which you recorded the video, as rlm@42: this information is lost when saving each frame to a file. rlm@42: rlm@42: =./src/com/aurellem/capture/video/FileVideoRecorder.java= rlm@42: #+include ../../jmeCapture/src/com/aurellem/capture/video/FileVideoRecorder.java src java rlm@42: rlm@42: rlm@42: ocsenave@275: ocsenave@275: * How to record videos yourself ocsenave@275: ocsenave@275: ** Include this code. ocsenave@275: rlm@280: No matter how complicated your application is, it's easy to add rlm@280: support for video output with just a few lines of code. ocsenave@275: # You can also record from multiple ViewPorts as the above example shows. ocsenave@275: rlm@280: And although you can use =VideoRecorder= to record advanced rlm@280: split-screen videos with multiple views, in the simplest case, you rlm@280: want to capture a single view\mdash{} exactly what's on screen. In rlm@280: this case, the following simple =captureVideo= method will do the job: rlm@42: rlm@42: #+begin_src java rlm@42: public static void captureVideo(final Application app, rlm@42: final File file) throws IOException{ rlm@42: final AbstractVideoRecorder videoRecorder; rlm@42: if (file.getCanonicalPath().endsWith(".avi")){ rlm@42: videoRecorder = new AVIVideoRecorder(file);} rlm@42: else if (file.isDirectory()){ rlm@42: videoRecorder = new FileVideoRecorder(file);} rlm@42: else { videoRecorder = new XuggleVideoRecorder(file);} rlm@42: rlm@42: Callable thunk = new Callable(){ rlm@42: public Object call(){ rlm@42: ViewPort viewPort = rlm@42: app.getRenderManager() rlm@42: .createPostView("aurellem record", app.getCamera()); rlm@42: viewPort.setClearFlags(false, false, false); rlm@42: // get GUI node stuff rlm@42: for (Spatial s : app.getGuiViewPort().getScenes()){ rlm@42: viewPort.attachScene(s); rlm@42: } rlm@42: app.getStateManager().attach(videoRecorder); rlm@42: viewPort.addProcessor(videoRecorder); rlm@42: return null; rlm@42: } rlm@42: }; rlm@42: app.enqueue(thunk); rlm@42: } rlm@42: #+end_src rlm@42: rlm@280: This method selects the appropriate =VideoRecorder= class for the file rlm@280: type you specify, and instructs your application to record video to rlm@280: the file. ocsenave@275: ocsenave@275: Now that you have a =captureVideo= method, you use it like this: ocsenave@275: ocsenave@275: - Establish an =Isotimer= and set its framerate :: For example, if ocsenave@275: you want to record video with a framerate of 30 fps, include rlm@306: the following line of code somewhere in the initialization of ocsenave@275: your application: ocsenave@275: #+begin_src java :exports code ocsenave@275: this.setTimer(new IsoTimer(30)); ocsenave@275: #+end_src ocsenave@275: ocsenave@275: - Choose the output file :: If you want to record from the game's ocsenave@275: main =ViewPort= to a file called =/home/r/record.flv=, then rlm@280: include the following line of code somewhere before you call rlm@280: =app.start()=; ocsenave@275: ocsenave@275: #+begin_src java :exports code ocsenave@275: Capture.captureVideo(app, new File("/home/r/record.flv")); ocsenave@275: #+end_src ocsenave@275: ocsenave@275: ocsenave@275: ** Simple example ocsenave@275: rlm@42: rlm@42: This example will record video from the ocean scene from the ocsenave@275: JMonkeyEngine test suite. rlm@42: #+begin_src java rlm@42: File video = File.createTempFile("JME-water-video", ".avi"); rlm@42: captureVideo(app, video); rlm@42: app.start(); rlm@42: System.out.println(video.getCanonicalPath()); rlm@42: #+end_src rlm@42: rlm@42: rlm@42: I've added support for this under a class called rlm@42: =com.aurellem.capture.Capture=. You can get it [[http://hg.bortreb.com/jmeCapture/][here]]. rlm@42: rlm@42: ocsenave@275: ** Hello Video! example rlm@0: rlm@0: I've taken [[http://code.google.com/p/jmonkeyengine/source/browse/trunk/engine/src/test/jme3test/helloworld/HelloLoop.java][=./jme3/src/test/jme3test/helloworld/HelloLoop.java=]] and rlm@0: augmented it with video output as follows: rlm@0: rlm@42: =./src/com/aurellem/capture/examples/HelloVideo.java= rlm@42: #+include ../../src/com/aurellem/capture/examples/HelloVideo.java src java rlm@0: rlm@0: The videos are created in the =hello-video= directory rlm@0: rlm@42: #+begin_src sh :results verbatim :exports both rlm@0: du -h hello-video/* rlm@0: #+end_src rlm@0: rlm@0: #+results: rlm@0: : 932K hello-video/hello-video-moving.flv rlm@0: : 640K hello-video/hello-video-static.flv rlm@0: rlm@306: And can be immediately uploaded to YouTube rlm@0: rlm@0: - [[http://www.youtube.com/watch?v=C8gxVAySaPg][hello-video-moving.flv]] rlm@0: #+BEGIN_HTML rlm@0: rlm@0: #+END_HTML rlm@0: - [[http://www.youtube.com/watch?v=pHcFOtIS07Q][hello-video-static.flv]] rlm@0: #+BEGIN_HTML rlm@0: rlm@0: rlm@0: #+END_HTML rlm@0: rlm@0: ocsenave@275: * COMMENT More Examples rlm@42: ** COMMENT Hello Physics rlm@280: rlm@280: =HelloVideo= is boring. Let's add some video capturing to rlm@280: =HelloPhysics= and create something fun! rlm@0: rlm@0: This example is a modified version of =HelloPhysics= that creates four rlm@0: simultaneous views of the same scene of cannonballs careening into a rlm@0: brick wall. rlm@0: rlm@0: =./jme3/src/test/jme3test/helloworld/HelloPhysicsWithVideo.java= rlm@0: #+include ./jme3/src/test/jme3test/helloworld/HelloPhysicsWithVideo.java src java rlm@0: rlm@0: Running the program outputs four videos into the =./physics-videos= rlm@0: directory. rlm@0: rlm@0: #+begin_src sh :exports both :results verbatim rlm@0: ls ./physics-videos | grep - rlm@0: #+end_src rlm@0: rlm@0: #+results: rlm@0: : lower-left.flv rlm@0: : lower-right.flv rlm@0: : upper-left.flv rlm@0: : upper-right.flv rlm@0: rlm@0: The videos are fused together with the following =gstreamer= commands: rlm@0: rlm@0: #+begin_src sh :results silent rlm@0: cd physics-videos rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./upper-right.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 left=-640 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: jpegenc ! avimux ! filesink location=upper.flv \ rlm@0: \ rlm@0: filesrc location=./upper-left.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox right=-640 ! mix. rlm@0: #+end_src rlm@0: rlm@0: #+begin_src sh :results silent rlm@0: cd physics-videos rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./lower-left.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 left=-640 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: jpegenc ! avimux ! filesink location=lower.flv \ rlm@0: \ rlm@0: filesrc location=./lower-right.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox right=-640 ! mix. rlm@0: #+end_src rlm@0: rlm@0: #+begin_src sh :results silent rlm@0: cd physics-videos rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./upper.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 bottom=-480 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \ rlm@0: jpegenc ! avimux ! filesink location=../youtube/helloPhysics.flv \ rlm@0: \ rlm@0: filesrc location=./lower.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: videobox top=-480 ! mix. rlm@0: #+end_src rlm@0: rlm@0: #+begin_src sh :results verbatim rlm@0: du -h youtube/helloPhysics.flv rlm@0: #+end_src rlm@0: rlm@0: #+results: rlm@0: : 180M physics-videos/helloPhysics.flv rlm@0: rlm@0: rlm@306: That's a terribly large size! rlm@0: Let's compress it: rlm@0: rlm@42: ** COMMENT Compressing the HelloPhysics Video rlm@306: First, we'll scale the video, then, we'll decrease it's bit-rate. The rlm@0: end result will be perfect for upload to YouTube. rlm@0: rlm@0: #+begin_src sh :results silent rlm@0: cd youtube rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./helloPhysics.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: `: # the original size is 1280 by 960` \ rlm@0: video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \ rlm@0: videoscale ! \ rlm@0: `: # here we scale the video down` \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: `: # and here we limit the bitrate` \ rlm@0: theoraenc bitrate=1024 quality=30 ! \ rlm@0: oggmux ! progressreport update-freq=1 ! \ rlm@0: filesink location=./helloPhysics.ogg rlm@0: #+end_src rlm@0: rlm@0: #+begin_src sh :results verbatim rlm@0: du -h youtube/helloPhysics.ogg rlm@0: #+end_src rlm@0: rlm@0: #+results: rlm@0: : 13M youtube/helloPhysics.ogg rlm@0: rlm@0: [[http://www.youtube.com/watch?v=WIJt9aRGusc][helloPhysics.ogg]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: rlm@0: ** COMMENT failed attempts rlm@0: Let's try the [[http://diracvideo.org/][Dirac]] video encoder. rlm@0: rlm@0: #+begin_src sh :results verbatim rlm@0: cd youtube rlm@0: START=$(date +%s) rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./helloPhysics.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \ rlm@0: schroenc ! filesink location=./helloPhysics.drc > /dev/null rlm@0: echo `expr $(( $(date +%s) - $START))` rlm@0: #+end_src rlm@0: rlm@0: rlm@0: #+results: rlm@0: : 142 rlm@0: rlm@0: That took 142 seconds. Let's see how it does compression-wise: rlm@0: rlm@0: #+begin_src sh :results verbatim rlm@0: du -h ./youtube/helloPhysics.drc rlm@0: #+end_src rlm@0: rlm@0: #+results: rlm@0: : 22M ./physics-videos/helloPhysics.drc rlm@0: rlm@0: rlm@0: #+begin_src sh :results verbatim rlm@0: cd youtube rlm@0: START=$(date +%s) rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./helloPhysics.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \ rlm@0: theoraenc ! oggmux ! filesink location=./helloPhysics.ogg \ rlm@0: > /dev/null rlm@0: echo `expr $(( $(date +%s) - $START))` rlm@0: #+end_src rlm@0: rlm@0: #+results: rlm@0: : 123 rlm@0: rlm@0: #+begin_src sh :results verbatim rlm@0: du -h youtube/helloPhysics.ogg rlm@0: #+end_src rlm@0: rlm@0: #+results: rlm@0: : 59M physics-videos/helloPhysics.ogg rlm@0: rlm@0: rlm@0: =*.drc= files can not be uploaded to YouTube, so I'll go for the rlm@0: avi file. rlm@0: rlm@0: rlm@0: ** COMMENT text for videos rlm@0: Video output from JMonkeyEngine3 (www.jmonkeyengine.org/) using Xuggle rlm@0: (www.xuggle.com/). Everything is explained at rlm@0: http://aurellem.org/cortex/capture-video.html. rlm@0: rlm@0: rlm@0: Video output from JMonkeyEngine3 (www.jmonkeyengine.org/) HelloPhysics rlm@0: demo application using Xuggle (www.xuggle.com/). Everything is rlm@0: explained at http://aurellem.org/cortex/capture-video.html. Here, rlm@0: four points of view are simultaneously recorded and then glued rlm@0: together later. rlm@0: rlm@280: JME3 Xuggle Aurellem video capture rlm@0: rlm@0: ocsenave@275: * Showcase of recorded videos rlm@0: I encoded most of the original JME3 Hello demos for your viewing rlm@42: pleasure, all using the =Capture= and =IsoTimer= classes. rlm@0: rlm@0: ** HelloTerrain rlm@0: [[http://youtu.be/5_4wyDFwrVQ][HelloTerrain.avi]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: ** HelloAssets rlm@0: [[http://www.youtube.com/watch?v=oGg-Q6k1BM4][HelloAssets.avi]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: ** HelloEffects rlm@0: [[http://www.youtube.com/watch?v=TuxlLMe53hA][HelloEffects]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: ** HelloCollision rlm@0: [[http://www.youtube.com/watch?v=GPlvJkiZfFw][HelloCollision.avi]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: ** HelloAnimation rlm@0: [[http://www.youtube.com/watch?v=SDCfOSPYUkg][HelloAnimation.avi]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: ** HelloNode rlm@0: [[http://www.youtube.com/watch?v=pL-0fR0-ilQ][HelloNode.avi]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: ** HelloLoop rlm@0: [[http://www.youtube.com/watch?v=mosZzzcdE5w][HelloLoop.avi]] rlm@0: rlm@0: #+begin_html rlm@0: rlm@0: #+end_html rlm@0: rlm@0: rlm@0: *** COMMENT x-form the other stupid rlm@0: progressreport update-freq=1 rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./helloPhy ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \ rlm@0: x264enc ! avimux ! filesink location=helloPhysics.avi \ rlm@0: rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./HelloAnimationStatic.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 left=-640 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: x264enc ! avimux ! progressreport update-freq=1 ! \ rlm@0: filesink location=../youtube/HelloAnimation.avi \ rlm@0: \ rlm@0: filesrc location=./HelloAnimationMotion.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox right=-640 ! mix. rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./HelloCollisionMotion.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=800, height=600, framerate=25/1 ! \ rlm@0: x264enc bitrate=1024 ! avimux ! \ rlm@0: filesink location=../youtube/HelloCollision.avi rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./HelloEffectsStatic.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 left=-640 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \ rlm@0: filesink location=../youtube/HelloEffects.avi \ rlm@0: \ rlm@0: filesrc location=./HelloEffectsMotion.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox right=-640 ! mix. rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./HelloTerrainMotion.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=800, height=600, framerate=25/1 ! \ rlm@0: x264enc bitrate=1024 ! avimux ! \ rlm@0: filesink location=../youtube/HelloTerrain.avi rlm@0: rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./HelloAssetsStatic.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 left=-640 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \ rlm@0: filesink location=../youtube/HelloAssets.avi \ rlm@0: \ rlm@0: filesrc location=./HelloAssetsMotion.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox right=-640 ! mix. rlm@0: rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./HelloNodeStatic.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 left=-640 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \ rlm@0: filesink location=../youtube/HelloNode.avi \ rlm@0: \ rlm@0: filesrc location=./HelloNodeMotion.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox right=-640 ! mix. rlm@0: rlm@0: gst-launch-0.10 \ rlm@0: filesrc location=./HelloLoopStatic.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox border-alpha=0 left=-640 ! \ rlm@0: videomixer name=mix ! ffmpegcolorspace ! videorate ! \ rlm@0: video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \ rlm@0: x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \ rlm@0: filesink location=../youtube/HelloLoop.avi \ rlm@0: \ rlm@0: filesrc location=./HelloLoopMotion.flv ! decodebin ! \ rlm@0: videoscale ! ffmpegcolorspace ! \ rlm@0: video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \ rlm@0: videobox right=-640 ! mix.