rlm@0
|
1 #+title: Capture Live Video Feeds from JMonkeyEngine
|
rlm@0
|
2 #+author: Robert McIntyre
|
rlm@0
|
3 #+email: rlm@mit.edu
|
rlm@0
|
4 #+description: Capture video from a JMonkeyEngine3 Application with Xuggle, and use gstreamer to compress the video to upload to YouTube.
|
rlm@306
|
5 #+keywords: JME3, video, Xuggle, JMonkeyEngine, YouTube, capture video, Java
|
rlm@4
|
6 #+SETUPFILE: ../../aurellem/org/setup.org
|
rlm@4
|
7 #+INCLUDE: ../../aurellem/org/level-0.org
|
rlm@0
|
8
|
rlm@0
|
9
|
rlm@0
|
10 * The Problem
|
rlm@0
|
11 So you've made your cool new JMonkeyEngine3 game and you want to
|
rlm@0
|
12 create a demo video to show off your hard work. Screen capturing is
|
rlm@0
|
13 the most straightforward way to do this, but it can slow down your
|
rlm@0
|
14 game and produce low-quality video as a result. A better way is to
|
rlm@280
|
15 record a video feed directly from the game while it is running.
|
rlm@0
|
16
|
rlm@0
|
17 In this post, I'll explain how you can alter your JMonkeyEngine3 game
|
rlm@0
|
18 to output video while it is running. The main trick is to alter the
|
rlm@0
|
19 pace of JMonkeyEngine3's in-game time: we allow the engine as much
|
rlm@0
|
20 time as it needs to compute complicated in-game events and to encode
|
rlm@0
|
21 video frames. As a result, the game appears to speed up and slow down
|
rlm@0
|
22 as the computational demands shift, but the end result is perfectly
|
rlm@0
|
23 smooth video output at a constant framerate.
|
rlm@0
|
24
|
rlm@0
|
25
|
ocsenave@275
|
26 * Video recording requires a steady framerate
|
ocsenave@275
|
27 ** The built-in =Timer= rushes to keep up.
|
rlm@573
|
28 # * Game-time vs. User-time vs. Video-time
|
rlm@0
|
29
|
rlm@280
|
30 Standard JME3 applications use a =Timer= object to manage time in the
|
rlm@280
|
31 simulated world. Because most JME3 applications (e.g. games) are
|
rlm@280
|
32 supposed to happen \ldquo{}live\rdquo{}, the built-in =Timer= requires
|
rlm@280
|
33 simulated time to match real time. This means that the application
|
rlm@280
|
34 must rush to finish all of its calculations on schedule: the more
|
rlm@280
|
35 complicated the calculations, the more the application is obligated to
|
rlm@280
|
36 rush. And if the workload becomes too much to handle on schedule,
|
rlm@280
|
37 =Timer= forces the application to cut corners: it demands fast,
|
rlm@280
|
38 approximate answers instead of careful, accurate ones. Although this
|
rlm@280
|
39 policy sometimes causes physically impossible glitches and choppy
|
rlm@280
|
40 framerates, it ensures that the user will never be kept waiting while
|
rlm@280
|
41 the computer stops to make a complicated calculation.
|
rlm@0
|
42
|
ocsenave@275
|
43 Now, the built-in =Timer= values speed over accuracy because real-time
|
rlm@280
|
44 applications require it. On the other hand, if your goal is to record
|
rlm@280
|
45 a glitch-free video, you need a =Timer= that will take its time to
|
rlm@280
|
46 ensure that all calculations are accurate, even if they take a long
|
rlm@280
|
47 time. In the next section, we will create a new kind of
|
rlm@280
|
48 =Timer=\mdash{}called =IsoTimer=\mdash{}which slows down to let the
|
rlm@280
|
49 computer finish all its calculations. The result is a perfectly steady
|
rlm@280
|
50 framerate and a flawless physical simulation.
|
rlm@0
|
51
|
ocsenave@275
|
52 # are supposed to happen \ldquo live \rdquo, this =Timer= requires the
|
rlm@280
|
53 # application to update in real-time. In order to keep up with the
|
rlm@280
|
54 # real world, JME applications cannot afford to take too much time on
|
rlm@280
|
55 # expensive computations. Whenever the workload becomes too much for
|
rlm@280
|
56 # the computer to handle on schedule, =Timer= forces the computer to
|
rlm@280
|
57 # cut corners, giving fast, approximate answers instead of careful,
|
rlm@280
|
58 # accurate ones. Although physical accuracy sometimes suffers as a
|
rlm@280
|
59 # result, this policy ensures that the user will never be kept waiting
|
rlm@280
|
60 # while the computer stops to make a complicated calculation.
|
rlm@0
|
61
|
rlm@573
|
62 # fast answers are more important than accurate ones.
|
ocsenave@275
|
63
|
ocsenave@275
|
64 # A standard JME3 application that extends =SimpleApplication= or
|
ocsenave@275
|
65 # =Application= tries as hard as it can to keep in sync with
|
rlm@280
|
66 # /user-time/. If a ball is rolling at 1 game-mile per game-hour in
|
rlm@280
|
67 # the game, and you wait for one user-hour as measured by the clock on
|
rlm@280
|
68 # your wall, then the ball should have traveled exactly one
|
rlm@280
|
69 # game-mile. In order to keep sync with the real world, the game
|
rlm@280
|
70 # throttles its physics engine and graphics display. If the
|
rlm@280
|
71 # computations involved in running the game are too intense, then the
|
rlm@280
|
72 # game will first skip frames, then sacrifice physics accuracy. If
|
rlm@306
|
73 # there are particularly demanding computations, then you may only get
|
rlm@280
|
74 # 1 fps, and the ball may tunnel through the floor or obstacles due to
|
rlm@280
|
75 # inaccurate physics simulation, but after the end of one user-hour,
|
rlm@280
|
76 # that ball will have traveled one game-mile.
|
ocsenave@275
|
77
|
rlm@280
|
78 # When we're recording video, we don't care if the game-time syncs
|
rlm@280
|
79 # with user-time, but instead whether the time in the recorded video
|
ocsenave@275
|
80 # (video-time) syncs with user-time. To continue the analogy, if we
|
rlm@280
|
81 # recorded the ball rolling at 1 game-mile per game-hour and watched
|
rlm@280
|
82 # the video later, we would want to see 30 fps video of the ball
|
rlm@280
|
83 # rolling at 1 video-mile per /user-hour/. It doesn't matter how much
|
rlm@280
|
84 # user-time it took to simulate that hour of game-time to make the
|
rlm@280
|
85 # high-quality recording.
|
ocsenave@275
|
86 ** COMMENT Two examples to clarify the point:
|
ocsenave@275
|
87 *** Recording from a Simple Simulation
|
ocsenave@275
|
88
|
ocsenave@275
|
89 **** Without a Special Timer
|
rlm@0
|
90 You have a simulation of a ball rolling on an infinite empty plane at
|
rlm@0
|
91 one game-mile per game-hour, and a really good computer. Normally,
|
rlm@0
|
92 JME3 will throttle the physics engine and graphics display to sync the
|
rlm@0
|
93 game-time with user-time. If it takes one-thousandth of a second
|
rlm@0
|
94 user-time to simulate one-sixtieth of a second game time and another
|
rlm@0
|
95 one-thousandth of a second to draw to the screen, then JME3 will just
|
rlm@0
|
96 sit around for the remainder of $\frac{1}{60} - \frac{2}{1000}$
|
rlm@0
|
97 user-seconds, then calculate the next frame in $\frac{2}{1000}$
|
rlm@0
|
98 user-seconds, then wait, and so on. For every second of user time that
|
rlm@0
|
99 passes, one second of game-time passes, and the game will run at 60
|
rlm@0
|
100 frames per user-second.
|
rlm@0
|
101
|
rlm@0
|
102
|
ocsenave@275
|
103 **** With a Special Timer
|
rlm@0
|
104 Then, you change the game's timer so that user-time will be synced to
|
rlm@0
|
105 video-time. Assume that encoding a single frame takes 0 seconds
|
rlm@0
|
106 user-time to complete.
|
rlm@0
|
107
|
rlm@0
|
108 Now, JME3 takes advantage of all available resources. It still takes
|
rlm@0
|
109 one-thousandth of a second to calculate a physics tick, and another
|
rlm@0
|
110 one-thousandth to render to the screen. Then it takes 0 seconds to
|
rlm@0
|
111 write the video frame to disk and encode the video. In only one second
|
rlm@0
|
112 of user time, JME3 will complete 500 physics-tick/render/encode-video
|
rlm@0
|
113 cycles, and $\frac{500}{60}=8\frac{1}{3}$ seconds of game-time will
|
rlm@0
|
114 have passed. Game-time appears to dilate $8\frac{1}{3}\times$ with
|
rlm@0
|
115 respect to user-time, and in only 7.2 minutes user-time, one hour of
|
rlm@0
|
116 video will have been recorded. The game itself will run at 500 fps.
|
rlm@0
|
117 When someone watches the video, they will see 60 frames per
|
rlm@0
|
118 user-second, and $\frac{1}{60}$ video-seconds will pass each frame. It
|
rlm@0
|
119 will take exactly one hour user-time (and one hour video-time) for the
|
rlm@0
|
120 ball in the video to travel one video-mile.
|
rlm@0
|
121
|
ocsenave@275
|
122 *** Recording from a Complex Simulation
|
rlm@0
|
123
|
rlm@0
|
124 *** Without a Special Timer
|
rlm@0
|
125 You have a simulation of a ball rolling on an infinite empty plane at
|
rlm@0
|
126 one game-mile per game-hour accompanied by multiple explosions
|
rlm@0
|
127 involving thousands of nodes, particle effects, and complicated shadow
|
rlm@0
|
128 shaders to create realistic shadows. You also have a slow
|
rlm@0
|
129 laptop. Normally, JME3 must sacrifice rendering and physics simulation
|
rlm@0
|
130 to try to keep up. If it takes $\frac{1}{120}$ of a user-second to
|
rlm@0
|
131 calculate $\frac{1}{60}$ game-seconds, and an additional
|
rlm@0
|
132 $\frac{1}{60}$ of a user-second to render to screen, then JME3 has
|
rlm@0
|
133 it's work cut out for it. In order to render to the screen, it will
|
rlm@0
|
134 first step the game forward by up to four physics ticks before
|
rlm@0
|
135 rendering to the screen. If it still isn't fast enough then it will
|
rlm@0
|
136 decrease the accuracy of the physics engine until game-time and user
|
rlm@306
|
137 time are synced or a certain threshold is reached, at which point the
|
rlm@0
|
138 game visibly slows down. In this case, JME3 continuously repeat a
|
rlm@0
|
139 cycle of two physics ticks, and one screen render. For every
|
rlm@0
|
140 user-second that passes, one game-second will pass, but the game will
|
rlm@0
|
141 run at 30 fps instead of 60 fps like before.
|
rlm@0
|
142
|
rlm@0
|
143 *** With a Special Timer
|
rlm@0
|
144 Then, you change the game's timer so that user-time will be synced to
|
rlm@0
|
145 video-time. Once again, assume video encoding takes $\frac{1}{60}$ of
|
rlm@0
|
146 a user-second.
|
rlm@0
|
147
|
rlm@0
|
148 Now, JME3 will spend $\frac{1}{120}$ of a user-second to step the
|
rlm@0
|
149 physics tick $\frac{1}{60}$ game-seconds, $\frac{1}{60}$ to draw to
|
rlm@0
|
150 the screen, and an additional $\frac{1}{60}$ to encode the video and
|
rlm@0
|
151 write the frame to disk. This is a total of $\frac{1}{24}$
|
rlm@0
|
152 user-seconds for each $\frac{1}{60}$ game-seconds. It will take
|
rlm@280
|
153 $(\frac{60}{24} = 2.5)$ user-hours to record one game-hour and
|
rlm@280
|
154 game-time will appear to flow two-fifths as fast as user time while
|
rlm@280
|
155 the game is running. However, just as in example one, when all is said
|
rlm@280
|
156 and done we will have an hour long video at 60 fps.
|
rlm@0
|
157
|
rlm@0
|
158
|
ocsenave@275
|
159 ** COMMENT proposed names for the new timer
|
rlm@0
|
160 # METRONOME
|
rlm@0
|
161 # IsoTimer
|
rlm@0
|
162 # EvenTimer
|
rlm@0
|
163 # PulseTimer
|
rlm@0
|
164 # FixedTimer
|
rlm@0
|
165 # RigidTimer
|
rlm@0
|
166 # FixedTempo
|
rlm@0
|
167 # RegularTimer
|
rlm@0
|
168 # MetronomeTimer
|
rlm@0
|
169 # ConstantTimer
|
rlm@0
|
170 # SteadyTimer
|
rlm@0
|
171
|
rlm@0
|
172
|
ocsenave@275
|
173 ** =IsoTimer= records time like a metronome
|
rlm@0
|
174
|
rlm@0
|
175 The easiest way to achieve this special timing is to create a new
|
rlm@0
|
176 timer that always reports the same framerate to JME3 every time it is
|
rlm@0
|
177 called.
|
rlm@0
|
178
|
rlm@0
|
179
|
rlm@42
|
180 =./src/com/aurellem/capture/IsoTimer.java=
|
rlm@573
|
181 #+INCLUDE: "../../jmeCapture/src/com/aurellem/capture/IsoTimer.java" src java
|
rlm@0
|
182
|
rlm@0
|
183 If an Application uses this =IsoTimer= instead of the normal one, we
|
rlm@280
|
184 can be sure that every call to =simpleUpdate=, for example,
|
rlm@280
|
185 corresponds to exactly $(\frac{1}{fps})$ seconds of game-time.
|
rlm@0
|
186
|
ocsenave@275
|
187 * =VideoRecorder= manages video feeds in JMonkeyEngine.
|
ocsenave@275
|
188
|
ocsenave@275
|
189
|
ocsenave@275
|
190 ** =AbstractVideoRecorder= provides a general framework for managing videos.
|
rlm@0
|
191
|
rlm@0
|
192 Now that the issue of time is solved, we just need a function that
|
rlm@0
|
193 writes each frame to a video. We can put this function somewhere
|
ocsenave@275
|
194 where it will be called exactly once per frame.
|
rlm@0
|
195
|
rlm@42
|
196 The basic functions that a =VideoRecorder= should support are
|
ocsenave@275
|
197 recording, starting, stopping, and possibly a cleanup step
|
ocsenave@275
|
198 where it finalizes the recording (e.g. by writing headers for a video
|
rlm@42
|
199 file).
|
rlm@42
|
200
|
rlm@306
|
201 An appropriate interface describing this behavior could look like
|
rlm@42
|
202 this:
|
rlm@42
|
203
|
rlm@42
|
204 =./src/com/aurellem/capture/video/VideoRecorder.java=
|
rlm@573
|
205 #+include: "../../jmeCapture/src/com/aurellem/capture/video/VideoRecorder.java" src java
|
rlm@42
|
206
|
rlm@0
|
207 JME3 already provides exactly the class we need: the =SceneProcessor=
|
rlm@0
|
208 class can be attached to any viewport and the methods defined therein
|
rlm@0
|
209 will be called at the appropriate points in the rendering process.
|
rlm@0
|
210
|
rlm@42
|
211 However, it is also important to properly close the video stream and
|
rlm@42
|
212 write headers and such, and even though =SceneProcessor= has a
|
rlm@42
|
213 =.cleanup()= method, it is only called when the =SceneProcessor= is
|
rlm@42
|
214 removed from the =RenderManager=, not when the game is shutting down
|
rlm@42
|
215 when the user pressed ESC, for example. To obtain reliable shutdown
|
rlm@306
|
216 behavior, we also have to implement =AppState=, which provides a
|
rlm@42
|
217 =.cleanup()= method that /is/ called on shutdown.
|
rlm@42
|
218
|
rlm@42
|
219 Here is an AbstractVideoRecorder class that takes care of the details
|
rlm@42
|
220 of setup and teardown.
|
rlm@42
|
221
|
rlm@42
|
222 =./src/com/aurellem/capture/video/AbstractVideoRecorder.java=
|
rlm@573
|
223 #+include: ../../jmeCapture/src/com/aurellem/capture/video/AbstractVideoRecorder.java src java
|
rlm@42
|
224
|
ocsenave@275
|
225
|
ocsenave@275
|
226 ** There are many options for handling video files in Java
|
ocsenave@275
|
227
|
rlm@0
|
228 If you want to generate video from Java, a great option is [[http://www.xuggle.com/][Xuggle]]. It
|
rlm@0
|
229 takes care of everything related to video encoding and decoding and
|
rlm@0
|
230 runs on Windows, Linux and Mac. Out of all the video frameworks for
|
ocsenave@275
|
231 Java, I personally like this one the best.
|
rlm@0
|
232
|
rlm@42
|
233 Here is a =VideoRecorder= that uses [[http://www.xuggle.com/][Xuggle]] to write each frame to a
|
rlm@0
|
234 video file.
|
rlm@0
|
235
|
rlm@42
|
236 =./src/com/aurellem/capture/video/XuggleVideoRecorder.java=
|
rlm@573
|
237 #+include: ../../jmeCapture/src/com/aurellem/capture/video/XuggleVideoRecorder.java src java
|
rlm@0
|
238
|
rlm@0
|
239 With this, we are able to record video!
|
rlm@0
|
240
|
rlm@280
|
241 However, it can be hard to properly install Xuggle. If you would
|
rlm@280
|
242 rather not use Xuggle, here is an alternate class that uses [[http://www.randelshofer.ch/blog/2008/08/writing-avi-videos-in-pure-java/][Werner
|
rlm@280
|
243 Randelshofer's]] excellent pure Java AVI file writer.
|
rlm@42
|
244
|
rlm@42
|
245 =./src/com/aurellem/capture/video/AVIVideoRecorder.java=
|
rlm@573
|
246 #+include: ../../jmeCapture/src/com/aurellem/capture/video/AVIVideoRecorder.java src java
|
rlm@42
|
247
|
rlm@42
|
248 This =AVIVideoRecorder= is more limited than the
|
rlm@42
|
249 =XuggleVideoRecorder=, but requires less external dependencies.
|
rlm@42
|
250
|
rlm@42
|
251 Finally, for those of you who prefer to create the final video from a
|
rlm@42
|
252 sequence of images, there is the =FileVideoRecorder=, which records
|
rlm@42
|
253 each frame to a folder as a sequentially numbered image file. Note
|
rlm@42
|
254 that you have to remember the FPS at which you recorded the video, as
|
rlm@42
|
255 this information is lost when saving each frame to a file.
|
rlm@42
|
256
|
rlm@42
|
257 =./src/com/aurellem/capture/video/FileVideoRecorder.java=
|
rlm@573
|
258 #+include: ../../jmeCapture/src/com/aurellem/capture/video/FileVideoRecorder.java src java
|
ocsenave@275
|
259
|
ocsenave@275
|
260 * How to record videos yourself
|
ocsenave@275
|
261
|
ocsenave@275
|
262 ** Include this code.
|
ocsenave@275
|
263
|
rlm@280
|
264 No matter how complicated your application is, it's easy to add
|
rlm@280
|
265 support for video output with just a few lines of code.
|
ocsenave@275
|
266 # You can also record from multiple ViewPorts as the above example shows.
|
ocsenave@275
|
267
|
rlm@280
|
268 And although you can use =VideoRecorder= to record advanced
|
rlm@280
|
269 split-screen videos with multiple views, in the simplest case, you
|
rlm@280
|
270 want to capture a single view\mdash{} exactly what's on screen. In
|
rlm@280
|
271 this case, the following simple =captureVideo= method will do the job:
|
rlm@42
|
272
|
rlm@42
|
273 #+begin_src java
|
rlm@42
|
274 public static void captureVideo(final Application app,
|
rlm@42
|
275 final File file) throws IOException{
|
rlm@42
|
276 final AbstractVideoRecorder videoRecorder;
|
rlm@42
|
277 if (file.getCanonicalPath().endsWith(".avi")){
|
rlm@42
|
278 videoRecorder = new AVIVideoRecorder(file);}
|
rlm@42
|
279 else if (file.isDirectory()){
|
rlm@42
|
280 videoRecorder = new FileVideoRecorder(file);}
|
rlm@42
|
281 else { videoRecorder = new XuggleVideoRecorder(file);}
|
rlm@42
|
282
|
rlm@42
|
283 Callable<Object> thunk = new Callable<Object>(){
|
rlm@42
|
284 public Object call(){
|
rlm@42
|
285 ViewPort viewPort =
|
rlm@42
|
286 app.getRenderManager()
|
rlm@42
|
287 .createPostView("aurellem record", app.getCamera());
|
rlm@42
|
288 viewPort.setClearFlags(false, false, false);
|
rlm@42
|
289 // get GUI node stuff
|
rlm@42
|
290 for (Spatial s : app.getGuiViewPort().getScenes()){
|
rlm@42
|
291 viewPort.attachScene(s);
|
rlm@42
|
292 }
|
rlm@42
|
293 app.getStateManager().attach(videoRecorder);
|
rlm@42
|
294 viewPort.addProcessor(videoRecorder);
|
rlm@42
|
295 return null;
|
rlm@42
|
296 }
|
rlm@42
|
297 };
|
rlm@42
|
298 app.enqueue(thunk);
|
rlm@42
|
299 }
|
rlm@42
|
300 #+end_src
|
rlm@42
|
301
|
rlm@280
|
302 This method selects the appropriate =VideoRecorder= class for the file
|
rlm@280
|
303 type you specify, and instructs your application to record video to
|
rlm@280
|
304 the file.
|
ocsenave@275
|
305
|
ocsenave@275
|
306 Now that you have a =captureVideo= method, you use it like this:
|
ocsenave@275
|
307
|
ocsenave@275
|
308 - Establish an =Isotimer= and set its framerate :: For example, if
|
ocsenave@275
|
309 you want to record video with a framerate of 30 fps, include
|
rlm@306
|
310 the following line of code somewhere in the initialization of
|
ocsenave@275
|
311 your application:
|
ocsenave@275
|
312 #+begin_src java :exports code
|
ocsenave@275
|
313 this.setTimer(new IsoTimer(30));
|
ocsenave@275
|
314 #+end_src
|
ocsenave@275
|
315
|
ocsenave@275
|
316 - Choose the output file :: If you want to record from the game's
|
ocsenave@275
|
317 main =ViewPort= to a file called =/home/r/record.flv=, then
|
rlm@280
|
318 include the following line of code somewhere before you call
|
rlm@280
|
319 =app.start()=;
|
ocsenave@275
|
320
|
ocsenave@275
|
321 #+begin_src java :exports code
|
ocsenave@275
|
322 Capture.captureVideo(app, new File("/home/r/record.flv"));
|
ocsenave@275
|
323 #+end_src
|
ocsenave@275
|
324
|
ocsenave@275
|
325
|
ocsenave@275
|
326 ** Simple example
|
ocsenave@275
|
327
|
rlm@42
|
328
|
rlm@42
|
329 This example will record video from the ocean scene from the
|
ocsenave@275
|
330 JMonkeyEngine test suite.
|
rlm@42
|
331 #+begin_src java
|
rlm@42
|
332 File video = File.createTempFile("JME-water-video", ".avi");
|
rlm@42
|
333 captureVideo(app, video);
|
rlm@42
|
334 app.start();
|
rlm@42
|
335 System.out.println(video.getCanonicalPath());
|
rlm@42
|
336 #+end_src
|
rlm@42
|
337
|
rlm@42
|
338
|
rlm@42
|
339 I've added support for this under a class called
|
rlm@42
|
340 =com.aurellem.capture.Capture=. You can get it [[http://hg.bortreb.com/jmeCapture/][here]].
|
rlm@42
|
341
|
ocsenave@275
|
342 ** Hello Video! example
|
rlm@0
|
343
|
rlm@0
|
344 I've taken [[http://code.google.com/p/jmonkeyengine/source/browse/trunk/engine/src/test/jme3test/helloworld/HelloLoop.java][=./jme3/src/test/jme3test/helloworld/HelloLoop.java=]] and
|
rlm@0
|
345 augmented it with video output as follows:
|
rlm@0
|
346
|
rlm@573
|
347 =./src/com/aurellem/capture/examples/HelloVideoRecording.java=
|
rlm@573
|
348 #+include: ../../jmeCapture/src/com/aurellem/capture/examples/HelloVideoRecording.java src java
|
rlm@0
|
349
|
rlm@0
|
350 The videos are created in the =hello-video= directory
|
rlm@0
|
351
|
rlm@42
|
352 #+begin_src sh :results verbatim :exports both
|
rlm@0
|
353 du -h hello-video/*
|
rlm@0
|
354 #+end_src
|
rlm@0
|
355
|
rlm@0
|
356 #+results:
|
rlm@0
|
357 : 932K hello-video/hello-video-moving.flv
|
rlm@0
|
358 : 640K hello-video/hello-video-static.flv
|
rlm@0
|
359
|
rlm@306
|
360 And can be immediately uploaded to YouTube
|
rlm@0
|
361
|
rlm@0
|
362 - [[http://www.youtube.com/watch?v=C8gxVAySaPg][hello-video-moving.flv]]
|
rlm@0
|
363 #+BEGIN_HTML
|
rlm@573
|
364 <iframe width="1062" height="872"
|
rlm@0
|
365 src="http://www.youtube.com/embed/C8gxVAySaPg"
|
rlm@0
|
366 frameborder="0" allowfullscreen>
|
rlm@0
|
367 </iframe>
|
rlm@0
|
368 #+END_HTML
|
rlm@0
|
369 - [[http://www.youtube.com/watch?v=pHcFOtIS07Q][hello-video-static.flv]]
|
rlm@0
|
370 #+BEGIN_HTML
|
rlm@573
|
371 <iframe width="1062" height="872"
|
rlm@0
|
372 src="http://www.youtube.com/embed/pHcFOtIS07Q"
|
rlm@0
|
373 frameborder="0" allowfullscreen>
|
rlm@0
|
374 </iframe>
|
rlm@0
|
375 #+END_HTML
|
rlm@0
|
376
|
rlm@0
|
377
|
ocsenave@275
|
378 * COMMENT More Examples
|
rlm@42
|
379 ** COMMENT Hello Physics
|
rlm@280
|
380
|
rlm@280
|
381 =HelloVideo= is boring. Let's add some video capturing to
|
rlm@280
|
382 =HelloPhysics= and create something fun!
|
rlm@0
|
383
|
rlm@0
|
384 This example is a modified version of =HelloPhysics= that creates four
|
rlm@0
|
385 simultaneous views of the same scene of cannonballs careening into a
|
rlm@0
|
386 brick wall.
|
rlm@0
|
387
|
rlm@573
|
388 # =./jme3/src/test/jme3test/helloworld/HelloPhysicsWithVideo.java=
|
rlm@573
|
389 # #+include: ./jme3/src/test/jme3test/helloworld/HelloPhysicsWithVideo.java src java
|
rlm@0
|
390
|
rlm@0
|
391 Running the program outputs four videos into the =./physics-videos=
|
rlm@0
|
392 directory.
|
rlm@0
|
393
|
rlm@0
|
394 #+begin_src sh :exports both :results verbatim
|
rlm@0
|
395 ls ./physics-videos | grep -
|
rlm@0
|
396 #+end_src
|
rlm@0
|
397
|
rlm@0
|
398 #+results:
|
rlm@0
|
399 : lower-left.flv
|
rlm@0
|
400 : lower-right.flv
|
rlm@0
|
401 : upper-left.flv
|
rlm@0
|
402 : upper-right.flv
|
rlm@0
|
403
|
rlm@0
|
404 The videos are fused together with the following =gstreamer= commands:
|
rlm@0
|
405
|
rlm@0
|
406 #+begin_src sh :results silent
|
rlm@0
|
407 cd physics-videos
|
rlm@0
|
408
|
rlm@0
|
409 gst-launch-0.10 \
|
rlm@0
|
410 filesrc location=./upper-right.flv ! decodebin ! \
|
rlm@0
|
411 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
412 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
413 videobox border-alpha=0 left=-640 ! \
|
rlm@0
|
414 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
415 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
416 jpegenc ! avimux ! filesink location=upper.flv \
|
rlm@0
|
417 \
|
rlm@0
|
418 filesrc location=./upper-left.flv ! decodebin ! \
|
rlm@0
|
419 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
420 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
421 videobox right=-640 ! mix.
|
rlm@0
|
422 #+end_src
|
rlm@0
|
423
|
rlm@0
|
424 #+begin_src sh :results silent
|
rlm@0
|
425 cd physics-videos
|
rlm@0
|
426
|
rlm@0
|
427 gst-launch-0.10 \
|
rlm@0
|
428 filesrc location=./lower-left.flv ! decodebin ! \
|
rlm@0
|
429 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
430 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
431 videobox border-alpha=0 left=-640 ! \
|
rlm@0
|
432 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
433 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
434 jpegenc ! avimux ! filesink location=lower.flv \
|
rlm@0
|
435 \
|
rlm@0
|
436 filesrc location=./lower-right.flv ! decodebin ! \
|
rlm@0
|
437 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
438 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
439 videobox right=-640 ! mix.
|
rlm@0
|
440 #+end_src
|
rlm@0
|
441
|
rlm@0
|
442 #+begin_src sh :results silent
|
rlm@0
|
443 cd physics-videos
|
rlm@0
|
444
|
rlm@0
|
445 gst-launch-0.10 \
|
rlm@0
|
446 filesrc location=./upper.flv ! decodebin ! \
|
rlm@0
|
447 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
448 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
449 videobox border-alpha=0 bottom=-480 ! \
|
rlm@0
|
450 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
451 video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \
|
rlm@0
|
452 jpegenc ! avimux ! filesink location=../youtube/helloPhysics.flv \
|
rlm@0
|
453 \
|
rlm@0
|
454 filesrc location=./lower.flv ! decodebin ! \
|
rlm@0
|
455 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
456 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
457 videobox top=-480 ! mix.
|
rlm@0
|
458 #+end_src
|
rlm@0
|
459
|
rlm@0
|
460 #+begin_src sh :results verbatim
|
rlm@0
|
461 du -h youtube/helloPhysics.flv
|
rlm@0
|
462 #+end_src
|
rlm@0
|
463
|
rlm@0
|
464 #+results:
|
rlm@0
|
465 : 180M physics-videos/helloPhysics.flv
|
rlm@0
|
466
|
rlm@0
|
467
|
rlm@306
|
468 That's a terribly large size!
|
rlm@0
|
469 Let's compress it:
|
rlm@0
|
470
|
rlm@42
|
471 ** COMMENT Compressing the HelloPhysics Video
|
rlm@306
|
472 First, we'll scale the video, then, we'll decrease it's bit-rate. The
|
rlm@0
|
473 end result will be perfect for upload to YouTube.
|
rlm@0
|
474
|
rlm@0
|
475 #+begin_src sh :results silent
|
rlm@0
|
476 cd youtube
|
rlm@0
|
477
|
rlm@0
|
478 gst-launch-0.10 \
|
rlm@0
|
479 filesrc location=./helloPhysics.flv ! decodebin ! \
|
rlm@0
|
480 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
481 `: # the original size is 1280 by 960` \
|
rlm@0
|
482 video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \
|
rlm@0
|
483 videoscale ! \
|
rlm@0
|
484 `: # here we scale the video down` \
|
rlm@0
|
485 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
486 `: # and here we limit the bitrate` \
|
rlm@0
|
487 theoraenc bitrate=1024 quality=30 ! \
|
rlm@0
|
488 oggmux ! progressreport update-freq=1 ! \
|
rlm@0
|
489 filesink location=./helloPhysics.ogg
|
rlm@0
|
490 #+end_src
|
rlm@0
|
491
|
rlm@0
|
492 #+begin_src sh :results verbatim
|
rlm@0
|
493 du -h youtube/helloPhysics.ogg
|
rlm@0
|
494 #+end_src
|
rlm@0
|
495
|
rlm@0
|
496 #+results:
|
rlm@0
|
497 : 13M youtube/helloPhysics.ogg
|
rlm@0
|
498
|
rlm@0
|
499 [[http://www.youtube.com/watch?v=WIJt9aRGusc][helloPhysics.ogg]]
|
rlm@0
|
500
|
rlm@0
|
501 #+begin_html
|
rlm@0
|
502 <iframe width="425" height="349"
|
rlm@0
|
503 src="http://www.youtube.com/embed/WIJt9aRGusc?hl=en&fs=1"
|
rlm@0
|
504 frameborder="0" allowfullscreen>
|
rlm@0
|
505 </iframe>
|
rlm@0
|
506 #+end_html
|
rlm@0
|
507
|
rlm@0
|
508
|
rlm@0
|
509 ** COMMENT failed attempts
|
rlm@0
|
510 Let's try the [[http://diracvideo.org/][Dirac]] video encoder.
|
rlm@0
|
511
|
rlm@0
|
512 #+begin_src sh :results verbatim
|
rlm@0
|
513 cd youtube
|
rlm@0
|
514 START=$(date +%s)
|
rlm@0
|
515 gst-launch-0.10 \
|
rlm@0
|
516 filesrc location=./helloPhysics.flv ! decodebin ! \
|
rlm@0
|
517 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
518 video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \
|
rlm@0
|
519 schroenc ! filesink location=./helloPhysics.drc > /dev/null
|
rlm@0
|
520 echo `expr $(( $(date +%s) - $START))`
|
rlm@0
|
521 #+end_src
|
rlm@0
|
522
|
rlm@0
|
523
|
rlm@0
|
524 #+results:
|
rlm@0
|
525 : 142
|
rlm@0
|
526
|
rlm@0
|
527 That took 142 seconds. Let's see how it does compression-wise:
|
rlm@0
|
528
|
rlm@0
|
529 #+begin_src sh :results verbatim
|
rlm@0
|
530 du -h ./youtube/helloPhysics.drc
|
rlm@0
|
531 #+end_src
|
rlm@0
|
532
|
rlm@0
|
533 #+results:
|
rlm@0
|
534 : 22M ./physics-videos/helloPhysics.drc
|
rlm@0
|
535
|
rlm@0
|
536
|
rlm@0
|
537 #+begin_src sh :results verbatim
|
rlm@0
|
538 cd youtube
|
rlm@0
|
539 START=$(date +%s)
|
rlm@0
|
540 gst-launch-0.10 \
|
rlm@0
|
541 filesrc location=./helloPhysics.flv ! decodebin ! \
|
rlm@0
|
542 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
543 video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \
|
rlm@0
|
544 theoraenc ! oggmux ! filesink location=./helloPhysics.ogg \
|
rlm@0
|
545 > /dev/null
|
rlm@0
|
546 echo `expr $(( $(date +%s) - $START))`
|
rlm@0
|
547 #+end_src
|
rlm@0
|
548
|
rlm@0
|
549 #+results:
|
rlm@0
|
550 : 123
|
rlm@0
|
551
|
rlm@0
|
552 #+begin_src sh :results verbatim
|
rlm@0
|
553 du -h youtube/helloPhysics.ogg
|
rlm@0
|
554 #+end_src
|
rlm@0
|
555
|
rlm@0
|
556 #+results:
|
rlm@0
|
557 : 59M physics-videos/helloPhysics.ogg
|
rlm@0
|
558
|
rlm@0
|
559
|
rlm@0
|
560 =*.drc= files can not be uploaded to YouTube, so I'll go for the
|
rlm@0
|
561 avi file.
|
rlm@0
|
562
|
rlm@0
|
563
|
rlm@0
|
564 ** COMMENT text for videos
|
rlm@0
|
565 Video output from JMonkeyEngine3 (www.jmonkeyengine.org/) using Xuggle
|
rlm@0
|
566 (www.xuggle.com/). Everything is explained at
|
rlm@0
|
567 http://aurellem.org/cortex/capture-video.html.
|
rlm@0
|
568
|
rlm@0
|
569
|
rlm@0
|
570 Video output from JMonkeyEngine3 (www.jmonkeyengine.org/) HelloPhysics
|
rlm@0
|
571 demo application using Xuggle (www.xuggle.com/). Everything is
|
rlm@0
|
572 explained at http://aurellem.org/cortex/capture-video.html. Here,
|
rlm@0
|
573 four points of view are simultaneously recorded and then glued
|
rlm@0
|
574 together later.
|
rlm@0
|
575
|
rlm@280
|
576 JME3 Xuggle Aurellem video capture
|
rlm@0
|
577
|
rlm@0
|
578
|
ocsenave@275
|
579 * Showcase of recorded videos
|
rlm@0
|
580 I encoded most of the original JME3 Hello demos for your viewing
|
rlm@42
|
581 pleasure, all using the =Capture= and =IsoTimer= classes.
|
rlm@0
|
582
|
rlm@0
|
583 ** HelloTerrain
|
rlm@0
|
584 [[http://youtu.be/5_4wyDFwrVQ][HelloTerrain.avi]]
|
rlm@0
|
585
|
rlm@0
|
586 #+begin_html
|
rlm@573
|
587 <iframe width="1062" height="872"
|
rlm@0
|
588 src="http://www.youtube.com/embed/5_4wyDFwrVQ"
|
rlm@0
|
589 frameborder="0" allowfullscreen>
|
rlm@0
|
590 </iframe>
|
rlm@0
|
591 #+end_html
|
rlm@0
|
592
|
rlm@0
|
593 ** HelloAssets
|
rlm@0
|
594 [[http://www.youtube.com/watch?v=oGg-Q6k1BM4][HelloAssets.avi]]
|
rlm@0
|
595
|
rlm@0
|
596 #+begin_html
|
rlm@573
|
597 <iframe width="1062" height="872"
|
rlm@0
|
598 src="http://www.youtube.com/embed/oGg-Q6k1BM4?hl=en&fs=1"
|
rlm@0
|
599 frameborder="0" allowfullscreen>
|
rlm@0
|
600 </iframe>
|
rlm@0
|
601 #+end_html
|
rlm@0
|
602
|
rlm@0
|
603 ** HelloEffects
|
rlm@0
|
604 [[http://www.youtube.com/watch?v=TuxlLMe53hA][HelloEffects]]
|
rlm@0
|
605
|
rlm@0
|
606 #+begin_html
|
rlm@573
|
607 <iframe width="1062" height="872"
|
rlm@0
|
608 src="http://www.youtube.com/embed/TuxlLMe53hA?hl=en&fs=1"
|
rlm@0
|
609 frameborder="0" allowfullscreen>
|
rlm@0
|
610 </iframe>
|
rlm@0
|
611 #+end_html
|
rlm@0
|
612
|
rlm@0
|
613 ** HelloCollision
|
rlm@0
|
614 [[http://www.youtube.com/watch?v=GPlvJkiZfFw][HelloCollision.avi]]
|
rlm@0
|
615
|
rlm@0
|
616 #+begin_html
|
rlm@573
|
617 <iframe width="1062" height="872"
|
rlm@0
|
618 src="http://www.youtube.com/embed/GPlvJkiZfFw?hl=en&fs=1"
|
rlm@0
|
619 frameborder="0" allowfullscreen>
|
rlm@0
|
620 </iframe>
|
rlm@0
|
621 #+end_html
|
rlm@0
|
622
|
rlm@0
|
623 ** HelloAnimation
|
rlm@0
|
624 [[http://www.youtube.com/watch?v=SDCfOSPYUkg][HelloAnimation.avi]]
|
rlm@0
|
625
|
rlm@0
|
626 #+begin_html
|
rlm@573
|
627 <iframe width="1062" height="872"
|
rlm@0
|
628 src="http://www.youtube.com/embed/SDCfOSPYUkg?hl=en&fs=1"
|
rlm@0
|
629 frameborder="0" allowfullscreen>
|
rlm@0
|
630 </iframe>
|
rlm@0
|
631 #+end_html
|
rlm@0
|
632
|
rlm@0
|
633 ** HelloLoop
|
rlm@0
|
634 [[http://www.youtube.com/watch?v=mosZzzcdE5w][HelloLoop.avi]]
|
rlm@0
|
635
|
rlm@0
|
636 #+begin_html
|
rlm@573
|
637 <iframe width="1062" height="872"
|
rlm@0
|
638 src="http://www.youtube.com/embed/mosZzzcdE5w?hl=en&fs=1"
|
rlm@0
|
639 frameborder="0" allowfullscreen>
|
rlm@0
|
640 </iframe>
|
rlm@0
|
641 #+end_html
|
rlm@0
|
642
|
rlm@0
|
643
|
rlm@0
|
644 *** COMMENT x-form the other stupid
|
rlm@0
|
645 progressreport update-freq=1
|
rlm@0
|
646
|
rlm@0
|
647 gst-launch-0.10 \
|
rlm@0
|
648 filesrc location=./helloPhy ! decodebin ! \
|
rlm@0
|
649 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
650 video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \
|
rlm@0
|
651 x264enc ! avimux ! filesink location=helloPhysics.avi \
|
rlm@0
|
652
|
rlm@0
|
653
|
rlm@0
|
654 gst-launch-0.10 \
|
rlm@0
|
655 filesrc location=./HelloAnimationStatic.flv ! decodebin ! \
|
rlm@0
|
656 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
657 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
658 videobox border-alpha=0 left=-640 ! \
|
rlm@0
|
659 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
660 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
661 x264enc ! avimux ! progressreport update-freq=1 ! \
|
rlm@0
|
662 filesink location=../youtube/HelloAnimation.avi \
|
rlm@0
|
663 \
|
rlm@0
|
664 filesrc location=./HelloAnimationMotion.flv ! decodebin ! \
|
rlm@0
|
665 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
666 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
667 videobox right=-640 ! mix.
|
rlm@0
|
668
|
rlm@0
|
669 gst-launch-0.10 \
|
rlm@0
|
670 filesrc location=./HelloCollisionMotion.flv ! decodebin ! \
|
rlm@0
|
671 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
672 video/x-raw-yuv, width=800, height=600, framerate=25/1 ! \
|
rlm@0
|
673 x264enc bitrate=1024 ! avimux ! \
|
rlm@0
|
674 filesink location=../youtube/HelloCollision.avi
|
rlm@0
|
675
|
rlm@0
|
676 gst-launch-0.10 \
|
rlm@0
|
677 filesrc location=./HelloEffectsStatic.flv ! decodebin ! \
|
rlm@0
|
678 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
679 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
680 videobox border-alpha=0 left=-640 ! \
|
rlm@0
|
681 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
682 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
683 x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \
|
rlm@0
|
684 filesink location=../youtube/HelloEffects.avi \
|
rlm@0
|
685 \
|
rlm@0
|
686 filesrc location=./HelloEffectsMotion.flv ! decodebin ! \
|
rlm@0
|
687 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
688 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
689 videobox right=-640 ! mix.
|
rlm@0
|
690
|
rlm@0
|
691 gst-launch-0.10 \
|
rlm@0
|
692 filesrc location=./HelloTerrainMotion.flv ! decodebin ! \
|
rlm@0
|
693 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
694 video/x-raw-yuv, width=800, height=600, framerate=25/1 ! \
|
rlm@0
|
695 x264enc bitrate=1024 ! avimux ! \
|
rlm@0
|
696 filesink location=../youtube/HelloTerrain.avi
|
rlm@0
|
697
|
rlm@0
|
698
|
rlm@0
|
699 gst-launch-0.10 \
|
rlm@0
|
700 filesrc location=./HelloAssetsStatic.flv ! decodebin ! \
|
rlm@0
|
701 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
702 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
703 videobox border-alpha=0 left=-640 ! \
|
rlm@0
|
704 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
705 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
706 x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \
|
rlm@0
|
707 filesink location=../youtube/HelloAssets.avi \
|
rlm@0
|
708 \
|
rlm@0
|
709 filesrc location=./HelloAssetsMotion.flv ! decodebin ! \
|
rlm@0
|
710 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
711 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
712 videobox right=-640 ! mix.
|
rlm@0
|
713
|
rlm@0
|
714
|
rlm@0
|
715 gst-launch-0.10 \
|
rlm@0
|
716 filesrc location=./HelloNodeStatic.flv ! decodebin ! \
|
rlm@0
|
717 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
718 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
719 videobox border-alpha=0 left=-640 ! \
|
rlm@0
|
720 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
721 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
722 x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \
|
rlm@0
|
723 filesink location=../youtube/HelloNode.avi \
|
rlm@0
|
724 \
|
rlm@0
|
725 filesrc location=./HelloNodeMotion.flv ! decodebin ! \
|
rlm@0
|
726 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
727 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
728 videobox right=-640 ! mix.
|
rlm@0
|
729
|
rlm@0
|
730 gst-launch-0.10 \
|
rlm@0
|
731 filesrc location=./HelloLoopStatic.flv ! decodebin ! \
|
rlm@0
|
732 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
733 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
734 videobox border-alpha=0 left=-640 ! \
|
rlm@0
|
735 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
736 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
737 x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \
|
rlm@0
|
738 filesink location=../youtube/HelloLoop.avi \
|
rlm@0
|
739 \
|
rlm@0
|
740 filesrc location=./HelloLoopMotion.flv ! decodebin ! \
|
rlm@0
|
741 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
742 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
743 videobox right=-640 ! mix.
|