rlm@0
|
1 #+title: Capture Live Video Feeds from JMonkeyEngine
|
rlm@0
|
2 #+author: Robert McIntyre
|
rlm@0
|
3 #+email: rlm@mit.edu
|
rlm@0
|
4 #+description: Capture video from a JMonkeyEngine3 Application with Xuggle, and use gstreamer to compress the video to upload to YouTube.
|
rlm@306
|
5 #+keywords: JME3, video, Xuggle, JMonkeyEngine, YouTube, capture video, Java
|
rlm@4
|
6 #+SETUPFILE: ../../aurellem/org/setup.org
|
rlm@4
|
7 #+INCLUDE: ../../aurellem/org/level-0.org
|
rlm@0
|
8
|
rlm@0
|
9
|
rlm@0
|
10 * The Problem
|
rlm@0
|
11 So you've made your cool new JMonkeyEngine3 game and you want to
|
rlm@0
|
12 create a demo video to show off your hard work. Screen capturing is
|
rlm@0
|
13 the most straightforward way to do this, but it can slow down your
|
rlm@0
|
14 game and produce low-quality video as a result. A better way is to
|
rlm@280
|
15 record a video feed directly from the game while it is running.
|
rlm@0
|
16
|
rlm@0
|
17 In this post, I'll explain how you can alter your JMonkeyEngine3 game
|
rlm@0
|
18 to output video while it is running. The main trick is to alter the
|
rlm@0
|
19 pace of JMonkeyEngine3's in-game time: we allow the engine as much
|
rlm@0
|
20 time as it needs to compute complicated in-game events and to encode
|
rlm@0
|
21 video frames. As a result, the game appears to speed up and slow down
|
rlm@0
|
22 as the computational demands shift, but the end result is perfectly
|
rlm@0
|
23 smooth video output at a constant framerate.
|
rlm@0
|
24
|
rlm@0
|
25
|
ocsenave@275
|
26 * Video recording requires a steady framerate
|
ocsenave@275
|
27 ** The built-in =Timer= rushes to keep up.
|
ocsenave@275
|
28 #* Game-time vs. User-time vs. Video-time
|
rlm@0
|
29
|
rlm@280
|
30 Standard JME3 applications use a =Timer= object to manage time in the
|
rlm@280
|
31 simulated world. Because most JME3 applications (e.g. games) are
|
rlm@280
|
32 supposed to happen \ldquo{}live\rdquo{}, the built-in =Timer= requires
|
rlm@280
|
33 simulated time to match real time. This means that the application
|
rlm@280
|
34 must rush to finish all of its calculations on schedule: the more
|
rlm@280
|
35 complicated the calculations, the more the application is obligated to
|
rlm@280
|
36 rush. And if the workload becomes too much to handle on schedule,
|
rlm@280
|
37 =Timer= forces the application to cut corners: it demands fast,
|
rlm@280
|
38 approximate answers instead of careful, accurate ones. Although this
|
rlm@280
|
39 policy sometimes causes physically impossible glitches and choppy
|
rlm@280
|
40 framerates, it ensures that the user will never be kept waiting while
|
rlm@280
|
41 the computer stops to make a complicated calculation.
|
rlm@0
|
42
|
ocsenave@275
|
43 Now, the built-in =Timer= values speed over accuracy because real-time
|
rlm@280
|
44 applications require it. On the other hand, if your goal is to record
|
rlm@280
|
45 a glitch-free video, you need a =Timer= that will take its time to
|
rlm@280
|
46 ensure that all calculations are accurate, even if they take a long
|
rlm@280
|
47 time. In the next section, we will create a new kind of
|
rlm@280
|
48 =Timer=\mdash{}called =IsoTimer=\mdash{}which slows down to let the
|
rlm@280
|
49 computer finish all its calculations. The result is a perfectly steady
|
rlm@280
|
50 framerate and a flawless physical simulation.
|
rlm@0
|
51
|
ocsenave@275
|
52 # are supposed to happen \ldquo live \rdquo, this =Timer= requires the
|
rlm@280
|
53 # application to update in real-time. In order to keep up with the
|
rlm@280
|
54 # real world, JME applications cannot afford to take too much time on
|
rlm@280
|
55 # expensive computations. Whenever the workload becomes too much for
|
rlm@280
|
56 # the computer to handle on schedule, =Timer= forces the computer to
|
rlm@280
|
57 # cut corners, giving fast, approximate answers instead of careful,
|
rlm@280
|
58 # accurate ones. Although physical accuracy sometimes suffers as a
|
rlm@280
|
59 # result, this policy ensures that the user will never be kept waiting
|
rlm@280
|
60 # while the computer stops to make a complicated calculation.
|
rlm@0
|
61
|
ocsenave@275
|
62 #fast answers are more important than accurate ones.
|
ocsenave@275
|
63
|
ocsenave@275
|
64 # A standard JME3 application that extends =SimpleApplication= or
|
ocsenave@275
|
65 # =Application= tries as hard as it can to keep in sync with
|
rlm@280
|
66 # /user-time/. If a ball is rolling at 1 game-mile per game-hour in
|
rlm@280
|
67 # the game, and you wait for one user-hour as measured by the clock on
|
rlm@280
|
68 # your wall, then the ball should have traveled exactly one
|
rlm@280
|
69 # game-mile. In order to keep sync with the real world, the game
|
rlm@280
|
70 # throttles its physics engine and graphics display. If the
|
rlm@280
|
71 # computations involved in running the game are too intense, then the
|
rlm@280
|
72 # game will first skip frames, then sacrifice physics accuracy. If
|
rlm@306
|
73 # there are particularly demanding computations, then you may only get
|
rlm@280
|
74 # 1 fps, and the ball may tunnel through the floor or obstacles due to
|
rlm@280
|
75 # inaccurate physics simulation, but after the end of one user-hour,
|
rlm@280
|
76 # that ball will have traveled one game-mile.
|
ocsenave@275
|
77
|
rlm@280
|
78 # When we're recording video, we don't care if the game-time syncs
|
rlm@280
|
79 # with user-time, but instead whether the time in the recorded video
|
ocsenave@275
|
80 # (video-time) syncs with user-time. To continue the analogy, if we
|
rlm@280
|
81 # recorded the ball rolling at 1 game-mile per game-hour and watched
|
rlm@280
|
82 # the video later, we would want to see 30 fps video of the ball
|
rlm@280
|
83 # rolling at 1 video-mile per /user-hour/. It doesn't matter how much
|
rlm@280
|
84 # user-time it took to simulate that hour of game-time to make the
|
rlm@280
|
85 # high-quality recording.
|
ocsenave@275
|
86 ** COMMENT Two examples to clarify the point:
|
ocsenave@275
|
87 *** Recording from a Simple Simulation
|
ocsenave@275
|
88
|
ocsenave@275
|
89 **** Without a Special Timer
|
rlm@0
|
90 You have a simulation of a ball rolling on an infinite empty plane at
|
rlm@0
|
91 one game-mile per game-hour, and a really good computer. Normally,
|
rlm@0
|
92 JME3 will throttle the physics engine and graphics display to sync the
|
rlm@0
|
93 game-time with user-time. If it takes one-thousandth of a second
|
rlm@0
|
94 user-time to simulate one-sixtieth of a second game time and another
|
rlm@0
|
95 one-thousandth of a second to draw to the screen, then JME3 will just
|
rlm@0
|
96 sit around for the remainder of $\frac{1}{60} - \frac{2}{1000}$
|
rlm@0
|
97 user-seconds, then calculate the next frame in $\frac{2}{1000}$
|
rlm@0
|
98 user-seconds, then wait, and so on. For every second of user time that
|
rlm@0
|
99 passes, one second of game-time passes, and the game will run at 60
|
rlm@0
|
100 frames per user-second.
|
rlm@0
|
101
|
rlm@0
|
102
|
ocsenave@275
|
103 **** With a Special Timer
|
rlm@0
|
104 Then, you change the game's timer so that user-time will be synced to
|
rlm@0
|
105 video-time. Assume that encoding a single frame takes 0 seconds
|
rlm@0
|
106 user-time to complete.
|
rlm@0
|
107
|
rlm@0
|
108 Now, JME3 takes advantage of all available resources. It still takes
|
rlm@0
|
109 one-thousandth of a second to calculate a physics tick, and another
|
rlm@0
|
110 one-thousandth to render to the screen. Then it takes 0 seconds to
|
rlm@0
|
111 write the video frame to disk and encode the video. In only one second
|
rlm@0
|
112 of user time, JME3 will complete 500 physics-tick/render/encode-video
|
rlm@0
|
113 cycles, and $\frac{500}{60}=8\frac{1}{3}$ seconds of game-time will
|
rlm@0
|
114 have passed. Game-time appears to dilate $8\frac{1}{3}\times$ with
|
rlm@0
|
115 respect to user-time, and in only 7.2 minutes user-time, one hour of
|
rlm@0
|
116 video will have been recorded. The game itself will run at 500 fps.
|
rlm@0
|
117 When someone watches the video, they will see 60 frames per
|
rlm@0
|
118 user-second, and $\frac{1}{60}$ video-seconds will pass each frame. It
|
rlm@0
|
119 will take exactly one hour user-time (and one hour video-time) for the
|
rlm@0
|
120 ball in the video to travel one video-mile.
|
rlm@0
|
121
|
ocsenave@275
|
122 *** Recording from a Complex Simulation
|
rlm@0
|
123
|
rlm@0
|
124 *** Without a Special Timer
|
rlm@0
|
125 You have a simulation of a ball rolling on an infinite empty plane at
|
rlm@0
|
126 one game-mile per game-hour accompanied by multiple explosions
|
rlm@0
|
127 involving thousands of nodes, particle effects, and complicated shadow
|
rlm@0
|
128 shaders to create realistic shadows. You also have a slow
|
rlm@0
|
129 laptop. Normally, JME3 must sacrifice rendering and physics simulation
|
rlm@0
|
130 to try to keep up. If it takes $\frac{1}{120}$ of a user-second to
|
rlm@0
|
131 calculate $\frac{1}{60}$ game-seconds, and an additional
|
rlm@0
|
132 $\frac{1}{60}$ of a user-second to render to screen, then JME3 has
|
rlm@0
|
133 it's work cut out for it. In order to render to the screen, it will
|
rlm@0
|
134 first step the game forward by up to four physics ticks before
|
rlm@0
|
135 rendering to the screen. If it still isn't fast enough then it will
|
rlm@0
|
136 decrease the accuracy of the physics engine until game-time and user
|
rlm@306
|
137 time are synced or a certain threshold is reached, at which point the
|
rlm@0
|
138 game visibly slows down. In this case, JME3 continuously repeat a
|
rlm@0
|
139 cycle of two physics ticks, and one screen render. For every
|
rlm@0
|
140 user-second that passes, one game-second will pass, but the game will
|
rlm@0
|
141 run at 30 fps instead of 60 fps like before.
|
rlm@0
|
142
|
rlm@0
|
143 *** With a Special Timer
|
rlm@0
|
144 Then, you change the game's timer so that user-time will be synced to
|
rlm@0
|
145 video-time. Once again, assume video encoding takes $\frac{1}{60}$ of
|
rlm@0
|
146 a user-second.
|
rlm@0
|
147
|
rlm@0
|
148 Now, JME3 will spend $\frac{1}{120}$ of a user-second to step the
|
rlm@0
|
149 physics tick $\frac{1}{60}$ game-seconds, $\frac{1}{60}$ to draw to
|
rlm@0
|
150 the screen, and an additional $\frac{1}{60}$ to encode the video and
|
rlm@0
|
151 write the frame to disk. This is a total of $\frac{1}{24}$
|
rlm@0
|
152 user-seconds for each $\frac{1}{60}$ game-seconds. It will take
|
rlm@280
|
153 $(\frac{60}{24} = 2.5)$ user-hours to record one game-hour and
|
rlm@280
|
154 game-time will appear to flow two-fifths as fast as user time while
|
rlm@280
|
155 the game is running. However, just as in example one, when all is said
|
rlm@280
|
156 and done we will have an hour long video at 60 fps.
|
rlm@0
|
157
|
rlm@0
|
158
|
ocsenave@275
|
159 ** COMMENT proposed names for the new timer
|
rlm@0
|
160 # METRONOME
|
rlm@0
|
161 # IsoTimer
|
rlm@0
|
162 # EvenTimer
|
rlm@0
|
163 # PulseTimer
|
rlm@0
|
164 # FixedTimer
|
rlm@0
|
165 # RigidTimer
|
rlm@0
|
166 # FixedTempo
|
rlm@0
|
167 # RegularTimer
|
rlm@0
|
168 # MetronomeTimer
|
rlm@0
|
169 # ConstantTimer
|
rlm@0
|
170 # SteadyTimer
|
rlm@0
|
171
|
rlm@0
|
172
|
ocsenave@275
|
173 ** =IsoTimer= records time like a metronome
|
rlm@0
|
174
|
rlm@0
|
175 The easiest way to achieve this special timing is to create a new
|
rlm@0
|
176 timer that always reports the same framerate to JME3 every time it is
|
rlm@0
|
177 called.
|
rlm@0
|
178
|
rlm@0
|
179
|
rlm@42
|
180 =./src/com/aurellem/capture/IsoTimer.java=
|
rlm@42
|
181 #+include ../../jmeCapture/src/com/aurellem/capture/IsoTimer.java src java
|
rlm@0
|
182
|
rlm@0
|
183 If an Application uses this =IsoTimer= instead of the normal one, we
|
rlm@280
|
184 can be sure that every call to =simpleUpdate=, for example,
|
rlm@280
|
185 corresponds to exactly $(\frac{1}{fps})$ seconds of game-time.
|
rlm@0
|
186
|
ocsenave@275
|
187 * =VideoRecorder= manages video feeds in JMonkeyEngine.
|
ocsenave@275
|
188
|
ocsenave@275
|
189
|
ocsenave@275
|
190 ** =AbstractVideoRecorder= provides a general framework for managing videos.
|
rlm@0
|
191
|
rlm@0
|
192 Now that the issue of time is solved, we just need a function that
|
rlm@0
|
193 writes each frame to a video. We can put this function somewhere
|
ocsenave@275
|
194 where it will be called exactly once per frame.
|
rlm@0
|
195
|
rlm@42
|
196 The basic functions that a =VideoRecorder= should support are
|
ocsenave@275
|
197 recording, starting, stopping, and possibly a cleanup step
|
ocsenave@275
|
198 where it finalizes the recording (e.g. by writing headers for a video
|
rlm@42
|
199 file).
|
rlm@42
|
200
|
rlm@306
|
201 An appropriate interface describing this behavior could look like
|
rlm@42
|
202 this:
|
rlm@42
|
203
|
rlm@42
|
204 =./src/com/aurellem/capture/video/VideoRecorder.java=
|
rlm@42
|
205 #+include ../../jmeCapture/src/com/aurellem/capture/video/VideoRecorder.java src java
|
rlm@42
|
206
|
rlm@42
|
207
|
rlm@0
|
208 JME3 already provides exactly the class we need: the =SceneProcessor=
|
rlm@0
|
209 class can be attached to any viewport and the methods defined therein
|
rlm@0
|
210 will be called at the appropriate points in the rendering process.
|
rlm@0
|
211
|
rlm@42
|
212 However, it is also important to properly close the video stream and
|
rlm@42
|
213 write headers and such, and even though =SceneProcessor= has a
|
rlm@42
|
214 =.cleanup()= method, it is only called when the =SceneProcessor= is
|
rlm@42
|
215 removed from the =RenderManager=, not when the game is shutting down
|
rlm@42
|
216 when the user pressed ESC, for example. To obtain reliable shutdown
|
rlm@306
|
217 behavior, we also have to implement =AppState=, which provides a
|
rlm@42
|
218 =.cleanup()= method that /is/ called on shutdown.
|
rlm@42
|
219
|
rlm@42
|
220 Here is an AbstractVideoRecorder class that takes care of the details
|
rlm@42
|
221 of setup and teardown.
|
rlm@42
|
222
|
rlm@42
|
223 =./src/com/aurellem/capture/video/AbstractVideoRecorder.java=
|
rlm@42
|
224 #+include ../../jmeCapture/src/com/aurellem/capture/video/AbstractVideoRecorder.java src java
|
rlm@42
|
225
|
ocsenave@275
|
226
|
ocsenave@275
|
227 ** There are many options for handling video files in Java
|
ocsenave@275
|
228
|
rlm@0
|
229 If you want to generate video from Java, a great option is [[http://www.xuggle.com/][Xuggle]]. It
|
rlm@0
|
230 takes care of everything related to video encoding and decoding and
|
rlm@0
|
231 runs on Windows, Linux and Mac. Out of all the video frameworks for
|
ocsenave@275
|
232 Java, I personally like this one the best.
|
rlm@0
|
233
|
rlm@42
|
234 Here is a =VideoRecorder= that uses [[http://www.xuggle.com/][Xuggle]] to write each frame to a
|
rlm@0
|
235 video file.
|
rlm@0
|
236
|
rlm@42
|
237 =./src/com/aurellem/capture/video/XuggleVideoRecorder.java=
|
rlm@42
|
238 #+include ../../jmeCapture/src/com/aurellem/capture/video/XuggleVideoRecorder.java src java
|
rlm@0
|
239
|
rlm@0
|
240 With this, we are able to record video!
|
rlm@0
|
241
|
rlm@280
|
242 However, it can be hard to properly install Xuggle. If you would
|
rlm@280
|
243 rather not use Xuggle, here is an alternate class that uses [[http://www.randelshofer.ch/blog/2008/08/writing-avi-videos-in-pure-java/][Werner
|
rlm@280
|
244 Randelshofer's]] excellent pure Java AVI file writer.
|
rlm@42
|
245
|
rlm@42
|
246 =./src/com/aurellem/capture/video/AVIVideoRecorder.java=
|
rlm@42
|
247 #+include ../../jmeCapture/src/com/aurellem/capture/video/AVIVideoRecorder.java src java
|
rlm@42
|
248
|
rlm@42
|
249 This =AVIVideoRecorder= is more limited than the
|
rlm@42
|
250 =XuggleVideoRecorder=, but requires less external dependencies.
|
rlm@42
|
251
|
rlm@42
|
252 Finally, for those of you who prefer to create the final video from a
|
rlm@42
|
253 sequence of images, there is the =FileVideoRecorder=, which records
|
rlm@42
|
254 each frame to a folder as a sequentially numbered image file. Note
|
rlm@42
|
255 that you have to remember the FPS at which you recorded the video, as
|
rlm@42
|
256 this information is lost when saving each frame to a file.
|
rlm@42
|
257
|
rlm@42
|
258 =./src/com/aurellem/capture/video/FileVideoRecorder.java=
|
rlm@42
|
259 #+include ../../jmeCapture/src/com/aurellem/capture/video/FileVideoRecorder.java src java
|
rlm@42
|
260
|
rlm@42
|
261
|
rlm@42
|
262
|
ocsenave@275
|
263
|
ocsenave@275
|
264 * How to record videos yourself
|
ocsenave@275
|
265
|
ocsenave@275
|
266 ** Include this code.
|
ocsenave@275
|
267
|
rlm@280
|
268 No matter how complicated your application is, it's easy to add
|
rlm@280
|
269 support for video output with just a few lines of code.
|
ocsenave@275
|
270 # You can also record from multiple ViewPorts as the above example shows.
|
ocsenave@275
|
271
|
rlm@280
|
272 And although you can use =VideoRecorder= to record advanced
|
rlm@280
|
273 split-screen videos with multiple views, in the simplest case, you
|
rlm@280
|
274 want to capture a single view\mdash{} exactly what's on screen. In
|
rlm@280
|
275 this case, the following simple =captureVideo= method will do the job:
|
rlm@42
|
276
|
rlm@42
|
277 #+begin_src java
|
rlm@42
|
278 public static void captureVideo(final Application app,
|
rlm@42
|
279 final File file) throws IOException{
|
rlm@42
|
280 final AbstractVideoRecorder videoRecorder;
|
rlm@42
|
281 if (file.getCanonicalPath().endsWith(".avi")){
|
rlm@42
|
282 videoRecorder = new AVIVideoRecorder(file);}
|
rlm@42
|
283 else if (file.isDirectory()){
|
rlm@42
|
284 videoRecorder = new FileVideoRecorder(file);}
|
rlm@42
|
285 else { videoRecorder = new XuggleVideoRecorder(file);}
|
rlm@42
|
286
|
rlm@42
|
287 Callable<Object> thunk = new Callable<Object>(){
|
rlm@42
|
288 public Object call(){
|
rlm@42
|
289 ViewPort viewPort =
|
rlm@42
|
290 app.getRenderManager()
|
rlm@42
|
291 .createPostView("aurellem record", app.getCamera());
|
rlm@42
|
292 viewPort.setClearFlags(false, false, false);
|
rlm@42
|
293 // get GUI node stuff
|
rlm@42
|
294 for (Spatial s : app.getGuiViewPort().getScenes()){
|
rlm@42
|
295 viewPort.attachScene(s);
|
rlm@42
|
296 }
|
rlm@42
|
297 app.getStateManager().attach(videoRecorder);
|
rlm@42
|
298 viewPort.addProcessor(videoRecorder);
|
rlm@42
|
299 return null;
|
rlm@42
|
300 }
|
rlm@42
|
301 };
|
rlm@42
|
302 app.enqueue(thunk);
|
rlm@42
|
303 }
|
rlm@42
|
304 #+end_src
|
rlm@42
|
305
|
rlm@280
|
306 This method selects the appropriate =VideoRecorder= class for the file
|
rlm@280
|
307 type you specify, and instructs your application to record video to
|
rlm@280
|
308 the file.
|
ocsenave@275
|
309
|
ocsenave@275
|
310 Now that you have a =captureVideo= method, you use it like this:
|
ocsenave@275
|
311
|
ocsenave@275
|
312 - Establish an =Isotimer= and set its framerate :: For example, if
|
ocsenave@275
|
313 you want to record video with a framerate of 30 fps, include
|
rlm@306
|
314 the following line of code somewhere in the initialization of
|
ocsenave@275
|
315 your application:
|
ocsenave@275
|
316 #+begin_src java :exports code
|
ocsenave@275
|
317 this.setTimer(new IsoTimer(30));
|
ocsenave@275
|
318 #+end_src
|
ocsenave@275
|
319
|
ocsenave@275
|
320 - Choose the output file :: If you want to record from the game's
|
ocsenave@275
|
321 main =ViewPort= to a file called =/home/r/record.flv=, then
|
rlm@280
|
322 include the following line of code somewhere before you call
|
rlm@280
|
323 =app.start()=;
|
ocsenave@275
|
324
|
ocsenave@275
|
325 #+begin_src java :exports code
|
ocsenave@275
|
326 Capture.captureVideo(app, new File("/home/r/record.flv"));
|
ocsenave@275
|
327 #+end_src
|
ocsenave@275
|
328
|
ocsenave@275
|
329
|
ocsenave@275
|
330 ** Simple example
|
ocsenave@275
|
331
|
rlm@42
|
332
|
rlm@42
|
333 This example will record video from the ocean scene from the
|
ocsenave@275
|
334 JMonkeyEngine test suite.
|
rlm@42
|
335 #+begin_src java
|
rlm@42
|
336 File video = File.createTempFile("JME-water-video", ".avi");
|
rlm@42
|
337 captureVideo(app, video);
|
rlm@42
|
338 app.start();
|
rlm@42
|
339 System.out.println(video.getCanonicalPath());
|
rlm@42
|
340 #+end_src
|
rlm@42
|
341
|
rlm@42
|
342
|
rlm@42
|
343 I've added support for this under a class called
|
rlm@42
|
344 =com.aurellem.capture.Capture=. You can get it [[http://hg.bortreb.com/jmeCapture/][here]].
|
rlm@42
|
345
|
rlm@42
|
346
|
ocsenave@275
|
347 ** Hello Video! example
|
rlm@0
|
348
|
rlm@0
|
349 I've taken [[http://code.google.com/p/jmonkeyengine/source/browse/trunk/engine/src/test/jme3test/helloworld/HelloLoop.java][=./jme3/src/test/jme3test/helloworld/HelloLoop.java=]] and
|
rlm@0
|
350 augmented it with video output as follows:
|
rlm@0
|
351
|
rlm@42
|
352 =./src/com/aurellem/capture/examples/HelloVideo.java=
|
rlm@42
|
353 #+include ../../src/com/aurellem/capture/examples/HelloVideo.java src java
|
rlm@0
|
354
|
rlm@0
|
355 The videos are created in the =hello-video= directory
|
rlm@0
|
356
|
rlm@42
|
357 #+begin_src sh :results verbatim :exports both
|
rlm@0
|
358 du -h hello-video/*
|
rlm@0
|
359 #+end_src
|
rlm@0
|
360
|
rlm@0
|
361 #+results:
|
rlm@0
|
362 : 932K hello-video/hello-video-moving.flv
|
rlm@0
|
363 : 640K hello-video/hello-video-static.flv
|
rlm@0
|
364
|
rlm@306
|
365 And can be immediately uploaded to YouTube
|
rlm@0
|
366
|
rlm@0
|
367 - [[http://www.youtube.com/watch?v=C8gxVAySaPg][hello-video-moving.flv]]
|
rlm@0
|
368 #+BEGIN_HTML
|
rlm@0
|
369 <iframe width="425" height="349"
|
rlm@0
|
370 src="http://www.youtube.com/embed/C8gxVAySaPg"
|
rlm@0
|
371 frameborder="0" allowfullscreen>
|
rlm@0
|
372 </iframe>
|
rlm@0
|
373 #+END_HTML
|
rlm@0
|
374 - [[http://www.youtube.com/watch?v=pHcFOtIS07Q][hello-video-static.flv]]
|
rlm@0
|
375 #+BEGIN_HTML
|
rlm@0
|
376 <iframe width="425" height="349"
|
rlm@0
|
377 src="http://www.youtube.com/embed/pHcFOtIS07Q"
|
rlm@0
|
378 frameborder="0" allowfullscreen>
|
rlm@0
|
379 </iframe>
|
rlm@0
|
380
|
rlm@0
|
381 #+END_HTML
|
rlm@0
|
382
|
rlm@0
|
383
|
ocsenave@275
|
384 * COMMENT More Examples
|
rlm@42
|
385 ** COMMENT Hello Physics
|
rlm@280
|
386
|
rlm@280
|
387 =HelloVideo= is boring. Let's add some video capturing to
|
rlm@280
|
388 =HelloPhysics= and create something fun!
|
rlm@0
|
389
|
rlm@0
|
390 This example is a modified version of =HelloPhysics= that creates four
|
rlm@0
|
391 simultaneous views of the same scene of cannonballs careening into a
|
rlm@0
|
392 brick wall.
|
rlm@0
|
393
|
rlm@0
|
394 =./jme3/src/test/jme3test/helloworld/HelloPhysicsWithVideo.java=
|
rlm@0
|
395 #+include ./jme3/src/test/jme3test/helloworld/HelloPhysicsWithVideo.java src java
|
rlm@0
|
396
|
rlm@0
|
397 Running the program outputs four videos into the =./physics-videos=
|
rlm@0
|
398 directory.
|
rlm@0
|
399
|
rlm@0
|
400 #+begin_src sh :exports both :results verbatim
|
rlm@0
|
401 ls ./physics-videos | grep -
|
rlm@0
|
402 #+end_src
|
rlm@0
|
403
|
rlm@0
|
404 #+results:
|
rlm@0
|
405 : lower-left.flv
|
rlm@0
|
406 : lower-right.flv
|
rlm@0
|
407 : upper-left.flv
|
rlm@0
|
408 : upper-right.flv
|
rlm@0
|
409
|
rlm@0
|
410 The videos are fused together with the following =gstreamer= commands:
|
rlm@0
|
411
|
rlm@0
|
412 #+begin_src sh :results silent
|
rlm@0
|
413 cd physics-videos
|
rlm@0
|
414
|
rlm@0
|
415 gst-launch-0.10 \
|
rlm@0
|
416 filesrc location=./upper-right.flv ! decodebin ! \
|
rlm@0
|
417 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
418 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
419 videobox border-alpha=0 left=-640 ! \
|
rlm@0
|
420 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
421 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
422 jpegenc ! avimux ! filesink location=upper.flv \
|
rlm@0
|
423 \
|
rlm@0
|
424 filesrc location=./upper-left.flv ! decodebin ! \
|
rlm@0
|
425 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
426 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
427 videobox right=-640 ! mix.
|
rlm@0
|
428 #+end_src
|
rlm@0
|
429
|
rlm@0
|
430 #+begin_src sh :results silent
|
rlm@0
|
431 cd physics-videos
|
rlm@0
|
432
|
rlm@0
|
433 gst-launch-0.10 \
|
rlm@0
|
434 filesrc location=./lower-left.flv ! decodebin ! \
|
rlm@0
|
435 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
436 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
437 videobox border-alpha=0 left=-640 ! \
|
rlm@0
|
438 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
439 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
440 jpegenc ! avimux ! filesink location=lower.flv \
|
rlm@0
|
441 \
|
rlm@0
|
442 filesrc location=./lower-right.flv ! decodebin ! \
|
rlm@0
|
443 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
444 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
445 videobox right=-640 ! mix.
|
rlm@0
|
446 #+end_src
|
rlm@0
|
447
|
rlm@0
|
448 #+begin_src sh :results silent
|
rlm@0
|
449 cd physics-videos
|
rlm@0
|
450
|
rlm@0
|
451 gst-launch-0.10 \
|
rlm@0
|
452 filesrc location=./upper.flv ! decodebin ! \
|
rlm@0
|
453 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
454 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
455 videobox border-alpha=0 bottom=-480 ! \
|
rlm@0
|
456 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
457 video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \
|
rlm@0
|
458 jpegenc ! avimux ! filesink location=../youtube/helloPhysics.flv \
|
rlm@0
|
459 \
|
rlm@0
|
460 filesrc location=./lower.flv ! decodebin ! \
|
rlm@0
|
461 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
462 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
463 videobox top=-480 ! mix.
|
rlm@0
|
464 #+end_src
|
rlm@0
|
465
|
rlm@0
|
466 #+begin_src sh :results verbatim
|
rlm@0
|
467 du -h youtube/helloPhysics.flv
|
rlm@0
|
468 #+end_src
|
rlm@0
|
469
|
rlm@0
|
470 #+results:
|
rlm@0
|
471 : 180M physics-videos/helloPhysics.flv
|
rlm@0
|
472
|
rlm@0
|
473
|
rlm@306
|
474 That's a terribly large size!
|
rlm@0
|
475 Let's compress it:
|
rlm@0
|
476
|
rlm@42
|
477 ** COMMENT Compressing the HelloPhysics Video
|
rlm@306
|
478 First, we'll scale the video, then, we'll decrease it's bit-rate. The
|
rlm@0
|
479 end result will be perfect for upload to YouTube.
|
rlm@0
|
480
|
rlm@0
|
481 #+begin_src sh :results silent
|
rlm@0
|
482 cd youtube
|
rlm@0
|
483
|
rlm@0
|
484 gst-launch-0.10 \
|
rlm@0
|
485 filesrc location=./helloPhysics.flv ! decodebin ! \
|
rlm@0
|
486 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
487 `: # the original size is 1280 by 960` \
|
rlm@0
|
488 video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \
|
rlm@0
|
489 videoscale ! \
|
rlm@0
|
490 `: # here we scale the video down` \
|
rlm@0
|
491 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
492 `: # and here we limit the bitrate` \
|
rlm@0
|
493 theoraenc bitrate=1024 quality=30 ! \
|
rlm@0
|
494 oggmux ! progressreport update-freq=1 ! \
|
rlm@0
|
495 filesink location=./helloPhysics.ogg
|
rlm@0
|
496 #+end_src
|
rlm@0
|
497
|
rlm@0
|
498 #+begin_src sh :results verbatim
|
rlm@0
|
499 du -h youtube/helloPhysics.ogg
|
rlm@0
|
500 #+end_src
|
rlm@0
|
501
|
rlm@0
|
502 #+results:
|
rlm@0
|
503 : 13M youtube/helloPhysics.ogg
|
rlm@0
|
504
|
rlm@0
|
505 [[http://www.youtube.com/watch?v=WIJt9aRGusc][helloPhysics.ogg]]
|
rlm@0
|
506
|
rlm@0
|
507 #+begin_html
|
rlm@0
|
508 <iframe width="425" height="349"
|
rlm@0
|
509 src="http://www.youtube.com/embed/WIJt9aRGusc?hl=en&fs=1"
|
rlm@0
|
510 frameborder="0" allowfullscreen>
|
rlm@0
|
511 </iframe>
|
rlm@0
|
512 #+end_html
|
rlm@0
|
513
|
rlm@0
|
514
|
rlm@0
|
515 ** COMMENT failed attempts
|
rlm@0
|
516 Let's try the [[http://diracvideo.org/][Dirac]] video encoder.
|
rlm@0
|
517
|
rlm@0
|
518 #+begin_src sh :results verbatim
|
rlm@0
|
519 cd youtube
|
rlm@0
|
520 START=$(date +%s)
|
rlm@0
|
521 gst-launch-0.10 \
|
rlm@0
|
522 filesrc location=./helloPhysics.flv ! decodebin ! \
|
rlm@0
|
523 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
524 video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \
|
rlm@0
|
525 schroenc ! filesink location=./helloPhysics.drc > /dev/null
|
rlm@0
|
526 echo `expr $(( $(date +%s) - $START))`
|
rlm@0
|
527 #+end_src
|
rlm@0
|
528
|
rlm@0
|
529
|
rlm@0
|
530 #+results:
|
rlm@0
|
531 : 142
|
rlm@0
|
532
|
rlm@0
|
533 That took 142 seconds. Let's see how it does compression-wise:
|
rlm@0
|
534
|
rlm@0
|
535 #+begin_src sh :results verbatim
|
rlm@0
|
536 du -h ./youtube/helloPhysics.drc
|
rlm@0
|
537 #+end_src
|
rlm@0
|
538
|
rlm@0
|
539 #+results:
|
rlm@0
|
540 : 22M ./physics-videos/helloPhysics.drc
|
rlm@0
|
541
|
rlm@0
|
542
|
rlm@0
|
543 #+begin_src sh :results verbatim
|
rlm@0
|
544 cd youtube
|
rlm@0
|
545 START=$(date +%s)
|
rlm@0
|
546 gst-launch-0.10 \
|
rlm@0
|
547 filesrc location=./helloPhysics.flv ! decodebin ! \
|
rlm@0
|
548 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
549 video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \
|
rlm@0
|
550 theoraenc ! oggmux ! filesink location=./helloPhysics.ogg \
|
rlm@0
|
551 > /dev/null
|
rlm@0
|
552 echo `expr $(( $(date +%s) - $START))`
|
rlm@0
|
553 #+end_src
|
rlm@0
|
554
|
rlm@0
|
555 #+results:
|
rlm@0
|
556 : 123
|
rlm@0
|
557
|
rlm@0
|
558 #+begin_src sh :results verbatim
|
rlm@0
|
559 du -h youtube/helloPhysics.ogg
|
rlm@0
|
560 #+end_src
|
rlm@0
|
561
|
rlm@0
|
562 #+results:
|
rlm@0
|
563 : 59M physics-videos/helloPhysics.ogg
|
rlm@0
|
564
|
rlm@0
|
565
|
rlm@0
|
566 =*.drc= files can not be uploaded to YouTube, so I'll go for the
|
rlm@0
|
567 avi file.
|
rlm@0
|
568
|
rlm@0
|
569
|
rlm@0
|
570 ** COMMENT text for videos
|
rlm@0
|
571 Video output from JMonkeyEngine3 (www.jmonkeyengine.org/) using Xuggle
|
rlm@0
|
572 (www.xuggle.com/). Everything is explained at
|
rlm@0
|
573 http://aurellem.org/cortex/capture-video.html.
|
rlm@0
|
574
|
rlm@0
|
575
|
rlm@0
|
576 Video output from JMonkeyEngine3 (www.jmonkeyengine.org/) HelloPhysics
|
rlm@0
|
577 demo application using Xuggle (www.xuggle.com/). Everything is
|
rlm@0
|
578 explained at http://aurellem.org/cortex/capture-video.html. Here,
|
rlm@0
|
579 four points of view are simultaneously recorded and then glued
|
rlm@0
|
580 together later.
|
rlm@0
|
581
|
rlm@280
|
582 JME3 Xuggle Aurellem video capture
|
rlm@0
|
583
|
rlm@0
|
584
|
ocsenave@275
|
585 * Showcase of recorded videos
|
rlm@0
|
586 I encoded most of the original JME3 Hello demos for your viewing
|
rlm@42
|
587 pleasure, all using the =Capture= and =IsoTimer= classes.
|
rlm@0
|
588
|
rlm@0
|
589 ** HelloTerrain
|
rlm@0
|
590 [[http://youtu.be/5_4wyDFwrVQ][HelloTerrain.avi]]
|
rlm@0
|
591
|
rlm@0
|
592 #+begin_html
|
rlm@0
|
593 <iframe width="425" height="349"
|
rlm@0
|
594 src="http://www.youtube.com/embed/5_4wyDFwrVQ"
|
rlm@0
|
595 frameborder="0" allowfullscreen>
|
rlm@0
|
596 </iframe>
|
rlm@0
|
597 #+end_html
|
rlm@0
|
598
|
rlm@0
|
599 ** HelloAssets
|
rlm@0
|
600 [[http://www.youtube.com/watch?v=oGg-Q6k1BM4][HelloAssets.avi]]
|
rlm@0
|
601
|
rlm@0
|
602 #+begin_html
|
rlm@0
|
603 <iframe width="425" height="349"
|
rlm@0
|
604 src="http://www.youtube.com/embed/oGg-Q6k1BM4?hl=en&fs=1"
|
rlm@0
|
605 frameborder="0" allowfullscreen>
|
rlm@0
|
606 </iframe>
|
rlm@0
|
607 #+end_html
|
rlm@0
|
608
|
rlm@0
|
609 ** HelloEffects
|
rlm@0
|
610 [[http://www.youtube.com/watch?v=TuxlLMe53hA][HelloEffects]]
|
rlm@0
|
611
|
rlm@0
|
612 #+begin_html
|
rlm@0
|
613 <iframe width="425" height="349"
|
rlm@0
|
614 src="http://www.youtube.com/embed/TuxlLMe53hA?hl=en&fs=1"
|
rlm@0
|
615 frameborder="0" allowfullscreen>
|
rlm@0
|
616 </iframe>
|
rlm@0
|
617 #+end_html
|
rlm@0
|
618
|
rlm@0
|
619 ** HelloCollision
|
rlm@0
|
620 [[http://www.youtube.com/watch?v=GPlvJkiZfFw][HelloCollision.avi]]
|
rlm@0
|
621
|
rlm@0
|
622 #+begin_html
|
rlm@0
|
623 <iframe width="425" height="349"
|
rlm@0
|
624 src="http://www.youtube.com/embed/GPlvJkiZfFw?hl=en&fs=1"
|
rlm@0
|
625 frameborder="0" allowfullscreen>
|
rlm@0
|
626 </iframe>
|
rlm@0
|
627 #+end_html
|
rlm@0
|
628
|
rlm@0
|
629 ** HelloAnimation
|
rlm@0
|
630 [[http://www.youtube.com/watch?v=SDCfOSPYUkg][HelloAnimation.avi]]
|
rlm@0
|
631
|
rlm@0
|
632 #+begin_html
|
rlm@0
|
633 <iframe width="425" height="349"
|
rlm@0
|
634 src="http://www.youtube.com/embed/SDCfOSPYUkg?hl=en&fs=1"
|
rlm@0
|
635 frameborder="0" allowfullscreen>
|
rlm@0
|
636 </iframe>
|
rlm@0
|
637 #+end_html
|
rlm@0
|
638
|
rlm@0
|
639 ** HelloNode
|
rlm@0
|
640 [[http://www.youtube.com/watch?v=pL-0fR0-ilQ][HelloNode.avi]]
|
rlm@0
|
641
|
rlm@0
|
642 #+begin_html
|
rlm@0
|
643 <iframe width="425" height="349"
|
rlm@0
|
644 src="http://www.youtube.com/embed/pL-0fR0-ilQ?hl=en&fs=1"
|
rlm@0
|
645 frameborder="0" allowfullscreen>
|
rlm@0
|
646 </iframe>
|
rlm@0
|
647 #+end_html
|
rlm@0
|
648
|
rlm@0
|
649 ** HelloLoop
|
rlm@0
|
650 [[http://www.youtube.com/watch?v=mosZzzcdE5w][HelloLoop.avi]]
|
rlm@0
|
651
|
rlm@0
|
652 #+begin_html
|
rlm@0
|
653 <iframe width="425" height="349"
|
rlm@0
|
654 src="http://www.youtube.com/embed/mosZzzcdE5w?hl=en&fs=1"
|
rlm@0
|
655 frameborder="0" allowfullscreen>
|
rlm@0
|
656 </iframe>
|
rlm@0
|
657 #+end_html
|
rlm@0
|
658
|
rlm@0
|
659
|
rlm@0
|
660 *** COMMENT x-form the other stupid
|
rlm@0
|
661 progressreport update-freq=1
|
rlm@0
|
662
|
rlm@0
|
663 gst-launch-0.10 \
|
rlm@0
|
664 filesrc location=./helloPhy ! decodebin ! \
|
rlm@0
|
665 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
666 video/x-raw-yuv, width=1280, height=960, framerate=25/1 ! \
|
rlm@0
|
667 x264enc ! avimux ! filesink location=helloPhysics.avi \
|
rlm@0
|
668
|
rlm@0
|
669
|
rlm@0
|
670 gst-launch-0.10 \
|
rlm@0
|
671 filesrc location=./HelloAnimationStatic.flv ! decodebin ! \
|
rlm@0
|
672 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
673 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
674 videobox border-alpha=0 left=-640 ! \
|
rlm@0
|
675 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
676 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
677 x264enc ! avimux ! progressreport update-freq=1 ! \
|
rlm@0
|
678 filesink location=../youtube/HelloAnimation.avi \
|
rlm@0
|
679 \
|
rlm@0
|
680 filesrc location=./HelloAnimationMotion.flv ! decodebin ! \
|
rlm@0
|
681 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
682 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
683 videobox right=-640 ! mix.
|
rlm@0
|
684
|
rlm@0
|
685 gst-launch-0.10 \
|
rlm@0
|
686 filesrc location=./HelloCollisionMotion.flv ! decodebin ! \
|
rlm@0
|
687 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
688 video/x-raw-yuv, width=800, height=600, framerate=25/1 ! \
|
rlm@0
|
689 x264enc bitrate=1024 ! avimux ! \
|
rlm@0
|
690 filesink location=../youtube/HelloCollision.avi
|
rlm@0
|
691
|
rlm@0
|
692 gst-launch-0.10 \
|
rlm@0
|
693 filesrc location=./HelloEffectsStatic.flv ! decodebin ! \
|
rlm@0
|
694 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
695 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
696 videobox border-alpha=0 left=-640 ! \
|
rlm@0
|
697 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
698 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
699 x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \
|
rlm@0
|
700 filesink location=../youtube/HelloEffects.avi \
|
rlm@0
|
701 \
|
rlm@0
|
702 filesrc location=./HelloEffectsMotion.flv ! decodebin ! \
|
rlm@0
|
703 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
704 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
705 videobox right=-640 ! mix.
|
rlm@0
|
706
|
rlm@0
|
707 gst-launch-0.10 \
|
rlm@0
|
708 filesrc location=./HelloTerrainMotion.flv ! decodebin ! \
|
rlm@0
|
709 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
710 video/x-raw-yuv, width=800, height=600, framerate=25/1 ! \
|
rlm@0
|
711 x264enc bitrate=1024 ! avimux ! \
|
rlm@0
|
712 filesink location=../youtube/HelloTerrain.avi
|
rlm@0
|
713
|
rlm@0
|
714
|
rlm@0
|
715 gst-launch-0.10 \
|
rlm@0
|
716 filesrc location=./HelloAssetsStatic.flv ! decodebin ! \
|
rlm@0
|
717 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
718 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
719 videobox border-alpha=0 left=-640 ! \
|
rlm@0
|
720 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
721 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
722 x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \
|
rlm@0
|
723 filesink location=../youtube/HelloAssets.avi \
|
rlm@0
|
724 \
|
rlm@0
|
725 filesrc location=./HelloAssetsMotion.flv ! decodebin ! \
|
rlm@0
|
726 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
727 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
728 videobox right=-640 ! mix.
|
rlm@0
|
729
|
rlm@0
|
730
|
rlm@0
|
731 gst-launch-0.10 \
|
rlm@0
|
732 filesrc location=./HelloNodeStatic.flv ! decodebin ! \
|
rlm@0
|
733 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
734 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
735 videobox border-alpha=0 left=-640 ! \
|
rlm@0
|
736 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
737 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
738 x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \
|
rlm@0
|
739 filesink location=../youtube/HelloNode.avi \
|
rlm@0
|
740 \
|
rlm@0
|
741 filesrc location=./HelloNodeMotion.flv ! decodebin ! \
|
rlm@0
|
742 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
743 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
744 videobox right=-640 ! mix.
|
rlm@0
|
745
|
rlm@0
|
746 gst-launch-0.10 \
|
rlm@0
|
747 filesrc location=./HelloLoopStatic.flv ! decodebin ! \
|
rlm@0
|
748 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
749 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
750 videobox border-alpha=0 left=-640 ! \
|
rlm@0
|
751 videomixer name=mix ! ffmpegcolorspace ! videorate ! \
|
rlm@0
|
752 video/x-raw-yuv, width=1280, height=480, framerate=25/1 ! \
|
rlm@0
|
753 x264enc bitrate=1024 ! avimux ! progressreport update-freq=1 ! \
|
rlm@0
|
754 filesink location=../youtube/HelloLoop.avi \
|
rlm@0
|
755 \
|
rlm@0
|
756 filesrc location=./HelloLoopMotion.flv ! decodebin ! \
|
rlm@0
|
757 videoscale ! ffmpegcolorspace ! \
|
rlm@0
|
758 video/x-raw-yuv, width=640, height=480, framerate=25/1 ! \
|
rlm@0
|
759 videobox right=-640 ! mix.
|