FFmpeg: Difference between revisions
(27 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
[https://ffmpeg.org/ FFmpeg] (Fast Forward MPEG) is a library for encoding and decoding multimedia. | [https://ffmpeg.org/ FFmpeg] (Fast Forward MPEG) is a library for encoding and decoding multimedia. | ||
You can interact with FFmpeg using their command-line interface or using their [https://ffmpeg.org/doxygen/trunk/index.html C API]. | You can interact with FFmpeg using their command-line interface or using their [https://ffmpeg.org/doxygen/trunk/index.html C API]. | ||
Note that a lot of things involving just decoding or encoding can be done by calling their CLI application and piping things to stdin or from stdout. | |||
==CLI== | ==CLI== | ||
Line 7: | Line 8: | ||
* Linux: [https://johnvansickle.com/ffmpeg/ https://johnvansickle.com/ffmpeg/] | * Linux: [https://johnvansickle.com/ffmpeg/ https://johnvansickle.com/ffmpeg/] | ||
* Windows: [https://ffmpeg.zeranoe.com/builds/ https://ffmpeg.zeranoe.com/builds/] | * Windows: [https://ffmpeg.zeranoe.com/builds/ https://ffmpeg.zeranoe.com/builds/] | ||
If you need nvenc support, you can build FFmpeg with https://github.com/markus-perl/ffmpeg-build-script. | |||
Basic usage is as follows: | Basic usage is as follows: | ||
Line 85: | Line 88: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
If you want better quality, you can use the following filter_complex: | |||
<pre> | |||
< | [0]split=2[v1][v2];[v1]palettegen=stats_mode=full[palette];[v2][palette]paletteuse=dither=sierra2_4a | ||
</pre> | |||
</ | |||
Here is another script from [https://superuser.com/questions/556029/how-do-i-convert-a-video-to-gif-using-ffmpeg-with-reasonable-quality https://superuser.com/questions/556029/how-do-i-convert-a-video-to-gif-using-ffmpeg-with-reasonable-quality] | Here is another script from [https://superuser.com/questions/556029/how-do-i-convert-a-video-to-gif-using-ffmpeg-with-reasonable-quality https://superuser.com/questions/556029/how-do-i-convert-a-video-to-gif-using-ffmpeg-with-reasonable-quality] | ||
Line 108: | Line 100: | ||
</syntaxhighlight> | </syntaxhighlight> | ||
}} | }} | ||
===Pipe to stdout=== | |||
Below is an example of piping the video only to stdout: | |||
<pre> | |||
ffmpeg -i video.webm -pix_fmt rgb24 -f rawvideo - | |||
</pre> | |||
In Python, you can read it as follows: | |||
<syntaxhighlight lang="python"> | |||
video_width = 1920 | |||
video_height = 1080 | |||
ffmpeg_process = subprocess.Popen(ffmpeg_command, | |||
stdout=subprocess.PIPE, | |||
stderr=subprocess.PIPE) | |||
raw_image = ffmpeg_process.stdout.read( | |||
video_width * video_height * 3) | |||
image = (np.frombuffer(raw_image, dtype=np.uint8) | |||
.reshape(video_height, video_width, 3)) | |||
</syntaxhighlight> | |||
==Filters== | ==Filters== | ||
Filters are part of the CLI | Filters are part of the CLI<br> | ||
[https://ffmpeg.org/ffmpeg-filters.html https://ffmpeg.org/ffmpeg-filters.html] | |||
===Crop=== | ===Crop=== | ||
Line 202: | Line 214: | ||
===Stack and Unstack=== | ===Stack and Unstack=== | ||
To stack, see <code>hstack</code>, <code>vstack</code>. | To stack, see [https://ffmpeg.org/ffmpeg-all.html#hstack <code>hstack</code>], [https://ffmpeg.org/ffmpeg-all.html#vstack <code>vstack</code>]. | ||
To unstack, see <code>crop</code>. | To unstack, see <code>crop</code>. | ||
Line 216: | Line 228: | ||
<pre> | <pre> | ||
ffmpeg -i $1 -i $2 -i $3 - | ffmpeg -i $1 -i $2 -i $3 -filter_complex "[0]split[t1][t2];[t1][t2]vstack" output.mkv -y | ||
</pre> | </pre> | ||
Line 232: | Line 244: | ||
-vsync 2 \ | -vsync 2 \ | ||
all_parts.mp4 -y | all_parts.mp4 -y | ||
</pre> | |||
===Replace transparency=== | |||
[https://superuser.com/questions/1341674/ffmpeg-convert-transparency-to-a-certain-color Reference]<br> | |||
Add a background to transparent images.<br> | |||
<pre> | |||
ffmpeg -i in.mov -filter_complex "[0]format=pix_fmts=yuva420p,split=2[bg][fg];[bg]drawbox=c=white@1:replace=1:t=fill[bg];[bg][fg]overlay=format=auto" -c:a copy new.mov | |||
</pre> | |||
===Draw Text=== | |||
https://stackoverflow.com/questions/15364861/frame-number-overlay-with-ffmpeg | |||
<pre> | |||
ffmpeg -i input -vf "drawtext=fontfile=Arial.ttf: text='%{frame_num}': start_number=1: x=(w-tw)/2: y=h-(2*lh): fontcolor=black: fontsize=20: box=1: boxcolor=white: boxborderw=5" -c:a copy output | |||
</pre> | </pre> | ||
==C API== | ==C API== | ||
A doxygen reference manual for their C api is available at [https://ffmpeg.org/doxygen/trunk/index.html]. | A doxygen reference manual for their C api is available at [https://ffmpeg.org/doxygen/trunk/index.html].<br> | ||
Note that FFmpeg is licensed under GPL.<br> | |||
If you only need to do encoding and decoding, you can simply pipe the inputs and outputs of the FFmpeg CLI to your program [https://batchloaf.wordpress.com/2017/02/12/a-simple-way-to-read-and-write-audio-and-video-files-in-c-using-ffmpeg-part-2-video/].<br> | |||
===Getting Started=== | ===Getting Started=== | ||
Best way to get started is to look at the [https://ffmpeg.org/doxygen/trunk/examples.html official examples]. | |||
====Structs==== | ====Structs==== | ||
* [https://www.ffmpeg.org/doxygen/trunk/structAVInputFormat.html <code>AVInputFormat</code>]/[https://www.ffmpeg.org/doxygen/trunk/structAVOutputFormat.html <code>AVOutputFormat</code>] Represents a container type. | * [https://www.ffmpeg.org/doxygen/trunk/structAVInputFormat.html <code>AVInputFormat</code>]/[https://www.ffmpeg.org/doxygen/trunk/structAVOutputFormat.html <code>AVOutputFormat</code>] Represents a container type. | ||
Line 268: | Line 297: | ||
===Muxing to memory=== | ===Muxing to memory=== | ||
You can specify a custom <code>AVIOContext</code> and attach it to your <code>AVFormatContext->pb</code> to mux directly to memory or to implement your own buffering. | You can specify a custom <code>AVIOContext</code> and attach it to your <code>AVFormatContext->pb</code> to mux directly to memory or to implement your own buffering. | ||
===NVENC=== | ===NVENC=== | ||
Line 324: | Line 352: | ||
You can try [https://github.com/PyAV-Org/PyAV pyav] which contains bindings for the library. However I haven't tried it. | You can try [https://github.com/PyAV-Org/PyAV pyav] which contains bindings for the library. However I haven't tried it. | ||
If you just need to call the CLI, you can use [https://github.com/kkroening/ffmpeg-python ffmpeg-python] to help build calls. | If you just need to call the CLI, you can use [https://github.com/kkroening/ffmpeg-python ffmpeg-python] to help build calls. | ||
==JavaScript API== | |||
To use FFmpeg in a browser, see [https://ffmpegwasm.netlify.app/ ffmpegwasm]. | |||
This is used in https://davidl.me/apps/media/index.html. | |||
==My Preferences== | ==My Preferences== | ||
My preferences for encoding video | My preferences for encoding video | ||
===AV1=== | |||
Prefer AV1 for encoding video on on modern devices. | |||
===H265/HEVC=== | ===H265/HEVC=== | ||
H264/HEVC is | H264/HEVC is now a good tradeoff between size, quality, and compatibility. | ||
This has been supported on devices since Android 5.0 (2014). | |||
<syntaxhighlight lang="bash"> | <syntaxhighlight lang="bash"> | ||
ffmpeg -i $1 -c:v libx265 -crf 23 -preset slow -pix_fmt yuv444p10le -c:a libopus -b:a 128K $2 | ffmpeg -i $1 -c:v libx265 -crf 23 -preset slow -pix_fmt yuv444p10le -c:a libopus -b:a 128K $2 | ||
</syntaxhighlight> | </syntaxhighlight> | ||
;Notes | ;Notes | ||
* The pixel format <code>yuv444p10le</code> is 10 bit color without chroma subsampling. If your source is lower, you can use <code>yuv420p</code> instead for 8-bit color and 4:2:0 chroma subsampling. | * The pixel format <code>yuv444p10le</code> is 10 bit color without chroma subsampling. If your source is lower, you can use <code>yuv420p</code> instead for 8-bit color and 4:2:0 chroma subsampling. | ||
===H264=== | |||
If you need compatability with very old and low end devices. | |||
<syntaxhighlight lang="bash"> | |||
ffmpeg -i $1 -c:v libx264 -crf 28 -preset medium -pix_fmt yuv420p -c:a libfdk_aac -b:a 128K $2 | |||
</syntaxhighlight> | |||
===Opus=== | |||
For streaming: | |||
<syntaxhighlight lang="bash"> | |||
ffmpeg -i input.wav -c:a libopus -b:a 96k output.opus | |||
</syntaxhighlight> | |||
See https://wiki.xiph.org/Opus_Recommended_Settings |