HOWTO Convert video files

From LinuxReviews
Jump to navigationJump to search

Converting video files between common file containers and codecs can easily be done from the command-line on Linux systems. Starting a conversion job can be done quicker than graphical tools when you know the basic commands.

Kemonomimi rabbit.svg
Note: This article is about video conversion and manipulation using the command-line in terminal. Take a look at avidemux and VLC if you want a graphical program which lets you quickly convert from one format to another. See kdenlive and Pitivi if you are looking for a full-featured timeline video editor

When to convert

There are many good reasons to convert video files you made yourself - or video files forwarded to you which you plan to distribute. Good reasons to change a video file can be:

  • Playability across devices
  • Size

This HOWTO will to introduce to you some of the basic commands to use the most common Linux command-line program to convert video files from one format to another.

Sometimes file-size matters

Size is important, specially when publishing video on the web.

  • A video-file reduced by 100 MB will save you 100 GB of bandwidth if 1000 people download it;
  • The reduction of 100 MB per. file will save you 1 GB of hard-drive space if you host 10 video-files on a web-server.

This adds up. You will likely want files to be as small as possible without reducing the video quality.


  • If you are distribution your video on the web to a general audience then proprietary H264 (MPEG/AVC) video with AAC is the best choice for maximum playability.
  • If your audience is primarily Linux-focused then VP9 video with Opus audio in a .webm container is the best choice.
    • iToddlers can't play Webm and Apple appears to be hell-bent on keeping it that way.

You will see your shareholder value decrease if you only provide videos in Webm format on your website if it is targeting a broad audience.

The Basic Tools for ALL Your Video Conversion Needs

There is one basic GNU/Linux tool you absolutely must learn and that is: ffmpeg

Three other tools do exist: mencoder, ffmpeg2theora and transcode. They are wildly outdated and not worth your time. mencoder has been barely updated to ensure it still compiles the last decade. transcode hasn't even got that much attention. ffmpeg2theora is no longer relevant since VP9 has deprecated the theora video format.

You need to enable rpmfusion to get a fully featured version of ffmpeg on Fedora due to imaginary property laws in the US.

ffmpeg for beginners

The most basic use of ffmpeg is simply:

Shell command(s):
ffmpeg -i YourAwesomeMovie-Input.dv YourAwesomeMovie-Output.webm

That's it. This is, ffmpeg -i input.ext output.ext is all you need do remember when it comes to the most basic use-case. ffmpeg detect the input file automatically and will look at the output file extension and try to intelligently decide what audio and video codecs to use.

You can specify the audio and video codecs used by input files. There is rarely a need, the built-in input detection does work perfectly in 99% of all cases.

  • An output filename with the .mp4 extension will make ffmpeg assume that you want a file with H.264/AVC video and AAC 2.0 audio. These happen to be the correct choices for HTML5 web video files which can be played on all devices.
  • An output file with the .webm extension will use VP9 video and Opus audio. This is a great choice for videos posted on websites that are not visited by too many Apple-users.
  • An output filename signalling the .mkv container will get h264 video and vorbis audio. This is likely NOT what you want in a .mkv file.

Specifying audio and video format

Add -c: and a for audio or v for video to set the audio and video codecs (-c:v for video and -c:a).

If you want VP8 video with vorbis audio in a .webm container you use -c:v libvpx -c:a libvorbis like this:

Shell command(s):
ffmpeg -i wjsn-save-you-save-me.mp4 -c:v libvpx -c:a libvorbis wjsn-is-the-best-we-love-them.webm
Kemonomimi rabbit.svg
Note: ffmpeg supports a large variety of encoders depending on how it was compiled. You can get a very large list of all the encoders your version supports by running ffmpeg -encoders. You probably want to pipe it to a pager like less, ffmpeg -encoders

Moving on to more advanced encoding

The secret ffmpeg manual page is very long. It contains lots of detailed information about the numerous options available when using this tool. Whole books can be are being written about the advanced options listed in the manual page.

Things you may want to tune when encoding include:

  • -r to change the frame rate
  • -b:v to set the video bitrate
  • -b:a to set the audio bitrate

These options can have a huge impact on the quality and the file-size of the resulting file.

A general rule is: You can have low file-size OR high picture quality, but you can NOT have both. You will have to compromise.

This may give you an acceptable quality video file on lower resolutions like 720p:

Shell command(s):
ffmpeg -i YourAwesomeMovie.dv -r 25 -b:a 128k -b:v 4000k YourAwesomeMovie.mp4

..and this will absoltely NOT, because 256k is not enough bandwidth for acceptable quality even at tiny resolutions:

Shell command(s):
ffmpeg -i YourAwesomeMovie.dv -r 25 -b:a 64k -b:v 256k YourAwesomeMovie.mp4

"strict experimental"

Some versions of ffmpeg will refuse to use certain codecs unless you specify -strict experimental

Try adding it if ffmpeg refuses to do something when your options should work.

Encoding options for H264 / MPEG-AVC video

libx264 as in -c:v libx264 is the codec for H264/MPEG-AVC video.

Output quality can be adjusted in many ways. We recommend sticking with the Constant Rate Factor encoding option set by -crf. The -crf switch values worth considering are 20 to 30 where 20 produces very high quality and produces 30 acceptable quality. You can use lower values for even better quality or higher values for worse; the allowed range is 0-51. 20-30 is what you want to use.

You should also add a -preset option and the choices are ultrafast, superfast, veryfast, faster, fast, medium (default), slow, slower and veryslow. The presets allow you to choose encoding speed at the cost of size. Thus; if you use a fixed quality setting like crf you will get the exact same quality with superfast and veryslow but the file size will be bigger when using superfast. The medium default is fine.

Do note that -preset will change quality if you are using fixed constant bitrate encoding.

An example of encoding H264 with a Constant Rate Factor of 21: An example of using -crf to set quality would be:

Shell command(s):
ffmpeg -i inputfile.mkv -c:v libx264 -preset slow -crf 21 output.mp4

You might want to make a simple script for encoding a format you use frequently such as:

File: $HOME/bin/

nice -n 19 ffmpeg -i "$*" \
  -strict experimental \
  -c:v libx264 -preset slow -crf 21 \
  -c:a aac -b:a 196k \
  -ac 2 \
exit 0

With that you could just run

Shell command(s): inputfile.mkv

and get a H264 file named inputfile.hq.mp4.

You can obviously throw in more options such as -vf scale if you, for example, convert video to 720p regularly:

File: $HOME/bin/

nice -n 19 ffmpeg -i "$*" \
  -strict experimental \
  -c:v libx264 -preset slow -crf 21 \
  -c:a aac -b:a 196k \
  -vf scale="'if(gt(iw,1280),1280,iw)':-1" \
  -ac 2 \

See H.264 Video Encoding Guide for more options.

Encoding options for VP9

ffmpeg needs the libvpx-vp9 codec specified with -c:v libvpx-vp9 to produce VP9 video. It defaults to average bitrate encoding by default. This mode is not ideal or good for visual quality but may be preferred when streaming to bandwidth-restricted devices. You likely do not want to use it.

For libvpx-vp9 Constant Quality or Constrained Quality are the best choices. The difference is that Constrained Quality allows you to set a roof for how high the variable bitrate required for the constant quality you selected goes. These modes are invoked by the switch -crf with a value between 0-63 together with -b:v and bitrate cap. This cap must be specified as 0 if you want no cap.

CRF values for VP9 differ x264. Lower values mean better quality, higher worse. The CRF value required for good visual quality is variable when using libvpx-vp9. 30 is a good choice for 1080p, 33 is a good choice for 720p and 16 works fine at 4K.

An example of Constant Quality VP9 encoding would be:

Shell command(s):
ffmpeg -i WJSN-LaLaLa.mp4 -c:v libvpx-vp9 -crf 30 -b:v 0 -c:a opus WJSN-LaLaLa.webm

And Constrained Quality mode is archived by setting -b:v to a bitrate value instead of 0:

Shell command(s):
ffmpeg -i WJSN-LaLaLa.mp4 -c:v libvpx-vp9 -crf 30 -b:v 9000k -c:a opus WJSN-LaLaLa.webm

libvpx-vp9 is very single-threaded when using the default options. You will probably want to use all of these options in addition to -crf when encoding:

  • threads 4
  • row-mt 1
  • tile-columns 6
  • frame-parallel 1
  • auto-alt-ref 1
  • lag-in-frames 25

What do all those options do? Nobody knows, mystery remains unsolved. The sum of them a) does produce slightly better quality and b) makes ffmpeg encode using more than just one core. The threads option will obviously limit how many.

You may want to script this in a file such as:

File: $HOME/bin/

nice -n 19 ffmpeg -i "$*" \
  -c:v libvpx-vp9 -crf 30 -b:v 9000k \
  -threads 4 -row-mt 1  \
  -tile-columns 6 -frame-parallel 1 \
  -auto-alt-ref 1 -lag-in-frames 25 \
  -ac 2 -c:a libopus -b:a 256k   \
  -strict experimental   \
  -f webm "${outputf}"

Quick note on Opus audio: Some claim that Opus is so good that -b:a 64k is fine for audio. The percentage of a video file used to store the audio is negligible and there is no good reason to sacrifice audio bitrate. None. 256k is fine.

Resizing a video to lower resolutions

The -vf for "video filter" option allows you to invoke scale and the scale filter can be used in many ways. This is a efficient way to use it to get a 720P video - which has a width of 1280:

-vf scale="'if(gt(iw,1280),1280,iw)':-1"

This makes ffmpeg decide the correct height based on the aspect ratio and the width.

You can use this in a full command like:

Shell command(s):
ffmpeg -i 4kvideofile.mkv -vf scale="'if(gt(iw,1280),1280,iw)':-1" -c:v libvpx-vp9 -c:a opus downscaledto720p.webm

Resizing a video to higher resolutions

Doing so would simply be foolish as you gain nothing in terms of quality while the file-size would be increased dramatically.

HOWTO Make a VideoCD/VCD

This information is left here because some are still interested in making VCDs for some reason. If you are a zoomer wondering what a VCD is: It was a disc shaped storage-medium used to distribute movies before the Internet.

ffmpeg can make VCD mpeg video files using -target where the target can be "vcd", "svcd", "dvd", "dv", "pal-vcd", "ntsc-svcd". These switches will set the output format options (bitrate, codecs, buffer sizes) automatically.

The default vcd switch makes a PAL vcd.

Shell command(s):
ffmpeg -i myfile.avi -target vcd /tmp/vcd.mpg

A ntsc vcd:

Shell command(s):
ffmpeg -i myfile.avi -hq -target ntsc-vcd /tmp/vcd.mpg

Converting a file for VCD format using a and b frames for MPEG 2:

Shell command(s):
ffmpeg -i myfile.avi -target ntsc-vcd -bf 2 /home/user/Video/vcd.mpg

Special use-cases

Making a .webm to post on that image-board

That image-board will not accept webm files with VP9 videos.

  • Container must be .webm
  • Video must be VP8
  • There can be no audio (except for specific boards where it is allowed)
  • File-size must be below 3M

With these restrictions a command for converting cute-kpop-queen-dancing.mp4 could be

Shell command(s):
ffmpeg -i cute-kpop-queen-dancing.mp4 -map 0:0 -c:v libvpx -b:v 1.4M -threads 4 cutekpopqueen.webm

The important options here are:

  • -map 0:0 specifies that only the video track (0:0) from the input should be included.
  • -c:v libvpx specifices VP8 video.
  • -b:v 1.4M specifies a bitrate of 1.4M. This is low. File-size restrictions do apply. You may want to convert a few times starting with -b:v 1M or -b:v 2M. Adjust the bitrate up or down depending on how close you are to the end-result being within the (3M on "that" imageboard) required limit.

HOWTO make animated .gif files

The outdated video-player mplayer combined with the image manipulation toolbox lets you convert short video files to .gif files.

  1. Make sure your video file is short. This is highly important as you do want to make sure your animated .gif ends up being reasonably small.
  2. Convert your video file to a very small resolution. Again, a 20 MB .mp4 file is alright, a 20 MB .gif file is NOT.

First, use mplayer's -vo option to produce a series of images, either png or jpg.

Shell command(s):
mplayer -vo png myvideoclip.mp4


Shell command(s):
mplayer -vo jpeg myvideoclip.mp4

You should now have a folder full of incriminating images. You can use them all or delete a few (every 4 images, for example).

It is now time to use the awesome power of imagemagick's convert tool (secret convert manual page) to merge all the png (or jpg) images into one big animated .gif file:

Shell command(s):
convert *.png animation.gif

Beware that convert has a secret -delay option which could be used to set the time between images in the animated .gif file.

Corner-case errors

Too many packets buffered for output stream 0:1.
[libopus @ 0x562117d06100] 1 frames left in the queue on closing

This can happen when trying to convert some files created with OBS. It can happen if the file has sparse audio or video frames. Adding this parameter to ffmpeg will solve it:

-max_muxing_queue_size 400

While it's somewhat unclear why this seemingly randomly happens the -max_muxing_queue_size 400 option appears to solve it every time.


Add your comment
LinuxReviews welcomes all comments. If you do not want to be anonymous, register or log in. It is free.