HOWTO enable 10-bit color on Linux

From LinuxReviews
Jump to navigationJump to search

Modern monitors and AMD graphics cards support 10-bit color. It's not enabled by default on GNU/Linux distributions. Enabling it is just a matter of having one line with "DefaultDepth 10" in a configuration file in /etc/X11/xorg.conf.d/.

However, there is one reason you may not want to do this,

Chromium can't into 10-bit color[edit]

Every single piece of software you're using will probably work just fine in a 10-bit X environment except for chromium (or chrome). That browser, for whatever reason, can't into 10-bit. It simply does not work. Here is a screenshot with Firefox on the left and Chromium on the right, both showing www.gnu.org:

Chromium-can-not-do-10bit-color-on-linux.png

You may or may not notice that there is a small problem with using Chromium version 73 (and all earlier versions) in X using 10-bit color-depth. If you never use Chromium or Chrome ever then 10-bit is a viable option: It appears to be all alone in it's failure to get into 10-bit.

Example configuration file[edit]

Creating a /etc/X11/xorg.conf.d/20-amdgpu.conf with the following content will give you 10-bit color:

Section "Screen"
  Identifier    "Default Screen"
  Monitor        "Configured Monitor"
  Device         "Configured Video Device"
  #               24 for 8-bit or 30 for 10-bit
  DefaultDepth    30
EndSection

And that's it. DefaultDepth 30 is the crucial line.