HOWTO enable 10-bit color on Linux

From LinuxReviews
Jump to navigationJump to search

Modern monitors and AMD graphics cards support 10-bit color. It's not enabled by default on GNU/Linux distributions. Enabling it is just a matter of having one line with "DefaultDepth 30" in a configuration file in /etc/X11/xorg.conf.d/.

However, there may be one reason you may not want to do this:

Chromium can't into 10-bit color

Every single piece of software you're using will probably work just fine in a 10-bit X environment except for the Chromium web browser (and Chrome). That browser, for whatever reason, can't into 10-bit. It simply does not understand that it is in a 10-bit environment. Here is a screenshot with Firefox on the left and Chromium on the right, both showing


You may or may not notice that there is a small problem with using Chromium in X using 10-bit color-depth. Chromium has had this problem since forever and version 78 still has this issue. If you never use Chromium or Chrome ever then 10-bit is a viable option: It appears to be all alone in it's failure to get into 10-bit.

Example configuration file

Creating a /etc/X11/xorg.conf.d/20-amdgpu.conf with the following content will give you 10-bit color:

Section "Screen"
  Identifier    "Default Screen"
  Monitor        "Configured Monitor"
  Device         "Configured Video Device"
  #               24 for 8-bit or 30 for 10-bit
  DefaultDepth    30

And that's it. DefaultDepth 30 is the crucial line.

Can You Into 10-bit?

The log file /var/log/Xorg.0.log will have all kinds of incriminating information about your computer. The first clue look for when you are trying to get into 10-bit is the EDID section created per monitor. It will look something like this when a monitor is connected to a AMD card capable of 10-bit (that's all not ancient AMD GPUs):

[    23.998] (II) AMDGPU(0): EDID for output DisplayPort-2
[    23.998] (II) AMDGPU(0): Manufacturer: AUS  Model: 28b1  Serial#: 108105
[    23.998] (II) AMDGPU(0): Year: 2017  Week: 29
[    23.998] (II) AMDGPU(0): EDID Version: 1.4
[    23.998] (II) AMDGPU(0): Digital Display Input
[    23.998] (II) AMDGPU(0): 10 bits per channel
[    23.998] (II) AMDGPU(0): Digital interface is DisplayPort
[    23.998] (II) AMDGPU(0): Max Image Size [cm]: horiz.: 62  vert.: 34
[    23.998] (II) AMDGPU(0): Gamma: 2.20

The 10 bits per channel shows that the monitor can accept a 10-bit signal. This indicates that the monitor supports 10-bit color input. There are TN panels who support 10-bit input even though they are limited to 6-bit actual output. You can send them 10-bit color and the display will look fine (for TN panels, anyway) but the color won't actually be 10-bit. You can usually tell if a monitor has true 10-bit support or not by it's price-tag.

The second line you want to look for is one which says Pixel depth =. It will either say

(II) AMDGPU(0): Pixel depth = 24 bits stored in 4 bytes (32 bpp pixmaps)

if you are using 8-bit OR, if you are into 10-bit:

(II) AMDGPU(0): Pixel depth = 30 bits stored in 4 bytes (32 bpp pixmaps)

There may be programs beyond Chromium/Chrome that have issues when you are using a 10-bit display. Please let us know if you run into any.

Add your comment
LinuxReviews welcomes all comments. If you do not want to be anonymous, register or log in. It is free.