AMD At The X.Org Developers Conference 2020: No-No Baking Your Graphics Card
Linux kernel developer Rajneesh Bhardwaj from AMD's Radeon Open Compute team warned that you should absolutely not bake your graphics card at the X.Org developers conference this week. He explained the dangers and stressed "Don't try this at home!". He went on to detail the many runtime power management advancements the
amdgpu kernel driver for AMD graphics cards has seen the last year.
AMD's Linux kernel developer Rajneesh Bhardwaj's area of expertise is the AMD Kernel Fusion Driver (AMDKFD) known as
/dev/kfd device you can use to do OpenCL 2.0 and other GPU compute using the Radeon Open Compute framework. ROCm packages are available for Ubuntu/RHEL/Fedora. This was not the area he chose to talk about at this years X.Org developers conference.
Rajneesh Bhardwaj kicked off his "No-No Baking Your Graphics" card presentation at this years X.Org developers conference by saying:
"It sounds unusual but I'm pretty sure that for this community, specially the graphics people, this must be something they have either done in the past or must have definitely heard about."
"Sometimes people do bake their graphics card. In their oven. In their kitchen."
Sources familiar with this years X.org developers conference told us that several key people attending the conference admitted to having baked their graphics cards.
Mr Bhardwaj went on to say that the GPU baking trick only works about 5% of the time and warned that
"If you're not an "expert", DON'T TRY THIS AT HOME"
The rest of his presentation was all about how advances in the amdgpu kernel driver's power management capabilities make AMD graphics card run cooler using less power.
Linux Kernel Advancements In AMD GPU And APU Power Management
amdgpu kernel driver has seen a lot of improvements in its runtime power management capabilities the last year. Hybrid and dedicated graphics card setups will now go to into a low power mode when they are idle for longer periods of time. And AMD powered laptops have gained the ability to turn both the GPU chip and the bus off to save even more power when the GPU is unused. The trade-off is slightly higher latency when the GPU is needed. This is specially useful if a dedicated laptop GPU is paired with a CPU with integrated graphics.
You can watch Rajneesh Bhardwaj's presentation at the X.Org Developers Conference below if you want to know how amdgpu power management works beneath the hood.