banner
News center
Our products guarantee a painless, practical, and secure solution.

Diablo 4 at 8K is a beastly good time... but 16K breaks it

Aug 25, 2023

The mighty Nvidia RTX 4090 wins out again

PC built by Stormforce Gaming

Motherboard: Asus PRIME Z-790P LGA 1700Processor: Intel Core i9-13900K, 24 Cores / 32 ThreadsCPU Cooler: Corsair iCUE H100i 240mm ELITE CAPELLIX Liquid CPU CoolerGPU: Nvidia GeForce RTX 4090Storage: 1.0TB Seagate FireCuda 530 M.2 NVMe SSDCase: Corsair iCue 5000X RGBRAM: Corsair Vengence 32GB DDR5 4800MHzScreen: LG 55NANO966PA

After years of hype, Diablo 4 is here, and the iconic RPG certainly delivers. As a sequel to one of the most popular PC games of all time, Diablo 4 feels most at home on a PC, despite also being available for the PS5 and Xbox Series X, so that's given me a great excuse to fire up our 8K gaming PC and test it out at silly resolutions.

As usual, I'll be using the machine supplied by Stormforce Gaming, which comes with a 13th gen 24-core Intel Core i9, Nvidia RTX 4090 GPU, and 32GB of DDR5 RAM.

In the past, I've used this rig to test out how graphically-intensive games have faired at 8K - such as Cyberpunk 2077 and Resident Evil 4. With Diablo 4 being an isometric point-and-slash game, you may think that it's not going to give our rig much of a workout.

That's not the case, however, as Diablo 4 is a fast and frantic game with excellent visuals, and when the hordes of hell are swarming you, the screen can fill with impressive graphical effects.

Because of this, Diablo 4 comes with a host of graphical options on PC, as well as support for Nvidia's excellent DLSS (Deep Learning Super Sampling) tech. This uses the AI capabilities of an Nvidia GPU to cleverly upscale images to hit higher resolutions without the performance cost - and with minimal sacrifices to image quality.

I was glad to see a DLSS option, as this tech has in the past allowed games to hit the golden 60fps (frames per second) benchmark at 8K.

However, before I resorted to DLSS, I wanted to see how it ran at native 8K, with all settings set to max. I hoped to put the game through the same hell it was going to put me through.

The results were pretty damn impressive, with Diablo 4 running at an average of 50.8fps at 8K with all graphical settings set to max. I recorded the frame rate during a series of battles in both closed spaces and in the open world, and the game felt reasonably smooth and solid while looking great.

However, that wasn't the 60fps I consider to be the ideal frame rate for 8K gaming, and I recorded drops to 34fps at points.

Such big drops in frame rate result in a rather unsatisfying gaming experience (as the game essentially runs at half the speed for a moment or two). In fact, I'd argue that it's better to run at a lower frame rate that's more consistent, than a higher frame rate with bigger drops.

It shows that while the RTX 4090 remains an absolute beast, Diablo 4's busy combat, environmental effects (such as snow and lighting), and other graphical touches do stress the mighty GPU at resolutions of 7,680 × 4,320.

In a bid to smooth out the frame rate, I kept all graphical settings to max but turned on DLSS to its 'Quality' setting. This minimizes the amount of upscaling involved, so image quality is prioritized, but performance gains are reduced.

With that turned on, the average framerate jumped to 59.5fps, pretty much hitting the 60fps goal. The minimum frame rate rose to 45.9fps - still a drop, but a less severe one, which made it less noticeable, and led to a much smoother gaming experience.

In a bid to see if I could get a rock-solid 60fps at 8K, I turned on frame generation, a feature exclusive to DLSS 3, which is only supported by Nvidia's current RTX 4000 series of GPUs. This tech inserts AI-generated frames between 'real' frames to help boost framerates.

However, this actually dropped the average frame rate to 55.4 fps, which seems odd, but this isn't the first time frame generation has apparently had a negative impact at 8K. I put this down to the fact that the tech is new, the effort to generate frames at 8K is likely more intense, and the fact that the game itself is brand-new (I played it before the global launch). Updates to both the game and GPU drivers could iron this out.

With DLSS enabled, the RTX 4090 is pretty capable of playing Diablo 4 at 8K. The question is - even if you have the kit, should you?

The answer, as is often the case, is no. Upping the game to 8K gave no competitive advantage - rather than showing more of the map, for example, most assets were just scaled larger, so you were effectively getting the same experience as playing at 4K - but with worse performance.

A few things changed, like the mouse cursor getting smaller, but this didn't help - in fact, it made the game harder. You can, at least, enlarge the mouse cursor using the game settings.

But even with DLSS and the world's most powerful gaming GPU, Diablo 4 is always going to max out at 60fps at 8K due to the limits with HDMI 2.1. Dropping the resolution of the 8K TV and game to 4K allows Diablo 4 to be played at 120Hz refresh rates (theoretically giving you a max of 120fps).

Without image quality improvements, it means 8K needlessly limits the performance of the game. You're much better off playing at 4K and squeezing every possible frame out of it.

During my testing, I played around with resolution scaling to see if there was a way to make use of the extra pixels in my 8K TV.

However, rather stupidly, I turned resolution scale to 200% at 8K, which essentially runs the game at 16K.

This proved too much for our PC, which promptly crashed the game. Not that surprising, to be honest, and with games still not making full use of 8K, gaming at 16K is even more of a pipe dream at the moment.

Sign up to receive daily breaking news, reviews, opinion, analysis, deals and more from the world of tech.

Matt is TechRadar's Managing Editor for Core Tech, looking after computing and mobile technology. Having written for a number of publications such as PC Plus, PC Format, T3 and Linux Format, there's no aspect of technology that Matt isn't passionate about, especially computing and PC gaming. Ever since he got an Amiga A500+ for Christmas in 1991, he's loved using (and playing on) computers, and will talk endlessly about how The Secret of Monkey Island is the best game ever made.

Photoshop AI Generative Fill is so powerful it might change photo editing forever

I don't care about VR, and no M3 chip news will ruin Apple's WWDC event for me

How to watch Apple's WWDC 2023 keynote

By Amelia SchwankeJune 03, 2023

By Matt EvansJune 03, 2023

By Hamish HectorJune 03, 2023

By Becky ScarrottJune 03, 2023

By Al GriffinJune 02, 2023

By Cat BussellJune 02, 2023

By Carrie MarshallJune 02, 2023

By Tom GoodwynJune 02, 2023

By John LoefflerJune 02, 2023

By Jim McCauleyJune 02, 2023

By John LoefflerJune 02, 2023

Motherboard: Processor: CPU Cooler: GPU: Storage: Case: RAM: Screen: