Two monitors themed repeatedly Should You Disturb? Normally my desire for editorial heterogeneity wouldn't allow for this, but while ultrawide displays have been around for donkey years, 2024 looks to be bringing a truly new take on gaming displays: the dual-mode monitor.
Exact specifications vary even among the very few models currently available. But generally, a dual-mode monitor can run at esports-level 1080p resolution with a refresh rate of more than 300 Hz, or at a much sharper 4K resolution with the refresh rate reduced, and can be switched between these two modes with a button press. a button. Or more likely a toggle in the OSD setting. The idea is to simultaneously satisfy both the well-groomed, high-end PC-owning gamer and the hyper-competitive FPS gremlin they'll probably become after consuming enough Red Bull after dinner.
If you're wondering “why don't you just get a regular 4K monitor and lower the in-game screen resolution if you want higher frame rates, you big rich nerd,” current hardware and cable limitations mean that native 4K beyond around 240Hz is largely theoretical. That's a lot of pixels to pump out that quickly, you see. That's why dual-mode monitors replace the actual output resolution with an easier 1080p Really high refresh rates that standard 4K displays cannot provide.
First thoughts? I can definitely see the logic, and although this is clearly a niche concept that requires a fancy gaming rig to take advantage of in the first place, flexibility is a worthwhile pursuit. As for money, the dual-mode monitors we've seen so far don't look even more expensive than a traditional 4K model with similar specs. Of course, the first ones out of the gate were the expensive OLED-paneled Asus ROG Swift PG32UCDP and LG UltraGear 32GS95UE-B, but the one I tested was the LCD/IPS-equipped ROG Strix The same goes for the latest Alienware AW2725QF, priced at £530. Both come with G-Sync and FreeSync compatibility to eliminate screen tearing, and although my XG27UGC's refresh rates are much lower, they're still pretty fast, with 160Hz in 4K mode and 320Hz at 1080p.
I guess we should stop at this point and consider the question we're always asked about gaming monitors with such crazy refresh potential: Does it make a difference? Various, sometimes conflicting, studies have been conducted on what the human eye can perceive. I've seen claims that it tops out at 30fps, 60fps, 225fps, and even 400fps. The problem with all this is that this is not actually how our eyeballs work, which is analog rather than working frame by frame.
The lack of compelling scientific evidence means I'm forced to conclude this particular point using opinion and anecdote, which isn't really the Do Not Disturb style, so I apologize for that – I'll blow a big Horn of Subjectivity if I need to do it again. But as someone who has spent years looking at these large light rectangles, my opinion is that you can get this light very easily. perception 160Hz is smoother than 60Hz and by some margin. Jumps to speeds like 240Hz and 320Hz aren't that big of a deal because the gaps between frames tend to be so small that diminishing returns come into play, but in my HOOOOONK experience you can at least train your eyes to see them more smoothly.
Another pragmatism of dual-mode monitors is that they allow very high refresh rates, but only when appropriate, that is, when the resolution is low enough that games can actually run fast enough to take advantage. The closest… I guess we'll call these “single-mode” alternatives from now on, 4K/240Hz monitors that also combine high-end resolutions and refresh rates, but they're also something of a computational white elephant. No graphics card in the world can run modern 4K games at 240fps with any regularity, and while you can always drop the resolution to 1080p yourself, you're still paying for that 4K/240Hz feature, no matter how unattainable. . These monitors, like the Gigabyte FO32U2P and Asus' own ROG Strix PG32UCDM, are twice as expensive as the XG27UGC; This monitor still has very high specs, but it's a monitor that doesn't enter the realm of fantasy.
By the way, the XG27UGC itself is a very useful screen. It may be 'only' LCD, but it covers 99.5% of the sRGB color gamut, records a respectable 1041:1 contrast ratio and can support basic HDR with a peak brightness of 434cd/m2. I also did not notice the slightest ghosting or screen tearing in both 4K and 1080p modes.
The latter mode also doesn't suffer much from the common drawback of running 4K monitors at 1080p: the potential for blurry-looking details that are a result of no longer a 1:1 match between the size of the pixels on the screen and the size of the pixels on the screen. the size of the actual pixels on the panel. If you put the XG27UGC side by side with a regular 27-inch/1080p display, there may be a slight loss of definition of text, but in games this is not a problem. There are no noticeable blurring or scaling issues, especially when you have that ridiculous refresh rate that makes the motion look clean.
So far it's hard to have trouble getting dual-mode monitors to work. But while they're generally a more logical possibility than their 4K/240Hz rivals, I'm still having trouble figuring out who they're for.
Who exactly are these 4K fidelity fans who only occasionally turn into 300Hz+ esports enthusiasts? Among the general PC audience, there are a small number of gamers who will want to spend money on the maximum quality that a 4K monitor brings, and there are also a small number of gamers who will run Counter-Strike 2 at 1080p. and potato settings, as it gives a slight advantage to headshot probability. Dual-mode monitors assume an overlap between these similar niche but diametrically opposed interests, and I'm not saying some part of the Venn diagram doesn't exist… but statistically, you're unlikely to live in it.
Dual mode is also less useful to the general audience; In the sense that if you're just interested in making games look good – which you should be, if you're prepared to drop five hundred quid on a monitor – then a 4K game running at 100fps will almost always have better quality sound than a 1080p game at 300fps. has. This is largely down to the aforementioned diminishing returns, which can still be worth clawing for in a seriously competitive setting, but it's a lot less so when you're just playing Lies of P or something else. Falling means a large decrease in sharpness and instead the result Negative a huge smoothness increase. I've been keeping the
And yet… I doubt my own doubts. Who am I to suggest that this 4K-Jekyll and 1080p-Hyde type of PC player isn't worth supplying, or worse, doesn't exist at all? If anything, it makes sense that someone who mainlines PC shooters/MOBAs/Rocket League enough to justify a 320Hz monitor to themselves is big on gaming in general. It's not hard to see how this attitude might extend to appreciating the detail and richness of slower-paced fare in 4K.
Verdict: Should you bother with dual-mode monitors? This still depends on exactly what you'll be playing and whether your PC has the guts for 4K. But you know what, I think it's a situational yes. Even if they're conceptually a niche on a niche, PC gaming is a broad enough church that it's clear someone out there will make good use of them.
I find it very helpful that dual-mode monitors don't actually have any inherent disadvantages when compared to a standard 4K monitor of similar speed. As the XG27UGC and Alienware AW2725QF show, they're no more expensive than their 4K siblings, at least as a rule, and are on par with many dedicated 1080p/360Hz displays. I might not be a frequent visitor to the XG27UGC's faster mode, but unlike 4K/240Hz monitors, there's little point in needing to invest more in features you won't use. Oh, that's right: HOOOOONK.