Pixel Realities: The New and Upcoming PC Monitor Thread

U

U-99

Guest
2023 is a fascinating time for monitor technology, with many different technologies maturing and/or dropping into mortal price ranges!

OLED is the new kid on the block and impressing everyone who sees it with perfect HDR blacks, blistering-fast response times, and rich colors. In the past year we've seen desktop-friendly TVs (like the LG C2) and gaming-targeted OLED panels (like the Alienware AW3423DWF) break $1,000. There is even a 1440P 240Hz model from LG! But OLED still has teething pains when used as a desktop monitor: Brightness is middling, text rendering is off, burn-in is a constant concern, and fast response times do not negate sample-and-hold blur.

FALD backlights have gone from the realm of $2,500+ flagship devices to $550 units, and mini-LED has pushed the tech into more granularity. It still lacks the exactness of OLED, but for people who want to do gaming and productivity on the same monitor without burn-in. Indeed, you can get 4K/Mini-LED/144Hz for as little as $800 now.

Black Frame Insertion is a feature that I fell in love with back in 2017. I've always been sensitive to motion blur, and every LCD or LED monitor that I've owned has looked mushy in games. It's something I constantly notice. ULMB on the PG27Q changed everything; scenes became crisp enough to reach out and touch! Sadly, the tech just wasn't mature enough for realistic use back then, and...in 2023, little has changed. Companies rarely provide strobing and rarely tune it to reduce cross-talk when they do; the feature is just checking a box. LG has steadily gone backwards with their OLEDs, from offering 120Hz BF1 to 60Hz BFI to yanking the feature out of consumer 2023 models. OLED BFI seems like a no-brainer due to its ease of execution, but companies have yet to embrace the feature.

Personally, I'm tempted by the 34" Dell OLED, the Cooler Master GP27Q (fast + mini LED + 1440P), and the INNOCN 27M2U (27" HDR1000 4k 144Hz), but none quite check all the boxes for me. Especially at my preferred 34" 1440P. The industry is trying to push 4K at everyone when even high-end GPUs struggle in today's games at 3440x1440. Though part of me wonders if 27" or 32" 4K could work if I just ran games at 1/4 resolution (1080p) with the higher res for desktop usage. But 4K requires scaling, which isn't always a home run.
 
Last edited by a moderator:

BO(V)BZ

Ars Tribunus Militum
2,082
As an LG C1 owner, I am fully on board with OLED, maybe willing to go with FALD if it's got at least 1k zones.

I saw your other post about BFI insertion and my C1 does support it, but not if you have a PC connected. I gave up pretty fast trying to figure out how it knew a PC was connected, but maybe there's some way to trick it. There's some blur, but at 120hz with perfect blacks, I can live with that until it's time to replace it with a new screen in a half decade or so. Maybe by that time microLED will be the same price, so we'll all get the best of every world. At least conceptually, microLED should be capable of absolutely stupid refresh rates if they decide they want to, so it's at least in the realm of possibility to do things like variable refresh rate BFI at 300hz+.
 
U

U-99

Guest
As an LG C1 owner, I am fully on board with OLED, maybe willing to go with FALD if it's got at least 1k zones.

I saw your other post about BFI insertion and my C1 does support it, but not if you have a PC connected. I gave up pretty fast trying to figure out how it knew a PC was connected, but maybe there's some way to trick it. There's some blur, but at 120hz with perfect blacks, I can live with that until it's time to replace it with a new screen in a half decade or so. Maybe by that time microLED will be the same price, so we'll all get the best of every world. At least conceptually, microLED should be capable of absolutely stupid refresh rates if they decide they want to, so it's at least in the realm of possibility to do things like variable refresh rate BFI at 300hz+.
Yeah I feel like we have the tech to make BFI possible with today's brighter, faster monitors. But no one seems to care.

Blurbusters had a certification and tuning program that is supposed to help manufacturers tune their strobing, but that only seems to have been applied to one model that was 1080p back in 2020. Oftentimes it's difficult to even see if a monitor provides BFI, let alone whether or not it's a quality implementation. For example, the Rtings review of the Dell S3422DWG says "The backlight timing seems to be pretty good, so there's little crosstalk" buy a guy on reddit told me that there was a bunch of crosstalk for him.
 

Nauls

Ars Praetorian
524
Subscriptor
Consumer MicroLED still feels a long way off to me. Especially when it comes to monitors, which always lag behind in panel tech to TVs, and affordable consumer MicroLED TVs are probably still at least 2-3 years out. Maybe I'm mistaken? Admittedly I haven't paid a lot of attention to that space, other than noting Samsung's offerings, which are still aimed at the "maybe I'll buy a yacht next year" consumer.

Personally, I'd be perfectly happy with a healthy selection of MiniLED monitors with a suitable amount of dimming zones that get close enough to OLED to not care anymore. After buying a MiniLED TV I'd gladly trade a very minimal amount of blooming for a cheaper price tag.
 
U

U-99

Guest
Consumer MicroLED still feels a long way off to me. Especially when it comes to monitors, which always lag behind in panel tech to TVs, and affordable consumer MicroLED TVs are probably still at least 2-3 years out. Maybe I'm mistaken? Admittedly I haven't paid a lot of attention to that space, other than noting Samsung's offerings, which are still aimed at the "maybe I'll buy a yacht next year" consumer.

Personally, I'd be perfectly happy with a healthy selection of MiniLED monitors with a suitable amount of dimming zones that get close enough to OLED to not care anymore. After buying a MiniLED TV I'd gladly trade a very minimal amount of blooming for a cheaper price tag.
There's a 27" 1440p microled gaming monitor from Cooler Master for $550, but the firmware is currently fucked.

INNOCEN has a $500ish 4K 60Hz miniled and an $800 HDR1000 4K 160Hz miniled.
 

Nauls

Ars Praetorian
524
Subscriptor
There's a 27" 1440p microled gaming monitor from Cooler Master for $550, but the firmware is currently fucked.

INNOCEN has a $500ish 4K 60Hz miniled and an $800 HDR1000 4K 160Hz miniled.
Thanks, I noticed the INNOCEN from your earlier link; the 32" 4K $1K model looks interesting. Still out of my price range at the moment. I've got a 32" 2K that I'll eventually upgrade to 32" 4K, or a 40-something inch 4K. Haven't researched INNOCEN - is it a solid brand?
 
U

U-99

Guest
The 32" panel has slower response times for high refresh rate gaming but should otherwise be comparable.

Unsure on the vendor. If I got an INNOCEN, I would do the following:
  1. Order from Amazon.
  2. Return using their no-questions-asked policy if the monitor is flawed at all when it arrives.
  3. Repeat 1 & 2 until you get a good one.
  4. Add the dirt cheap 4 year Asurion warranty from Amazon, as it too is quite generous.
  5. Bask in a safe monitor.
 

Semi On

Senator
89,415
Subscriptor++
Consumer MicroLED still feels a long way off to me. Especially when it comes to monitors, which always lag behind in panel tech to TVs, and affordable consumer MicroLED TVs are probably still at least 2-3 years out.

My OLED is only a few months old. I expect it to manage 3 years. If not, and MicroLED is still off on the horizon, I'm sure MiniLED will have improved quite a bit in that time.
 

Exordium01

Ars Praefectus
3,977
Subscriptor
There's a 27" 1440p microled gaming monitor from Cooler Master for $550, but the firmware is currently fucked.

INNOCEN has a $500ish 4K 60Hz miniled and an $800 HDR1000 4K 160Hz miniled.
That’s a miniLED backlight. MicroLED is an inorganic emissive display tech that should fix the issues with OLED (by replacing it with devices that can handle higher current densities and not burn out) but is probably a decade out for large displays and possibly longer depending on what technology wins out. You can’t exactly pick-and-place when you have 25 million subpixels.
 

Exordium01

Ars Praefectus
3,977
Subscriptor
My OLED is only a few months old. I expect it to manage 3 years. If not, and MicroLED is still off on the horizon, I'm sure MiniLED will have improved quite a bit in that time.
I’m really liking my LG C1, I have mixed thoughts on my Samsung Odyssey G7, and there is a fair amount of backlight bloom on my iPad Pro. I’m convinced that VA LCD technology is a dead end from the Samsung and that you need a second black and white LCD to handle backlight bloom with IPS displays from the iPad. But if you add a second LCD, you’re going to need a light cannon of a backlight to hit acceptable brightnesses.
 
  • Like
Reactions: Semi On

kibbler

Ars Tribunus Militum
2,242
I’m really liking my LG C1, I have mixed thoughts on my Samsung Odyssey G7, and there is a fair amount of backlight bloom on my iPad Pro. I’m convinced that VA LCD technology is a dead end from the Samsung and that you need a second black and white LCD to handle backlight bloom with IPS displays from the iPad. But if you add a second LCD, you’re going to need a light cannon of a backlight to hit acceptable brightnesses.
Out of curiosity what are your quibbles with the Samsung?

I have a Neo G8 and love it despite a few annoyances. I'm hoping it lives long (and prospers) considering Samsung's reputation.
 

Exordium01

Ars Praefectus
3,977
Subscriptor
Out of curiosity what are your quibbles with the Samsung?

I have a Neo G8 and love it despite a few annoyances. I'm hoping it lives long (and prospers) considering Samsung's reputation.
Some images causing scan lines and graphical artifacts. Flickering that was mostly resolved with a firmware update (I preordered mine), some backlight leakage around the edges and in the corners. Pretty crappy HDR performance. The curve is a touch too extreme but necessary to mitigate the crappy viewing angles. The Neo G8 fixes a lot of the backlight relevant issues.

To be clear, I don’t regret my purchase. The display market was quite different 2.5 years ago. I just think the C1 is the better display and I paid about the same amount for the two. If I were to get a new monitor today or in the near future, I’d want an ultrawide OLED display.
 
U

U-99

Guest
I ordered the S3422DWG to try out the strobing and fast MVA panel; it could be a nice upgrade for $380. My girlfriend leaves for a business trip on Saturday; it will arrive Sunday. 👹

(Skipping the "why do you need a new monitor if you already have a big one?" argument*. If I like it, I can install it and sell the X34 for $100 or whatever before she comes back. If I don't like it, it will be returned and the box gone before she touches down in JFK.) If I upgrade in 2024 or 2025, I can always sell the Dell panel to a buddy for say half price.

*You see, every monitor of a particular size is fungible regardless of panel tech, G2G, MRPT, HDR, resolution, etc.
 
Last edited by a moderator:

cerberusTI

Ars Tribunus Angusticlavius
6,449
Subscriptor++
My OLED is only a few months old. I expect it to manage 3 years. If not, and MicroLED is still off on the horizon, I'm sure MiniLED will have improved quite a bit in that time.
I just bought one of these the other day, which should arrive tomorrow:

A big part of the reason for that instead of the few hundred dollar less LG monitor using the same panel is that it comes with a three year warranty against burn in and dead pixels. Hopefully it will last longer, but that seems like a minimum acceptable lifetime for a monitor in this price range.

Also, it gives the option for a flat monitor, and I am not so sure about the bend.
 

cerberusTI

Ars Tribunus Angusticlavius
6,449
Subscriptor++
Yeah that thing is interesting. I found I kind of liked a curve when gaming and less so when doing office stuff so it’s a good compromise if it’s actually reliable.
The reviews were a bit mixed on it, but I think for my personal display preferences it is the best current option. I will update as to how it works out in practice once I have used it for a while.

I plan to mostly use it as an office monitor (although I do play some games), and after looking at curved monitors I was not so sure I would appreciate that for daily use. The ability to try various curves seemed desirable, but if it was fixed at flat or curved I would have picked flat. I considered 2x of the LG 27 inch monitors instead.

A higher resolution would be nice, but it is already a lot of pixels at 240hz and I can see why the tradeoff must be made.

Most of the reviews which did not like it seemed to primarily not like that the full panel brightness is very low, but one of the big reasons I want an OLED is to turn the brightness down as far as possible (much lower than their max), so that is fine.
 
I wound up ordering a Cooler Master Tempest GP27U (just rolls off the tongue), and for the ~4 hours I've had it hooked up since yesterday, it's quite nice.


It was proving difficult to find monitors that featured mini LED, local dimming, bright HDR, GSync support but not also curved. We have a curved monitor at work for one of our workstations, and I hate using it for productivity work.

Haven't tried out the USB C/KVM yet, or the gaming yet, but YouTube looks good.

The downside is that it makes my older Samsung U28E510 look quite bad, but I doubt I will upgrade that anytime soon. It still does 4K, and is plenty big.
 
Interesting. Reddit led me to believe that the GP27X monitors were so bad that they would snipe kittens from your desk, so I'm glad that you're liking one. It's hard to tell what is a product flaw, what is edge case malfunction, and what is an overly sensitive reviewer.
I like rtings, as they do buy the monitors they review, and I like they do at least show their calibrated TV/monitor settings, which I've used on what I've owned to tweak the colors.

They're also the reason I got a Hisense TV after my Vizio died. A killer TV for that price.

As for the Cooler Master, it's oddly difficult to find. I got mine on Amazon, and Amazon does sell it time to time, but mine came through another seller. All new though, IIRC it was ~$850. There is RGB on the back of the monitor, and I can't speak to the stand, as I have a dual monitor arm desk mount.

Officially, the GP27U doesn't support VESA DisplayHDR 1000, Gsync, or Freesync, but they all work (if rtings said they hadn't, I would've past it by). Windows 11 shows it can do a peak ~1200 nits. It is nice to set a monitor to 120 Hz too, it does feel slightly smoother, if just in Windows.

There is a firmware update that I will install for it this weekend, should make it even better, per their release notes.

How Windows handles HDR though is a bit weird.
 

benwaggoner

Ars Praefectus
3,567
Subscriptor
As an LG C1 owner, I am fully on board with OLED, maybe willing to go with FALD if it's got at least 1k zones.

I saw your other post about BFI insertion and my C1 does support it, but not if you have a PC connected. I gave up pretty fast trying to figure out how it knew a PC was connected, but maybe there's some way to trick it. There's some blur, but at 120hz with perfect blacks, I can live with that until it's time to replace it with a new screen in a half decade or so. Maybe by that time microLED will be the same price, so we'll all get the best of every world. At least conceptually, microLED should be capable of absolutely stupid refresh rates if they decide they want to, so it's at least in the realm of possibility to do things like variable refresh rate BFI at 300hz+.
I've been using a LG C1 as my primary desktop monitor, for both Mac and Windows. I've got a pretty good BenQ as the other monitor, which used to be my primary display, but I found myself mainly using the C1 due to better image quality, black levels, and larger size so much I switched.

The key is to change picture mode to Filmmaker Mode, which turns off all the "enhancement" stuff that makes an out of the box TV a horrible monitor, like the extra sharpness that makes text ring. If you're using an Nvidia GPU you also need to turn off the auto overscan feature in the TV. For some bizarre reason Nvidia has long defaulted to setting the overscan bit to ON.

I get a solid 4K 120p 10-bit with VRR using a DisplayPort 1.4 to HDMI 2.1 adaptor (my GPU is an a6000, which has 4xDP but no HDMI).
 

benwaggoner

Ars Praefectus
3,567
Subscriptor
Consumer MicroLED still feels a long way off to me. Especially when it comes to monitors, which always lag behind in panel tech to TVs, and affordable consumer MicroLED TVs are probably still at least 2-3 years out. Maybe I'm mistaken? Admittedly I haven't paid a lot of attention to that space, other than noting Samsung's offerings, which are still aimed at the "maybe I'll buy a yacht next year" consumer.

Personally, I'd be perfectly happy with a healthy selection of MiniLED monitors with a suitable amount of dimming zones that get close enough to OLED to not care anymore. After buying a MiniLED TV I'd gladly trade a very minimal amount of blooming for a cheaper price tag.
I've been evaluating Sony's 2023 "New QD OLED" and microLED TVs, and I have to say they are amazing, and that panel technology would make for next-level computer monitors. Viewing angles are superlative, contrast is excellent (although better on OLED), and peak brightness is great (although better on microLED). Sharpness of text is the best I've ever seen due to better pixel.

And, importantly, they both have good good watts/m^2, without LCD filtering and only having to power a pixel as much as it is bright. Much better than edge-lit, and a lot WRGB OLED or zone-lit dual-layer LCD.

The required dot pitch is bigger than other technologies, so we're a while away from being able to do a 27" 4K display with them.
 

BO(V)BZ

Ars Tribunus Militum
2,082
I've been using a LG C1 as my primary desktop monitor, for both Mac and Windows. I've got a pretty good BenQ as the other monitor, which used to be my primary display, but I found myself mainly using the C1 due to better image quality, black levels, and larger size so much I switched.

The key is to change picture mode to Filmmaker Mode, which turns off all the "enhancement" stuff that makes an out of the box TV a horrible monitor, like the extra sharpness that makes text ring. If you're using an Nvidia GPU you also need to turn off the auto overscan feature in the TV. For some bizarre reason Nvidia has long defaulted to setting the overscan bit to ON.

I get a solid 4K 120p 10-bit with VRR using a DisplayPort 1.4 to HDMI 2.1 adaptor (my GPU is an a6000, which has 4xDP but no HDMI).
I can't remember the tweaks I did for mine, but I'm pretty sure I just cribbed them from rtings.com. I never had any issue with overscan with my Nvidia GPUs, so maybe that's fixed?
 
Finally ran some games (Control, HZD, God of War), and it's been a blast.

HZD and GoW look great in HDR, and it's nice to no longer have screen tearing in Control, although to me it looks a bit washed out...I don't see any HDR options in that game.

However, now that I have two Displayport monitors (vs one DP, one HDMI before), how the OS (Windows 11) treats a monitor like it's been disconnected when powering a DP monitor off is quite jarring. App windows bouncing between screens, HZD has a janky handling of HDR...I thought multi-monitor setups would be better in 2023.
 

Doomlord_uk

Ars Legatus Legionis
24,892
Subscriptor++
Although I'm beyond happy with my Alienware AW3821DW, that's pretty damn cool, and a good price. My only gripe would be that it's 1440P rather than 1600P like the AW. But 240Hz? Dayum. Because a 240Hz version of my AW is pretty much my dream monitor right now, if I had money to burn and the thing existed. I'm assuming AW is going to come up with a 240Hz version one day... (maybe HDR1000 or more too).
Yeah that thing is interesting. I found I kind of liked a curve when gaming and less so when doing office stuff so it’s a good compromise if it’s actually reliable.
Just to bang on my favourite Alienware drum, but the AW3821DW curve is fairly minimal, and I think is 'just right' for 2D stuff, whilst still being nice for gaming and, well, just looking good on your desktop without some absurd low-radius look like those crazy Samsungs.
 
Although I'm beyond happy with my Alienware AW3821DW, that's pretty damn cool, and a good price. My only gripe would be that it's 1440P rather than 1600P like the AW. But 240Hz? Dayum. Because a 240Hz version of my AW is pretty much my dream monitor right now, if I had money to burn and the thing existed. I'm assuming AW is going to come up with a 240Hz version one day... (maybe HDR1000 or more too).
So that Corsair is completely different panel tech (OLED) with different strengths and weaknesses. It looks like the AW3821DW is a more conventional LED with token local dimming (edge lit).
 

Semi On

Senator
89,415
Subscriptor++
Oh shit, I actually have a reason to upgrade now? At work my 30" powering off messes up all the windows, shoving them into the 2x vertical 24".

This actually works pretty well in W11 now. I have a work Surface Pro X I usually keep docked to my OLED. When I remove it from the dock to sit in the living room or whatever, everything switches to full screen, then when I plug back into the monitor, it all usually goes back to being snapped into the various positions on the big screen.
 
  • Like
Reactions: continuum

cerberusTI

Ars Tribunus Angusticlavius
6,449
Subscriptor++
So that Corsair is completely different panel tech (OLED) with different strengths and weaknesses. It looks like the AW3821DW is a more conventional LED with token local dimming (edge lit).
I have had the Corsair Xeneon Flex for about a week, and so here is the quick review.

The optional curve is interesting. I mostly went with this one as it had a flat option, and I am glad I did so. After trying it a few ways, even a tiny curve is not so great when programming in an IDE, and while I was sort of getting used to it after a couple of days I think I can safely say it is very unlikely I will ever buy a curved monitor for work, regardless of size, unless it can correct perspective internally.

For some games the curve is nice, and it does increase immersion a bit. Only first person games do well with a curve though, and I had to look through my library a bit to even find some. My wife installed Cyberpunk 2077 from her library and played for a while, that was probably the best of the lot, and it does look quite a bit better than on a flat screen, especially at full curve (which is 800R). It requires a fair amount of force to bend the monitor. I may bend it for the weekend or maybe even nightly if playing something which can use the curve, but I would not bend it when just switching applications.

The matte screen produces noticeable color differences between same color pixels, but causes less blur than others I have used. It is highly effective at stopping incoming light from reflecting noticeably. I would rate it well for monitor use, even if you run the display somewhat dark in a somewhat bright room.

The speed and accuracy with which a pixel makes a color transition is impressive. I already had a 240hz monitor which claimed a 1ms response time (an early one), but it is even easier to read scrolling text or track a fast mouse on this one (and with the screen area this provides, tracking a fast mouse is desirable). It is noticeably faster at this, taking well under a frame to complete the process. Motion clarity is excellent overall, better than any other monitor I have owned. It beats any LCD or CRT here due to the speed alone, without the visual artifacts and eyestrain involved in significant pixel overdrive, poorly tuned phosphor persistence, black frame insertion, or other eye strain inducing mitigations.

The brightness can be high, but not for the entire screen at once. It has a maximum for the entire monitor which is fairly low, and the changes can be fairly distracting. While in games it works fine and shows bright scenes as they should be, on the desktop I enabled the brightness stabilizer to keep windows from changing brightness with size and the amount of text on the screen, which reduces the maximum to 150 nits. I have it set at half this (about 75 nits) and can reduce this to more like 60 and still read everything, due to the great OLED black levels. This is fine for me, but many seem to prefer their monitors well above this.

The resolution is fairly low, at about the ppi of a 27 inch 1080p monitor. Individual pixels as well as the grid between pixels are visible at the brightness I am using. This is the same as almost any monitor at this ppi, but it is no better than others here. It is less visible at higher brightness.

Color reproduction is overall a high average, but it is an average of extreme highs and lows. Mostly it looks good and reproduces colors well, especially in HDR, but it is very bad at yellow and orange (easily worse than a 6 bit+FRC monitor as a comparison when showing images primarily in these colors). It is so bad at displaying detail in these colors that I changed IDE colors away from these, as it shows yellow text as one red line next to one green line vertically, and only horizontal parts are yellow.

This likely results from the odd subpixel arrangement of RWBG on LG OLED panels (that is a guess, but an educated one). The red and green are in the same logical pixel, but they are too far apart to blend colors when at monitor distance, so it really only blends with the next pixel on each side. This leads to a variety of easily noticeable visual issues as the display is not capable of displaying fine detail in these colors. Anything yellow on black has a red border on the left, and a green border on the right. Items like some of the Ars user icons are off center, as the yellow area in the middle shifts left, eating a green pixel from the border with the red subpixel of the first yellow pixel, and adding an extra green pixel on the other side. Solid yellow interior regions are fine, but anything with a pattern and these colors has the wrong colors (it blends colors from subpixels across two different pixels where this is not intended), so most images do not do well either. It is a little bit surprising they shipped it as a monitor without grouping subpixels better, maybe they will fix that in a firmware update or similar (physically it has a red and green right next to each other, but the are not considered the same pixel logically).

This subpixel arrangement also means cleartype must be set to grayscale only in Windows. It creates a rainbow around text without helping anything otherwise.

Having an actual white subpixel makes up for the bad reproduction of some colors quite a bit. I set it for monochrome text hinting only, and as most text is monochrome it is very sharp. This makes an IDE or other text look quite a bit better than on other monitors, so long as yellow and orange are not colors in use. It is probably the best at monochrome text I have seen in any display.

Also, pixel orbit on a one minute timer is distracting. I would prefer to be able to set it to do this daily or only after several hours.

Overall, I would give it a 7.5 out of 10, where 5 is the average monitor. If they fix the inability to show detail in yellow or orange it would be an 8.5 or 9, with only the resolution as a downside.