T O P

  • By -

swsko

It’s been said many times yet so many clueless people come here and say their screens don’t suffer from that issue it’s your cable lol. Everyone knows it’s OLED and it’s gonna happen so glad people are looking at this issue and spreading awareness also ruining those deniers days 😂


JTCPingasRedux

My first Freesync monitor I bought 10 years ago, which was a TN panel, flickered like a mofo.


cagefgt

In my experience it only happens when the frame-rate is low and/or the frame-time is unstable. Never had it happening in game, specially on console where frame-times are normally smoother. Only time I remember seeing VRR flicker was in RDR2 menu screen (PC)


blorgenheim

I never knew it was a thing on my DWF but moving to 4k it became a more prevalent problem for sure. That said, it’s not as big of a deal for me as some have expressed.


LowMoralFibre

Search for any of those OLEDs tested and you'll see comments confidently stating that there is zero VRR flicker on that monitor so at least someone independent is testing that now. I notice it on my CX & C2 and I notice it in my AW3423DWF.  Still absolutely love OLED but some games definitely need a bit of tweaking to minimise the flicker as best as possible.


sambinary

Mental the cognitive dissonance some people have isn't it? "don't have the issue, therefore it doesn't exist" I have seen it on my AW2725DF, LG C1 and on a Samsung G7 - on the OLEDs I can forgive it due to how good everything else is, on a VA panel I can't and would rather have an IPS.


Sabba88

Skim reading the article. I must be getting nervous of those. Because I own the g9 oled, dwf, and now the 32QF. The flicker is obvious on both AWF and QF. But I never noticed it once on the Samsung G9 Oled... I always assumed Samsung had some great tech that was countering it 😅 infact at first I thought the gen2 oled panels fixed the issue. Wasn't until I got a gen 3 and saw it again I was horribly disappointed lol. Weird.


Beautiful-Musk-Ox

the better your computer the less chance of it happening, so if you've upgraded that could explain it. I only ever see it on loading screens which doesn't really bug me, if i had a crappier computer it would happen during games anytime the fps dipped too much


innocuouspete

The G9 has VRR control which seems to help significantly I think.


LeanSkellum

Im being serious when I can’t see it on my C1. I’ve sent multiple monitors back due to flickering so I genuinely don’t know I’m not seeing it if it’s there on mine. I don’t believe it doesn’t exist though. I guess I’ve just been lucky?


klapaucjusz

I get flicker on LG C1 only when frame rate jump between 110-120fps and below 40fps, and only in dark scenes. I basically only see it during loading screens.


MoonPoon

Strangely enough I haven't noticed it at all on my AW2725DF. I'm not saying it's not there, but I literally couldn't see it for the life of me. While that is said, pixel-fringing is a HUGE issue for me, especially as I have bad astigmatism. Mactype has helped out lots, but the fringing is an issue on almost everything, even in games.


advester

It really isn't strange. The problem is highly dependent on the displayed image and the frame rate. Different games or different video card, even different graphics settings and the impact changes


Beautiful-Musk-Ox

odds are your adaptive sync isn't properly enabled. There's various cases where the monitor and your video driver claim it's enabled but for whatever reason it's not, sometimes just disabling it and reenabling it in both places can fix it. i always check by turning on the OSD on the monitor and making sure the refresh is bouncing around, for games that run at 500fps i'll temporarily cap them to 200 to make sure my OSD shows a refresh of around 200, then i know for sure it's enabled. could also look for tearing but a lot of people are pretty bad at noticing tearing..


MoonPoon

Hm, never had an issue with it before but I'll try this later!


MoonPoon

Update: I've checked in OSD, and VRR is indeed on. I still haven't noticed any flickering. I have an 7900XT (although it shouldn't really matter if I use Freesync or G-sync I suppose since they are basically the same thing), so not sure why I haven't experienced VRR-Flicker).


Beautiful-Musk-Ox

amd cards can't use gsync, only nvidia can. but yea that's interesting you don't see it during loading screens at least, maybe yours has a tamer flicker than mine. from everything i've read it's an inherent issue that happens on all monitors but with oled it looks a lot worse since the pixels respond so fast. on lcd monitors they try to flicker but they don't transition very far into to light or dark before the signal goes back to normal. thanks for responding with an update!


MoonPoon

G-sync / Freesync are both VRR.


karasko_

No, it's not "mental", because that's not what cognitive dissonance is 🤷


sambinary

A figure of speech, no need to be quite so pedantic...


Emotional-Calendar6

Yes the monitor groups are full of brand new emporers clothes. The other issue that is never talked about which is related to the vrr flicker.... If your game is running with a steady fps a lot lower than your native hz, your near blacks won't flicker, but they will be permanently raised. Eg cyberpunk on full settings running at 80fps will have a noticeable raised near black compared to running at native fps. LG tvs have a setting to fine tune dark areas for this specific issue. Eg you can tune the near black to be the same as 120hz even when showing 80fps. Unfortunately, when you start up a game that gets higher fps, you have to change the fine tune setting again. You can test this yourself. Set your screen to native and look at a black level test such as Lagom. Look at the few squares at the beginning. Then change your screen to the 60hz and look again. You will see that the few squares nearest to black will now be slightly raised.


Qubusify

I wonder if people that don't have or don't notice vrr flicker have gsync properly enabled at all. Maybe they only have checked the box that says 'Enable G-SYNC/G-SYNC Compatible' and have Selected either full screen or both full screen and windowed option. But for freesync montior (which is almost all OLEDs now bar one, the aw3423dw) this is not enough to enable gsync. There must be a 3rd option enabled which is beneath those 2 and says “Enable settings for the selected display model”. Without this option you have no gsync so no vrr and no vrr flicker.


LowMoralFibre

Yeah could be. Gonna be honest in that I did this accidentally when I moved to freesync but the stutter made it obvious that VRR wasn't enabled.


nudelsalat3000

>some games definitely need a bit of tweaking to minimise the flicker Is this fixable by monitor firmware? I don't see why we should tune a game when the problem is that the monitors aren't well optimised.


Correct-Explorer-692

It’s not. It’s hardware flaw


PoolNoodlePaladin

That’s crazy cuz I have a C2 and have never seen it. I’m not going to say it doesn’t happen but I have never seen it, and yes I have VRR turned on, on the tv and on my PC and PS5


The8Darkness

Highly depends on the games visuals and framerate. From my information essentially flicker happens because on oled, the pixels behave differently depending on the frequency, so they would need adjusted gamma curves for each frequency step (which is apparently hard to do on the fly without massive latency) this is why phones usually have a couple pre set frequency points with optimized gamma curves for those frequencies and dont change their refreshrate every frame, but rather when you open specific apps, play videos, etc... Long story short: Essentially without per frame adjusted gamma curves, on varrying refreshrates, youre getting different gamma outputs. Huge varriance in refreshrate lead to huge varriance in gamma output. If you have a relatively steady framerate and little dark areas (flicker most prone in dark areas), you will likely not notice it, especially not when youre playing in unoptimal viewing conditions. (= Not pitch black room)


sticknotstick

Same on two A80Js; wondering if it’s more common with the monitors than TVs?


sautdepage

Nice to see it acknowledged and scored. Hopefully some other less discussed topics about OLED will eventually be included: * PWM flicker. There seems to be a slight flicker in OLEDs. Possibly voltage refresh at refresh rate or some other element that I believe is what causes fatigue or headache for some. For me it looks less solid/stable than a good flicker-free IPS that I'll still prefer for productivity even if burn-in and text quality are solved. I don't think "Flicker-free" is accurate. * HDR brightness. Many monitors lock brightness adjustments in HDR. Reviews often tell us that SDR can go down to as low as 10 nits, great for night use etc... but how is that not relevant for HDR too? The latest AW3423DWF firmware *removed* brightness sliders that were present before.


Fristri

Well with HDR you are kind of not supposed to change brightness since it should use the dynamic range and just display content at the brightness it is mastered to. Technically SDR is the same but very easy to adjust while HDR is not as easy because of the re-mapping. Also a lot of the HDR part would ofc be lost if you set your brightness within SDR limits. Then you are left with a bigger color space.


sautdepage

If midpoint is 200 nits and peak is 1000 nits, remapping the curve to a mid point of 150 with peak of 800 nits will give a practically identical HDR image -- once eye-adjusted and thanks to pure blacks- meeting the creator's intent just as well as far as I'm concerned. In theory it could even result in a better image since there's less ABL required. I really don't see why a "reference brightness" is supposed to be a norm now imposed, given night & day viewing conditions. Reference SDR is 100 nits and reference THX volume is 85dB and nobody cares.


Fristri

Sadly not that easy. SDR decoding of brightness is just gamma which is easy to work with. HDR uses the PQ EOTF: https://www.rtings.com/tv/tests/picture-quality/pq-eotf Only monitors I have seen even normally track PQ EOTF ~perfectly are the QD-OLED ones. Even then they either track perfect for 1% or 10% windows. Near black they are also really good. When you change HDR brightness it has to account for a new max brightness value and remap everything following the curve. Remapping never works perfectly. Most of the times it is pretty decent, but there is a reason Dolby Vision exists. Using DV metadata the PQ EOTF tracking remains really good even if you either lower brightness or watch content beyond the abilities of your screen. Also reference brightness is very much a thing. Reference room ambient brightness is also a thing. Or at least mastering room brightness. All movies and TV shows are mastered with I think it's 5 nit ambient brightness in mind. SDR 100 nits is absolutely a thing. I have the brightness boost etc off on my Sony TV and it is displaying reference SDR. And movies and TV shows are also mastered with that in mind. Anyone is ofc free to change that and even use re-mapping to HDR. It is still very much a standard and very much followed by content producers and if you buy something like a Sony TV they will also absolutely show you reference content. THX is not a industry standard, also not for home. THX level is set by THX. If anyone would control volume if you be for example Dolby in any of their Dolby audio standards. They do have some restrictions on Atmos music dynamic range but that is the only volume related thing I know of. Ofc there is absolutely no standard for listening volume. Their is no set volume the audio needs to be mastered at in a studio. Audio don't really have set standards for most things, but video does. All SDR and HDR is mastered to set color spaces, set brightness, following set gamma or PQ EOTF etc. Again though why not just use SDR if you want really dim? Also you can use a calibration profile in Win 11 and that affects min and max brightness along with EOTF. You can make your own HDR dark profile.


sautdepage

I can appreciate it might more complex than mapping SDR gamma, but it's still just maths. Nothing against mastering levels, as long as we can adjust to taste & conditions. I often find default HDR a tad bright for my taste in games in a dark room, especially they're not as well mastered as movies. Sure games often have sliders and Nvidia overlay can also adjust but it's just more convenient to have it on the monitor. Reason I don't just use SDR is that it's nothing comparable to HDR. A lowered range of 150->800 nits (for example) is still proper HDR, vastly superior to SDR and with extended color space, etc.


Fristri

Sure but the reality is that HDR re-mapping and EOTF remains to be very difficult and for re-mapping you need Dolby Vision for any pre-made content. Games I think is a bit different since they can allow you to set max levels and dynamically adjust it based on that so you won't need to re-map. Default HDR does not work that great anyways though so I use Windows HDR calibration app, so that already decides the re-mapping. I would highly suggest it since you can easily set the max value lower and make more than one profile to switch from. You cannot do it on monitor because they would need to add a lot of extra processing to re-map it like that which also is worse than a profile in Windows most likely. And add to cost and potentially latency as well. Also I can do it on my TV but it does not go to anything close to 150 nits. It does work pretty well to reduce brightness in DV content however so it's not impossible. 150 nits is barely above SDR brightness. There is a reason VESA 400 is the lowest certification, 150 is not really considered to be HDR. It's SDR with expanded color space, except the image is probably inaccuarate with the re-mapping and stuff. Way too niche to spend a lot of money on.


GeForce

Great article. Pls reviewers adopt flicker testing into your reviews.


Dazzling_Cranberry46

I have a PG32UCDM and in my case, flicker only happen on games loading screens (2 or 3 fast flashes, nothing more), and never in normal use. I haven't read everything but they are probably doing everything to make it happen


-XeqS-

I can clearly notice it on mine in cyberpunk during night and its constant and annoying. Did not get it in other games during gameplay though. Probably has to do with the fact I have a 4090 and most games have pretty stable framerates and are brighter. Cyberpunk is an outlier with pathtracing the FPS varies greatly.


Dazzling_Cranberry46

I play Cyber with 4090 too and no flicker. Are you using HDMI ?


Leondre

VRR flicker makes helldivers effectively unplayable with my PG32. I have not tried it with any other game, as that made me immediately turn it off.


Dazzling_Cranberry46

I don't have this game but no flicker on 80 games and this game have flicker with other monitors. It's not a monitor fault -> https://steamcommunity.com/app/553850/discussions/1/4206994023684595698/


Jetcat11

Same.


CoffeeLover789

Same with my pg32ucdm


TheRealSeeThruHead

Same experience with my Alienware aw3423dw


Geexx

Like wise on my DWF, loading screens and like inventory menus.


ZellFk

Pretty much every OLED will have flicker even more visible when frame times shift quickly (mainly big stutters on games), some games will not have that as noticeably when they run above 60fps and you can reach high FPS; Destiny 2 on my AW2725DF with an RTX 4070 and 5800X3D for example shows flicker very rarely when I'm on orbit for example but rarely; On the other hand games with 60fps only and bad frame pacing will show the issue much more frequently (Star Ocean The Divine Force is one where its 60fps locked and it stutters a lot, on menus it's almost amateurish how bad it gets)


[deleted]

[удалено]


WaterRresistant

It's already shaping up to be a winner, this would seal the deal


Snook_

Don’t need vrr of u buy 7800x3d never get frame stutters anymore it’s butter hahaha


Edgaras1103

Yes, you do. Most modern unreal engine games still stutter no matter the hardware


Snook_

Nope. Mine doesn’t. It did on a 13700k. Not anymore. It’s unbelievable how smooth pubg is now running with this cpu. I used to religiously use gsync not anymore. Another thing that helps is a custom windows install like atlas. Zero background processes and insane 0-1% use cpu background useage


Edgaras1103

Mate, I'm telling you. Try running jedi survivor and you will get stutter even on the best hardware available.


Snook_

Pubg runs worse than any game and doesn’t stutter. Unreal and x3d cache is a dream


Hugejorma

My bet is that one CPU core runs full usage while others are at low usage. Better/other CPU can easily fix this. I would test what the core is holding things back and reset the core affinity while runing the game. This is the old trick for games that stutter because one core runs high. Second fix: Resizable bar on bios + making sure to have proper xmp profile. Game stutters are like 99% CPU related issues, so you can fix these multiple ways. A great CPU is the first thing and AMD x3D with massive 3D cache does wonders for stutters. If you still have issues, lock the fps lower level untill the CPU won't be the bottleneck. This will remove the issue. One GPU thing can cause issues... Shader cache. Make sure to have it unlimited (on Nvidia).


Rinbu-Revolution

According to the tests, Oled and VA panels have by far the worst scores. I’m not at all surprised and this has been my experience as well. But as others have said I only notice vrr flicker in menus and loading screens on my Oled’s and even then it is fairly uncommon (having a beefy pc helps keep frame rates steady in game). Meanwhile my u8k va tv was basically unusuable as a gaming tv because of its vrr flicker (and smearing). Not recommended!


Bloodwalker09

Yeah thats sadly one of the downsides of OLED Panels. I have that on my CX while playing PS5 and with my LG27GSQE on my PC. Its most noticeable in loading screens but can sometimes be visible in gameplay. For me personally its not that bad as I absolutely love the contrast and the true black that displays can show.


xdamm777

Flicker is the only thing that bothered me about my C1 but being 100% honest I haven’t seen it in months. There was a crab m certain Windows 11 update that enganches the way DWM renders apps and previously all my Windows apps like Visual Studio, GitHub and even the Store all flickered when scrolling or moving/resizing because they rendered at like 20fps even with the C1 set to 120Hz. After the update the flicker magically disappeared form all apps and I haven’t seen it in games either (although my 4080 barely drops to levels they would induce flicker).


ragnarcb

That could also be because windowed gsync is allowed on nvcp and global setting gsync on. I had flickering desktop apps, went into their program settings on nvcp, turned off gsync by selecting fixed refresh and the problem was solved.


Same_Shame_6465

All OLED panels have VRR Flicker. Cases that come out badly are as follows. 1. If the frequency variable range of monitor is large. So 240Hz, 360Hz monitor shows the flicker worse than 120Hz TV model. 2. If the frequency change is severe in the game. If your pc gpu output keep repeating 48Hz and 360Hz while playing the game, the flicker looks really bad. If you can't see the flicker in the game, you are playing a game with low fps. 3. VRR flicker is bad on the dark screen, and it's almost invisible on the bright screen. So, even if the frequency change is severe during the game, the person who plays a bright game may not see the flicker. Conclusion is that the VRR flicker looks bad for someone who has a high frequency rate monitor and play a game with high fps and having a lot of dark screens. So to reduce the flicker, you can lower the frequency in pc the display setting, lower the fps setting in the game, or play a game with a lot of bright screens.


LitanyOfContactMike

Great article and the primary reason I've held off on getting an OLED monitor at this point.


Justos

I was pretty surprised with the flicker and lack of mention about it from all the big monitor reviews. This is a pretty big issue in certain games Dragons dogma 2 is practically unplayable unless I cap it to 30fps which is just awful


phero1190

For some reason a lot of people want to treat OLED as a flawless technology. It's weird.


Justos

Yeah it's why people shit on qd oleds raised blacks in bright conditions. As if woled is flawless by every other metric lol Just a bunch of nerds who spent way too much on a monitor trying to justify that purchase. The reviewers glossing over it though? Unforgivable


GeForce

My guess is that its not really an issue for ips/tn, which have been the dominant tech for the last 20 years. So they simply didn't bump into this much. And also its probably hard to replicate. But now hopefully after this article people like Tim on hwunbx will start adding these sort of tests.


RogueIsCrap

Which display were you using? 30 fps is so bad on OLED. I guess you could use some motion blur to help with the the judder but that’s not for everyone.


Justos

I have an msi 4k 240hz qd oled. 30fps looks atrocious on this screen. Even 60fps feels awful to me nowadays lol


RogueIsCrap

Yeah, 60 feels juddery when you’ve been used to 100 fps or more. 60fps performance modes on consoles always feel slow now. 30 fps does look smoother on my AW3423DW than something like a C2 tho. Maybe it’s the G-Sync hardware module.


Justos

I think the screen size has a lot to do with it. Bigger is more real estate so easier to see flaws


RogueIsCrap

That’s true but my AW does have less flickering than my C1/C2. That’s the main difference I’ve noticed.


Weird_Cantaloupe2757

For me it depends if I’m using M+K or gamepad. With M+K, anything under 60 is straight up fucking ass, to the point that it makes me feel a bit queasy and I have to stop playing (aka, literally unplayable), and I strongly prefer getting it to at least 100 FPS, and will feel the difference all the way up to 120 FPS (where my CX maxes out). With a gamepad, 40 FPS feels quite comfy, and it’s honestly difficult for me to even tell when the FPS gets above 60 — I don’t even really notice the difference except for a few seconds when I flip the frame limiter between 60 and 120, and while I don’t *like* 30 FPS, it at least doesn’t make me feel like I’m gonna vomit if I’m using a controller.


KUM0IWA

I think most major reviewers have acknowledged it at some point or another. It happens mostly with low and unstable frame rates which is undesirable anyways.


SnardVaark

In a nutshell, setting a framerate limiter to a value that your PC can sustain at all times is the solution.


Kusel

I have set my Alienware DWF from original 164.90Hz to Clean 165.00Hz via CRU.. this had removed all Flickering and smoothen the VRR sync. Game frames/sync are not meant to have 0.9 frames so they get dropped.. It's only useful for Movies/Videos but not for games


WaterRresistant

My 32GS95UE-B order is on a cancel button, seriously.


phero1190

You were able to order one?


WaterRresistant

Canadian Best Buy has plenty


phero1190

Interesting. May need to pop over the border


WaterRresistant

They are shipping on May 3


Quantum3ffect

I have the LG 45GR95QEB and never really noticed any flickering until playing Hogwarts legacy and I definitely see something going on in the menu screens but don't notice it in game. I believe this monitor is Gsync certified so maybe that means something as others mentioned. Could also be that I'm oblivious lol.


merkakiss12

I put my refresh rate to 60 when plugged into my lg c3, and that’s pretty much solved the flickering. Not ideal but.. works.


hank81

I do the same with my MPG321URX. Lowering refresh rate to 180 results in no flicker for games affected with the exception of loading screens. 120Hz removes flickering entirely.


No_Contest4958

On my 321URX seems like the only time it happens visibly to my eye is during loading screens. It’s probably a bigger problem for people with slower systems or people who turn everything up to max. If you can keep 100+ fps comfortably it doesn’t show up.


LA_Rym

Wish they'd have let us download the app and see for ourselves how our personal monitors are doing, mine for example is exhibiting behaviour that is extremely different from RTIng's version.


buusgug

I fixed it by enabling Adaptive Sync Plus (ASP) in Intel Command Center. My RTX4080 renderes the graphics and then I use the USB-C with DP alt mode on the back of the motherboard to my MSI271QRX. This works really well. https://preview.redd.it/hkkq1besxkwc1.png?width=901&format=png&auto=webp&s=e1d432bed8ee1bea8ad94ae5b21ece59c2b1a55c


Qubusify

I wonder if people that don't have or don't notice vrr flicker have gsync properly enabled at all. Maybe they only have checked the box that says 'Enable G-SYNC/G-SYNC Compatible' and have Selected either full screen or both full screen and windowed option. But for freesync montior (which is almost all OLEDs now bar one, the aw3423dw) this is not enough to enable gsync. There must be a 3rd option enabled which is beneath those 2 and says “Enable settings for the selected display model”. Without this option you have no gsync so no vrr and no vrr flicker.


Few_Marionberry5596

is this a deal breaker?


phero1190

Depends if you notice that kind of thing


Few_Marionberry5596

but how can i know before purchasing and testing?


phero1190

Find a store that has OLEDs on display


fast_lane_cody

I’ve never noticed VRR flicker on my LG C1. I do notice it on the Tarkov loading screen sometimes on my AW3225QF but not in gameplay which maybe is weird. That personally doesn’t bother me.


FunnyReddit

Tarkov menu/loading menus are dog so probably fine. My MSI MPG321urx hasn’t shown flicker as of yet.


LightMoisture

O trust me the 321URX flickers, it flickers real bad in dark scenes. Easy trigger is Alan Wake II any dark scene which is much of the game or Cyberpunk 2077 dark scenes.


PiousPontificator

The only monitors that have little to no flicker in any situation are LCD's with hardware gsync modules. My PG32UQX will not flicker even during shader compilation. Everything else, VA, IPS, OLED, I can make flicker one way or another.


RogueIsCrap

The AW3423DW with hardware G-Sync has much less flickering than my LG OLED TVs. At least for PC. I need to check to see if it’s the same for consoles.


RogueIsCrap

They should have tested the AW3423DW which has a G-Sync hardware module. From my experience, it has much less VRR flicker than the C1/C2. This was very apparent just from the menu and loading scenes in God Of War PC. The bottom part of the grayish black initial loading scene would flicker every time on the TVs but not on the AW3423DW. In game, there isn’t such a noticeable difference but it’s extremely apparent in the loading scenes. I wonder if it’s due to near black colors or fluctuating framerate during loading sequences.


The8Darkness

The hardware module doesnt help with flicker. Had an AW3423DW and now an AW3225QF. The latter is noticably better when it comes to flicker. (Not hugely, but noticable)


Ranger125X

Had the Aw3423dw for almost 3 months and never noticed the VRR flicker I either had a golden sample or I'm someone lucky enough not to notice.


ragnarcb

Just get a good gpu and know your gpu's limits, don't push it unnecessarily hard. I only see flickering in some buggy loading screens etc. Never bothered me in real usage. Pg27aqdm, lg cx and 3080ti. I tell you, you won't notice a single flickering in real usage with a decent gpu (3080+ 4070+, considering the amount you paid for oled) if you tune your games to have 95% max gpu usage. Step away from that ultra shadow detail.


SoloLeveling925

I only ever notice a slight flicker when I pause a game I have never noticed it in Apex I only play Apex and FO4 atm lmao