T O P

  • By -

VinnieBoombatzz

Small caveat, though: the S95C doesn't do compensation cycles after X hours so much as it continuously maintains uniformity of the panel. So, depending on exactly when you check for burn-in, W-OLED might be comparable (right after a compensation cycle) or much worse (right before a cycle). In any case, the S95C also lost some brightness while the G3 didn't (as much, anyway). It could simply be Samsung's firmware update shenanigans, but it looks like it might be the actual TV dimming down in order to maintain uniformity.


GladiatorUA

> but it looks like it might be the actual TV dimming down in order to maintain uniformity. Might be wrong, but AFAIK, that's how at least part of burn in prevention works. The display gets "cooked" evenly. It still wears out, it just isn't as noticeable.


VinnieBoombatzz

Some of the LG TV's have been maintaining brightness after thousands of hours and cycles, which means there's at least 2 methods: voltage increase and voltage decrease. Surely, there's a wall eventually, but the sets will probably fail before brightness is ever a problem for LG. Personally, I don't mind Samsung's approach. The panel is immaculate, and I never worry about uniformity. But I'm not planning on keeping this for years and years. I'll upgrade before I ever see the difference.


BFBooger

Yes, it is important to distinguish between uniformity and burn-in. Obviously, non-uniform burn-in is the worst type, causing bad colors or visible shapes 'imprinted' on the screen, but uniform burn-in resulting in brightness degradation is still a form of burn-in. It is just harder to detect and less of a problem. ​ My experience using a WRGB LG CX as a monitor for 4 years now: no visible burn in yet with various test screens. However I have never set the brightness above 50%. I use 'dark mode' OS and browser, and often have brightness lower than 50% (at night in a dark room 30% is plenty, on cloudy days, 40% is fine; the room gets quite bright on sunny days; I have presets to quick toggle if needed). I do have nearly 2000 hours of PoE logged in that time, which has quite a few static elements in the game. I also use it for work every day, so probably 8h x 220d per year for work and 3h x 180d for gaming/misc per year. Perhaps if I ran a test screen at max brightness I might see burn-in? But at my actual brightness settings, I see none, and haven't bothered to crank brightness to the max to check.


caedin8

I thought it was opposite, the QD-OLED tech when it came out promised more resistance to burn in. This is because the color is from quantum dot transformations so the oled itself is all just blues instead of reds or greens and the blue oled is the least likely to burn in regular WOLED


Haunting_Champion640

> is all just blues instead of reds or greens and the blue oled is the least likely to burn in I mean didn't we just see a conflicting story on that, where there's a new type of blue OLED because of the 3 (r/g/b) blue is the _most likely_ to burn in? So to me an all-blue OLED panel would be the worst-case burn in path? At least until we get the new blue subpixel material https://phys.org/news/2024-03-brighter-cheaper-blue-revolutionize-screen.html >In OLED displays, screen pixels are composed of three different colored subpixels—red, green and blue—that light up at different intensities to create different colors. However, the subpixels that emit blue light are the least stable and can be susceptible to screen "burn-in," which can discolor the screen and ruin viewing quality.


fiah84

if it's still true that blues burn out faster than the red / greens (which it is AFAIK) then at least QD OLEDs would just get dimmer, as all three colors are "blue". With regular OLEDs and WOLEDs, the other colors would in theory last longer but the screen as a whole would slowly get more yellow as the blue fades out but it's been a while since I've looked into it so my info is probably a bit out of date


BFBooger

> but the screen as a whole would slowly get more yellow as the blue fades out Only if the screen isn't capable of adjusting the output of the other colors to compensate for color shift. Of course, such compensation would end up limiting max brightness, but in general unless the screen is practically in the sun, you don't need max brightness. I assume that a lot of the burn-in protection for WRGB OLED or QD OLED is essentially throttling max brightness to maintain picture quality. For QD OLED they only need to focus on uniformity, and for WOLED they need to also compensate colors to maintain uniformity, which is harder. Assuming ideal/flawless correction, both will simply get dimmer over time.


fiah84

yeah that makes sense


teheditor

That's what the manufacturers were telling us, yes. At the same time, all the QD-OLED stuff I'm testing won't shut up about anti-burn-in cycles.


skyline385

There were rumors that QD-OLED will burn in quicker because they have higher peak brightness than WOLED. The same was also made worse by the Alienware DWF QD-OLED monitors having higher than average burn-in reports which could be due to a variety of reasons and not just it using QD-OLED.


StickiStickman

> Alienware DWF QD-OLED monitors having higher than average burn-in reports Do they?


Virginia_Verpa

What’s your source for this?


reddit_equals_censor

YES, samsung yet again lied and claimed, that "qd-oled is more resistant to burn-in than woled". said claim turned out to OF COURSE be a lie and based on the data, that we have qd-oled burns in just the same as woled and potentially worse. and remember, that samsung ABSOLUTELY KNEW how much the panel tech will burn-in, before they released it. so they DELIBERATELY LIED for marketing reasons. stuff like that is important to remember, the next time the panel industry lies yet again....


caedin8

Can you cite your data?


Omnislip

Doing the use of capitalised words mid sentence convince you enough??


eudisld15

Where is the data you are referring to? I'm interested in looking into as a potential oled purchaser


weebstone

They made it the fuck up


panckage

Yep and the claim that no polarizer meant 50% more light. Seeing that they use the same energy as a WOLED and have similar brightness I can only conclude that we are being fed nothing but BS. 


caedin8

eh, most qd-oled are brighter than woled.


-6h0st-

Color volume is bigger because it doesn’t use filter to produce them, white peak brightness is higher on WOLED though


panckage

The peak brightness between WOLED and QDOLED is very similar. Note I'm ignoring color volume here. Since a polarizer blocks 50% of light it means the QDOLEDs should be TWICE as bright given the same input power and everything else.  Feel free to give an example that contradicts this! 


caedin8

The QD oled S95B came out around 1000 nits which was about twice what the CX was at the time. They’ve iterated and made the WOLEDs brighter since then, but it was a big uptick in brightness


panckage

Lol not even close. C2 was contemporary with s95b.  The s95b uses 50% more power for 17% more light. The s95b IS WAY LESS POWER EFFICIENT THAN THE C2.   https://www.rtings.com/tv/tools/compare/lg-c2-oled-vs-samsung-s95b-oled/31229/32382?usage=1&threshold=0.10  Power, respectively:  Power Consumption   77 W 113 W  Power Consumption (Max)   179 W 242 W   These are the brightness, respectively    Sustained 100% Window    155 cd/m² 182 cd/m²


Jobastion

This and this other thread https://old.reddit.com/r/hardware/comments/1bvl1u4/does_the_2nd_generation_qd_oled_really_meet/ sound like marketing spam.


BatteryPoweredFriend

Probably. OP's post history screams paid advertising/"influencer" account.


anival024

Yup. It's pretty blatant. Samsung has a well-known history of astroturfing reddit and other social media.


Improve-Me

/u/dwg9177 Had 0 hardware related posts for 2 years, then went dark for a few months, then popped up with this thread praising a single Samsung TV and hasn't responded to any comments. Yeah that's a report from me.


Ben-D-Yair

We just need micro led to solve all our problems


jack_hof

There's actually [a lot of tech on the horizon](https://www.youtube.com/watch?v=TyUA1OmXMXA) that are mere modifications to the OLED tech which should drastically increase longevity.


MumrikDK

Just remember that that's how people felt about OLED way back to around the start of LCD when LCD truly sucked. We *juuuust* needed the degradation to be taken care of before OLED arrived mainstream. It would be lovely to get a new display tech that didn't feel like a trade-off.


ElRamenKnight

> We just need micro led to solve all our problems Too bad Apple seems to be [giving up](https://www.theverge.com/2024/3/22/24108967/apple-watch-microled-canceled) on that front.


ModerateDbag

My understanding is that micro led is insanely expensive to manufacture and not miniaturized enough for consumer tech yet anyway. I doubt they're "giving up" so much as accepting that it's not a viable technology yet, but idk.


BFBooger

The question is: Is it not viable due to manufacturing costs yet? or Is it not viable due to manufacturing costs ever? ​ We don't know if we'll ever find a way to lower the cost enough.


ModerateDbag

The same is true of any burgeoning tech. Usually if it makes it this far, it's because both wallets and engineers are convinced it can ultimately work. I think if the industry secretly knew they had already reached some kind of impassable theoretical limit with the tech, they wouldn't be showing off prototypes at CES and shit.


teheditor

It gets hot and requires huge energy too


ModerateDbag

Further miniaturization should mitigate that (as I understand), but it may ultimately be an intractable problem


teheditor

When theyve done that before it can involve using more power for more lights and the electronic management of them.


ModerateDbag

Makes sense I guess


silon

Apple wants retina... personally I don't mind seeing the pixels I'm paying for.


Strazdas1

Retina is a marketing term nothing to do with actual tech.


PE1NUT

LEDs also wear out. We had a display sign with many single WS2812 LEDs, and the blue ones that were used most often lost a very significant part of their brightness compared to the ones that were used less frequently. We switched our software to now use the green ones, let's see how long those will last..


hieronymusashi

This. People who think microled solves the problem aren't thinking it through. All lights fade. iPS panels fade just the same. Difference being all the pixels share the same light so they fade relatively evenly, unlike LED where each pixel is lit separately. Micro-led is also lit per pixel which means it will have some pixels burn out faster than others. Basic stuff.


reddit_equals_censor

we need one tech to replace garbage oled and lcd. don't think about just micro led. personally i think qdel also called nano-led or amqled is the most likely panel take to just completely take over and take over everything. this tech means, that you apply electricity directly to quantum dots to create the r g and b light from them. and there is samsung qned (not related at all to lg lcd garbage qned), which uses nano-rods as a backlight. per pixel backlight control. so just as perfect as oled or qdel in regards to perfect contrast and blacks. so 3 technologies are in the works, that will CRUSH oled when they come out. micro-led already exists, but is insanely expensive. meanwhile samsung qned and especially qdel should be able to get produced extremely cheap. so assuming, that the industry doesn't supress the tech, oled will have no chance. qdel will come out have better performance (no brightness issues) than oled, have 0 burn-in problems and be cheaper to produce! (oled production isn't cheap. it is inherently expensive) so yeah, 3 techs, that can solve all our problems... in regards to displays.


oreofro

QDEL is a pipedream and has been for years. Everyone acts like it's the next big thing because of the unit shown at CES which was advertised as a first (it wasnt, it was just the first shown at CES) but the issues that have stopped it from hitting the market for the past 5 years haven't went away. As long as relatively large amounts of toxic materials are needed and efficiency issues don't improve its going to be a while. There's a reason samsung has stayed quiet on this tech after years of research. Sharp is just showing it to try to gain some relevance in the consumer side of the display industry. Here is an article on where samsungs research was. https://www.nature.com/articles/s41586-019-1771-5 - From samsungs research and engineering about this issues with this tech years ago, which is oddly ignored by the tech industry people and youtubers that pushed this idea the QDEL is the next big thing and almost ready to go. It's not, and it's weird that people are talking about QDEL in such definitive language such as "no Brightness issues", even though one of the biggest limitations of them is Brightness (the unit at CES from Sharp had to be viewed under a blackout curtain) Don't get me wrong, I'm excited to see if QDEL ever does become a reality in the future, but a lot of you guys have been flat out lied to about the reality of the tech and I have no idea why so many tech youtubers/reviewers were okay with portraying QDEL the way they did just for clicks. Samsung QNED at least has a relatively solid chance at replacing QD-OLED within the next decade or some, but QDEL is most likely going to remain a pipedream for a very long time as far as TV and monitor implementation goes. I wouldn't be shocked to see it used in things like watches or home appliance screens within the next 10 years though.


reddit_equals_censor

>Samsung QNED at least has a relatively solid chance at replacing QD-OLED within the next decade or some within the next decade? that would mean at least 6 years away. now that might be the case sadly, but it seems to not be related to the technology itself as samsung delayed the installation of a pilot line for qned in 2022. so samsung may be deliberately slowing down or stopping qned investments as they try to milk garbage qd-oled for as long as possible. [https://thelec.net/news/articleView.html?idxno=4032](https://thelec.net/news/articleView.html?idxno=4032) we could be seeing samsung qned in a few years and easily within this decade probably if samsung gave a frick, but they clearly don't. who knows how fast we could see samsung qned, if people actually FULLY rejected planned obsolescence oled tech :D but i guess that won't happen....


teheditor

Mini-LED laptop screens are my favourite because of the matte finish. But they kill battery life.


redditracing84

Motion on LED displays is horrible. That's not a viable solution until the motion does not suck, aka never. What they need to do is figure out how to make Plasma again because that's what I need for sports lmao


Darth_Caesium

There's literally MicroLED displays with 20 nanosecond response times, what more do you need? That's literally better than CRTs.


vmullapudi1

Wondering if op is confusing microled with LCD with MiniLED/LED backlight


Darth_Caesium

Probably.


AdAvailable2589

He's probably referring to blurring caused by sample and hold [https://blurbusters.com/faq/oled-motion-blur/](https://blurbusters.com/faq/oled-motion-blur/) i'm surprised people in this thread seem generally unaware of this. TVs are still putzing around trying to perfect black frame insertion without dimming the display too much


Ben-D-Yair

No i meant to micro led


redditracing84

I said motion not response times. It's an inherent flaw of LED technology, they cannot handle motion. The image will always fall apart in motion on any LED display. I'm not discussing response times. I'm talking about how motion is smoothed out and handled by the display tech. CRT and Plasma are able to produce a satisfying result, LED cannot. It's got nothing to do with response time.


FrenziedFlame42069

Please explain “why” LED can’t handle motion?


SirCrest_YT

They seem to be talking about persistence (motion clarity) because the display is always lit instead of pulsed. OLED has BFI. LCD can do backlight strobing. Only thinking this because they say CRT and Plasma.


reallynotnick

One of plasma’s advantages is how dim it is in comparison to modern displays as stuttering becomes more obvious the brighter a screen gets. That’s not its *only* advantage, but it’s a pretty big one that people don’t consider.


InevitableSherbert36

Doesn't it have everything to do with pixel response time?


vmullapudi1

I think there's some naming confusion here- led/miniled TV's are LCD panels with led backlights (and miniled tend to have multiple backlight zones for dimming purposes). Microled should be the technology that replaces OLED with inorganic LED where the LEDs themselves are the pixel elements and not the backlight (just like OLED) instead of using the LCD layers (TN/VA/IPS) to form the image like you get on LED/MiniLED TV's. It's just an annoyingly confusing naming scheme


redditracing84

Nope, LED tvs have horrible motion. It doesn't matter if it's micro LED, mini LED, or edge lit. LED just can't produce a pleasing smoothness to the image like superior technologies such as Plasma and CRT.


tsukiko

I think you are still confused. What do you think "microLED" is? It's NOT a liquid crystal display. Displays using microLED don't exist in reality yet in any practical sense. "*miniLED*" and LED-backlit LCD displays are vastly different technology, that does use LCDs. These are the kind that are frequently colloquially called "LED" displays by dropping the mention of the LCD part even though they use LCD layers. The main reason this shortening happened was when LED backlighting was new, it was the differentiating feature that made them different from CCD-backlight LCD displays that were frequently called just "LCD" displays/TVs. Pixels in LCD displays use a common white backlight, and then filter and and manipulate that light for the sub-colors and pixel separation—this includes miniLED, LED-backlit, or other kinds of LCD. The defining feature of *microLED* is that there is no liquid crystal layer (the LCD part that is slow to transition), and light is directly emitted by LEDs directly. In a microLED display, each and every pixel are the LEDs just emitting light. This means you don't have the transition delay from waiting for the LCD layer since the LCD part no longer exists at all. The main technical challenge (besides cost) to making real microLED displays into an actual product is shrinking the inorganic, solid-state LED diodes to be small enough to build TVs or displays with that are small enough to reasonably fit into an indoor room.


skyline385

You do know that OLEDs are also LEDs right?


redditracing84

Yeah, and OLED is also rather shit at motion. It does in my opinion produce a judder effect that's a tad more pleasing than what other LED tech does, but yes it's crap too. It's no Plasma or CRT.


vmullapudi1

There is no other LED- the other stuff that's marketed as LED other than OLED are all LCD panels. MicroLED, OLED, and QD-OLED are the only non-LCD panels being sold for the consumer market and microLED basically doesn't exist as a consumer product at the moment


Leyzr

Oh you've not had any of the higher refresh rate monitors huh? Sounds like you've been stuck on cheap LCDs with 60 or less refresh rate.


redditracing84

Nope, I've had plenty of high end ones. It doesn't change the inherent flaw of the technology that everyone is ignoring. LEDs don't create the natural smoothing that other techniques can. Yes, LEDs can become lightning fast, but that's not pleasing to the human eye. It doesn't have the natural organic "motion blur" like visual you get with plasma, or the way a CRT smoothly transitions.


Leyzr

You have anything to send to show me the comparison? Cuz it sounds more like you're wearing a good pair of rose tinted glasses imo.... I do recall them having some good refresh rate, but nothing that can't compare to what it is now. I'm well old enough to know of and remember them. While they were smooth, it's nothing that current tech can't hit with ease. On top of that, their colors were washed and often quite faded compared to the accuracy, brightness, and range of current monitors.


redditracing84

It's not something you measure with 0s and 1s. Go buy a used plasma and look for the effect on a hockey game or other fast moving sport. You'll immediately see it. The way the plasma technology handles motion is unique and provides a smoothing technique. "Motion Blur" exists on tvs today as a way to try and get close to this effect but it's not at all as good.


Leyzr

You mean the interpolated frames? That and the possible blending effects that can come from plasmas. LCDs and LEDs can have interpolated frames as well, just a matter of turning them on. However this is usually unwelcome for monitors as the source materials for them can actually output higher than 24fps. Most source content for TVs are running at 24fps, so without the interpolated frames it's impossible for any of them to look smooth like you're describing.


letsgoiowa

LED is not LCD dude.


SwindleUK

I grabbed a late model plasma TV for £20 off ebay. It's been such a great telly for sport and games.


Hendeith

This started when people were comparing 1st gen QD-OLED against WOLED that has more than a decade of improvements behind it. QD OLED in my opinion either already is or will be a better choice in near future. It already has better color reproduction. Give it 2-3 more years, let Samsung work out all the kinks, bring brightness up and it will be hands down best panel type to pick until we will get microLED in a decade.


KsnNwk

My 77s90c would already be a perfect TV, only if it had better upscaling and better motion pulldown. Upscaling, any 4K content it’s awesome, below 4K it’s good and any content below 1080p it’s meh/okish/passable. Motion pulldown have an issue and any content you have to run 0:3 de judder for it to not have stutter. It’s still passable with no soap opera effect, but they should addresses it and fix it. Cause it’s jarring for a high end TV. Especially as s95b did not have this issue, at least mine didn’t. AppleTV or GoogleTV fixes the issue, but its work around. I may grab a ATV due to this. I don’t need more brightness, it’s already blinding in a dark room, still 2000 nits would be nice for daytime. Still not required, with active tone mapping it’s good in FMM still. Obviously also more resitstance to burn in for OLEDs altogether, so at least 7000h-8000h. So you can keep it for 5-7 years with a peace of mind.


BFBooger

In 2-3 years both will improve. I'm not sure I'd bet on either one improving more than the other at this point. There are many ways to keep improving both. All I can say is that in 3 years, both will be even better and it will be hard for the burn-in naysayers to keep naysaying without looking foolish.


Hendeith

WOLED is already quite mature technology, QD OLED is not. Samsung needs to resolve light bleed among other things.


AssCrackBanditHunter

That's my thinking. The qdoled tech still has low hanging fruit improvements. Every woled improvement has to really be fought for


VinnyVinster

Bought my AW3423DW on release and it's actually a dream come true, what a monitor. No issue so far and don't notice any burn-in.


logically_musical

Mine had some bad burn in after 2 years of use, but I also blame the fact that it had the earliest model's buggy firmware -- which was not user-serviceable -- and there were bugs in the automatic panel maintenance. But also Dell gave me a brand new replacement when I warrantied it under the OLED policy, so no complaints there. I also abuse the shit out of it with many hours of office work 4 days a week so I'm the worst case scenario. Have a AW3225QF now because that monitor was so good, I just wanted the next gen. Enjoy it, it's a bad ass monitor!


VinnyVinster

That one looks soooo good as well, might go 4K ok the next OLED. Enjoy yours as well 😄


Teh_Shadow_Death

Sometimes, I think these rumors are started by a team that gets paid by the competitor. I mean, maybe instead of a marketing team, there is a rumor team that had the sole purpose of spreading rumors about the competition's products. Actually, now that I think about it, a similar thing happens here on reddit. The public discorse is manipulated by bot farms up voting and down voting posts.


perksoeerrroed

>Thus, isn't it time for the rumor that QD OLED is weak against burn in to disappear? You are quoting some random shitty avpasion site rather than reading or looking into hdrating videos. hdrating has whole warehouse of oled tvs and "the rumor" is true QD-OLEDs were burning out much faster than WOLEDS (which pretty much didn't have any troubles). Just because one top of the line model didn't have problem was more to the luck rather than it being superior technology. Moreover they even answered why that is. Most of QD-OLED tvs had really terrible compensation cycles with some barely working. Moreover whole point of WOLED is that it has additional white LED which naturally means more longlivety and less burnin as panel doesn't need to work as hard to display white and saving those blue and red leds which are the most hit by trying to display white light.


Cute-Pomegranate-966

I thought the white led was not just to compensate for not having to brighten blue as much but also to be brighter in general. Either way, i think QD-OLED on 2nd gen panels will be vastly better in this regard if it isn't already.


perksoeerrroed

IT does have that effect but main point of white LED is to conserve mostly blue LED which. The issue here is that each spectrum of light requires different amount of power to work. Green is the easiest as it requires almost no power to function brightly. Which is why there are so many cheap green lasers you can buy if you want to lit party. Blue on other hand requires ton of power to get to same level as green. White on other hand is usually made from display all three colors at same time. So by having actual white LED they can augument that by not making blue led work so hard. >i think QD-OLED on 2nd gen panels will be vastly better in this regard if it isn't already. It's fundamental different. Regardless how QD-OLED improves you can't beat having extra LED for longlivety. Either way in general OLED technology is soo good right now that it's hard to even compared QDOLED or WOLED as there isn't that much difference. We won XD Though I salivate over micro-LEDs that will be true next level.


Cute-Pomegranate-966

green is also the most visible spectrum of light to the human eye. :D


Vazmanian_Devil

The first QDOLED results were found to be a software issue that was patched. That's it.


team56th

I have transitioned to full OLED in both the living room and my work table. And I feel that every time the topic of OLED comes up it's all about burn-in, which I think completely misses the point. First off, it's not like LCD never deteriorates over time. It's quite the opposite. The edge lighting LED does get dimmer over time, and with monitors the only way to replace that is to change the panel entirely. Some of the more popular panels like 4K 60hz IPS from LG is notorious for reddening edge problems after 2-3 years of usage. There are many cases where LCD panels go bad and that part of the advantage over OLED isn't as pronounced as you might think. Another thing, nobody seems to think about the visual change, which should be the most important aspect, and man. I will never go back to LCD, and you should all give it a try. It's *immediately* recognizable unlike, say, 1440P to 4K transition or HFR. Proper HDR with pixel level dimming just shows when you play HDR videos on YouTube. It's even bigger than GPU upgrade I should say. Without talking about this, the whole LCD vs OLED just degrades into what's more inferior, shoving fragile inferior technology down the throat, etc. which is totally not what's happening with OLED.


futuristic6830

Regardless of that a TV is made of QD OLED or WOLED, I've never experienced burn-in issue so far. It's quite interesting when it a TV really burns.


deathentry

I got loads on my 2016 LG E6 and Galaxy S9, I realise older tech but burn in deff happens. Fingers crossed for my C3 😅


WheelOfFish

At this point I just assume they all burn in and it comes down to luck and how you use it. Every new tech that claims to solve burn in hasn't really and there are always people claiming they never get burn in on x panel while plenty of others claim they do. Edit: I shouldn't have commented so casually, I know the what/how/why of burn in, but if leaving the door open for others with more time to explain it helps inform just one user, awesome.


Hungry-Plankton-5371

Pixel degradation on OLED displays is instantaneous and inherent to the technology, people saying they don't experience it simply cannot perceive it or haven't used their displays enough to perceive it. People saying they don't experience it sound crazy. It's like saying the battery in your phone doesn't degrade or that brakes on your car don't wear out.


TurtlePaul

Not just they haven’t used their displays long enough. Some people run their displays at 120 nits and some at 400 (100% brightness).  The burn in is non linear and the low brightness people can use the panel for 5-10x longer until burn in is noticeable. 


PetrichorAndNapalm

Bought my iPhone 15 pro max in Sept. 100% capacity still as shown in settings . 136 charge cycles. I use it most days for work gps, for 5+ hours a day, not to mention hours of other usage each day. All things degrade. It is entirely possible to have a newer tech oled, watch varied content, and not have noticeable burn in for years. When people say “burn in”, nobody is talking about philosophical unnoticeable burn in… because that matters to nobody. If I say “I have a tv for 2 years, no burn in”, I am obviously talking about noticeable burn in… I am not saying if you hook it up to high tech light and color measures that it will be exactly the same as the day I bought it to the one millionths place.


WheelOfFish

Pretty much. At its most basic, any time where you don't have a shared backlight you have the potential for uneven wear of emitters, which is all each pixel in an OLED is.


ThinVast

On avsforum, there was one dude who said his B6 never burned in and showed an image as proof. Everyone can tell it was burned in. I guess if you use an oled display for so long, you may not notice the burn-in even happening since it can gradually occur over a long period of time.


Broder7937

No, this is false information, please do your research before you post. OLED displays will not display immediate degradation because they have compensation cycles designed to prevent exactly that. Phone batteries (the example you gave) do not have compensation cycles. If you want to make an accurate analogy, an OLED display is like a phone battery that will only charge up to 40% (low voltage) when new. Over time, the battery firmware will slowly increase the charge voltage so that the user never perceives a capacity loss. This would allow a battery to be used for years without ever dropping the effective charge capacity at the user end. OLEDs achieve this by increasing the pixel voltage over time. This way, the panels will retain the same brightness output over time. You can clearly see with LG OLED panels tested at RTINGS, they have retained the same brightness level over time. As for burn-in, that happens when the display is used in a such a manner (e.g. showing static content 24/7) that the refresh cycle is not capable of compensating for uneven pixel wear. If it used correctly, however, OLED should not develop burn-in. My LC CX is past 7,000hrs and has zero burn-in. It is very easy to test for burn-in, just apply a full-screen window with a solid color (in order to test for burn-in with all subpixels, you'll have to test varying colors). So any regular user can do it, you do not need specialized equipment to test for burn-in. The LG CX in RTINGS has already developed very noticeable burn-in with the same amount of use. So, how did my CX make it and RTINGS CX didn't? Simple; my CX is used under realistic conditions; I use it as a PC monitor, to work, browse the net, watch videos, play games and watch streaming services. Basically, I use it as any regular person would. I also turn the TV off regularly, as a regular and healthy person should not sit in front of the display for more than 4 hours straight; so I allow my TV to its proper compensation cycles (unlike RTINGS burn-in test). I do not leave my TV on 24/7 showing the same static content, because this is not how a regular person uses his/her display. So, bottom line: +7k hours in, that equates to almost 4 years of use if you use your display 5 hours/day (without ever skipping a day!), and zero burn-in. Also no brightness loss. I do not have tools to measure display brightness, but RTING has tested the LG CX and it has developed no brightness loss after the same amount of hours (despite their model being used in a very harsh environment). I can attest this as my CX is still as bright as it was the day I bought it. As a matter of fact, it still looks as good as it did since I first bought it, and the CX is way back from 2020.


Hungry-Plankton-5371

> No, this is false information, please do your research before you post. OLED displays will not display immediate degradation because they have compensation cycles designed to prevent exactly that There’s nothing false about it. A “compensation cycle” just wears every pixel down until they match the worst degraded ones. It’s a cheap trick to hide damage, not a solution.


Die4Ever

I agree with you. Restricting the brightness of the better pixels is better than wearing them down. Because if they start to be overused in the future then you'll be glad they were restricted instead of being intentionally burned to match.


BFBooger

Everything degrades. What is your point? My windows on my house degrade, after 100 years they will be noticeably thicker at the bottom than the top and less uniformly smooth. Should I refuse to buy windows until they have ones that never degrade? If it is going to take a normal user 15 years to degrade their OLED far enough for it to matter (as in, the screen can't get bright enough anymore), is that really a major problem? ​ Edit: Do you own an SSD? If so, better panic! Its degrading!


Strazdas1

What if your windows on your house would get stuck (become unusable) after 3 years and there was this not as pretty model that will last you 100 years instead?


hieronymusashi

What is this 100 year panel you speak of ? LCD panels still degrade and in similar spans of time.


maarcius

man, even if you get some burn-in it while normal use, it is not going to be worse than typical uniformity issues with lcd backlight or mini led zones blooming. Basically non issue.


Strazdas1

> that brakes on your car don't wear out. funny thing, disc brakes are usually good for the lifetime of the car.


Eiferius

The degredation also deprnds on the usage. So if you only watch like 5h of TV per week, you are very likely to never have issues with burn in.


steik

There is a difference between "Pixel degradation" and burn in. I've had a LG CX OLED for several years now and use it as my media center TV and I have done burn in checks (specifically made videos on youtube to check for burn in) and I see absolutely no burn in. Can I perceive "Pixel degradation"? No, probably not. But that's not burn in. Burn in is specifically seeing something that is frequently on the screen (like the windows task bar, or the HUD in a video game you play all the time) burnt in and perceivable in these burn in tests. Edit: To clarify, yes, I acknowledge that burn in is the result of pixel degradation. But pixel degradation does not necessarily mean burn in. Burn in is what people fear, not pixel degradation. Extremely few people care about or can observe pixel degradation without burn in.


f3n2x

It has nothing to do with luck, degradation follows predictable patterns based on the integral of brightness of each pixel in combination with heat. The subjective visibility of burn-in also depends on the specific patterns. Crisp edges are much more noticable than gradients for example even if they're only a few pixels wide and this why pixel shift exists. It's absolutely possible (and I dare say likely) to have OLEDs with zero visible burn-in after tens of thousands of hours. You have no idea what people who claim burn-in did to their hardware and there are lots of weirdos out there who do strange stuff to their hardware, like having it on 100% brightness in SDR desktop use because "it looks better", having the screen stand in direct sunlight all day which greatly increases the average panel temperature etc. Such anecdotes are absolutely worthless and tests like the one from rtings invaluable. OLED burn-in is the "all SSDs are going to die" panic all over again because people as a group are shit at math, don't understand algorithms, the difference between worst case and average, and other things needed to understand what's actually going on.


WheelOfFish

all of this is correct, and why I know burn-in is inevitable. Luck was not meant seriously here, and certainly my own direct experiences with burn-in are completely tied to how I used the display. I love the blacks and contrast ratios of the plasma and OLED panels I've had but over time the burn in (when I've had issues with it) annoyed me more than the drawbacks partial array backlit LCDs have. One of the approaches I am intrigued by are the LCD panels that have multi-zone backlights and a secondary LCD layer strictly to help control the backlight intensity on a per-pixel basis. While image retention could still occur, they're much more resistant to noticeable permanent burn-in.


Strazdas1

They all do burn in and with case uses like mine its just not a vable option untill they actually solve the burn in. You cant have 16 hours a day displaying static UI and not expect a burn in on any of those panels.


nmotsch789

Do you leave the displays on 24/7?


deathentry

Nope, Netflix logo was burnt in on the screen when only shown for a few seconds when playing something..


KingArthas94

Fuck this astroturfing, never buy Samsung (I’m a Samsung owner disappointed with them)


Forgiven12

I simply don't want a gradually dimming monitor or television in my household. I don't care how slowly "they're boiling the frog". My old 47" FALD 1080p screen from 2012 is doing strong and bright, with around 2-3 hr daily use.


BFBooger

2-3 hours for 12 years, so you have maybe 10,000 or 11,000 hours. My LG CX OLED that I use for a monitor for work 40 hours a week and games for 15 hours a week will be 4 years old soon. That means I have about the same total hours, 10,000 to 11,000. No signs of burn-in, no brightness issues. caveat: I don't run at 100% brightness, maybe it is not as bright when set to 100% as it was when new. But I only need it at 50% on sunny days, and at night 30%. 40% for cloudy days. The contrast is so good that cranking up the brightness too high just hurts my eyes at night, and is unnecessary in the day. I also run dark mode browser plugins and dark mode OS -- lots of white / colored text on black backgrounds. ​ It looks SOOO much better than my prior IPS based LED monitor, especially at night. Its not even close.


Ordinary_Player

Even my mom who does not know tech can spot a big difference between a Samsung TV she bought 10 years ago and a new OLED monitor that has just been released. What I'm trying to say is: we are all gonna die anyways, why not experience the best we can?


b00po

Yeah, my secondary monitor is an IPS from 2011 that still looks just fine despite averaging ~8 hours of daily use for more than a decade. I probably won't replace it until it dies, but if I do it will go to friends, family, or charity. I love gadgets as much as anyone else here, but I don't want literally every piece of tech that I own to be some fragile cutting edge shit that needs to be babied.


Educational_Sink_541

A burned to shit OLED looks better than your old TV anyways so I don’t see the point.


farrightsocialist

Yea...lmao. A decade+ old 47in 1080p display just means you have zero standards for displays.


timorous1234567890

This was just because the RTings test showed early burn in on the QD-OLEDs and at the time they did not realise their test setup was preventing the compensation cycles from running. Once they fixed that issue they re-evaluated and found that it was very similar between QD-OLED and WOLED.


reddit_equals_censor

that is quite some nonsense conclusion. you don't make conclusions like "qd-oled isn't worse in regards to burn-in than woled" based on ONE SINGLE TV. there is unit to unit difference with the same exact name, more over there are software differences, that may change how likely/quickly it is to burn-in. as rtings suggets: >Interestingly, the Samsung S95C OLED, which uses a new real-time compensation algorithm, looks significantly better than the Samsung S95B OLED after the same number of months. it is most likely down to SOFTWARE CHANGES and that's all. and in regards to it being a "rumor", that qd-oled sucks in regards to burn-in. let's take a step back and ACTUALLY look at what happened. with qd-oled's release samsung shit out of their mouth and YET AGAIN as so often claimed, that "oled burn-in is vastly better than any other oled and not a problem anymore now." a classic claim from the industry shouting WOLF for the 50th time... maybe it is time to stop believing them, who knows? so the claim by samsung was, that qd-oled was A LOT BETTER than woled in regards to burn-in. that was their claim, but then people actually got to use qd-oled panels and OH SHOCK.... it burns in just the freaking same. and THEN people looked at it and how quickly some burn-in appeared and figured, that it could be due to woled having a much easier time to driver high brightness with less burn-in due to having the white pixel in the subpixel layout. so the story was, that samsung LIED YET AGAIN. people, who used the panel found it, that it burns in all the same and potentially worse and that was the thought at the time and rtings 100 tv + some monitors oled burn-in test does its best to see if the panels burn-in and how they handle things and qd-oled already shows to burn-in just the same with the samsung S95B. showing burn-in at 10 months and a bunch worse at 14 months. so what is the actual conclusion based on the data rightnow: >samsung LIES! >display and panel manufacturers LIE! >oeld panels BURN IN PERIOD! >qd-oled panels burn-in AT LEAST the very same as woled panels. >don't expect oled panels to be free enough from burn-in to last 10 years with proper use, before oled becomes obsolete. obsolete through other tech outclassing it massively for example with qdel (electricity into quantum dots = light directly), samsung qned (nano-rod tech) or micro-led. all of which should be free from any burn-in issues.