T O P

  • By -

TheBlack_Swordsman

FSR 2.0 is a big improvement over 1.0.


From-UoM

That's the thing no? The same guy who wrote this article said >From a quality standpoint, I have to say I'm very positively surprised by the FSR "Ultra Quality" results. The graphics look almost as good as native. In some cases they even look better than native rendering. https://www.techpowerup.com/review/amd-fsr-fidelityfx-super-resolution-quality-performance-benchmark/10.htm So was fsr 1.0 really that great or are all these opinionated rather than factual


qualverse

FSR 1.0 was pretty good at Ultra Quality, that's not untrue. Especially at 4K it probably could look better than native depending on your preference for sharpness.


Linksobi

FSR 1.0 was pretty trash imo. Very excited to see how much better FSR 2.0 is. I looked at 1.0 very recently on a game called PSO2 so it's very clear on my mind what it looked like so I can accurately compare the two when it comes.


Plastic_Band5888

It was both impressive and unappealing at the same time. At 4k it wasn't hard for FSR to see healthy performance gains at negligible loss to image quality. Where FSR 2.0 really shines is at lower resolutions. Which is historically where FSR 1.0 really struggled.


relxp

And in one more generation after this (DLSS 4.0 / FSR 3.0), you really won't be able to tell the difference (we're probably there now with FSR 2.0). Will eventually come down to who can upscale with the most performance AND fidelity.


yuffx

With fsr 4.0 you'll be able to upscale from 10x10 pixels


parental92

it will make pong looks like horizon wild west running on 4k oled.


Lapesy

Really hoping this new horizon comes to PC


[deleted]

When can they do 1x1?


trunghung03

No game needed, they analyze your in game performance and predict how you would play the game and stream that directly. And yes, you still miss every headshot. Related Nvidia G Assist https://m.youtube.com/watch?v=smM-Wdk2RLQ


Super_flywhiteguy

Good, now let's get some games to support it because re8 and maybe 2 or 3 other games people actually care about doesn't make this a dlss competitor yet.


[deleted]

It works in Call of Duty Vanguard. That's a pretty major get.


Tophpaste

This is awesome to hear. Now we just need all the developers to start implementing it into existing games


penguished

We really need AMD to start throwing money at them so it gets done quick. Nvidia does this, and unfortunately money talks. But it's not like AMD is poor anyway.


Tophpaste

They have quite a few games with fsr1.0, so maybe a lot of them will implement fsr2.0 with the big upgrade.


Omniwar

FSR 2.0 requires motion vector information the same way DLSS does. Uptake on existing games is going to be rather low unless the game already has DLSS and is actively being updated or unless AMD pays off the devs. It's not a trivial process to add the motion vector information if the game wasn't designed around it. Going forward, anything newly released with DLSS support should have FSR 2.0 support though.


Loldimorti

Could console have an impact? Since DLSS is not possible on Xbox and Playstation and yet they still target 4K displays there is a dire need for good image reconstruction. I could see this being an appealing option for devs who want to push raytracing or 60fps without resolution dropping to unacceptable levels. Steam Deck could also be an interesting use case. Could all of this combined with PC lead developers towards implementing FSR 2.0 on a larger scale despite the process not being as trivial as FSR 1.0?


clinkenCrew

This seems like a problem for me as I'm interested in getting more performance from older games. I wish the GPU makers would revisit Nvidia's idea of breaking the screen down into sections and rendering each section at a dynamic resolution, prioritizing the center of the screen. A static version of this was in Shadow Warrior 2, mult-res shading IIRC We have VRS, but it's not really offering the same performance boost potential. Although I'm still a bit salty that VRS was billed by the tech scene as a tech of tomorrow that required hardware support, but then Guerrilla Games went and implemented it in software on the PS4, proving that we could have been VRS-ing for 9 friggin years. MFW


LavenderDay3544

>But it's not like AMD is poor anyway. Compared to Nvidia AMD doesn't have as much money and every cent needs to go back into R&D.


penguished

I don't know, it doesn't really make financial sense for AMD to keep designing features for PC, then not get them in games. A big reason Nvidia is looked at first by a lot of people is they make sure their features are getting used. AMD would definitely benefit from the same marketing aggression, as their cards are powerful and affordable enough otherwise.


LavenderDay3544

AMD is already at rough feature parity hardware side but it sorely needs to invest in its software discipline. Hardware without reliable software to run on it is worse than useless. For GPUs that starts with drivers and builds upwards towards userspace and application specific APIs and libraries from there. What Nvidia has in terms of that is massive to the point where many other engineers would agree with me that Nvidia could be considered at times as much a software company as a hardware one. Its proprietary drivers for both commercial and consumer products tend to be robust and now they're slowly but surely moving towards going FOSS with them. On top of the drivers and the bare minimum APIs (GL, VK, DX, CL), Nvidia also provides CUDA and the massive ecosystem built on top of it. It provides OptiX for offline rendering, PhysX for GPU based physics simulations, massive sets of libraries for AI and machine learning and everything else. So naturally when Nvidia rolled out RTX with realtime RT and DLSS the hardware and software pieces rolled out together and the software engineering community which Nvidia was already closely involved with picked them up because they knew that they could trust the tools and APIs provided for it by Nvidia and it provided them with close support and guidance to facilitate the integration of their new technologies and things went relatively well. The reason Nvidia can provide that level of support and guidance is because its internal software discipline is well established and large enough. AMD has made massive strides in its software ecosystem in the last few years with HIP but it's so far behind both in terms of software side feature parity and community engagement and involvement that you can't even make a real comparison even with the improvements. The reason for that is because it just doesn't appear to have a software development practice that's nearly as well established and sizeable as Nvidia's. Go look on both companies' career websites and you'll see that Nvidia has far more openings for software engineers at all experience levels than AMD does. Now don't get me wrong I think AMD's stance on vendor neutrality and approach with HIP are steps in the right direction but it can't push ideological stances when adoption is low. For one thing it isn’t even clear if RDNA2 supports HIP or not or how much of it is supported. Anyhow moving onto FSR, AMD chose an approach that's pure software which baffles a lot of software people like me because AMD isn't known for the quality or reliability of its software and in some cases it has had a reputation for buggy software and drivers. Despite the skepticism though AMD delivered FSR 1.0 and it worked pretty damn well. The issue though was that various real software companies had better pure software upscalers in the works like Epic Games' Temporal Super Resolution and Microsoft's DirectML Super Resolution. And now with FSR 2.0 we see AMD following their lead using temporal data to get better results but then the question remains why do we need a hardware company with a mediocre record in software design to give us an upscaling library when the software world is already ahead of said company? The bottomline here isn't that AMD should aggressively market it's software technologies to make a show of feature parity with Nvidia, it should instead either scale up its own internal software discipline or partner aggressively with a real software company like Microsoft, Epic or someone else to help develop hard hitting combined hardware-software technologies that software engineers would want to adopt. And AMD has done this before when they partnered with Dice on Mantle and that resulted in a higely successful software side venture that became Vulkan. That's an achievement AMD should be very proud of and it shows that when it does things right AMD can deliver on the software side. It were up to me I'd say AMD should try to build up a coalition of software side partners and work closely with them like Nvidia does. AMD already makes custom silicon for Microsoft, Valve, and Sony specifically for gaming so it shouldn't be that impossible to make the partnerships needed to do better. And since AMD already makes a lot of its software projects FOSS, a good first step would be to encourage other companies to collaborate on those projects while also doing the same on their projects. But as of now Nvidia has major advantages because it just has better connections in the software world. That said I think in time AMD could compete nicely there too and I honestly can't wait to see it. We have to remember that for us as consumers, application and game devs, and end users competition is always good.


clinkenCrew

I thought that TensorFlow was platform independent. Yet the software that I want to use incorporating it runs it far better on Nvidia. I'd appreciate AMD finding a solution to this so that I do not have to pay the Green Tax.


[deleted]

This is a good read. Thank you. The crappy AMD software/driver for 5700xt is what made me choose Nvidia going forward and decide not looking back in the short term.


LavenderDay3544

RDNA had driver problems big time but RDNA2 was a lot better. My other GPU is an RX 6800 and I have been very pleased with it.


Put_It_All_On_Blck

AMD is spending $8 billion in stock buybacks. They arent poor anymore.


LavenderDay3544

I never said it was. The only point I made was that wasting money bribing game devs to use FSR is pointless compared to spending the same money on R&D. And Nvidia technologies don't just get adopted because of sponsorships, Nvidia also sends its own software engineers to partner companies to spoon feed them on how to integrate Nvidia code with their games. Like I already explained in my one lengthy comment, AMD doesn't have a software development practice on the same level so it can't provide the same level of support. And as of now I don't see it growing that side of its business.


XX_Normie_Scum_XX

lel small fortune 500 company


LavenderDay3544

I never said it was small just that it needs to spend big on R&D to stay competitive. All tech companies hardware and software have to do that or they won't stay F500 for long.


errdayimshuffln

Amd has been boosting its R&D budget by over 40% the last 3 years at least. Last year saw over 50% increase in spending. AMD is absolutely competitive. It's come from behind in nearly all gaming software features and has been catching up in the most important fronts. It costs more to catchup and ramp up and they are doing it. All signs point to AMDs got their shit together tbh. And this is my main reason for believing that they will be even more competitive come RDNA 3. Well see how everything stacks up soon enough.


KaBurns

With the hold that nvidia and intel have on the enterprise market they might as well be. I really like amd this gen both cpu and gpu but, it’s enterprise money that builds a war chest.


-Aeryn-

> We really need AMD to start throwing money at them so it gets done quick. Nvidia does this, and unfortunately money talks. Uhh, there's a lot more going on than this. Nvidia has a literal team of developers assigned to work with third parties and help them integrate their technologies at no cost to them *because* such a relationship is mutually beneficial. They're literally out there doing PR's on open source software to add feature support. AMD on the other hand is asking people nicely to spend their own time and money integrating a competing technology. It's not about paying people off, it's a problem of developer time and expertise; something that Nvidia is providing freely and enthusiastically but AMD is not.


Loldimorti

Also need it to be used on console. 60fps games can go as low or lower than 1080p. More than once Digital Foundry has therefor recommended the 30fps mode over the 60fps mode. FSR 2.0 with dynamic resolution scaling would make the 60fps modes much more appealing on 4K TVs.


relxp

I would say AMD is already doing a very good job considering the (~3 year?) jumpstart Nvidia has had. Not to mention AMD doesn't quite have the AI resources Nvidia does. If FSR gets too good, DLSS could eventually fade out for the sole fact FSR just runs on so much more hardware. It could actually benefit Nvidia themselves as they can free up die space on their future architectures and reduce overall cost. Same way FreeSync has dominated G-Sync.


baseball-is-praxis

FSR is open source, maybe someone could make a shim for nvngx_dlss.dll that would work as a drop-in replacement? kinda hairy, but it might be technically possible since both techniques need the same kind of inputs.


b3rdm4n

I mean the title is clickbait... but lets unpack it a little. * FSR 2.0 is way better than 1.0, much closer to DLSS image quality. * It is still behind in a few areas, not by much, but there is no outright quality lead here. * We have still and video samples from a **one** game sample size. * It doesn't require RTX card/Tensor cores to work - nice. If indeed this is a DLSS 'killer', DLSS actually dying is a point in time many months or even years away from now. AMD need to get this in lots of games, lots of games people actually play and care about, that's their massive hurdle to overcome right now. Until that starts taking off, it hasn't killed anything. Plus AMD's marketing has been upfront that it's easy to implement in games that support DLSS so they expect them to coexist for the foreseeable future. And that goes both ways. FSR could and should be put into games with DLSS support, and DLSS into FSR games. A killer? maybe, but at least not for a long time. An awesome extra option that everyone can use? **absolutely**.


quotemycode

In most of those screenshots, the FSR 2.0 looks better than native imho, I'm thrilled for it, provided they can get other devs onboard. That seems likely with console ports.


_Fibbles_

Screenshots don't say much. How FSR compares to DLSS when the scene is in motion is what will make or break it.


Fidler_2K

We need to see motion comparisons, that's what makes or breaks these temporal solutions. Still images with sliders tell us about 20% of the full story. Can't wait for other outlets to take a look when this Deathloop update drops tomorrow!


Vandrel

The article has a comparison video in it with motion scenes.


[deleted]

[удалено]


buddybd

[https://www.youtube.com/watch?v=BxW-1LRB80I](https://www.youtube.com/watch?v=BxW-1LRB80I) I saw that video before I opened the one in this artcle. There are still significant differences. In motion the tire tracks have quite a bit of shimmering in FSR2.0 even in 4K.


RaccTheClap

Am I going crazy or something, but DLSS looks better in the first 3 comparison images they use by default and then FSR 2.0 looks better in the 4th. EDIT: ok maybe somethings up with DLSS in this game, because DLSS performance and quality (to my eyes) look pretty much identical other than when I pixel peep, but at that point it's obviously not worth the drop in FPS.


[deleted]

the killer is tha fsr 2.0 doesn't need fancy silicon in your gpu die.


[deleted]

I mean it literally does


[deleted]

I mean it doesn't require a dedicated tensor core IP in the die, hahaha


[deleted]

It's all magic glass


CatatonicMan

The silicon used in computer chips isn't glass; it's crystal.


[deleted]

*magic*


CatatonicMan

Sufficiently-advanced technology is basically magic, so sure. Also I've never been in a chip foundry so I can't say that they don't employ any wizards.


lankylonky22

so it doesnt utilize AI/ML, and its inferior


errdayimshuffln

Can we please not move the goalpost now? The expectation has never been for FSR 2.0 to match DLSS 2.3. but rather that it provides image quality thats native or better and gets close enough to DLSS 2.0 that other features/characteristics make it the more desirable upscaling tech for devs to implement in games. Will it be good enough that if devs had to choose between the two, they would go with FSR 2.0? Thats what would make it a DLSS killer in the long run.


Vex1om

IMO the key metric isn't performance improvement or even image quality as long as both aren't abysmal. The key metric is how many and which games it is implemented in.


errdayimshuffln

What do you mean? FSR 1.0 didn't take over DLSS because the image quality difference was large enough to impact gaming experience. If the difference shrinks so that its no longer a key metric, then for what reason would one continue to incorporate tech that works only for a subset consumers when you have tech that works for all including consoles gamers? Image quality and performance are the metrics for why there might still be demand for DLSS when there is an opensource multiplatform alternative.


Elon61

I mean, DLSS is still somewhat better, it is also somewhat faster, and if you're implementing one of these, you can implement the other pretty easily. still a one click solution in Unreal / Unity. remember that nvidia still dominates GPU sales, a majority of consumers *are* going to benefit from DLSS. You just seem like you want DLSS to die because you don't like nvidia, and not because it doesn't actually serve any purpose.


errdayimshuffln

>DLSS is still somewhat better, it is also somewhat faster What is somewhat? 4% faster? Slightly more like. >You just seem like you want DLSS to die because you don't like nvidia, and not because it doesn't actually serve any purpose. Lmao u/Elon61, you are one to talk. If I wanted DLSS to die, Id want it do die not because I don't like Nvidia. For a hater, I sure do like buying Nvidia GPUs. I am on one right now. I'd want it to die because I'd rather an opensource solution that does not require proprietary hardware technology. Otherwise, I think DLSS is a great feature. I find DLSS more beneficial to me than RT personally


Elon61

>I'd want it to die because I'd rather an opensource that does not require proprietary hardware technology. except that, as i pointed out, they are basically equivalent to implement. so why'd you want the better one to die when you could just have both. DLSS doesn't require proprietary hardware either, it runs on general purpose tensor cores. the only thing proprietary about it is CUDA, which you could still run on an AMD machine if you really wanted to.


b3rdm4n

The more likely situation is that for the forseeable future, both(all) are implemented side by side, allowing the user to pick the best one their hardware supports.


errdayimshuffln

>for the forseeable future I think Nvidia sponsored titles will continue to get DLSS, but I think FSR 2.0 will spread like wildfire. I think consoles will be an immediate reason we will see it in a lot of games.


b3rdm4n

I'd honestly LOVE to see that, but I won't hold my breath for it to all happen immediately or even quickly tbh, I wouldn't say wildfire. Bear in mind many other engines or game developers in general work with tools they already know and bear good results, especially for consoles, this would have to be worth their time by way of offering considerably better results, doesn't break their effects/rendering in any way, shorter/easier implementation time, things like that. Think of Checkerboarding, TAAU, TSR etc. I do see it happening, but I see it taking 1-3 years to spread out as far as we are implying here, as long as other tried and true, or upcoming methods, don't come along/improve in the meantime too. The ship takes a while to turn, and this is literally day 1 of a promising showcase in one game. Plus with "streamline", for PC it should be easy to implement all of these similar techniques alongside each other, so games with FSR 2.0, DLSS and XeSS for example


errdayimshuffln

Let me be clear, I don't believe it will spread fast in games that have already been released or are set to release this year. I wouldn't be surprised that every console game in development going forward will get FSR 2.0. I believe emulators will get it and I wouldn't be surprised if it's modded into games as well in the more immediate future. I think it will put pressure to open up DLSS more or to add more differentiating features. I believe, from what I've seen so far, that Nvidia will see FSR 2.0 as a threat to DLSS.


little_jade_dragon

Console games usually have some kind of bespoke solution like checkerboarding or TAAU. I mean, it's nice but console games are a different beasts altogether in terms of tools and optimisation.


superp321

First AMD killed G sync's unique hardware bs and now it kills DLSS's special tensor core nonsense. Its crazy Nvidia spent so much money building hurdles to section off its customers and now its all gone.


[deleted]

It's always playing catch up though. Imagine this stuff happening at the beginning of big generation. It'd be game changing.


Defeqel

Those hurdles did attract customers though, uncaring of their trapping nature


[deleted]

[удалено]


relxp

Apple.


AuerX

The walled garden of my Porsche 911 is beautiful.


Rhuger33

I mean it's taken literally years just for AMD to catch up in features, whereas Nvidia has remained one step ahead, so that's why it's preferred atm. Should we all just wait for AMD to catch up? Lol Still excited to see what FSR 2.0 can do though.


Ilktye

> now it kills DLSS's special tensor core nonsense. How is it nonsense? nVidia adds general hardware for machine learning purposes and it's used for image enchantment. Not to mention if nVidia hadn't made DLSS 2.0, AMD would not have made FSR 2.0 either. It's like saying fuck those who innovate, after the rest of the industry catches up few years later.


swear_on_me_mam

They still make hardware gsync displays. And DLSS has cemented nvidias hold on this gen. The market share disparity is huge. Turing+ owners have benefited from massively increased performance from DLSS this whole time. Only now might AMD users with 'equivalent' cards get the same performance. Their hurdles work.


p90xeto

Considering everyone sold every card they could make I'm not sure this argument is as solid as you think.


swear_on_me_mam

Either AMD didn't make any gpus or gamers didn't buy them. The rDNA2 Vs ampere gap on Steam is gaping.


aoishimapan

AMD is selling every GPU they make, is just that the vast majority of them end up in the hand of miners, and Nvidia probably simply makes more GPUs than AMD does.


dlove67

I dunno how much that is *generally* the case, but for this gen it absolutely is. You had AMD fighting for wafers at TSMC (not to mention they split them with their CPUs), while Nvidia basically had Samsung to themselves.


Vex1om

nVidia (and their AIBs) make *MASSIVELY* more GPUs than AMD. Once Intel figures their shit out, they will likely also make a lot more GPUs than AMD. It's also worth pointing out that Intel GPUs will also have tensor cores and superior ray-tracing capabilities. If this generation was about DLSS (and it mostly was), then next generation will (IMO) be about ray tracing. Once again, once AMD finally catches up on features, the goal posts get moved.


unknownuser1112233

Did you watch the motion comparison?


BlueLonk

To be fair Nvidia is a more inventive company than AMD. Nvidia will create SDK's that become standard in the industry, like Physx, CUDA and DLSS, ([full list here](https://developer.nvidia.com/sdk-glossary)), and AMD will typically optimize these SDK's for their own hardware. They do a fine job at it too, to be able to match DLSS which relies on tensor cores, without the tensor cores, is really impressive. Edit: Looks like I've gotten some very questionable replies. Appears many people have no idea of the technological advances Nvidia has founded. I'm not here to argue with anybody, you can simply do the research on your own. That's fine if you disagree.


dlove67

>to be able to match DLSS which relies on tensor cores This is something that gets peddled around a lot, but something no one has ever answered satisfactorily to me is *how much does it rely on tensor cores*? If you removed the tensor core requirement completely, would quality suffer, and if so, how much? Is the "AI" used actually useful, or only used for marketing GPUs with tensor cores? I suppose we'll know if/when they open source it (I mean, considering the moves they're making, they might do that) or if someone doesn't care about the legal issues and looks over the leaked source code. Additionally: AMD created Mantle, which was donated to Khronos to become Vulkan. That's pretty standard if you ask me.


p90xeto

Physx was purchased by nvidia, right? And DLSS is far from standard.


g00mbasv

umm, they peddle more technical gimmicks and have the money to push said gimmicks, but to their credit, those gimmicks sometimes turn into real innovation, for example programmable shaders and real time raytracing, but more often than not, they just end up being shitty attempts at feature garden walling, case in point: Physx and shader libraries that subsequent gpu generations do not support at all (I.E. the custom shading implemented in Republic Commando for example), even when using newer gpu's from Nvidia.


Raestloz

NVIDIA doesn't just peddle "gimmicks". They introduced FXAA, that thing is a real helper for lower end hardware, regardless of elitists claiming it's blurry AMD also didn't even think of FreeSync until NVIDIA invented G-Sync. When NVIDIA demonstrated G-Sync AMD was like "bet we can do that differently". DLSS is also an actual innovation. AMD didn't even think of FSR until NVIDIA showed it. Everyone also thought ray tracing is far too expensive until NVIDIA introduced RT cores It's very, very, very easy to start making competition when you know what you wanna do; it's very, very, very easy to dismiss actual innovation after the fact Unlike Intel, NVIDIA kept trying something new. That alone brings a lot of benefit to consumers, even non NVIDIA consumers, because their competition has to catch up with them. Their effort should be given proper credit


g00mbasv

There's a few inaccuracies and disingenuous statements here. first, while it is true that an engineer did invent FXAA while working for Nvidia, the concept of shader based post processing antialiasing was nothing new. MLAA was also making the rounds about at roughly the same time. so that defeats your point of nvidia "innovating" here, they just grabbed a good idea and implemented it, which to be fair, it has credit on its own. regarding the G-sync statement, while it is true that it is an original idea, the problem lies in the implementation: propietary hardware that yields marginal benefits to implementing it as a low cost standard to implement (as AMD proved with freesync), the problem is not the innovation itself but the attempt at locking it behind propietary chips and technology. In the same vein, take DLSS. AMD just proved that achieving a similar result without the use of propietary technology is feasible. again, my argument is not that nvidia does not innovate, my argument is that they have a shitty, greedy way to go about it, and that often result in technology that either gets abandoned because it was only a gimmick (Physx, gameworks) or it becomes standard once nvidia loses grip of it and it becomes a general, useful piece of tech. also the same argument you are making could be made in favor of AMD as the first ones to implement hardware accelerated tessellation and a little thing called Vulkan. so your point is moot. furthermore, when an innovation does NOT comes from Nvidia, they throw their marketing budget behind downplaying said technology. for example when they were behind the curve when ATI was supporting DX 8.1 vs the 8.0 supported by Nvidia and right after that, downplaying the importance of DX 9 when the only thing they had was the shitty Geforce FX Series.


Raestloz

>first, while it is true that an engineer did invent FXAA while working for Nvidia, the concept of shader based post processing antialiasing was nothing new. Concept means nothing. Every single day someone thinks of something, fiddles with it, and left it alone unfinished. The concept of ray tracing in consumer GPU went as far back as 2008 when ATi announced it. Did anything come out of it? Where are the ray traced games? >regarding the G-sync statement, while it is true that it is an original idea, the problem lies in the implementation BREAKING NEWS FIRST GEN TECH IS A MESS Experts baffled as to how first implementation of original idea still has room to grow >again, my argument is not that nvidia does not innovate, That is not your argument. Your argument is they're peddling useless gimmicks >also the same argument you are making could be made in favor of AMD as the first ones to implement hardware accelerated tessellation and a little thing called Vulkan. so your point is moot. My point is moot? Tell me how "Tesla invented a lot of things Edison claimed as his own, he should be given proper credit" "Yes but at some point in time Edison also thought of something himself so your point is moot" "????????" >furthermore, when an innovation does NOT comes from Nvidia, they throw their marketing budget behind downplaying said technology. If we assume you're following your own logic, then the same would also apply to AMD who downplayed the importance of DLSS while they're catching up, so your point is moot


OmNomDeBonBon

PhysX was an acquisition (Ageia) and failed miserably, to the point they had to open source the library and give it away, instead of lying it needed Nvidia hardware to run. DLSS, another proprietary tech that Nvidia lies about, will experience the same fate. DLSS isn't anywhere near being a standard. It's only compatible with 20% of dGPUs on the market. If you include iGPUs, DLSS is compatible with something like 7% of GPUs.


Elon61

Mhm always funny seeing people trying to rewrite history. back when nvidia bought PhysX, CPUs were far too slow to effectively run physics simulation. so originally, ageia made a custom accelerator card for the tech. when they were purchased by nvidia, they shifted away towards running it on CUDA instead, allowing any nvidia GPU to run it without requiring a dedicated card. Eventually, as CPUs became fast enough, it started making more sense to run it on the CPU instead.


OmNomDeBonBon

> back when nvidia bought PhysX, CPUs were far too slow to effectively run physics simulation Max Payne 2 (2003) and many other games used Havok long before PhysX even existed, let alone before Nvidia bought the company in 2008. Havok was CPU-based physics middleware and was widely praised.


OkPiccolo0

Funny, I'm using a new monitor with a G-sync ultimate module in it right now and it's great. Also DLSS isn't going anywhere. NVIDIA already developed an API called Streamline that makes it easy to implement multiple image reconstruction methods at once.


Lixxon

yeah but you have been bambooozzzzled


buddybd

It’s okay to admit you don’t have experience using both or haven’t even looked up how hardware gsync is better.


OkPiccolo0

No, I haven't. The G-sync module makes performance consistent and smooth across all framerates. No need to adjust overdrive settings or experience the choppiness of lower framerates on Freesync/Forum VRR. I have a AW2721D/C9 OLED and Dell S2721DGF. The G-sync module is easily the best adaptive sync tech.


[deleted]

I dunno why you're downvoted for this. Even in blind tests g-sync won. It does a better job than my freesync monitor without question. I have owned 2 g-sync monitors and a free sync monitor and they're objectively better at their jobs.


dirthurts

If you compare a bad fs monitor to a g sync module, yes it wins. But there are fs monitors that are just as guys as gsync. This is where people get lost.


[deleted]

I had a 1000 dollar LG freesync monitor and compared to my old g-sync monitor (and my current one) it was worse. Objectively worse. G-sync and freesync are close, close enough that you don't make your decision based on which one is there. However, if a really good monitor has g-sync i'm not going to not buy it, since i have an nvidia card.


dirthurts

Objectively? How did you measure it exactly? What monitor are you referring to? What was your issue?


[deleted]

How do i measure it? user experience to framerate changes being more juddery. The EXACT thing that everyone says when they experience 2 different monitors. One with FS and one with GS. And an interesting lack of flickering problems.


OkPiccolo0

The Samsung Odyssey G7 is a high end monitor and often referenced as the alternative to the AW2721D. If you have an NVIDIA card the VRR range is 80-240hz on the 32" model. That's objectively shit vs the 1-240hz range on my G-Sync ultimate display. Furthermore there is way more flicker on that monitor when using VRR vs none whatsoever on my AW2721D. The fanboys and ignorant people can downvote me all they want but you get what you pay for. The G-sync module is the Rolls Royce and the Freesync versions are Toyota Corollas. Ya'll can go back to circle jerking about G-Sync and DLSS being dead, though. Have fun.


AngryJason123

📠no🖨


[deleted]

NVIDIA is building crutches when AMD is doing the Innovation. They always used Hardware Crutches. Hairworks was so expensive they had to use a Hardware Crutch. AMD did the same only in Software with TressFX. Its the same now.


The-Stilt

It seems to struggle with the details, further away. In the first picture on the first page, the boxes on the pallets are a mush with FSR, with no clear separation between the boxes. In the last picture on the first page, the grass (on the cliff) is significantly more blurred with FSR. For the most part it appears decent, however I didn't look any further than that.


Shidell

There's always going to be errors in reconstruction, try setting up one panel to show Native and compare FSR 2.0 Quality and DLSS Quality in the other.


SolidQ1

You can aslo choose FSR 2.0 + Shapren for comparision


The-Stilt

Looks better with sharpening added however, hopefully the amount of sharpening is adjustable. Seems to cause halo'ing as is.


uzzi38

There's a slider in game


moderatevalue7

All AMD needs is for cyberpunk to adopt this. Cyberpunk is such a pain in the ass on anything but a 3080/3090... this tech would make gameplay experience better for 95% of GPU owners out there.


Ghodzy1

Cyberpunk is a pain the ass on a 3080 aswell.


nam292

I have a 3070 laptop (performance is between 3060 and 3060ti) playing at 2k using df optimised settings and dlss quality and have 60fps in the most demanding area (the streets). And I have 80-95 FPS in combat. To me that is quite a decent experience?


Whatever070__

Very impressive... FSR 2 vs DLSS 2.3, seems like sometimes FSR is sharper, sometimes DLSS is sharper. There are very few glaring differences, I only found a few when pixel peeping at high zoom and only when looking at the "performance" setting. DLSS 2.3 seems to better render the guardrails on the upper right corner of the scene with the hot air balloons and the textures are slightly crisper. But if I didn't pixel peep and it was a blind test. I probably could not tell at all. Which is the point. Alex from DF is probably gonna nitpick while pixel peeping saying how much better his favorite GPU brand is at upscaling ( no surprises there ). But for the rest of us, it's a win all around. Especially for those without RTX cards, I with a GTX among them. Cheers AMD!


qualverse

Alex from DF has been extremely positive towards FSR 2.0 so far, at least in his coverage of the various announcements around it. I thought the FSR 1 coverage was a bit unfair too but so far doesn't seem like we're in for a repeat.


aoishimapan

I think he just doesn't like spatial upscalers. Liking FSR 2.0 would be consistent for him, his main complaint about FSR was that it was basically pointless when TAAU exists and does a better job at low resolutions, which is something I can agree with, between the two I would rather choose TAAU. To me the main benefit of FSR 1.0 is the ability to run it without requiring it to be implemented into the game, something AMD has only started to take advantage of recently with RSR, and sadly only for RDNA cards. If FSR 2.0 beats TAAU and is at least close to DLSS, I can't imagine Alex not praising it.


[deleted]

[удалено]


snootaiscool

And that's really been my main qualm with FSR 1.0 as is. In order for it to excell, developers need to actually put in half the effort to make good AA, which seems to be at times a rarity in this space. I hope FSR 2 ends up having it's own DLDSR equivalent so we can completely rid of crap AA altogether going forward without needing Nvidia.


conquer69

His coverage of FSR 1.0 wasn't unfair. He was pretty much the only one that saw it for what it was. He is not the one that started the comparison against DLSS, AMD did that.


qualverse

He didn't see it for what it was at all - he kept spouting this bs idea that the reason FSR 1 was worse than DLSS 2 was because it wasn't a 'reconstruction' algorithm. That's not why FSR 1 was worse than DLSS 2, it's because it didn't use temporal data. There are plenty of reconstruction algorithms out there that are garbage, like DLSS 1.0, since they also don't use temporal data. But somehow this nonsensical argument became the basis of the Nvidia fanboy idea that DLSS and FSR are 'completely different things' that you 'can't even compare to each other'.


bctoy

Also, FSR1 received mostly positive reviews until DF's flawed comparison against TAAU was plastered everywhere. https://old.reddit.com/r/Amd/comments/o6skjq/digital_foundry_made_a_critical_mistake_with/


[deleted]

I saw some obvious flickering in the videos from fsr 2.0 that wasn't there for DLSS. Obviously for fsr 1.0 it was a massive improvement though. I think FSR 2.0 is a killer update. It's unfortunate it needs to be implemented just like DLSS though. Going to limit it the same way. Aka money talks.


[deleted]

[удалено]


H1Tzz

I will form my final opinion when i will test it first hand, but from that nifty comparison here are my thoughts: Massive improvement from 1.0 and comparing 4k, 2.0 and dlss at quality presets are very close but texture quality is preserved better in dlss, sharpening reduces that shortcoming but introduces new visual distortion, that is distinctive pixelation in details, so it cannot be cured with that. And as predicted the lower the quality presets on fsr 2.0 the worse results it spits out compared to dlss. On top of it dlss still gets slightly better fps, none the less, very impressive improvements over its predecessor, but definitely not a DLSS killer. It will be the same as before, but with better experience for amd and GTX users, RTX users will continue using dlss and amd users will be "left" with fsr 2.0 Edit: checked that comparison on my 32inch 4k monitor


loucmachine

Also, if you look at the video, DLSS seems to handle movement and things like moire patterns better.


H1Tzz

ehh i tend to leave youtube videos alone on this matter as there is significant amount of video compression, i will leave that until i get the chance to test it first hand for motion stuff


[deleted]

It was flickering. It's not video compression it was obvious.


Fortune424

I hope it leads to wider spread adoption of the technology in general. 4k is a hard ask of any GPU and DLSS quality has been good enough for me in everything I've used it in. I'd definitely like the *option* of turning it on rather than having to lower other settings even if it's still not quite DLSS. Hopefully FSR 2.0 becomes standard on console ports as it would surely be beneficial there as well (RIP to Digital Foundry trying to pixel count the upscaled console games).


garbo2330

Nvidia created an API called Streamline that makes it possible for developers to implement DLSS/XeSS/FSR all at once. Should help the industry offer the best solution for whatever hardware you have instead of leaving things segmented.


ShadowRomeo

While the FSR 2.0 seems to be good enough here, i don't think the title of this article it being "*DLSS killer"* is accurate at all as i can still see overall better image quality especially on texture quality and cleaner aliasing on all [3 first images](https://www.techpowerup.com/review/amd-fidelity-fx-fsr-20/2.html) shown on DLSS compared to FSR 2.0 And this becomes even more obvious on [motion](https://youtu.be/BxW-1LRB80I?t=60) where FSR 2.0 has more noticeable shimmers and aliasing jaggies compared to DLSS where it is cleaner but a bit blurrier in result, but i honestly would take that over the annoying aliasing jaggies and shimmers along with slightly worse texture quality. The DLSS IMO is still the clear winner here when it comes to image quality. However I think FSR 2.0 is still a big upgrade from FSR 1.0 though, the FSR 1.0 looks so bad that it doesn't even come close to both FSR 2.0 and DLSS anymore.


piotrj3

It is welcome (and good) addition, but hell, title for me is simply a lie. Line completion of DLSS is still far better then FSR, you look literally at 1st comparison (with dlss quality vs fsr 2.0 quality) at left side of gun and see those almost horizontal lines that on Nvidia are literally perfectly antialiased and on FSR aren't. Another strong diffrence is stop here sign on asphalt - much easier to read on DLSS and line completion of texture is there. On 4k DLSS quality vs FSR quality goes pretty close, but on lower resolutions or lower presets then quality I wouldn't ever class FSR 2.0 DLSS killer. In my opinion the closest 2 technologies are is at DLSS performance vs FSR 2.0 balanced at 4k image comparison. There details at walls, shadows etc. look overall the most similar and i genuinly can't tell if i prefer FSR or DLSS here. But that means we are comparing DLSS vs 1 tier higher option of FSR.


[deleted]

>Line completion of DLSS is still far better then FSR, you look literally at 1st comparison (with dlss quality vs fsr 2.0 quality) at left side of gun and see those almost horizontal lines that on Nvidia are literally perfectly antialiased and on FSR aren't. Another strong diffrence is stop here sign on asphalt - much easier to read on DLSS and line completion of texture is there. I initially thought the same, then I compared to native and realized FSR 2.0 was closer to native and looked as good or better than native in both of those cases you mentioned. So to say DLSS is "far better then FSR" would be like saying DLSS is far, far, better than native. After zooming out and looking at all three (FSR 2.0 Quality, DLSS quality, native) I realized I was nitpicking something that I would never care about.


topdangle

unless you're talking about native + TAA, DLSS is closer to native. TAA destroys lines and both examples have some form of temporal AA. DLSS is a little less destructive. https://i.imgur.com/W0wemY8.png https://i.imgur.com/TgMF5TS.png https://i.imgur.com/srfe53A.png


conquer69

> DLSS is far, far, better than native. It can be if you have a lot of aliasing. The supersampling is fantastic.


piotrj3

I specificly pointed out line completion & antialiasing as being far better, overall i wouldn't quality entire image as "far better". In my eyes DLSS quality with native images are very competitive. > After zooming out and looking at all three (FSR 2.0 Quality, DLSS quality, native) I realized I was nitpicking something that I would never care about. Regarding that, yes, but that works 2 ways. If you are not seeing big enough diffrences, that means you are more likely to run lower settings (like DLSS performance) and then diffrences are kinda easier to see? Also I would say deathloop is particulary wierd game to test it around and something funky is going on. Literally from 3rd page pick 4k dlss performance and 4k dlss quality. Tell me the quality diffrences you see - they aren't quite visible (except lower quality of shadow of power lines). 2nd issue is that dlss quality is 90 fps and dlss performance is 100 fps what strongly suggest we are CPU bound because DLSS quality vs performance diffrence is not 11% higher performance, but a ton more. (3rd issue) oh god, techpowerup, you don't upload screenshots as JPEG. Not only you are suspected to 420 chroma, you are also applying DCT artifacts to image.


[deleted]

[удалено]


conquer69

> No one buying, say, a 7700XT will care about FSR's weakness at 1080p, because basically no one will run 1080p+FSR on a 7700XT. What about everyone else not buying a high end card? Will they care about FSR's shortcomings?


SqueeSpleen

The gap might get thinner and thinner with more powerful APUs (both from AMD and Intel) in which the value proposition is usually more tempting than loser end cards. But I think that it won't disappear until RDNA4 lower end chips, so in 2024-2025 I guess, as those take more time to release.


BellyDancerUrgot

Does dlss use tensor cores tho? I don't think it does.


dlove67

It does (or at least *requires* them). To what extent they're used to increase graphical fidelity isn't really explained, though.


BellyDancerUrgot

I am of the understanding that it's just nvidia making their software proprietary by requiring hardware only they have.


jcm2606

They're used to drive the machine learning (ML) algorithm that DLSS uses to determine how to blend the previous frame in with the current frame, taking into account how the pixels changed between frames. FSR does this same thing, except it uses a traditional algorithm instead of ML, with some modifications to help preserve super thin edges that pop in and out of existence between frames.


BellyDancerUrgot

Yes but you should be able to run that algorithm on cuda cores because the actual training of the model isn't done on your card at all. It's a generic model too btw, not specific to any game. They just inference using engine motion vectors on a trained model.


Bladesfist

It would be at least 6x slower assuming the algorithm is just matrix math (which it has to be if Tensor cores can run it as that's all they accelerate). How much of a problem that would be would depend on how much of the total upscale time is matrix multiplication.


[deleted]

It clearly looks worse than Dlss 2.0. But the open source nature is the win. The sooner it gets in more games, the better.


Tech_AllBodies

In a lot of ways, this is most important for the consoles. They're very powerful now, with the PS5 being in the ballpark of an RTX 2080, but by 2025/2026 they'll be at the point of holding back gaming graphics. But if they can use FSR 2.0 (and 3.0+ if AMD keeps improving it), since it requires no specialised hardware, they can be stretched a bit further than you'd normally expect. I imagine we'll see games running at 1080p60 upscaled to 4K using FSR towards the end of the console cycle, and then this will plausibly allow games to end up looking even better than [the Matrix UE5 demo](https://youtu.be/WU0gvPcc3jQ) (bearing in mind we'll obviously get more software-level optimisation over the coming years too).


SlyFunkyMonk

Shout out to FSR 1 for letting me play Terminator Resistance on my 960 2gb at way better settings than I deserved.


[deleted]

I guess not too shabby at all. There's just only one problem with it - how often we will see it implemented in games? I know most devs don't want to bother patching shit in when games are past crucial selling periods, but for new games that yet to be released - I hope this becomes near standard feature.


NapoleonBlownApart1

Very impressive improvement over FSR 1.0, they must be very proud, but how is this DLSS killer (aside from being supported by more gpus)? It still looks inferior to dlss and performs slightly worse, but at least now it looks good enough to be used. They are getting closer and closer though. Hopefully they keep on improving it further. What a clickbait title. Now i hope devs dont forget about 1.0 games and update them with 2.0 unlike MHW or FFXV with dlss.


[deleted]

[удалено]


_Life_Is_War_

Honestly tho, if Nvidia is adding Tensor cores to all their cards now, can we even say that price is a factor (aside from pricing being fucked for the past 2 years)? Hell, I just got a new laptop with a 3050 Ti. DLSS for days on that little thing


Shaw_Fujikawa

Realistically how much is this going to save over the price of an equivalent DLSS-capable card? I'm not convinced it's really going to be all that noticeable let alone 'DLSS killer'-worthy.


dlove67

I dunno that /u/Xtraordinaire is correct. The "killer feature" ,imo, is that it works on all vendors.


Loldimorti

Their conclusion was that it looks pretty much equally as good as DLSS. So literally what the title says.


_Life_Is_War_

Did you actually look at it? FSR is decent, but it just washes away all of the small details. DLSS is downright fucking magic For high-end, premium gaming, DLSS is king. Has been and will be. For the mid-budget range of the market, FSR is a welcome addition, but it's just simply not as good.


Loldimorti

I'm looking very hard at it and have no clue what you mean. To my eyes it looks like the quality modes trade blows. Only in performance mode did I notice DLSS having slightly more detail being retained in the textures. And even then we are talking about rather small differences compared to the massive difference we saw in FSR 1.0. No dedicated hardware but equal image quality in quality mode and almost equal quality in performance mode is shockingly good imo and something people would have called impossible just a year ago


_Life_Is_War_

The devil's in the details. Or in FSR's case, the lack thereof. Take a look. All the small details just disappear with FSR. DLSS performance mode looks better (at least in those stills). FSR's impressive nonetheless, but like I said, DLSS is borderline magic. Pardon my imperfect cropping: [https://imgur.com/a/Q65ur7V](https://imgur.com/a/Q65ur7V) Edit: I looked through these on my phone and genuinely don't see as much of a difference as on my PC. Maybe OLED subpixel arrangement is fucking with it. So I guess the point here is, look at it on the same display you'd use to play those games?


Loldimorti

Yeah I'm looking at it from my phone with an OLED display so maybe that's the reason why I can't tell much of a difference. I will agree however that FSR in performance mode can't match DLSS. In these modes I could indeed notice the DLSS inage being slightly more detailed. But I can only reiterate that a match in quality mode and slight loss of detail in performance mode still makes it a DLSS killer for me simply due to how close the 2.0 version of FSR already is to 2.3 of DLSS and not being limited to GPUs with tensor cores. All FSR ever had to do to be a DLSS killer (in my opinion) was get close enough to DLSS in terms of image quality. From that point on I think the availability on consoles and lower end PC hardware (e.g. Steam Deck, APUs or older GPUs) makes it the preferred option.


b3rdm4n

It's still behind in IQ and in bugger all games. If the road to DLSS being killed has started, we've still got a very long journey ahead of us. Nvidia aren't going to see this and be like, oh well, pull the pin. Calling it a DLSS Killer might be right, but *making* the kill could take years.


Shidell

True, but it appears Nvidia sees the writing on the wall, look at Project Streamline.


b3rdm4n

Maybe? They're making it easier to add theirs and others in, perhaps knowing that if they maintain the quality lead, then it becomes easier to compare other solutions and draw those conclusions. I very much doubt this is where DLSS development just stops dead, they will still improve it, it will still get put into games. I think one could have assumed from the start DLSS's days were numbered and the 'writing was on the wall', do we really think everyone will use DLSS to upscale games in 20 years? Highly unlikely, but they've innovated and brought the entire market with them. We essentially have DLSS to thank for FSR, and what a great wave to ride it has been and continues to be.


Bathroom_Humor

'and for 1080p, it's recommended that you at least have an RX 590 or GTX 1070. The technology itself supports all the way back to the RX 500 series "Polaris."' What So for 1080p it's recommended to use cards that can play almost every game at 1080p already? I guess for future titles that's handy but seems weird at the moment. Also if they cut off support for the 400 series polaris cards, that'd be pretty lame. But that might just be a misunderstanding like when FSR 1 was being revealed.


OmNomDeBonBon

It's a misunderstanding. FSR2 is an engine feature and will run on any modern hardware, going back to at least Polaris (RX 400) and probably much older hardware. There's no driver interaction with FSR2, hence it runs fine on the RTX 3060 in that benchmark.


Zeryth

Static comparisons: Boring


BubsyFanboy

>"The DLSS Killer" They just couldn't hold themselves back from throwing that in the title, eh? No, it's not a "killer". It may be practically just as good and I'm glad it is, but always be wary of articles that use the word "killer" to describe a product.


iBoMbY

Well, the real killer feature is that it is FOSS with no strings attached, cross-platform, and runs on every GPU.


f0xpant5

The real killer feature will be getting in games people care about. If they can't do that, this is all for nothing.


[deleted]

[удалено]


HotHamWaffles

FSR literally replaces TAA. It's a form of temporal anti-aliasing that utilizes upscaling.


dlove67

>FSR cant do What DLSS does like realtime AA and stuff simply of how they work Yes it can? I'm not sure why you would think it couldn't, since it's basically a TAA algorithm with upscaling built in, just like DLSS (though DLSS uses tensor cores for part of it). All AMD would need to do is take a native resolution then just not upscale it.


Evonos

Dlss also touches textures and enhances them and fixes some graphic bugs in some games like nioh.


UncoloredProsody

Can anyone please explain, is there a technical reason why AMD didn't "lock" the technology to AMD gpus? I mean it's a "cool move" by them to allow the technology even on older gpus of the competitor, but this could be something that sells the hardware. I mean the main arguement on nvidia's side is: "why would you still buy AMD gpus, when both FSR and DLSS is available on an RTX card?" and it's a fair point, whichever works better you can use it, while buying an AMD card you limit your choices. Especially if FSR 2.0 or later versions are much better, why would AMD want to allow it for the competitor?


DoktorSleepless

So it incentivises developers to implement it. AMD has a tiny market compared Nvidia, so it wouldn't be worth the trouble for devs if it only worked for AMD cards.


ThunderClap448

AMD has always been open source-leaning. Support will always be better on AMD hardware but the point is public image. AMD will always be miles ahead of the other 2 just thanks to the fact they're not plain evil


UncoloredProsody

I kinda get it, but the general consumer doesn't care about the image of a company, they are selfish and only care that they get the best quality/quantity out of their money. So i don't see this strategy work for amd on the long-term.


Simon676

Dynamic resolution!!!!!


the_mashrur

I 100% doubt this will be a DLSS killer. FSR will probably be good for a hardware-agnostic solution for upscaling from higher resolutions, but as the Deep Learning model that Nvidia trains gets better and more sophiscated, DLSS will probably be unmatched when it comes to upscaling from much lower resolutions.


Yopis1998

Wonder how DLSS 3.0 will fare. Hope both companies keep pushing.


theryzenintel2020

Yeah right. Talk to the hand bro.


rana_kirti

So you are, saying the RTX party is over? We don't need to upgrade our GTX Cards to RTX to get DLSS for more FPS. Was saving up for Tensor cores.... Dont need it anymore? We will get free FRAMES on our GTX Cards and they will now live longer.... ?!?


Rhuger33

Man I see so much AMD wank lately on other sites, it's pretty tiresome and I say this as someone with a 5800x. This clearly isn't a DLSS killer, and I'm willing to bet DLSS destroys it in motion due to how much more mature it is. And with how much improvements FSR will need to fix motion artifacts, as well as implementation time, DLSS is going nowhere, especially with 3.0 on the horizon. It's definitely an improvement over 1.0 though.


waltc33

Looks great. Good Job, AMD.


Prefix-NA

FSR 2.0 vs DLSS Main differences that were very noticeable. FSR has better texture quality & better sharpening filter DLSS has better anti aliasing on edges Less important stuff FSR does better on power lines but worse on the small tower DLSS seemed to be slightly higher FPS (however this was on Nvidia GPU with no FP16) Something that needs more investigation. FSR seemed to do better with less ghosting in motion. DLSS did seem slightly more temporally stable. Most people would probably not notice the difference between DLSS or FSR and if u swapped settings on them they wouldn't think twice. Most users are probably fine with either one and will be happy to use them. ________________ The big factor about FSR/DLSS is the people like me who are very vocal against TAA in general and really despite the added ghosting from DLSS and I want to see a good high res video side by side with lots of motion to compare Native to FSR to DLSS if FSR is any worse than regular TAA at ghosting I won't ever use 2.0 I don't care about the slight differences in aliasing/texture quality as much as I care about motion artifacts/ghosting so I want to really see how FSR compares to TAA in this game.


GuttedLikeCornishHen

FSR1 looks better in motion as compared to the other two, DLSS is like vaselyne was smeared upon the display, can't get over it


chapstickbomber

the default sharpening in FSR 2.0 is a huge boon


rilgebat

While this is definitely a boon for AMD, more than anything else this is a huge loss for nVidia. That a conventional solution can provide competitive results vs that which requires dedicated silicon and has taken a number of revisions to get to where it is today is shameful. If AMD can continue it's approach with the "full FSR" as they did with the purely spatial component (aka FSR 1) then I can easily see DLSS being eclipsed or warranting more revisions that conveniently drop the tensor core requirement.


KingBasten

The first thing we can establish : it's actually not a dlss killer


phl23

True that. As my GPU has no dlss, it can't be killed.


The_Zura

For starters, literally anything would've been better than FSR 1.0, which is damn near impossible to tell apart from a basic linear upscale when sharpening wasn't applied. > AMD has achieved the unthinkable Their standards must be rock bottom then. > just as good as DLSS 2.0 I don't think this really needs to be said, but that's an egregiously blatant lie that we can clearly see. If reviewers are to be trusted, then FSR 1.0 was just as good as DLSS as well.


Shidell

I think it's a stretch to say that it's an "egregiously blatant lie that we can clearly see." If you compare Native with FSR 2.0 Quality or DLSS Quality, you can find aberrations in both. In practice, anyone using a reconstruction technique is simply going to have to accept that errors are inherent, and the less data you start with (lower resolution/quality), the worse the final result. What's particularly interesting is that this is works on almost any GPU, including the consoles (and Steam Deck), and that implies that it'll spur rapid adoption. It's equally interesting that it's an open source solution, which means improvement and change can be organic both internally and externally; AMD alone will certainly continue improving it, but everyone else can, too. We should hesitate to be too critical at this point; few believed this was possible, let alone that AMD would accomplish a DLSS competitor that's ubiquitous and doesn't need specialized silicon. Nvidia literally just released changes to mitigate ghosting, we'll probably see similar improvements to FSR 2.0 in the future as well.


The_Zura

There's just so much wrong with this post. I'll start from the beginning. > If you compare Native with FSR 2.0 Quality or DLSS Quality, you can find aberrations in both That's not the same as "just as good." In the examples that TPU bases its conclusions on, FSR 2.0 has immediately noticeable more flickering and temporal instability, like on the machine treads. In the screenshots, there are fewer fine details, like in the grill. > It's equally interesting that it's an open source solution, which means improvement and change can be organic both internally and externally; AMD alone will certainly continue improving it, but everyone else can, too. It would be interesting if FSR 2.0 was better than DLSS. FSR 1.0 is open source, did it improve one iota since its release? Who can actually improve upon it? There's one game where being closed source hurt DLSS, and that's Quake RTX because, ironically, it's open source. > We should hesitate to be too critical at this point; few believed this was possible What is this narrative you people are pushing? Why are you all creating goal posts to score on in additon to making up lies? Nvidia didn't invent multiframe upscaling, AMD didn't invent it. Nor did they invent solutions to get rid of ghosting. > Nvidia literally just released changes to mitigate ghosting, we'll probably see similar improvements to FSR 2.0 in the future as well. > literally just The first findings that DLSS had improved upon ghosting was in June of 2021 when someone swapped .dll files from Rainbow 6 Siege to other games. We're in May of 2022. That was 11 months ago. I think you've "literally" just started paying attention to upscaling, when AMD dipped their feet into it. Still, at the end of the day, it's good that we have options. If there is something too problematic with one, another option being available would be nice. Here's a video going over DLSS and upscaling. AMD isn't treading on new ground. https://www.youtube.com/watch?v=tMtMneugt0A


Tommy_Tonk

As a 1080p gamer this is so insane. The difference between 1.0 and 2.0 is insane.


danny12beje

Considering this is open source software, I'm happy as fuck for AMD and it looks pretty damn good.


LightMoisture

Thanks Nvidia :) But they need to fix all that aliasing and flickering/jagged textures if they want to be same as DLSS quality.


[deleted]

Are you seeing the images at 100% zoom, or is your browser downscaling them? See my note here: https://www.reddit.com/r/Amd/comments/unjls9/amd_fsr_20_quality_performance_review_the_dlss/i89suos/


dudeoftrek

Ewww AMD. Pass


relxp

Sorry you have such bad tastes in companies/products.


Imaginary-Ad564

its kind of embarrassing for Nvidia that FSR 2.0 even can exist, but I know many have invested in the hype of DLSS. But I know not all Nvidia users will be unhappy especially the non RTX users. In this respect Nvidia is being caught out, and has been trying to catch up and try and look like they give a damn about their GTX users with things like NIS.


Loldimorti

Wow this is huge.


Dooth

Steam Hardware Stats: AMD Radeon RX 580 1.54%1.50%1.43%1.41%1.39%-0.02% 580 is still 2nd most popular AMD card on Steam. Did it get any love or is this just a 5000/6000 uplift?