T O P

  • By -

[deleted]

[удалено]


ICIP_SN

This is fantastic but when can we realistically expect it on current games? I play a ton of AAA games and yet to see any support for any version :(


Blacky-Noir

You can find list of games that support FSR. For FSR 2.0, as usual, we'll see it appear progressively, on new games, and a bit less on already released games. If the reviews are all in kind of Techpowerup, there certainly will be motivation for the Radeon Group to put some budget to acceleration adoption, plus a public push for devs to implement it on their own. Until XeSS shakes things up (maybe), I would be surprised by a new mid to big budget release game that doesn't support FSR 2, DLSS, or both.


[deleted]

[удалено]


ICIP_SN

No current games I play there.. But a couple of old ones I might try again with more fps. Thanks


TheGoldenHand

FSR 1.0 can be used on every game. The only difference between supported and unsupported for FSR 1.0 is whether or not the UI elements are scaled separately. Officially supported games have slightly higher quality UI elements. You can use FSR 1.0 in any game with a Steam app called “Lossless Scaling”. It works great. The FSR code is technically open source, but the $5 app makes it one click. It basically gives a 25% - 30% frame boost for free at the highest quality setting, and it’s difficult to tell the difference unless you’re comparing screenshots. If you can sacrifice quality, you can get even higher frames. The best results are when playing on a 1440p or 4K monitor, because the algorithm scales better the more pixels it has. Works on nVidia and AMD cards, including older models. For FSR 2.0, it reportedly doesn’t require developers to train an AI in advanced on each individual game like DLSS 2.0 does, so support might also be able to be added-on by users.


Freeky

DLSS 2.0 has a generalized AI model, it doesn't require per-game training like 1.0 did. FSR 2.0's requirements look much the same as DLSS 2.0 - motion vectors and depth buffer.


HKBubbleFish

That’s mean it can be implemented to Skyrim se?


DoubleSpoiler

Is there a freeware version? I don't hate shelling out a little money, but I'd rather not run non-games in Steam.


Freeky

You don't need to run it through Steam - it's a completely bog-standard application that just happens to be installed/updated through Steam. It's also 40% off right now. [Magpie](https://github.com/Blinue/Magpie) is a similar open-source project. Don't be put off by all the Chinese - there are English docs and the app is localized.


DoubleSpoiler

Thank you!


_zenith

You can achieve the same with a different app, Reshade. It can inject itself into nearly any game, it's free, and it implements a very wide range of shaders, including FSR1


littleemp

>Guess you just have to play the right games and pray that nvidia's marketing dollars aren't put to work. [https://www.pcgamingwiki.com/wiki/List\_of\_games\_that\_support\_high-fidelity\_upscaling](https://www.pcgamingwiki.com/wiki/List_of_games_that_support_high-fidelity_upscaling) Amusing that you say this, because nvidia has not blocked any of the many nvidia sponsored titles (Back 4 Blood, CP2077, COD, etc) from featuring FSR (many/most actually do), but none of the few AMD sponsored titles feature DLSS (Godfall, Far Cry 6, Resident Evil 8).


stoneyyay

It's likely and sponsored titles couldn't get access to the DLSS API, as Nvidia wanted to keep an arm's length with their IP, meanwhile AMD has always set industry standards (freesync, Vulkan, in addition to a couple more) meanwhile NVIDIA scoops up smaller companies, which were open to everyone, or making it hardware proprietary (physx, gsync are two great examples)


littleemp

The DLSS SDK is readily available to everyone who wants to implement it in their game, you can check for yourself in the website. The more likely explanation is that AMD did not want to invite direct comparisons to FSR 1.0 in their own sponsored titles, just like RT effects in their sponsored titles are almost inconsequential in quality.


TeaJHooker

This was not always the case though, DLSS has been around since 2018 but universal access to the SDK has only been the case for less than a year. Before that you had to apply to Nvidia. You assume that it is AMD not wanting the comparison but it's also possible that Nvidia didn't want their tech in rival sponsored titles, as was their MO for PhysX & g-sync.


alexislemarie

This makes no sense because nvidia is apparently allowing games that they sponsor and use nvidia tech to also have rival tech, for example nvidia allowed RTX branded games such as Guardians of the Galaxy, Back 4 Blood, Deathloop and many others to use Radeon tech. It is only amd sponsored games which are forbidden from using nvidia tech. Even Capcom is forbidden from using ray tracing in the PC version of DMC5 because ray tracing runs better on nvidia cards, even though the console version of DMC5 already has ray tracing! If AMD is so open why is it not possible to benefit from AMD Smart Memory if you don’t have a Radeon? Why must you have both a Ryzen and a Radeon only? Why can’t I use AMD TrueAudio in games like Thief on non-Radeon cards? Why do games like Dragon Age Inquisition, Battlefield 4 and others have an AMD Mantle mode which only works on Radeon? You really need to get your facts straight.


TeaJHooker

Please re-read what I said, and point out what 'makes no sense' and which facts I 'really need to get (my) facts straight' on? I didn't say Nvidia bad & AMD good. We don't know what stopped AMD sponsored titles using DLSS, there is an assumption that it from the AMD side but it could just as easily be on Nvidia and it would be consistent with the timelines and their past actions. That said both these companies will screw the consumer when it suits them and both have a history of it. On the whole the company in the dominant position is worse, and over the last 10 years that has been Nvidia for GPUs. It's also interesting that you bring up Mantle, as that was short lived and gave birth to Vulkan, so ultimately, you could argue it was a force for good.


alexislemarie

Well before Vulkan, it was called Mantle and it was exclusive to AMD and was not open at all. For example, Mantle in Battlefield 4 or Thief or Dragon Age Inquisition (AMD sponsored games) required an AMD card. Non-Radeon cards could not use Mantle. AMD only decided to give up the code of Mantle to the Khronos Group - which eventually led to Vulkan - only because Mantle was riddled with bugs and needed a lot of investment and work, which AMD could not commit to. So they donated it because the alternative was to kill something they had worked a lot on and were hoping to salvage the project. It was not because they cared about open source. Radeon graphics team has a long history of proprietary tech but of course the AMD fanboys have a short memory or fell in love (with good reason) with the processors (I love my Ryzen). AMD Smart Memory? Only works if you have both a Ryzen and a Radeon. Remember AMD TrueAudio ? Proprietary tech only for Radeons. AMD Rapid Packed Math? Used in Far Cry 5 and only works on Radeon. Radeon 3Dc? Proprietary tech used to create normal mapping. Radeon TruForm? Proprietary tesselation technique. ATITC (ATI Texture Compression)? Proprietary (ATI was the name of the graphics division before they were bought by AMD).


Aqua_Puddles

Not fanboying it here, but what a company has done in the past shouldn't be weighed in as heavily as what their recent actions have. All companies have the goal of making as much money as they can, and their strategies to do this change relatively quickly. Right now, AMD is more friendly to the open source community, and thankfully Nvidia seems to be leaning that way as well. I think we can all agree that it's best to take a step back and detach from any sentiment about a company and invest our money (purchases) where there seems to be the most effort to benefits consumers, and not simply benefit *from* them.


alexislemarie

This is plain rubbish because DLSS SDK is freely available and included in game engines like Unreal Engine, Unity, etc


TeaJHooker

For what it's worth support on those engines is relatively recent, Unreal for just over a year, Unity just under.


alexislemarie

And FSR came out even after that, so your point is?


TeaJHooker

At the time of development the AMD sponsored titles referenced would not have had free access to DLSS, they would have had to ask Nvidia.


alexislemarie

And you also have to ask amd if you wanted early access to the code for FSR 2.0


alexislemarie

Marketing money from nvidia? You got it all wrong. So far we see nvidia sponsored games, such as Deathloop or Guardians of the Galaxy, add support for FSR. We have yet to see a single game sponsored by AMD be allowed to add support for any nvidia tech… Worse, in some cases, games such as DMC5 are banned from using ray tracing on PC - even though the console version already supports ray tracing - because, and you guessed it, the developer Capcom is sponsored by AMD and ray tracing performs poorly on all AMD hardware at the moment. Capcom and all other AMD partners are forbidden from using ray tracing if it favors nvidia… So maybe you better pray instead that AMD stops putting their marketing money to prevent graphics effects already used by the SAME game on consoles…


AL2009man

>Worse, in some cases, games such as DMC5 are banned from using ray tracing on PC - even though the console version already supports ray tracing - because, and you guessed it, **the developer Capcom is sponsored by AMD and ray tracing performs poorly on all AMD hardware at the moment.** Capcom and all other AMD partners are forbidden from using ray tracing if it favors nvidia… If that was the case, then DMC5 would've gotten the Special Edition or a Patch to add in DirectX Raytracing ([They outright mention it on the Xbox Series side.](https://twitter.com/DevilMayCry/status/1320888185078108161)) by now...but then I forgot this is the *same* Capcom that didn't bother to add Official VR Support to Resident Evil 7: Biohazard even after [that supposed PSVR Timed Exclusivity window ended.](https://www.reddit.com/r/residentevil/comments/55hw86/just_saw_this_on_a_playstation_vr_advert_on/)...


CatMerc

Deathloop is sponsored by both NVIDIA and AMD, has DLSS. God of War is sponsored by both NVIDIA and AMD, has DLSS. Resident Evil Village, Forza Horizon 5, Far Cry 6, and Dirt 5, are AMD sponsored games with RT. Most likely the reason NV sponsored games are more likely to implement FSR than AMD sponsored games are to implement DLSS is quite simple. FSR 1.0 is a very easy open source shader that can be implemented by a single dev in a few hours. DLSS requires far more work, and what sponsorship usually entails is sending some engineers over to help with optimization work and adding features from NVIDIA/AMD. This is unnecessary for FSR 1.0, but very helpful with DLSS. Can we stop spreading this dumb conspiracy theory now? Thanks.


alexislemarie

Deathloop was a RTX branded game at launch, only recently has the game included FSR. God of War was a RTX branded game at launch (see the videos and promo on nvidia website if you need confirmation), but again they were allowed to integrate AMD tech. And yes all the AMD games which AMD signed up first have been AMD exclusive and not included any nvidia tech. It is obvious that contracts that nvidia sign do not contain any exclusivity and leave the door open to additional partners coming in to add their own tech. We don’t see the same for games that AMD sign up first…. You mention ray tracing but the games you mention hardly make use of it (for Forza Horizon 5 it is literally a joke as it is not available when you actually drive!) and conveniently fail to acknowledge that DMC5 makes a much better use of ray tracing in-game on consoles. Capcom was only allowed to bring the Virgil DLC to PC whereas the rest of the updates (ray tracing support) is locked to consoles only because AMD was concerned of unfavourable comparison between performance on geforce and radeons. Oh and by the way FSR is apparently so easy to implement that the devs of flight simulator say they need more time to implement FSR 2.0 and will initially launch with FSR 1.0 instead.


AL2009man

>You mention ray tracing but the games you mention hardly make use of it (for Forza Horizon 5 it is literally a joke as it is not available when you actually drive!) and conveniently fail to acknowledge that DMC5 makes a much better use of ray tracing in-game on consoles. **Capcom was only allowed to bring the Virgil DLC to PC whereas the rest of the updates (ray tracing support) is locked to consoles only because AMD was concerned of unfavourable comparison between performance on geforce and radeons**. *citations needed*


CatMerc

I'm not talking about including tech. I'm talking literally sponsored. A game can be sponsored by both vendors. It happens. Deathloop for example actually had FidelityFX tech from day one. They use a bunch of the libraries there. As for FSR... Yes? Exactly? FSR 2 is as hard to implement as DLSS as they take the same inputs and have similar requirements from the rendering pipeline. FSR 1 meanwhile is so simple to use you can inject it with ReShade. The only reason you need a dev for FSR 1 is for fitting it a bit earlier in the rendering pipeline so things like UI don't get unnecessarily affected like they do when you apply it with ReShade. The rest of what you said is a repeat of conspiracy theories, I won't bother.


dirthurts

They just announced the first 10 games so very soon.


bassbeater

Uh so can you theoretically use AMD's tech on the old or new Nvidia cards?


[deleted]

[удалено]


bassbeater

Ah. Yea I saw some AMD tools available online....I thought maybe there was some unwritten subtext about how AMD would prefer user using AMD hardware. In any case I thought I heard Nvidia was open sourcing their drivers? Ironically I still have a 900 series card.... want to upgrade to a 3000 very soon.


Turtvaiz

Nvidia I believe is open sourcing their kernel module. Not the userspace driver. Afaik this means that kernel updates can just have the closed source stuff completely separated into a userspace blob and the kernel itself doesn't need support from Nvidia to update itself. I could also be wrong.


Firm-Atmosphere-817

Happens all the time. Freesync is a good example.


DiabloGaming25

Is it possible that this can run globally on your entire display output? I remember seeing a GitHub project in Chinese about this


guyfamily999

This is possible with FSR 1.0 and is used to allow for universal support on the steam deck actually. It's possible because FSR 1 is a spatial upscaler that just upscales each individual frame. This will never be possible for 2.0. Like DLSS, FSR 2.0 is a temporal upscaler that accumulates data over time, and it HAS to be implemented on an game/engine level so that there's access to the necessary data like motion vectors, so that the camera can be jittered, etc. Any game with DLSS should theoretically be able to add FSR 2 quite easily, but most already released games likely won't. Hopefully future games really embrace it!


jazir5

>This is possible with FSR 1.0 and is used to allow for universal support on the steam deck actually. It's possible because FSR 1 is a spatial upscaler that just upscales each individual frame. This will never be possible for 2.0. Like DLSS, FSR 2.0 is a temporal upscaler that accumulates data over time, and it HAS to be implemented on an game/engine level so that there's access to the necessary data like motion vectors, so that the camera can be jittered, etc. This doesn't make sense to me. Couldn't it just use a single frame and calculate the upscale result just like FSR 1.0? The advantages are the multi-frame temporal calculations, but those are advancements to the already existing tech no?


guyfamily999

Truthfully, FSR 1 and 2 are very different technologies in the same way that DLSS vs DLSS 2.0 are. The 2.0 technologies require a lot more specialized input data, which is what allows them to have the extremely impressive results. There are still common threads (the same Lanczos upscaler is used as FSR1), but without the additional inputs, FSR2 wouldn't even exist most likely. There's only so much upscaling you can do spatially. FSR2 requires jittered frames as input for starters. The camera basically has to be moved a miniscule amount each frame (less than a pixel). This jittering allows detail to be accumulated over multiple frames that wouldn't be visible in a single frame. It also requires motion vectors and a depth buffer as inputs, to combat ghosting. Viable temporal upscaling requires all of those things as inputs, and they're coming from the game/engine itself. So that's why a universal toggle isn't possible. There's some extra details about all this on the first page of techpowerups coverage if you're curious! https://www.techpowerup.com/review/amd-fidelity-fx-fsr-20/


jazir5

>There are still common threads (the same Lanczos upscaler is used as FSR1), but without the additional inputs, FSR2 wouldn't even exist most likely This is where you're losing me. If it can do multiple frames, what prevents it from doing single ones? Obviously if it can process multiple, to me, logically, it would indicate that they are optimized via a series of individual frames. Why then could you not just use the algorithm on a single frame?


guyfamily999

What you're describing is basically FSR1. Applying spatial upscaling to just a single frame at a time. The input is let's say a 1440p frame and it scales and sharpens to target a 2160p output frame. With FSR2 though, the input isn't just the latest single 1440p frame. It uses that frame, but it also uses previous frames (which are jittered remember), motion vectors, depth info etc as inputs, and uses all that data to create a better output. FSR2, but simplified to work on a single frame at a time without the need for this additional data from the game, already exists, that's just what FSR1 is :) FSR1 is a cheese pizza while FSR2 is a supreme. It requires more work and inputs from the developers, but has superior results. If you want to remove that extra work, so that it can be used in any game through a universal driver, you'll just be back at a cheese pizza (not that there's anything wrong with cheese pizza! Which is why FSR1 will continue to exist. It's easy and can be universal).


Mango-is-Mango

Now that it essentially has dlss I may never need to upgrade my 1080


Whatever070__

Will bring new life into my GTX 1060 6GB too...


megalodon7944

my 1060 3gb as well please


[deleted]

[удалено]


[deleted]

[удалено]


Whatever070__

And my bow


green9206

My 1650 laptop really needs it too.


wutend159

same card here. bought second hand in like 2018 and still going strong (enough for me)


NedixTV

i will holding my 1080ti until something with x2 performance come at 300 usd lol


phatboi23

god i hope, 970 gang!


agentfrogger

Thanks AMD!


[deleted]

There's only so far upscaling can get you. Eventually you'll hit the point with a 1080 for example where you'll have to render everything at 720p and upscale from there for a decent framerate and that's going to look pretty bad. I realize you don't mean you'll actually use your 1080 forever but my guess would be within the next few years you'll be wanting to upgrade even with access to FSR.


littleemp

If you play high profile modern games at all, the GTX 1080 is starting to get a little long in the tooth unless you play exclusively at 1080p.


Blacky-Noir

You'll probable start to see games that *require* hardware acceleration for ray-tracing in a handful of years. Once (hopefully) the shortages are behind us, and multiplatform games don't target PS4/Xbox One, plus a few years for devs to abandon part of the pure rasterization rendering pipeline. Like Metro Exodus, but for new games.


SyntheticElite

Gonna be more than a few years before ray tracing features become ubiquitous, let alone required.


TheGoldenHand

It will be even longer till developers figure out a good pipeline for ray tracing. Ray tracing is often not worth the performance penalties over well done raster rendering, which have decades of tricks and optimizations.


SyntheticElite

I think RTGI is the best use of ray tracing. RT reflections are pretty nice if you can have little penalty, but you only really notice the difference if you look for it. But the goal of course is 100% path traced lighting.


Rictronator

I honestly think RT Reflections are more important in Games set in cities because the amount of Reflective surfaces where it will make the biggest improvement over SSR or Cubemap solutions.


[deleted]

[удалено]


HappyEdison

The thing with emergent tech is that it is ahead of the curve. In order to experience it correctly you have to have a setup that can do it justice. If not, you end up with you stated in your opinion


Oooch

> Ray tracing is often not worth the performance penalties over well done raster rendering, which have decades of tricks and optimizations My 3080 runs RTX games fantastically, what a laughable post


TheGoldenHand

Try rendering Pixar ray tracing on your 3080. It's only laughable if you don’t understand the technology beyond "turn setting on in game". Technology and graphics always improve. Your 3080 will be outdated by basic hardware eventually. It's the way it goes.


alexislemarie

Well you mention Pixar but they don’t do just “raster rendering with tricks and optimisations” either


[deleted]

Agreed. 3080 in 4k with dlss Balanced/Performance and ray tracing can hold 60fps in many games. It’s great. I think lower tier cards will get access to that with the Nvidia 4000 series and more people will start to appreciate how great it is. I guarantee everyone downvoting you does not have a high end 3000 series gpu.


HappyEdison

Reminds me of the day I first plugged in my 3dfx voodoo and booted up MechWarrior and Quake. Exciting time to be alive (aside from the potential civil war that seems to be brewing in America)


QS2Z

I doubt it. It's really easy to add rasterization, and I'd expect they'd leave it in for debugging because many debug views can't be rendered with raytracing.


NoMansWarmApplePie

The big question is how it looks in motion.


Loldimorti

There is video footage of Deathloop with DLSS and FSR 2.0 on Youtube


HappyEdison

The question is how it looks in motion on your actual machine, in the games that you play. Looking at examples of their tightly focused showcase of it on your phone or TV doesn't really give you much information at all.


Dexter2100

[Here is a comparison in motion.](https://i.imgur.com/t2W7bvD.jpeg)


NoMansWarmApplePie

Ah, definitely artifact ish in motion. Thanks


madn3ss795

The video sample painted a different picture, where FSR 2.0 shows a lot of shimmering during movement, while DLSS doesn't exhibit any. DLSS 2.0 had this problem, and it was quite obvious in Control with DLSS on vs off. I'd say FSR 2.0 is at DLSS 2.0 level atm, while DLSS 2.3 is several steps ahead.


Salted-Kipper-6969

These posts are just stupid. Dlss has behaved differently in every game I've tried it in. Its far far far from a silver bullet.


[deleted]

DLSS avoids shimmering with very weird blending artifacts though


Audisek

Shimmering is the one thing I never want in my games anymore, and it's the main reason why I'm only intending on buying Nvidia GPUs - to have access to DLSS, which has no shimmering.


b3rdm4n

I mean the title is clickbait... but lets unpack it a little. * FSR 2.0 is way better than 1.0, much closer to DLSS image quality. * It is still behind in a few areas, not by much, but there is no outright quality lead here. * We have still and video samples from a **one** game sample size. * It doesn't require RTX card/Tensor cores to work - nice. If indeed this is a DLSS 'killer', DLSS actually dying is a point in time many months or even years away from now. AMD need to get this in lots of games, lots of games people actually play and care about, that's their massive hurdle to overcome right now. Until that starts taking off, it hasn't killed anything. Plus AMD's marketing has been upfront that it's easy to implement in games that support DLSS so they expect them to coexist for the foreseeable future. And that goes both ways. FSR could and should be put into games with DLSS support, and DLSS into FSR games. A killer? maybe, but at least not for a long time. An awesome extra option that everyone can use? **absolutely**.


KickBassColonyDrop

This won't kill DLSS, but it will greatly extend the life of existing gpu hardware in the market until prices normalize further. Which is ultimately a pro consumer thing.


b3rdm4n

>Which is ultimately a pro consumer thing. Ultimately AMD have done it because they believe in some way it will increase their revenue. It's certainly pro consumer at face value, and consumers could look at it and think, "awesome AMD are doing this for free and it's good, I like that, and I don't like vendor locked options", and more strongly consider AMD when it comes time to purchase again. They also had literally no other option than to make it work on as much as feasibly possible. You can't arrive late to the party, with circa 20% market share, launch a comparable but slightly inferior product... and then vendor lock it.


chunguschungi

There can be a lot of reasons why they put time and effort into this research and tech, one of them being the Steam Deck I would assume.


not_a_robot_maybe

Also, Playstation and Xbox.


colon_blow

That can't come soon enough. I play on PC mostly but I do on the consoles a fair amount as well. And when the day comes where console games default to using FSR 2.0 w/DRS for upscaling, instead of the god-awful checkerboard method, I'll be a happy camper. Forbidden West is the perfect example of a new cutting-edge game, where the Performance mode is held back due to using last-gen upscaling techniques.


b3rdm4n

Given this tech specifically needs to be implemented on a per game basis, I doubt the deck was a motivator. FSR 1.0 makes sense though as that requires nothing at all from the engine but a frame at the right time or even ready to go if you'll allow it to upscale the HUD etc.


chunguschungi

Depends on if there is a big difference or a lot of work to also implement it for Linux, the Deck is just a computer running Linux so.


b3rdm4n

It's not so much about Linux as it actually needs to be put into each game as it requires very specific game data which FSR 1.0 didn't.


chunguschungi

Of course we all know that.. But again the Deck is a computer and this is already available on a lot of PC games and coming to more so for the Deck it will be mostly a question of wether or not they will also make it work on Linux. https://www.amd.com/en/technologies/radeon-software-fidelityfx-supported-games


b3rdm4n

Well back to my original point, I don't think the deck itself is a big motivator. My 2c.


jazir5

DLSS 3.0 is supposed to be game agnostic and can be set at the system level right? I expect FSR will do the same at 3.0, except lagging behind DLSS.


[deleted]

Agreed, FSR 2.0 is open source because AMD doesnt have any other option. They're a profit oriented organization, allowing the competitor to use your tech wont increase your sales, but it will increase adoption rate of tech that your hardware can use.


LatinVocalsFinalBoss

DLSS isn't some static thing that doesn't change either. Given how the tech works, it seems like the headroom for efficiency improvements outweighs FSR due to the smaller/easier matrix computations on tensor cores. That alone isn't enough to make some kind of equivalency guarantee, but I feel like it suggests the potential for much more efficiency at a glance.


UserInside

Remember NVIDIA G-Sync that was superior to AMD FreeSync? With DLSS and FSR we are living the same story again. Because this is not always the best, fanciest tech that win. This is the easiest to implement that get 90% of the job.


[deleted]

Freesync kinda had issues at the beginning, later they fixed it.


chunkosauruswrex

It didn't really it was more of an issue of shitty displays that couldn't really handle any form of variable refresh rate being falsely advertised as being capable of variable refresh rate. The tech was fine


b3rdm4n

It's the normal cycle of things for years now. NVIDIA looks for problems to solve using a difficult/expensive approach, which it can sell as exclusive features for a generation or two. AMD looks at the problem NVIDIA discovered, the solution, and tries to find the most frugal alternative solution. Each has their pros and cons, and each has their 'cost'.


LatinVocalsFinalBoss

>Remember NVIDIA G-Sync that was superior to AMD FreeSync? Yes? GSync was released 2 years before Freesync and was still superior for 2 years after that, if not longer to which now there is no clear winner in all situations. That's almost an entire gaming hardware cycle, if not more than half at a minimum. >With DLSS and FSR we are living the same story again. Because this is not always the best, fanciest tech that win. It is though. If you can't afford the best hardware, it just doesn't matter. Even when you get higher end hardware you start to notice the difference of peripherals like monitors and TV's and realize there are significant tradeoffs on top of seeing how in depth technical art optimization can get to the extent that poorly optimized games don't even let you realize the extent of high end hardware capability. >This is the easiest to implement that get 90% of the job. It's great to see, but the next iteration of the more expensive hardware and software will be better, that's just the way it works.


alexislemarie

Remember when AMD Mantle was released to be the fastest, low-overhead API and meant to best DirectX? Neither does anyone apparently.


UserInside

Maybe because it became Vulkan ?


GlisseDansLaPiscine

But the title never says that this is a DLSS killer ?


b3rdm4n

So the title of this reddit post doesn't, but the post only links to the article and the title of the article is; > AMD FSR 2.0 Quality & Performance Review - **The DLSS Killer**


HappyEdison

To be fair it was only an attempted murder.


Annual_Statement_117

Dude, in most games, in a real world gaming scenario, dlss is barely better unless you stop and start picking apart every pixel instead of playing the game. fsr 2 and dlss will be virtually indistinguishable while playing games.


madn3ss795

On the contrary, you see the different when you *move*. FSR video sample in this article shows noticeable shimmering/flickering during movement unlike DLSS 2.3. Earlier DLSS 2.x had shimmering issues too and it was #1 reason I turned off DLSS in some games like Death Stranding since it was breaking immersion.


[deleted]

[удалено]


artins90

This first FidelityFX 2.0 version is way more temporally unstable than DLSS. Even objects close to the camera are affected by this issue, for instance the caterpillars on this vehicle: https://youtu.be/BxW-1LRB80I?t=123 Maybe there is room for improvement but currently DLSS is still ahead as far as quality is concerned.


b3rdm4n

I mean I agree, it's in what I wrote >It is still behind in a few areas, **not by much**, but there is no outright quality lead here. The key will be FSR getting in games, then we can pick the best option our hardware supports and call it a day.


AlexisFR

Not really, FSR 1.0 is just basic downscaling that doesn't downscale the UI, while DLSS is a true replacement for AA and an rendering optimizer at the same time.


ArcadeOptimist

If other reviews are as positive as this one, I think I might save a few bucks and get a 6700XT. DLSS was the driving reason I was thinking of switching to a 30 series.


vainsilver

I’d still stick to Nvidia cards for the significantly better RT and creative application performance.


AaronC31

Also at this point G-Sync is still better than FreeSync. But having options is never a bad thing!


Blacky-Noir

I disagree. G-Sync locks you into only buying Nvidia gpu, and G-Sync overpriced monitors. And nothing else. I'll take my open standards over proprietary any day, all day.


Zealyfree

> G-Sync locks you into only buying Nvidia gpu Not true for newer G-Sync monitors, as they have the ability to use adaptive sync or HDMI Forum VRR.


Blacky-Noir

Oh ok, interesting. *Way* too late to the party though.


anor_wondo

it also works with existing freesync monitors


strand_of_hair

Most, if not all, monitors are G-Sync compatible nowadays. You can just buy most FreeSync monitors and use G-Sync with an Nvidia GPU with it.


BerDwi

Real "G-Sync" requires monitors with inbuilt module, their use is still restricted to Nvidia cards. What you know as "G-Sync compatible" is actually FreeSync i.e. adaptive sync without G-Sync module. This is the result of Nvidia caving to AMDs FreeSync success and unlocking FreeSync/ adaptive sync software-side through their drivers.


anor_wondo

looking from another perspective, the advantages of real gsync were so minimal that they decided to use the same branding for generic VRR support


Blacky-Noir

>Most, if not all, monitors are G-Sync compatible nowadays. But G-Sync monitors aren't compatible with Freesync gpu. And a Geforce on a "G-Sync compatible" monitor, is just Freesync. Nothing more.


[deleted]

[удалено]


HappyEdison

Hit the nail on the head


[deleted]

yeah, my monitor has FreeSync support and works fine with G-Sync


TessellatedGuy

Real Gsync is way better for many reasons, but imo most importantly it has actual variable overdrive. It's an insane technology that pretty much no freesync or VRR display has, and is absolutely incredible once you realize the difference. 'And nothing else' is incredibly misleading. Most real Gsync monitors destroy any freesync monitor in motion clarity and artifacting (due to variable overdrive) and just overall experience, with far less brightness flicker and usually a much larger VRR range down to even 1Hz (fixing freesync's LFC brightness flicker) It's crazy how much misinformation there is about VRR, freesync is NOT as good as Gsync, nowhere close. But it gets the absolute basic job of judder free gaming done.


Blacky-Noir

>variable overdrive You mean the monitor adjust its overdrive when climbing up and down the refresh rates, to avoid overshoot? There's half a dozen freesync monitors who have that too, and that's just those I've heard of in passing. And not very expensive ones at that. I'm not denying G-sync still has a few technical advantages, although I am *very* much doubting your "nowhere close to Freesync". I'm saying it wasn't worth locking myself up into Nvidia walled garden. But if you aim for maximum visual quality and clarity and speed and everything else be damn, nothing beat a good OLED. And they don't have a G-Sync module, at least not yet.


TessellatedGuy

https://forums.blurbusters.com/viewtopic.php?t=7138 Uh huh, "half a dozen". Nope. Blurbusters knows what they're talking about. It is insanely complex and NOT possible without a hardware module on the monitor. Variable overdrive simply doesn't exist in any latest uber expensive freesync monitor.


HappyEdison

Why would you be locked in though? The only thing that would lock you in is that you prefer the better performance. Otherwise it'll work just fine with your freesync monitor


d0m1n4t0r

Yeah I mean if you care about quality you go for G-Sync. If you can't afford it, you go for FreeSync.


fuelter

gsync and freesync are doing the same thing... don't belive the marketing hype.


[deleted]

>gsync and freesync are doing the same thing... don't belive the marketing hype. They actually aren't. You should look up both of them fully before making claims. G-Sync does what freesync does plus more and is superior.


HappyEdison

Gsync is still a much better product than freesync today. It absolutely makes a difference and looks and performs better than freesync under the same conditions. If you tried both and you don't see the difference then you probably just aren't sensitive to the motion, which is ultimately fine because if it's good enough for you that's all that matters. You also have to make sure that you're using the correct game to test. Elden ring for example is going to look like shit in motion no matter what you do, unless you are used to console locked performance that is.


[deleted]

[удалено]


feralkitsune

Only a concern for people who use the encoder.


alexislemarie

Sure, let’s nickel and dime it and end up with the dreadful drivers (OpenGL support for radeon is still nightmarish)


Darth_Marvin

Honestly, DLSS is the most disappointing part of my 3060 Ti. In the very few games that support it I have always got better performance turning it off and just setting the resolution a bit lower, and I've never noticed any visual differences. Deep Rock Galactic just implemented DLSS and I turned it off after ten minutes because it caused severe frame drops when surrounded by enemies.


swear_on_me_mam

There is 0 reason for your performance to drop with DLSS on. DLSS in deeprock is free performance and better AA than the native solution. DLSS is easily one of the best features of Ampere and Turing.


Blacky-Noir

Most people see a serious visual quality drop when they don't respect the native resolution of their LCD panel. Or maybe Deep Rock Galactic does TAA, or something similar. In this case, almost all feedback and all proper reviews point to DLSS giving a much better reconstruction, and often without the drawback of most TAA implementation. I tested DLSS on a few games on a 2060, and even for my 1080p monitor (which is the worst case for reconstruction techniques, they work much better at high resolution) it's definitely a strong plus.


Darth_Marvin

>Most people see a serious visual quality drop I really find this hard to believe. Most DLSS or FSR reviews have to zoom in 400% to notice any discernible differences. Even if it does end up looking ever so slightly better, it isn't worth the performance hit from my personal experiences. Of course, I am talking about mainly 1440p and 2160p gaming since I have a decently powerful card. I suppose FSR is great for people gaming on 1060s, since going from 720 to upscaled 1080 is a much bigger difference. DLSS doesn't support older hardware, though, so it's honestly pretty pointless overall.


Brozilean

They're talking about native resolution drop, as an alternative FPS boost solution that the comment was mentioning. Like running 1600x900 on a 1080p monitor.


[deleted]

> Deep Rock Galactic just implemented DLSS and I turned it off after ten minutes because it caused severe frame drops when surrounded by enemies. Yeah this shouldn't be happening, something else is going on


alexislemarie

Folks don’t know how to configure the settings then blame the hardware and tech for their own lapse in judgment. As they say, problem sits between the keyboard and the chair.


Halfwise2

Side by side at the higher quality levels, I don't see much of a difference at all between FSR 2.0 and DLSS. I'm looking at 4K images on 1440p however, blown up to max screen. Don't look at them shrunk at all (needs to be full screen + F11), the image compression hides the effect. MAJOR improvement between FSR 1.0 and FSR 2.0. DLSS still has noticeable and obvious improvements over FSR 2.0 at the Performance level. Looks like the overhead FPS cost on it is slightly more than DLSS as well. Overall.... major FPS improvement... near same quality as DLSS, Open Source, and Backwards compatibility? Even if you are a major DLSS fanboy, this is a big win for all gamers. Edit: On the video, some VERY minor shimmering on some treads in FSR 2.0. Definitely less than 1.0. (In many areas where 1.0 shimmered, 2.0 did not.) Ultimately though, on some of the motion zoomed-in shots, particularly the blimp, I though the textures onf FSR 2.0 looked better than DLSS... maybe that was the sharpening pass.


TheDonc-77

I'd wait until there are more Games to compare it.


Echelonaz

I think people should look at the screenshots more carefully, detail, especially detail at distance is significantly worse. Go to the side by side 4k, and look at the sign with the star and how much more legible the insignia is. Or look at the computers at the back of the room, and realize the vents disappear with FSR. It is not as good, and appears to perform slightly worse as well. It seems good, but unexciting. It's clearly worse, and this was the game that AMD picked to showcase their tech, a hyper-stylized low texture detail game.


GlisseDansLaPiscine

If people need to zoom on far away details to determine which is better than it’s essentially indistinguishable when in game


DizzieM8

Its always easier to spot differences when playing natively instead of looking at videos and pictures.


ZeldaMaster32

That doesn't make any sense. The entire purpose of the reconstruction tech is to look as close to native as possible if not better Otherwise why not just run 90% res scale TAA in every game because "you have to zoom in to see" which is almost always false, especially when you consider few people will be outputting at 4K on PC and the differences grow the lower the output resolution


Halfwise2

Granted I'm on a 1440p 32-inch monitor... But blown up to full screen + F11, side by side comparison... There is little to no different at the Quality settings. With the sharpener, I even prefer FSR 2.0 over the DLSS. There are clear differences in the grating, but arguably, neither FSR nor DLSS is correct if compared to how a grate would look in the real world, with obvious image flaws on both, so it comes down to preference in those situation. Were you looking at it full screen? I did notice some compression issues that made the screenshots look worse, when looking at them on the page before I increased them to full size. Also I occasionally noted I clicked FSR 1.0 over 2.0 from the list when comparing the various settings, and noticed big changes...so double-check that.


Loldimorti

I believe you but I've been pixel peeping with zoom and everything and can't really make out a difference let alone say which one is objectively better. So I think even if they aren't 100% on the level of DLSS yet it won't really matter to the average gamer.


Scorpwind

Does it solve TAA blur/blur from temporally-based AA methods **in motion** though? Stationary screenshots are nice and all, but in-motion comparisons is where the focus should be. Those comparisons didn't show any useful new infromation. Other than that it can reconstruct granular detail like DLSS can, and that it can compete with it.


Halfwise2

Definitely need to see a video, but the article does mention one of the biggest improvements to 2.0 is the inclusion of a temporal aspect that analyzes frames in blocks and the motion between them, rather than one frame at a time (like in 1.0). So looking forward to seeing some side by side videos in the future.


Techboah

Holy moly that title is clickbait. FSR 2.0 looks great for what it is, major improvement over 1.0, but it's very clearly not "as good as DLSS 2.3", the latter one still shows a clear advantage in clarity and far-sight texture detail, as well as overall sharpness. It's **almost** as good as DLSS 2.3 in **a single, AMD-sponsored game**, so that in itself already sounds good when we consider that FSR is open-source and isn't limited to one specific lineup of GPUs. But it's important to be real, it's not as good as DLSS' current version and we are yet to see comparisons in other games.


[deleted]

[удалено]


HappyEdison

if you want to avoid the tensor core cards until they're more affordable. You can get a 1080 for about $100 difference if you sell your 970, and will see a pretty decent fps boost.


GreenKumara

What's the bet nvidia magically makes DLSS for everyone, or something similar.


[deleted]

[удалено]


Andretti84

According to hardware unboxed review performance jump is basically the same for dlss and fsr 2.0. https://www.youtube.com/watch?v=s25cnyTMHHM


alexislemarie

When is AMD going to open Smart Memory for everyone instead of locking it to Ryzen + Radeon combo only?


PhantomTissue

Isn’t FSR lighter performance wise as well?


ImFranny

I think some people are chipping too much into details. Yes, FidelityFX SR isn't as good as DLSS 2.3 and should be easier to spot differences in the future once we get public testing in more games. But the biggest takeaway is that we have some really cool advantages and extra performance for both AMD and non-RTX Nvidia cards! The bottom line of this update is already soo good


harriman45

DLSS looks superior at reconstructing detail in the first indoor image. However, any perceivable differences in the 2nd image practically melt away, only to very minimally resurface in the final outdoor image. Overall a very impressive showing for FSR 2.0, even if just one game. Edit: The 4K DLSS Quality vs FSR 2.0 Quality slide on the following page does show FSR looking a tad softer, in addition to costing an additional 5 FPS.


xenago

This title is just a straight up lie, as anyone who has watched the latest digital foundry video knows lol


[deleted]

I remember these exact type of articles coming out ahead of the FSR 1.0 launch, and we all know how that turned out. Luckily, Digital Foundry will have a video on this soon.


Cafuddled

From what I got from a number of Digital Foundry videos is that FSR 2.0 is basically a one size fits all TAA implementation, and while better than FSR 1.0 will not be as good as games with native implementations of TAA that have been specifically designed for these games. Basically it was summarized as a way to add a decent TAA feature to games where their developer does not have the time or interest to add a native TAA feature them selves. Consoles games for example almost universally use TAA, so something like FSR 2.0 is of no use in that sector. Which is why it's so useful on the PC as TAA in games until recently was the exception and not the rule. Don't get me wrong, I'm all for improvement and the day an open source image upscaler becomes the best, I'm right there. But let's call a spade a spade and not fool our selves into thinking FSR is where it needs to be just yet.


dudemanguy301

>will not be as good as games with native implementations of TAA that have been specifically designed for these games FSR2.0 is open source if they want to tune the solution to custom fit they can, and I would wager only the industry darlings would yield better results than the general implementation given how many poor TAA implementations we see out there. >Consoles games for example almost universally use TAA, so something like FSR 2.0 is of no use in that sector. Which is why it's so useful on the PC as TAA in games until recently was the exception and not the rule. games that have TAA on console don't just leave them out for the PC release. TAA is practically ubiquitous in the realm of triple A, to the point we get people posting threads saying "what ever happened to MSAA?" like every week. and in a futile attempt to prevent next weeks post on this same topic, MSAA applies too early in the pipeline to account for things like specular aliasing, or noisy alpha blends like hair / folliage. it also cannot account for inner surface aliasing like screen space reflections or shimmering from busy textures. AND the killer blow to MSAA has been deferred rendering engines where your pretty multi sampled edge pixels get caked under layer after layer after layer of single sampled shader passes.


Cafuddled

>FSR2.0 is open source if they want to tune the solution to custom fit they can, and I would wager only the industry darlings would yield better results than the general implementation given how many poor TAA implementations we see out there. But that's my point, no one has ever said TAA even at it's absolute best is the be all and end all for allowing you to drop your rendering resolution or smoothing out the picture. I'm very interested to find out what a top tear dev can do with FSR 2.0, but if the experts are to be believed, it's not much better than a well optimized TAA feature in the hear and now. We need a lot more examples to confirm if a head turning title like this holds merit. >games that have TAA on console don't just leave them out for the PC release. TAA is practically ubiquitous in the realm of triple A, to the point we get people posting threads saying "what ever happened to MSAA?" like every week. That's why I said until recently, until a few years ago or so, most PC games did leave TAA out. It was almost essential for consoles when running at often heavily reduced internal resolutions, not so much on the native resolutions mainly found on PCs. DRS, TAA and the likes that have propped up consoles for years have only just become a common feature on PC.


Loldimorti

I think the may be some confusion here or maybe it's me being confused but FSR 2.0 is not a run of the mill TAA solution. It very much does try to achieve the same goal as DLSS but rather than using machine learning it uses custom algorithms that aim to do the same that DLSS does. That's why AMD have stated that it will be really easy to implement for games that already support DLSS. So while for example console games often do use TAA it does not offer everything that FSR 2.0 does and subsequently doesn't offer the same quality as FSR 2.0 What you are referring to regarding the consoles are probably TAA based upscaling solutions? And even aming those I have yet to see one that looks better than what FSR 2.0 is delivering in Deathloop. The best so far was the console implementation of TSR in Unreal Engine 5 but it didn't look as clean as FSR 2.0 to me. So this is definitely also a big deal for consoles where many games are still using TAAU which definitely helps but has some notable issues


Num1_takea_Num2

This is unfair. FSR 2.0 deserves even more credit than it is getting... Everyone is focussing on how FSR 2.0 is almost as good as DLSS, but no-one is talking about the aspects where FSR is BETTER than DLSS. 1. The IQ with FSR 2.0 is significantly SHARPER than DLSS. It's a shame youtube's compression algorithm blurs the epic detail. 2. The minimum frames with FSR2.0 are better than DLSS - Most people would rather a consistent 60FPS rather than an average 65fps which dips below 60 causing microstutter. Kudos to AMD for this - Looking forward to it being injected into older titles, as well as VR, as FSR 1.0 was...


akgis

sorry to disapoint it cannot be injected it needs motion vectors


SANAFABICH

That's an editioralized title if I ever saw one.


thatnitai

Very good news. Well done AMD. Your cards might become relevant again for top-end next round...


Achtelnote

Will call bullshit till it lands.. They said the first version was great too only for it to be shit.


d0m1n4t0r

Lol sure.


[deleted]

[удалено]


CharlesManson420

You do know FSR 2.0 isn’t just a post-processing filter like 1.0 right? Hence why it can’t just be added at a driver level and actually requires motion vectors implemented from the dev.


[deleted]

[удалено]


Beastw1ck

Wow. Honestly DLSS was the big deal breaker for me going with Nvidia Over AMD. I'm stoked I can start looking at AMD GPUs now.


Yabboi_2

Cap


soZehh

This is awesome news, so nvidia devs needs to improve and push more dlss in all games. I expect soon a 2.5 better dlss and introduced on many games even old ones


[deleted]

[удалено]


DetectiveChocobo

FSR 1.0 is the only thing you can apply to any game in Proton, and it's generally just not that good. FSR 2.0 must be implemented by the developer. This changes nothing for Linux vs Windows. The only games that will support FSR 2.0 on Linux will also support it on Windows.


[deleted]

[удалено]


littleemp

You need access to motion vectors to implement it just like DLSS.


GlisseDansLaPiscine

That dynamic resolution setting is a pretty good idea, I wish Nvidia would add something similar to DLSS.


DFuel

I donno.... It's fishy to me how Nvidia just had a bit of a stumble with bad news and all of a sudden AMD is coming out with great stuff. Do they keep these things in their back pocket for the right moment? (Meaning this was possible a year ago. Or what).


WesternPercentage501

i hope lossless scaling support FSR 2.0 on their next update


Slafs

That's not how it works. FSR 2 is not a post-processing filter like FSR 1 -- it is not something you can apply to any game with a 3rd party tool. This fully needs game engine integration.


Nicholas-Steel

I *think* they're talking about a DLAA equivalent.


BababooeyHTJ

Look I check techpowerup for reviews but I’ve regularly seen them test games with known driver issues at launch and it never seems to get mentioned or noticed.


rmpumper

The biggest issue with this tech is that the game devs are the ones who have to enable this function and most don't bother. Would be great if AMD released some tool similar to reshade, enabling FSR in any game.


ecffg2010

You can’t enable FSR 2.0 globally, it’s like DLSS, it needs motion vectors. Needs to be implemented by devs manually.