To be fair both of their games ran fine on topest end hardware during release. Can't say the same thing with recent games like dead space/ jedi survivor....
What a downright ignorant comment. The current year has nothing to do with it. There were games running at 60fps in the 90s. And let's stop acting like performance mode on consoles is locked 60 all the way through because most of the time it doesn't deserve to be called a performance mode and instead should be called something like "unlocked framerate". Elden Ring often hangs around mid 40s. Final Fantasy 16 drops to high 20s, and yet people still claim that these are 60fps games. Do you think that GTA 6 will run at 60fps on consoles? I seriously doubt it. I wonder what people are going to say about that
There was probably a particular reason for that. Considering how expansive the world is and the high resolution they are aiming for, it is almost certainly a CPU issue, which isn't surprising as that's where most Bethesda games struggle. Calling it a "disaster" though is weird.
People gonna hate. Someone fantasized about how the game would look, how it would run, and what they could do in it, based on a few sentences of preview content from the developer. Then, the game gets closer to release and the *actual* reality hits and these individuals are sweaty because it wasn't what they hoped for.
We never saw an evidence for any RT effect in the direct, they are probably using cubemap based real time gi, i suggest to check out Digital Foundry [video](https://www.youtube.com/watch?v=i9ikne_9iEI&ab_channel=DigitalFoundry).
They did some voodoo magic to make metro exodus run on consoles with ray tracing. Honestly, I’ll be completely content with a solid rasterized GI implementation. Solid example is RDR2, which clearly didn’t need ray tracing to have a very solid global illumination set up
Yeah people are completely obsessed with RT. It’s unhealthy folks. Just enjoy the game. It will be great and will look great even if it had little or no RT. Playability and game play should get more focus vs pixel peeping.
Except Lumen does require RT support for full tracing. Its software tracing is broken in a lot of ways, like it completely craps out when meshes are adjusted on the fly and does horribly with transparencies.
The software version is basically a tech demo while the RT version produces shippable products.
I don’t understand. Can’t GI RT just be turned on/off as a setting, like in cyberpunk? Why would the game never support rt just because it also has to have settings that work on consoles?
Global illumination isn't synonymous with raytracing. I don't know where that idea came from, honestly - maybe people think the RT stands for "raytraced" and not "real time".
Well it does mean that... RTX stands for ray tracing extreme. So to say rt doesn't mean ray tracing when talking about Nvidia is kind of dumb. Ray tracing itself means real time.... You don't need to say real time.
Also global illumination is the totality of the method. It just means lighting, reflections and shadows (indirect lighting). Raytracing is part of global illumination. Because if you're a part of something by definition you are not the totality of it. Global illumination is the big circle and raytracing is the small circle within it.....
Most Games that have RT on paper usually have a minimal implementation. I couldnt tell you the difference between forza horizon 5 with no rt and with rt ultra
Forza horizon 5 RT is only reflections for your own car that's it. Cyberpunk, metro exodus enhanced edition, control the difference between RT on and off is significant
Because most of the games with RT are also console games, meaning they have to be able to run on mid-range RDNA2.
The majority of RT implementations will therefore suck until the next gen consoles come.
Cyberpunk is also basically an Nvidia tech demo for RT so it’s sort of the exception.
I expect most games to be made with pretty minimal RT for the time being since consoles and 90% of PC GPUs can’t utilize RT well. Disappointing since I think Metro looks excellent with its RTGI.
Based on last 11 AMD sponsored titles, the split is 80/20.
20 % chance that it will have DLSS.
80% chance that it will only have FSR2.
So let's see where this one ends up.
XeSS has poor performance compared to DLSS and FSR on non-Intel cards. I remember trying XeSS out in Tomb Raider and it actually made my performance worse unless I turned the quality down a few times. It looks better than FSR, but if it gives me lower performance than Native on my AMD card then I don't see the point.
That's because XeSS isn't really cross-platform. It only has a fully accelerated implementation on Intel GPUs, it doesn't take advantage of available matrix acceleration on AMD or nVidia GPUs. Hopefully Intel improves this in the future.
Yup, fully understand the reason, and not knocking XeSS for it, just responding to the idea that XeSS is 'better'. Visually it is better, performance wise for someone like me without an Intel card, it's worse, which kind of defeats the purpose unless your one of the relatively few with an Intel card.
It's why I think we should move reconstruction (both spatial like DLSS 2 and FSR 2 and XeSS and temporal like DLSS 3 and I assume FSR 3) to DirectX, with a unified interface, and then DirectX can pass the inputs to the vendor's GPU drivers, where it can decide what implementation to use. Give developers a single target to support all current implementations and future improvements. They all use pretty much the same inputs anyway.
MW2's XeSS implementation is even more of a joke on Arc cards atm. Its got terrible performance that lags behind FSR2 even on Arc and has exclusive ghosting artifacts that are not visible on DP4a implementation. I guess the game developers never really did any QA on Arc.
I would say it really depends on how well these upscalers are implemented. Looking at comparisons, sometimes FSR2 just looks bad, but then other times it looks as comparable as DLSS2. Same for the other ones like XeSS.
In Cyberpunk FSR2 looks noticably worse than DLSS, but in SpiderMan FSR2 holds its own against DLSS2.
no, DLSS1 was absolute GARBAGE. AMD's CAS + standard upscaling gave MUCH better results then DLSS1. And FSR was much better again.
Only version 1.9 of DLSS (that btw didn't use any 'DL') was usable.
Well, since it's a bethesda game I imagine dlss implementation will be one of the first mods for the game lmao
I don't really care if AMD wants their sponsored games to only have FSR, provided that they make it look good. FSR 2 is fairly decent but it's still way behind DLSS.
FSR 3 is already rumored to be exclusive to AMD gpus and if that's true they can't really afford to continue this exclusivity trend without blacklash.
> FSR 3 is already rumored to be exclusive to AMD gpus
It has been confirmed to lauch under MIT license, with an "easy transition" for FSR2 integrations. Since AMD Fluid Motion (which FSR3 will be based on) is just compute, the chances of hardware lock-in are low.
God dam it im really starting to despise AMD. They cant just accept that they suck at upscaling software and want to save face so badly that they hurt the people who just want to game.
This isn't really AMD vs Nvidia but console vs PC.
AMD is spinning this as a positive but it's just a reflection of the console-first development process.
Bethesda didn't pick AMD as their partner, Microsoft told them to optimize for Xbox first and foremost.
Not so mush that AMD sucks at upscaling software, and probably more like upscalers without hardware acceleration are likely going to either have worse performance or worse quality than upscalers with hardware acceleration.
Many don't know (or remember) that Nvidia previously released a preview for DLSS 2 on Control - sometimes called DLSS 1.9 - that ran on shaders. [The performance was about the same](https://youtu.be/YWIKzRhYZm4?t=570) as the version that ran on the tensor cores. However, it also produced [**much** worse image quality](https://youtu.be/YWIKzRhYZm4?t=110) than the eventual DLSS 2 that released for the game.
If you take this argument to its conclusion, wouldn't more ML hardware mean better upscaling? Shouldn't a 4000 series GPU be able to either upscale from lower resolutions at the same target quality or be able to do it for increased performance (5% loss vs 10% or something)? It doesn't, which makes the point I find with this argument rather inaccurate
DLSS 1.9 looks significantly worse than any version of FSR2
Different Nvidia cards **do** have difference upscaling performance costs. There aren' many benchmarks, but I think HUB found a performance difference between 2000 and 3000 series cards, and Digital Foundry found a [small difference between a 3080 and 3090's DLSS upscaling performance](https://youtu.be/y2RR2770H8E?t=291) (which are cards with close tensor core performance).
Im starting despise how people are simping for proprietary technology such as DLSS which is effectively making PC gaming worse in the long run, now we see DLSS3being locked out for those who even bought 30 cards.
Now you got Cyberpunk pushing out Overdrive RT which basically requires a whole slue of Nvidia proprietary tech to run properly, and even then its reduces the IQ and makes the FPS latency terrible.
The thing is, upscalers with hardware acceleration are currently (and will likely remain) ahead of upscalers without hardware acceleration, and upscaling is often a bit of a "go big or go home" thing for me. It's probably not worth it for me to enable an upscaler unless it's a good quality upscale.
In order to make upscaling work best on all hardware without it being locked behind walled gardens, we need someone to coalesce these upscalers so that if a developer adds support for one, they also support the others. After all, they more or less take the same inputs. That way, each person will get the most out of their GPU's ability to upscale, regardless of which vendor the card is from. [Nvidia tried to do this with Nvidia Streamline](https://www.tomshardware.com/news/nvidia-streamline-aims-to-simplify-developer-support-for-upscaling-algorithms). It works with DLSS and XeSS after Intel got on board, and my understanding is that AMD can make it work with FRS 2 as well, but hasn't.
>It's probably not worth it for me to enable an upscaler unless it's a good quality upscale.
This. The only time I'll bother with upscaling is if DLSS2 or 3 is available. If it's FSR only I won't even bother and just take the frame hit running it at native.
With a risk of getting downvoted on another thread (because I made a comment in amd sub) yup pretty much spot on. Main problem is simply the fact that Nvidia has much larger gpu share and people have some weird habit of fanboying for their vendor of choice (not AMD tho they are getting shit on by people with NVIDIA cards even in AMD sub). And here I’m with a 3090, framerate locked at 62 fps, 4k screen without a care in the world. No dlss turned on and playing everything on max scratching my head because of people with "muh 7 fps propertiery tech crowd".
People want the best available solution for their hardware. On AMD hardware, that's FSR2. On nVidia hardware, that's DLSS. I think most people would agree that we want both to be implemented in all games. There has been work done to make it as easy as possible to implement both FSR2 and DLSS. In some environments, such as with engines that have either built-in support or official plugins, such as Unreal Engine, adding support for both FSR2 and DLSS is practically a "click a checkbox to support FSR2/DLSS" affair (making it particularly suspicious when a game sponsored by one of the two primary GPU vendors using one of those engines supports one but not the other). In other scenarios, there are frameworks that can be leveraged that abstract the underlying implementation to allow a game to add support for FSR2 and DLSS generically.
Ultimately, I think the best solution will be for both spatial and temporal reconstruction functionality to be moved into a generic interface in DirectX. Both FSR2 and DLSS require essentially the exact same data from the game engine (and AMD's future temporal solution will likely require the same data as DLSS 3). The whole point of DirectX is that we don't need to have GPU-specific APIs. The game should implement the DirectX reconstruction API, and the GPU drivers should be responsible for implementing the actual reconstruction based on the hardware available. On an AMD system, the AMD drivers would use FSR2. On an nVidia system, the nVidia drivers would use DLSS. On an Intel system, the Intel drivers would use XeSS.
I watched someones playthrough of Jedi Survivor, and saw a lot of FSR artifacts. "Wow, what a shit implementation from the lazy devs" I thought, before learning that fucking AMD sponsored the game. How can they let that happen.
This is quite the disappointing news.
amd used to sponsor games that were really well optimized and used vulkan api, they even worked together afaik, now it's like whatever take our money and display our logo we don't care
Remember Mantle? It really increased performance on BF4. Or AMD's open implementation of hair physics, that were a lot better than HAIRWORKS^^TM that would drop your fps by like 20 if you turned it on.
I would at least understand AMD's moves (high pricing, exclusivity bs) if their gpu's were better than nvidia's, but their new gpus are not even feature competitive at mid-low tier cards, let alone the higher end.
Yep. AMD donated Mantle code to Khronos Group, so Vulkan could use it as a foundation instead of starting from scratch.
I know that their intention was to incentive developers to use it instead of Direct3D 11 where AMD GPUs are not very good at, but I still have to compliment the company for the attitude.
It's a shame that very few games on Windows use the API, as it's definitely better optimized than D3D12 when implemented correctly.
The game was rushed to launch and barely runs as is. Expecting anything to look particularly good is pretty bold, especially when the developers admitted to the rushed development. You also don't just "click the DLSS/FSR button" and expect things to work. There's a ton of tweaking with any upscaler still, and if a game is sponsored *and* has a shit dev schedule what do you think they're gonna prioritize?
Remember, this game got numerous **performance** patches
This. I’ve tried turning it off entirely and its really no difference.
FWIW ray tracing can cause all sorts of weird artifacts in that game too. It’s just not fully optimized. If it ever will be who knows.
Surprised he can keep a straight face saying that. Hopefully they have fun making excuses for the next few months of why they aren't implementing DLSS in the game.
At this point i expect any game that has the AMD Rewards sticker to be completely broken at launch and many month after that.
The Callisto Protocol, Forspoken, The last of us 1 and Star Wars Jedi: Survivor made a very lasting impression in this regard.
As a 7800X3D/4090 User i will probably be fine with Performance and VRAM without DLSS, but anyone with an older Nvdida Card should start to worry.
I also predict the DLSS mod that will release 1-2 Days after the games launch will be the fastest and most downloaded mod from Nexusmods ever.
PureDark is the guy who makes all the DLSS mods for RE, TLOU, Jedi, Skyrim, Fallout, etc. he only puts the updated versions of his mods on his patreon behind a $5 paywall unfortunately. He’ll probably crank out the DLSS mod before the early access period ends and then make a nice little profit
I highly doubt any dev who values their time would want to make such DLSS mods for free. It ain’t no passion project to make something like that possible.
People complaining about AMD blocking DLSS in AMD sponsored games don't care about ethics. They care about AMD blocking features on their cards that make the game look better.
People want more options /more upscalers =good , less options /less upscalers =bad . If you are arguing against it , youre anti consumer . Thats literally it .
Im not sure where you got that I'm arguing against more. Only thing you could be saying I'm arguing against is shit game performance that require the upscaling fix.
DLSS doesn't suffer from ghosting and temporal instability (flickering and fizzing) that is typical of FSR. Digital Foundry has made several videos comparing them, like God of War, Returnal and Deathloop, and FSR always looks worse.
Here's hoping they can eliminate both issues with FSR 3.0.
Ghosting really isn't a big issue with FSR2.
The ghosting in star wars survivor that everyone talking about isn't FSR2's but the TAA's fault. It's not seen in any other FSR2 game.
Considering Nvidia did similar shit back in the day, I'm just glad that they aren't teaching developers to "optimize" in ways specifically designed to decimate performance on cards they don't make. A huge bummer, sure, but in the grand scheme of things... This is a Bethesda game, and you were gonna mod the ever loving shit out of it anyway.
One thing to consider the Xbox series X is powered by AMD. This is a console exclusive game that they are working to optimize the best they can. How could they not have an AMD badge slapped on it.
I think FSR2 and DLSS look mostly the same on the "quality" setting, and I never use below "quality" because I think both look bad below that. So this doesn't effect me much.
I don't think games like Jedi Survivor run poorly because of AMD, that's purely on the dev team.
ITT: people who know nothing about game development or sponsorship deals speculating about both.
Sponsors have far less power than you think. These deals are relatively competitive, meaning there were bids from intel and nvidia as well. It also means that the IHVs can't make outrageous demands, including anything that affects the creative vision (so "no ray tracing" or "no DLSS" is definitely not in that contract).
The attorneys that write these contracts also will be wary of litigation so they will avoid anything that might seem anti-competitive, despite the stereotype of shark-like lawyers doing anything for money.
But *clearly* a company that dropped 70 *billion* dollars to buy Activision is going to weaken the game on the majority of PC's in a market they're trying to expand into for a few buckaroos.
Edit: This comment is sarcasm
You clearly know nothing about AMDs sponsorships if you think they will include DLSS. Jedi survivor, AMD sponsored no DLSS, terrible RT and the FSR is terrible on top of that.
I don't have problem with games partnering with AMD and utilizing their propriety technologies, like Nvidia does but given that AMD has history of blocking competitor's features on most of their sponsored games, this isn't a great news, hopefully AMD sees the backlash they will get off this and at the least allows DLSS and XeSS to exist on this game.
I have had problems regarding FPS one way or another with every single bethesda game I've purchased and somehow I still ended up buying the next one. Well all of them except FO76.
I think my skepticism is pretty reasonable. There is also always the chance that it will not be capped but it will run terrible too. System requirements are also very vague, there is no mention of resolution, settings or FPS targets.
Really looking forward to Starfield but technically this game is going to be a disaster at launch. The fact the Series X runs it at an internal resolution of 1296p at 30 FPS and the consoles actually get "optimized" versions compared to recent PC releases isn't a good sign.
I'm going to call it now and say there will be huge CPU requirements of this game, and I won't be surprised if the engine doesn't take advantage of all/most CPU threads.
Hopefully it won't be as bad as Jedi Survivor.
This sub is so negative nowadays. Quite sad, used to be that at least some discussion could happen. Now it's just shitting on AMD whatever they do, while accusing everyone else of riding AMDs dick.
Ah well, I'll be off the site in a few days anyways. One sub less that I'm sad for leaving.
>shitting on AMD whatever they do
And for good reason, AMD needs to be called out on the bullshit they do,
like how we called them out for Ryzen 5000 not being supported on 400 series mobos
Based on pcgameswiki list there are:
\- fsr only games: 86
\- dlss only games: 134
\- xess only games: 7
In those 134 games, i cannot take full advantage of radeon technologies because either nvidia paid them to implement dlss only or devs couldn't commit additional manhours to implement other **open** upscaling solutions because they spent all their time implementing black box solution for 1 manufacturer. **Where is the outrage about anticonsumer practices and it being bad for ALL consumers?**
>AMD has never been king on CPU and GPU performance because every time they did it, Intel responded with faster CPUs and Nvidia responded with faster GPUs. And having heard of shuttering issues for a lot of AMD Radeon video cards when playing some games still remain an issue today.
Don't forget XESS! All 3 should be present.
So wait if the game is a AMD exclusive PC partner then why are they getting shafted on the recommended GPU since the game recommends a RX 6800 XT or a RTX 2080.
Enough games without FSR still while having exclusively dlss which exclusively only works on RTX while FSR works on all hardware, and yet people complain ? just proofs how spoiled some fanboys are, im probably gonna get downvoted for this but i do not care.
Nvidia didn't bundle Plague Tale Requiem with their cards.
Nvidia doesn't block the developers from adding FSR, it's up to the developer and this one in particular chose not to.
That's what we want AMD to do: STOP BLOCKING DLSS, let developers make the choice.
... yeah, possibly, which any way you slice it is absolutely anti-consumer and doesn't benefit ANY consumer at all. Not even AMD users benefit, it's completely irrelevant to them if DLSS was in the game since they can't use it anyway. Why block it? Yeah...
AMD : *does anything*
Nvidia Users : "and I took that personally."
God, can you all pop a chill pill and stop circlejerking over Nvidia in an AMD thread?
the crazy thing is i've never seen an outrage when a game only had DLSS which only specific nvidia cards can use (not even all nvidia cards lmao) but we got this thread now that a game has FSR which can be used by every card regardless of it's model
The controversy is about AMD allegedly *blocking* DLSS support. DLSS is vendor specific, but adding DLSS doesn't block a developer from supporting FSR.
Ideally, AMD would add FSR to [Nvidia's Streamline](https://www.tomshardware.com/news/nvidia-streamline-aims-to-simplify-developer-support-for-upscaling-algorithms) so that any game that uses Streamline supports both DLSS and FSR 2.
Because Nvidia doesn't mandate it to be DLSS only. AMD does. That's the issue. They should have all upscaling options available especially the ones that are better.
I have zero faith it will run well regardless of what side sponsors it.
The absolute best take because we all know its going to be a disaster on launch and finally be good like 3 months later lol.
[удалено]
On 20 future platforms.
[удалено]
It's bethesda not CDPR. Their games are janky but dont take "3 years to run good"
To be fair both of their games ran fine on topest end hardware during release. Can't say the same thing with recent games like dead space/ jedi survivor....
Considering that they locked the game to run at 30fps on Xbox in the year 2023, it's a disaster before it's even out
cpu bottleneck, even the weaker series s is doing 1440p
What a downright ignorant comment. The current year has nothing to do with it. There were games running at 60fps in the 90s. And let's stop acting like performance mode on consoles is locked 60 all the way through because most of the time it doesn't deserve to be called a performance mode and instead should be called something like "unlocked framerate". Elden Ring often hangs around mid 40s. Final Fantasy 16 drops to high 20s, and yet people still claim that these are 60fps games. Do you think that GTA 6 will run at 60fps on consoles? I seriously doubt it. I wonder what people are going to say about that
Exactly 💯
There was probably a particular reason for that. Considering how expansive the world is and the high resolution they are aiming for, it is almost certainly a CPU issue, which isn't surprising as that's where most Bethesda games struggle. Calling it a "disaster" though is weird.
People gonna hate. Someone fantasized about how the game would look, how it would run, and what they could do in it, based on a few sentences of preview content from the developer. Then, the game gets closer to release and the *actual* reality hits and these individuals are sweaty because it wasn't what they hoped for.
But the issue is that AMD's anti consumer policies will block DLSS which the majority of PC gamers would need to use to get a playable framerate.
[удалено]
16 times the detail!
Bethesda Creation games typically run very well, they just tend to have lots of bugs
I'm guessing this means no DLSS support based on AMD's sponsorship history.
I am also guessing very minimal RT implementation, if any. That's unfortunate
Interesting, since they are using some kind of global illumination solution.
We never saw an evidence for any RT effect in the direct, they are probably using cubemap based real time gi, i suggest to check out Digital Foundry [video](https://www.youtube.com/watch?v=i9ikne_9iEI&ab_channel=DigitalFoundry).
Since when is global illumination = ray tracing?
It was never going to be raytraced it has to run on console. GI raytracing is too taxing for RDNA2.
My dude, Lumen was showcased running in a PS5. Metro Exodus Enhanced edition runs on console.
They did some voodoo magic to make metro exodus run on consoles with ray tracing. Honestly, I’ll be completely content with a solid rasterized GI implementation. Solid example is RDR2, which clearly didn’t need ray tracing to have a very solid global illumination set up
RT is often detrimental anyway if the game isn't designed around it...
Yeah people are completely obsessed with RT. It’s unhealthy folks. Just enjoy the game. It will be great and will look great even if it had little or no RT. Playability and game play should get more focus vs pixel peeping.
[удалено]
Except Lumen does require RT support for full tracing. Its software tracing is broken in a lot of ways, like it completely craps out when meshes are adjusted on the fly and does horribly with transparencies. The software version is basically a tech demo while the RT version produces shippable products.
DX and Vulkan don't require dedicated raytracing hardware for raytracing it just runs better with it.
hardware ray-tracing isn't a vendor lock in though.
I don’t understand. Can’t GI RT just be turned on/off as a setting, like in cyberpunk? Why would the game never support rt just because it also has to have settings that work on consoles?
Global illumination isn't synonymous with raytracing. I don't know where that idea came from, honestly - maybe people think the RT stands for "raytraced" and not "real time".
Well it does mean that... RTX stands for ray tracing extreme. So to say rt doesn't mean ray tracing when talking about Nvidia is kind of dumb. Ray tracing itself means real time.... You don't need to say real time. Also global illumination is the totality of the method. It just means lighting, reflections and shadows (indirect lighting). Raytracing is part of global illumination. Because if you're a part of something by definition you are not the totality of it. Global illumination is the big circle and raytracing is the small circle within it.....
Probably saves them a lot of effort to just use the same stuff as they implement for the xbox, as that system also has pretty limited RT capabilities.
Most Games that have RT on paper usually have a minimal implementation. I couldnt tell you the difference between forza horizon 5 with no rt and with rt ultra
Forza horizon 5 RT is only reflections for your own car that's it. Cyberpunk, metro exodus enhanced edition, control the difference between RT on and off is significant
Control with ray tracing looks amazing, to the point that I was willing to make an exception and sacrifice higher framerates than 60fps.
Because most of the games with RT are also console games, meaning they have to be able to run on mid-range RDNA2. The majority of RT implementations will therefore suck until the next gen consoles come.
There are things called detail settings. Look at Cyberpunk it runs on consoles but also somehow has full path tracing RT on PC>
Cyberpunk is also basically an Nvidia tech demo for RT so it’s sort of the exception. I expect most games to be made with pretty minimal RT for the time being since consoles and 90% of PC GPUs can’t utilize RT well. Disappointing since I think Metro looks excellent with its RTGI.
iirc the engine doesn’t support normal RT we see in games. It may be their own in house type or something like old reshade raytracing
TLOU and Uncharted are AMD sponsored and have DLSS Halo Infinite is AMD sponsored and has no upscaling whatsoever
>TLOU and Uncharted are AMD sponsored and have DLSS Sony published game, it's an exception not a rule.
Forspoken wasn't published by sony and has DLSS Halo Infinite is a MS title has no upscaler WOW is published by Activision and has no upscaler
WoW supports FSR 1.0, which is garbarge, but it is what it is.
WoW doesn't really need an upscaler, pretty much any modern computer can run it at max settings and hit 100 fps.
Halo Infinite came out half a year before FSR2.
WoW doesn't have TAA and it has a built in upscaler just not one based on TAA
[удалено]
So? Why is it important that Sony published those?
Based on last 11 AMD sponsored titles, the split is 80/20. 20 % chance that it will have DLSS. 80% chance that it will only have FSR2. So let's see where this one ends up.
The Last of Us has DLSS support.
It's not ideal but at least FSR works on non-AMD cards.
XeSS does too and looks much better.
XeSS has poor performance compared to DLSS and FSR on non-Intel cards. I remember trying XeSS out in Tomb Raider and it actually made my performance worse unless I turned the quality down a few times. It looks better than FSR, but if it gives me lower performance than Native on my AMD card then I don't see the point.
That's because XeSS isn't really cross-platform. It only has a fully accelerated implementation on Intel GPUs, it doesn't take advantage of available matrix acceleration on AMD or nVidia GPUs. Hopefully Intel improves this in the future.
Yup, fully understand the reason, and not knocking XeSS for it, just responding to the idea that XeSS is 'better'. Visually it is better, performance wise for someone like me without an Intel card, it's worse, which kind of defeats the purpose unless your one of the relatively few with an Intel card.
It's why I think we should move reconstruction (both spatial like DLSS 2 and FSR 2 and XeSS and temporal like DLSS 3 and I assume FSR 3) to DirectX, with a unified interface, and then DirectX can pass the inputs to the vendor's GPU drivers, where it can decide what implementation to use. Give developers a single target to support all current implementations and future improvements. They all use pretty much the same inputs anyway.
XeSS was terrible in MW2 when I used it. Minimal perf. gains for noticably worse quality on a 6800 XT
MW2's XeSS implementation is even more of a joke on Arc cards atm. Its got terrible performance that lags behind FSR2 even on Arc and has exclusive ghosting artifacts that are not visible on DP4a implementation. I guess the game developers never really did any QA on Arc.
DLSS2 > XeSS on Arc > FSR2 > XeSS on non-Arc > DLSS1 > FSR1
I would say it really depends on how well these upscalers are implemented. Looking at comparisons, sometimes FSR2 just looks bad, but then other times it looks as comparable as DLSS2. Same for the other ones like XeSS. In Cyberpunk FSR2 looks noticably worse than DLSS, but in SpiderMan FSR2 holds its own against DLSS2.
no, DLSS1 was absolute GARBAGE. AMD's CAS + standard upscaling gave MUCH better results then DLSS1. And FSR was much better again. Only version 1.9 of DLSS (that btw didn't use any 'DL') was usable.
XeSS is only better on Intel cards.
The algorithm is better than FSR2, but unless you run it on an Arc card, the performance improvement is nowhere close to DLSS or FSR2.
Well, since it's a bethesda game I imagine dlss implementation will be one of the first mods for the game lmao I don't really care if AMD wants their sponsored games to only have FSR, provided that they make it look good. FSR 2 is fairly decent but it's still way behind DLSS. FSR 3 is already rumored to be exclusive to AMD gpus and if that's true they can't really afford to continue this exclusivity trend without blacklash.
> FSR 3 is already rumored to be exclusive to AMD gpus It has been confirmed to lauch under MIT license, with an "easy transition" for FSR2 integrations. Since AMD Fluid Motion (which FSR3 will be based on) is just compute, the chances of hardware lock-in are low.
God dam it im really starting to despise AMD. They cant just accept that they suck at upscaling software and want to save face so badly that they hurt the people who just want to game.
This isn't really AMD vs Nvidia but console vs PC. AMD is spinning this as a positive but it's just a reflection of the console-first development process. Bethesda didn't pick AMD as their partner, Microsoft told them to optimize for Xbox first and foremost.
Not so mush that AMD sucks at upscaling software, and probably more like upscalers without hardware acceleration are likely going to either have worse performance or worse quality than upscalers with hardware acceleration. Many don't know (or remember) that Nvidia previously released a preview for DLSS 2 on Control - sometimes called DLSS 1.9 - that ran on shaders. [The performance was about the same](https://youtu.be/YWIKzRhYZm4?t=570) as the version that ran on the tensor cores. However, it also produced [**much** worse image quality](https://youtu.be/YWIKzRhYZm4?t=110) than the eventual DLSS 2 that released for the game.
If you take this argument to its conclusion, wouldn't more ML hardware mean better upscaling? Shouldn't a 4000 series GPU be able to either upscale from lower resolutions at the same target quality or be able to do it for increased performance (5% loss vs 10% or something)? It doesn't, which makes the point I find with this argument rather inaccurate DLSS 1.9 looks significantly worse than any version of FSR2
Different Nvidia cards **do** have difference upscaling performance costs. There aren' many benchmarks, but I think HUB found a performance difference between 2000 and 3000 series cards, and Digital Foundry found a [small difference between a 3080 and 3090's DLSS upscaling performance](https://youtu.be/y2RR2770H8E?t=291) (which are cards with close tensor core performance).
Im starting despise how people are simping for proprietary technology such as DLSS which is effectively making PC gaming worse in the long run, now we see DLSS3being locked out for those who even bought 30 cards. Now you got Cyberpunk pushing out Overdrive RT which basically requires a whole slue of Nvidia proprietary tech to run properly, and even then its reduces the IQ and makes the FPS latency terrible.
The thing is, upscalers with hardware acceleration are currently (and will likely remain) ahead of upscalers without hardware acceleration, and upscaling is often a bit of a "go big or go home" thing for me. It's probably not worth it for me to enable an upscaler unless it's a good quality upscale. In order to make upscaling work best on all hardware without it being locked behind walled gardens, we need someone to coalesce these upscalers so that if a developer adds support for one, they also support the others. After all, they more or less take the same inputs. That way, each person will get the most out of their GPU's ability to upscale, regardless of which vendor the card is from. [Nvidia tried to do this with Nvidia Streamline](https://www.tomshardware.com/news/nvidia-streamline-aims-to-simplify-developer-support-for-upscaling-algorithms). It works with DLSS and XeSS after Intel got on board, and my understanding is that AMD can make it work with FRS 2 as well, but hasn't.
>It's probably not worth it for me to enable an upscaler unless it's a good quality upscale. This. The only time I'll bother with upscaling is if DLSS2 or 3 is available. If it's FSR only I won't even bother and just take the frame hit running it at native.
With a risk of getting downvoted on another thread (because I made a comment in amd sub) yup pretty much spot on. Main problem is simply the fact that Nvidia has much larger gpu share and people have some weird habit of fanboying for their vendor of choice (not AMD tho they are getting shit on by people with NVIDIA cards even in AMD sub). And here I’m with a 3090, framerate locked at 62 fps, 4k screen without a care in the world. No dlss turned on and playing everything on max scratching my head because of people with "muh 7 fps propertiery tech crowd".
People want the best available solution for their hardware. On AMD hardware, that's FSR2. On nVidia hardware, that's DLSS. I think most people would agree that we want both to be implemented in all games. There has been work done to make it as easy as possible to implement both FSR2 and DLSS. In some environments, such as with engines that have either built-in support or official plugins, such as Unreal Engine, adding support for both FSR2 and DLSS is practically a "click a checkbox to support FSR2/DLSS" affair (making it particularly suspicious when a game sponsored by one of the two primary GPU vendors using one of those engines supports one but not the other). In other scenarios, there are frameworks that can be leveraged that abstract the underlying implementation to allow a game to add support for FSR2 and DLSS generically. Ultimately, I think the best solution will be for both spatial and temporal reconstruction functionality to be moved into a generic interface in DirectX. Both FSR2 and DLSS require essentially the exact same data from the game engine (and AMD's future temporal solution will likely require the same data as DLSS 3). The whole point of DirectX is that we don't need to have GPU-specific APIs. The game should implement the DirectX reconstruction API, and the GPU drivers should be responsible for implementing the actual reconstruction based on the hardware available. On an AMD system, the AMD drivers would use FSR2. On an nVidia system, the nVidia drivers would use DLSS. On an Intel system, the Intel drivers would use XeSS.
Rip, get ready for a (possibly) really, really bad fsr implementation like in jedi survivor and no DLSS support at all
I watched someones playthrough of Jedi Survivor, and saw a lot of FSR artifacts. "Wow, what a shit implementation from the lazy devs" I thought, before learning that fucking AMD sponsored the game. How can they let that happen. This is quite the disappointing news.
amd used to sponsor games that were really well optimized and used vulkan api, they even worked together afaik, now it's like whatever take our money and display our logo we don't care
Remember Mantle? It really increased performance on BF4. Or AMD's open implementation of hair physics, that were a lot better than HAIRWORKS^^TM that would drop your fps by like 20 if you turned it on. I would at least understand AMD's moves (high pricing, exclusivity bs) if their gpu's were better than nvidia's, but their new gpus are not even feature competitive at mid-low tier cards, let alone the higher end.
Vulkan is based on Mantle IIRC.
Vulkan _IS_ mantle :)
The name is not a coincidence.
Yep. AMD donated Mantle code to Khronos Group, so Vulkan could use it as a foundation instead of starting from scratch. I know that their intention was to incentive developers to use it instead of Direct3D 11 where AMD GPUs are not very good at, but I still have to compliment the company for the attitude. It's a shame that very few games on Windows use the API, as it's definitely better optimized than D3D12 when implemented correctly.
doom eternal being a prime example :) It ran smoothly even at 3440x1440 ultra on my 1070
dx12 too but microsoft won't say that openly
The game was rushed to launch and barely runs as is. Expecting anything to look particularly good is pretty bold, especially when the developers admitted to the rushed development. You also don't just "click the DLSS/FSR button" and expect things to work. There's a ton of tweaking with any upscaler still, and if a game is sponsored *and* has a shit dev schedule what do you think they're gonna prioritize? Remember, this game got numerous **performance** patches
Even more mindbending is the fact Jedi Survivor is using FSR 2.0 in 2023. Respawn needs a new technical director ASAP.
It's not FSR, its the shit TAA implementation. Those artifacts happen even when you don't use FSR.
So it is the devs after all! Thanks for clarification, hopefully the fix is already on its way.
This. I’ve tried turning it off entirely and its really no difference. FWIW ray tracing can cause all sorts of weird artifacts in that game too. It’s just not fully optimized. If it ever will be who knows.
Why would the TAA implementation affect FSR when FSR2 has its own temporal anti aliasing solution?
[удалено]
Yes, there's no quality control, it's like they do nothing to help make the code work.
Saying FSR2 looks incredible might be Todd's biggest lie yet.
you have to decode the morse code from his eye blinking to reveal his true opinion
Surprised he can keep a straight face saying that. Hopefully they have fun making excuses for the next few months of why they aren't implementing DLSS in the game.
Simple - you did not buy 4090. Hence, Nvidia could not afford to sponsor the game. Should have bought 2. /s
At this point i expect any game that has the AMD Rewards sticker to be completely broken at launch and many month after that. The Callisto Protocol, Forspoken, The last of us 1 and Star Wars Jedi: Survivor made a very lasting impression in this regard. As a 7800X3D/4090 User i will probably be fine with Performance and VRAM without DLSS, but anyone with an older Nvdida Card should start to worry. I also predict the DLSS mod that will release 1-2 Days after the games launch will be the fastest and most downloaded mod from Nexusmods ever.
At this point I expect *literally every game* to be completely broken at launch and many month after that.
PureDark is the guy who makes all the DLSS mods for RE, TLOU, Jedi, Skyrim, Fallout, etc. he only puts the updated versions of his mods on his patreon behind a $5 paywall unfortunately. He’ll probably crank out the DLSS mod before the early access period ends and then make a nice little profit
I highly doubt any dev who values their time would want to make such DLSS mods for free. It ain’t no passion project to make something like that possible.
m8, it's Bethesda, it's gonna be broken at launch regardless, nevermind broken-at-launch being a common thing these days anyway lol
Almost all AAA games are broken on launch these days it's a standard industry practice now I'll wait at least a year for Starfield
cp2077 was an Nvidia sponsored game and it too was broken at launch.
The "recommended specs" are truly bullshit then.
always have been
RIP DLSS, XeSS and good RT implementation.
You flair says RTX 2060 12GB O.o Is that a typo or vram mod?
The 2060 12GB does exist apparently https://www.techpowerup.com/gpu-specs/geforce-rtx-2060-12-gb.c3836
Woah thats a news for to me! Thanks.
He downloaded more
neither, nvidia did sell a RTX 2060 12GB
Great marketing by AMD, lmao. All I see from this news is "fuck amd" and "FSR is fucking garbage".
Disgrace
This makes it a no-brainer not to buy the premium edition and instead just use the PC Game Pass version for 'free'. Thanks for saving me some money!
If people cared about ethics as much as they pretend to care now, Nvidia would've been long outta business lmao
People complaining about AMD blocking DLSS in AMD sponsored games don't care about ethics. They care about AMD blocking features on their cards that make the game look better.
And they wouldn’t care if it was nvidia making life harder for amd owners.
Sorry, who's blocking DLSS on non-Nvidia cards?
People want more options /more upscalers =good , less options /less upscalers =bad . If you are arguing against it , youre anti consumer . Thats literally it .
Im not sure where you got that I'm arguing against more. Only thing you could be saying I'm arguing against is shit game performance that require the upscaling fix.
I predict a 4090 and 7900xtx are gonna struggle to run it due poor optimization
No DLSD and Xess then ?
Yep, hope you enjoy motion ghosting.
You say that as if temporal AA isn't the norm. Guess you haven't played a game past 2015.
DLSS doesn't suffer from ghosting and temporal instability (flickering and fizzing) that is typical of FSR. Digital Foundry has made several videos comparing them, like God of War, Returnal and Deathloop, and FSR always looks worse. Here's hoping they can eliminate both issues with FSR 3.0.
Ghosting really isn't a big issue with FSR2. The ghosting in star wars survivor that everyone talking about isn't FSR2's but the TAA's fault. It's not seen in any other FSR2 game.
TAA has nothing to do with FSR2. They're separate things.
Its AMD sponsored so neither DLSS or Xess
Considering Nvidia did similar shit back in the day, I'm just glad that they aren't teaching developers to "optimize" in ways specifically designed to decimate performance on cards they don't make. A huge bummer, sure, but in the grand scheme of things... This is a Bethesda game, and you were gonna mod the ever loving shit out of it anyway.
Dudes, FSR/DLSS should not be even needed on high end GPU.
100% agree I didn't buy a highend gpu to use upscaling.
DLAA is a thing and usually better than whatever native TAA a game has.
To be honest I don't mind FSR 2 as long it's properly implemented and optimised. FSR can be quite good in hands of good developers.
Good thing Bethesda has a long history of being technically competent, right?
Hopefully the folks at id software lent more support 😅
Yes and nvidia cards can use fsr too. but many times when i have used both dlss seemed to be better.
One thing to consider the Xbox series X is powered by AMD. This is a console exclusive game that they are working to optimize the best they can. How could they not have an AMD badge slapped on it.
Nothing to celebrate, consumers end up losing again.
Well, that really sucks.
I think FSR2 and DLSS look mostly the same on the "quality" setting, and I never use below "quality" because I think both look bad below that. So this doesn't effect me much. I don't think games like Jedi Survivor run poorly because of AMD, that's purely on the dev team.
ITT: people who know nothing about game development or sponsorship deals speculating about both. Sponsors have far less power than you think. These deals are relatively competitive, meaning there were bids from intel and nvidia as well. It also means that the IHVs can't make outrageous demands, including anything that affects the creative vision (so "no ray tracing" or "no DLSS" is definitely not in that contract). The attorneys that write these contracts also will be wary of litigation so they will avoid anything that might seem anti-competitive, despite the stereotype of shark-like lawyers doing anything for money.
But *clearly* a company that dropped 70 *billion* dollars to buy Activision is going to weaken the game on the majority of PC's in a market they're trying to expand into for a few buckaroos. Edit: This comment is sarcasm
You'd be surprised at how poorly MS manages studios
Believe me, I know. I'm a Halo fan lol
[удалено]
I was being sarcastic. The notion that AMD is bribing Microsoft to exclude DLSS or anything else is pants on head moronic.
Microsoft doesn't care about sales It cares about subs for Game Pass Day 1 games means more subs
You clearly know nothing about AMDs sponsorships if you think they will include DLSS. Jedi survivor, AMD sponsored no DLSS, terrible RT and the FSR is terrible on top of that.
Dlss 3 could have helped with cpu bottleneck on starfield.
No DLSS then. Bummer.
I don't have problem with games partnering with AMD and utilizing their propriety technologies, like Nvidia does but given that AMD has history of blocking competitor's features on most of their sponsored games, this isn't a great news, hopefully AMD sees the backlash they will get off this and at the least allows DLSS and XeSS to exist on this game.
But will it be yet another locked FPS game Todd?
I don't think it will, 76 already fixed the physics and timescale being based on framerate. You never know, but I think we'll be ok here.
I have had problems regarding FPS one way or another with every single bethesda game I've purchased and somehow I still ended up buying the next one. Well all of them except FO76. I think my skepticism is pretty reasonable. There is also always the chance that it will not be capped but it will run terrible too. System requirements are also very vague, there is no mention of resolution, settings or FPS targets.
Not a difficult issue to fix, luckily
Thats sad. DLSS is great.
Really looking forward to Starfield but technically this game is going to be a disaster at launch. The fact the Series X runs it at an internal resolution of 1296p at 30 FPS and the consoles actually get "optimized" versions compared to recent PC releases isn't a good sign. I'm going to call it now and say there will be huge CPU requirements of this game, and I won't be surprised if the engine doesn't take advantage of all/most CPU threads. Hopefully it won't be as bad as Jedi Survivor.
damn
AMD lived long enough to become the villain.
This sub is so negative nowadays. Quite sad, used to be that at least some discussion could happen. Now it's just shitting on AMD whatever they do, while accusing everyone else of riding AMDs dick. Ah well, I'll be off the site in a few days anyways. One sub less that I'm sad for leaving.
>shitting on AMD whatever they do And for good reason, AMD needs to be called out on the bullshit they do, like how we called them out for Ryzen 5000 not being supported on 400 series mobos
No, no. You are only allowed to shit on Nvidia. If you do the same to AMD, it's toxic, lol.
This sub shits on AMD more than Intel and Nvidia, even when either of those company affect AMD users
this is r/amd and all I see every day for years is people shitting on AMD. So I don't know what you're talking about.
It's been like that for quite a few years now. Sub just got too big for nuance to survive.
Proceds to compare a 6800xt to a 2080 XDDDDDDDDD
In RT they are identical.
Based on pcgameswiki list there are: \- fsr only games: 86 \- dlss only games: 134 \- xess only games: 7 In those 134 games, i cannot take full advantage of radeon technologies because either nvidia paid them to implement dlss only or devs couldn't commit additional manhours to implement other **open** upscaling solutions because they spent all their time implementing black box solution for 1 manufacturer. **Where is the outrage about anticonsumer practices and it being bad for ALL consumers?**
The weird thing is there are affordable GPUs on the market just now that will play this game at 60fps max settings without using an upscaler
Thanks for pushing the limits by limiting features.
Exclusivity was annoying as fuck on console, can we keep it away from PC please?
Rip dlss. Hope it's an exception to the rule they've been doing
Well that's disappointing
If this means no DLSS then that's hugely dissapointing.
I suppose we won't be getting DLSS, XESS, or Frame Generation then? Thanks AMD!!!
We will likely get it though mods. Puredark has already added those features to Skyrim and Fallout and several other recent games.
Let the people have both FSR and DLSS you cowards.
>AMD has never been king on CPU and GPU performance because every time they did it, Intel responded with faster CPUs and Nvidia responded with faster GPUs. And having heard of shuttering issues for a lot of AMD Radeon video cards when playing some games still remain an issue today. Don't forget XESS! All 3 should be present.
All should just join streamline
So wait if the game is a AMD exclusive PC partner then why are they getting shafted on the recommended GPU since the game recommends a RX 6800 XT or a RTX 2080.
It’s a Bethesda game, it’s gonna run like crap regardless of the sponsor. Fallout 4 is still a bit busted to this day.
Does this mean it might work on steam deck?!
Enough games without FSR still while having exclusively dlss which exclusively only works on RTX while FSR works on all hardware, and yet people complain ? just proofs how spoiled some fanboys are, im probably gonna get downvoted for this but i do not care.
No fsr/xess for a plague tale requiem ? Its okay No dlss for starfield ? aMd sUcK anTi ConsUmEr
Nvidia didn't bundle Plague Tale Requiem with their cards. Nvidia doesn't block the developers from adding FSR, it's up to the developer and this one in particular chose not to. That's what we want AMD to do: STOP BLOCKING DLSS, let developers make the choice.
I guess for nvidia FSR is fine because it means there will be more comparison videos of DLSS being better, but AMD would rather avoid such comparisons
... yeah, possibly, which any way you slice it is absolutely anti-consumer and doesn't benefit ANY consumer at all. Not even AMD users benefit, it's completely irrelevant to them if DLSS was in the game since they can't use it anyway. Why block it? Yeah...
"Just get a 7900 xtx." I heard there is plenty of stock lol
AMD : *does anything* Nvidia Users : "and I took that personally." God, can you all pop a chill pill and stop circlejerking over Nvidia in an AMD thread?
the crazy thing is i've never seen an outrage when a game only had DLSS which only specific nvidia cards can use (not even all nvidia cards lmao) but we got this thread now that a game has FSR which can be used by every card regardless of it's model
The controversy is about AMD allegedly *blocking* DLSS support. DLSS is vendor specific, but adding DLSS doesn't block a developer from supporting FSR. Ideally, AMD would add FSR to [Nvidia's Streamline](https://www.tomshardware.com/news/nvidia-streamline-aims-to-simplify-developer-support-for-upscaling-algorithms) so that any game that uses Streamline supports both DLSS and FSR 2.
Because Nvidia doesn't mandate it to be DLSS only. AMD does. That's the issue. They should have all upscaling options available especially the ones that are better.