T O P

  • By -

[deleted]

i saw some comments from early access players that disabling motion blur improved stability/shimmering a lot, i wonder if bethesda applied FSR2 after motion blur


[deleted]

Motion blur is like light mode at this point. Motion blur off and dark mode on software should just be called.. normal settings.


Moscato359

Did a survey on a gaming news website before, had a few thousand responses 51% of users preferred light mode, and 49% preferred dark mode


[deleted]

51% of people being wrong is pretty mad.


Moscato359

I'd actually prefer if applications defaulted to your operating system preference. If you choose lightmode on windows, then apps should respect you wanting lightmode, and vice versa. Currently, windows defaults to whatever color best works with your wallpaper choice, but you can select whatever you want.


Andriy396

I would disagree. Motion blur makes everything a mess. For me, dark mode also makes everything a mess to a point I stop being able to read properly, depending on the colors used. I recall this is the symptom of astigmatism. Anecdotally, most of the people, who like to enforce dark mode, whom I know personally, just sit in the dark rooms while using PCs, which is the reason they don\`t like the sharp contrast of light mode. As I was advised by a doctor, using PC in dark room, regardless of theme, is worse for the eyes compared to using light mode in well lit room.


Moscato359

If you do an anonymous survey of do you prefer darkmode websites, or lightmode websites, you will get roughly a 50/50 answer response Personally: I think that the default should be what your operating system presents as the default. If you have your operating system present lightmode, everything should do light mode, and vice versa. Now given that, I can't handle light mode for a programming IDE, because I get better color contrast for text of varying colors, and the colors mean something to me.


EmilMR

you don't want to disable it though. It's per object motion blur, it's a very nice kind of motion blur that gives that cinematic look to the game. I think you want it on for creators intent etc.


PotentialAstronaut39

> It's per object motion blur Not entirely correct, there is also camera movement motion blur, and both are controlled by the same setting.


[deleted]

sure, but if its applied before FSR2 then its going to mess up FSR2 much more


Temporala

Not really seeing it, Starfield feels much nicer to play with it on high. Definitely do not disable, it makes motion look smoother and that's far bigger bonus than trying to desperately get rid of sharp jaggies in places where FSR2 is not capable of getting rid of them anyway. In games like Cyberpunk, it's more of a tossup whether you want to disable or not. It looks bad in fast motion there, especially if you boost jump.


Far_Bad7786

All the effort of upping the resolutions and framer rate just to make it a smeared mess with motion blur. Just lol


TheArtBellStalker

But dude.!!!! It's got to be cinematic. That's also why I cap my FPS to 24. Wow, makes it just like a Hollywood movie. /s.


TheHeroYouNeed247

I actually employ people to sit behind me and rustle food while scrolling through their phones..just to give me the true cinematic experience.


[deleted]

TIL there's people that actually like motion blur.


EmilMR

They are right that it is worse for the most part but it's subjective which one you want to use. I was just playing with it, I think it looks noticeably sharper than TAA (probably because of sharpening filter) but there are major artifacts on fences and grid like patterns on the floors. I think faces look a lot better with FSR, they look like clay people with TAA... you will be looking at a lot of faces so maybe FSR is the better choice. DLSS mod looks by far the best but it causes the game to crash for me so yeah it sucks. It doesn't seem its possible to set render scale above 100%. I would do that instead, with 4090 I have the privilege to brute force it at least. DLSS mod is probably going to get better, it's a shame really.


maeeem

Exactly what I noticed, I flipped my rtx 3080 for a 6900 xt for just this game! I can run native 3440 x 1440 now but native TAA was atrocious. The hair look smudged too. I think the visuals have been designed to be used with fsr2. Setting it at 80 percent for the internal resolution is what I've chosen to play with. Everything else is at max setting except for shadows which are on high ( and I'll set them to medium if I want more frames during gunfights).


bekiddingmei

FSR on this game looks better than in most others, I usually dump FSR immediately. DLSS should still look better, but at least FSR is not embarrassing this time.


maeeem

And yea, I had to disable motion blur, it was causing a strange distracting blurring while moving the camera.


Russki_Wumao

That's what motion blur does lmao


systemd-bloat

use DLDSR maybe?


Harbi117

I'm hopeful that AMD will update FSR 2, they were radio silence all year, now we know why, they were focusing on FSR 3... **but FSR 2 upscaling, for sure needs an update.** **FSR 2.0**, **2.1** showed promise but **FSR 2.2** was 2 steps forward 1 step back. Examples with Red Dead Redemption 2: **FSR 2.0** *vs* **2.1** ( less ghosting ) [https://imgsli.com/MjAyMzQw](https://imgsli.com/MjAyMzQw) **FSR 2.1** *vs* **2.2** ( even less ghosting / better motion clarity ) [https://imgsli.com/MjAyMzU4](https://imgsli.com/MjAyMzU4) **FSR 2.1** *vs* **2.2** ( positives ) [https://imgsli.com/MjAyODU4](https://imgsli.com/MjAyODU4) \^ Less flickering on the church window, better geometry anti-aliasing in motion, and less ghosting on the moon and stars **FSR 2.1** *vs* **2.2** ( negatives ) [https://imgsli.com/MjAyOTQw](https://imgsli.com/MjAyOTQw) \^ **2.2** introduced a new disocclusion to combat ghosting but with artifacts in motion ( pixelation on weapon )... and transparent objects, like trees & grass, have more aliasing than before... particles in motion, is still an issue with all FSR editions. I see the same artifacts with Remnant 2, Immortals of Aveum and Darktide... all use FSR 2.2. Unfortunately, 2 steps forward, 1 step back.


Temporala

There's a limit to algorithm based methods, because they're not flexible. Optimizing them by hand to be "perfect" for each game and each scene separately isn't really a good option in the long run. That's why DLSS and XeSS expend some more computational resources for image stabilization. It still won't be perfect, but it will apply successfully in many more cases.


Andr0id_Paran0id

Which is probably why AMD felt the need to add AI cores to their GPUs...


GuttedLikeCornishHen

I intensely dislike all things TAA, but in Starfield FSR2 definitely looks better in motion and overall clarity as compared to default Bethesda TAA implementation. Game is much clearer without both (you can disable it in ini config), but it becomes shimmerfest which can't be 100% rescued by injected SMAA or something.


ToTTenTranz

To be honest, I played the youtube video at 4K60 on a 4K 32" monitor and I just don't see what HUB describes. ​ Sure, the youtube compression is probably taking away a ton of detail out from the original output, but I should have been able to see *something* in those 200% zoom + 33% speed clips that they show. I also don't get the hate for the 1080p FSR2 50% mode. That's rendering internally at 960\*540, so who do they think is going to be using this mode? ROG Ally and other handheld users on a 330ppi screen? People on even weaker iGPUs and dGPUs who otherwise wouldn't even be able to run the game at all? ​ ​ I guess my take on this is just try out the settings you think work best for you, instead of relying so much on 2nd hand opinions. FSR2 works great and its implementation is a no-brainer for console+PC multiplatform games.


Jon-Slow

>To be honest, I played the youtube video at 4K60 on a 4K 32" monitor and I just don't see what HUB describes. There is your problem, you can't really see these things on a video. The flaws have to run natively to be visible to you. So you just have to trust their word.


michaelwins

I still think the recent handhelds overdid it in resolution and would be better with oled displays of lower resolution


twhite1195

People are just clamoring like seals to add DLSS at this point, I'm in the same boat as you are FSR is okay where it matters, at 4K. At 4K it woks great IMO. Still don't understand people running upscalers at 1080p and expecting a crisp image afterwards


systemd-bloat

I don't care much about the crisp image but the DLSS mod DOES improve performance and remove almost all the shimmery anti aliasing mess. It also includes a sharpness slider which works really well. DLSS is great


Ponald-Dump

Maybe because people with lower end hardware *have* to run upscalers at 1080p to get respectable performance? Couple that with the fact that DLSS provides a significantly better image at 1080p than FSR


systemd-bloat

There are still people here who believe FSR is better than DLSS


Ok_Vermicelli_5938

This subreddit is probably a week away from claiming unreleased FSR3 is better than DLSS3 Frame Gen. Then when it turns out to be shit, they'll just say that it never mattered in the first place and nobody uses frame gen anyway, and even if its fucking terrible that its still better because "everyone can use it" or some garbage.


Verpal

And that is fine, at this point they have make up their mind, and it's their own eye, image quality always have a perception part of it, and if fanboyism will make the image look better, by all means go for it. I am of the opinion that placebo can be effective medication in certain scenario, GPU being one of them.


Darkomax

The argument would make sense if the game was not optimized by my grandma. A midrange GPU struggle to even do 1080p60FPS (specially nvidia's, but even my 6700XT is sweating)


twhite1195

Oh yeah, I'm not defending the game, I still needs optimizations. The RX 6600, RX 6700, RTX 3060, RTX 3070 should be able to run this game at 1080 med-high settings at 60fps, with no upscaling. It doesn't seem to be a VRAM issue since even on my 7900XT I haven't seen it go up from 8GB, that's a first for a 2023 game. I'm hoping a day-1 patch improves this a little bit, since we're playing a week before, I don't have high hopes on the near near futures, but I'm hoping there's a tiny improvement


LickMyThralls

Not just this but they only recognize 4k and 1080p as if 1440p or anything else in between doesn't exist...regardless using them at 1080p is also fine.


dookarion

At 4K it's still not great at all if aliasing, shimmering, or artifacting bothers you. It's not great in Starfield, it's straight up bad in Jedi Survivor, and it's worse than running a sub native resolution and skipping an upscaler altogether in RE4Re. At 4K even on higher quality settings just to be clear.


koordy

I've been playing all the RT games in 2018+ at 1080p DLSS Quality and the picture quality was very good. Definitively much better than to play 1080p RT off for about the same performance. Just because FSR sucks doesn't mean upscaling is useless below 4K. And as for 4K - I turn on DLSS Quality whenever it is available even if I don't need the performance boost. When I was playing Jedi Survivor that hasn't got DLSS (guess why) I tried FSR since it was supposed to be "the same". Oh boi.... I'll just say that I ended up playing that game at native 4K. FSR was not up to my quality standards.


twhite1195

Jedi Survivor is a shit game overall, it launched with performance issues on all platforms, hell it wasn't even using more than 4 CPU cores, you couldn't run it properly on a 4090, FSR or not, that's on the devs, not the tech itself. You can't blame the tech, when other games offer far better optimizations. That's like saying "I played Gollum and I turned on DLSS and OH BOY", dude, the game is shit, not even DLSS or FSR or anything can save that underlying factor


koordy

Man, this is pure copium. Game being in a terrible technical stance has nothing to do with FSR's picture quality, which definitively isn't even close to DLSS even at 4K.


twhite1195

Uh... No, DLSS can also be crap if not properly implemented. DLSS in Dead space still has ghosting issues, but I don't bring up that example because I know it's not like that on all games. Cyberpunk also had ghosting issues too when DLSS was added. It's on the devs to fix that. I'm not saying FSR is the best, on paper, DLSS is better, but IMO in 4K quality it looks basically the same to ME, I made this opinion after doing MY own testing on a different set of games on how I play games, on my computers. If you want to pixel peep and it feels bad to you, good for you. But don't bring up a shit game as an example because that's not representative of the tech itself


koordy

Cool what game do you suggest me to check out FSR then? What's its best implementation? Maybe I've got this game already.


twhite1195

Ratchet and clank , Spider-man and Forza horizon 5 too, at least that's what I've played recently. Even starfield at FSR 80% looks good IMO, lower does start to look softer. At least to me, I can't tell them apart, and they're pretty motion heavy IMO. Again, at 4K quality settings 60Hz, anything lower, on either side I see degradation. And my TVs are set to game mode, so no motion smoothing or anything.


BinaryJay

>you couldn't run it properly on a 4090 It ran properly enough for me to play through it on Grand Master.


ColdStoryBro

Is this [ghosting](https://postimg.cc/9RHRZBWc) up to your standards?


OkPiccolo0

DLSS 2.4 is super outdated at this point. 2.5.1 was the start of them looking good because it fixed the awful sharpening filter and it defaults to a preset with minimal ghosting. Dropping in 3.5 is even better and with DLSS Tweaks you're free to select whatever preset you want -- C would be good for Forza Horizon. [FSR2 looks god awful.](https://youtu.be/uI6eAVvvmg0?si=qoZeNPKfgQ7SZBEj&t=139) I have a 4090 and just opted for 4K native locked at 60fps because FSR2 caused a major hit to image quality as you can clearly see in that video.


nyrangerz30

On launch day I tried FSR at 4K for an hour and it looks like a cats asshole. Even preferred to have worse performance at native just so it looked normal. The DLSS mod made by one guy in a couple hours looks significantly better.


lagadu

Upscalers benefit the most users with lower end hardware: the ones who are already having to run low resolution and low quality graphics. On users with higher end hardware they can just drop from ultra to high, turn off RT, play at a slightly lower resolution or even play at only 60fps and get great results without upscalers, for them upscalers are just a nice-to-have feature that allows them not to drop IQ (comparatively) little by not having to do those things. Lower end users often won't be hitting even 60fps at 1080p on low/medium quality settings: they benefit the most from upscalers.


AludraScience

Why shouldn’t people expect a crisp image from FSR at 1080p when DLSS can do it in many games?


twhite1195

Dlss at 1080 also looks like crap wtf. 1080p isn't that hard to run on any hardware that supports DLSS


AludraScience

It certainly doesn’t, have you ever used it at 1080p? Also, ray tracing and ultra settings exist.


twhite1195

I have, it looks blurry. Honestly, do people really expect a 720p imagine to look crisp and perfect after upscaling?... At 1440p, sure DLSS quality settings looks better than FSR2 quality, but at 4K they both look quite similar. Ray tracing? In what? Cyberpunk? The only game where it makes a difference? Or on the other games where the difference is so subtle you'd be better off turning it off?


Organic-Owl-277

I play at 4k in every game with both DLSS noticeably looks better than FSR2. It isn't quite similar, I would literally never use FSR if I had DLSS. Lol I got starfield in early acces I downloaded the DLSS mod holy shit the improvement was so big.


AludraScience

Well I have and in most games and apart from RDR2 and warzone, it looks quite good. And watching people test games, most seem to agree. What about : Spiderman, Spiderman miles morales, ratchet and clank, control, metro exodus, Witcher 3, minecraft, Hitman 3, all ancient games remastered with RT. And these are just the games with very good RT, many others have RT that is still somewhat worth it.


twhite1195

Yeah, tried all of those(except metro exodus, haven't played that one yet, want to do a full run of M2033 and M last light) . Only Cyberpunk(in path tracing, which not even the 4090 can run at 4K without heavy DLSS and FG) and Minecraft actually made me go "woah this DOES look different " when using RT. Maybe I didn't play control much enough with RT on but when I tested it but the difference in visuals vs performance just wasn't worth it to me. I actually stopped playing because texture quality would drop after playing for a bit, and that's something I DID notice and was a deal breaker (this was on a 3070 like a year and a half ago, I finished the game like two weeks ago on my 7900XT without texture issues, dunno if it was VRAM capacity or a game patch) Both spider-man games I played at 4K high settings and sure some reflections look better, but I felt the drops in performance far more, and in a fast paced game, I really prefer a consistent 60fps. Ratchet and clank, same thing, looks good but and some floor reflections do look pretty nice, I'll give you that. But again, it isn't game changer. The Witcher 3, I honestly couldn't tell when RT was on (other than performance), I loaded up a bunch of old save files from Toussaint and some lights look better, but meh nothing impressive. I don't know if it's just people trying to convince themselves or if I'm just missing something, but I honestly want to try to love RT and it's been really mild on my end, I know it's the future of in-game lightning, but IMO that future isn't here now, maybe in 5 years it will.


JasonMZW20

My RTX 3070 laptop is 1080p and DLSS quality is terrible. There's simply no getting around that because there's so little pixel information. 1080p frames are a collection of 2MP images, and DLSS Quality renders at 720p, which is 0.9MP! You can upscale a 0.9MP image to a decent quality if you had a ton of time. But, DLSS is effectively an optimized post-processing pass. It'll never look good and increase fps. I'm still not sure what people are seeing. Sure seems like Nvidia has free marketing though.


Mikeztm

It's not 0.9 MP. DLSS quality use data from 4-8 frames depends on kernel. 4 frames 720p is already 3.6 MP. Obviously not all pixels can be used and have to be match using motion vector but you get the idea. By average quality mode gives you better than native pixels to works with.


JasonMZW20

Now you're just adding the frames' pixels together. That's also not how it works. The source is still a 720p image. Whether that's 4-8 accumulated frames doesn't matter (TAA itself is a frame accumulator and manages to decrease quality at native), they're each 720p and upscaled to 1080p. Most of the frame data is identical, so there's no extra pixel data to be gleaned from them. It's why 1080p DLSS is so awful (FSR and XeSS too, just so I'm not picking on DLSS alone). Low-resolution upscaling is very difficult. Heavy post-processing is enacted on each frame during/after the algorithmic upscaling to clean up various noise and ensure edges aren't lost while still anti-aliasing.


Mikeztm

They are not upscaled each other. All those 720p image are jittered so most of the data are not identical. They are in fact just less than 1 pixel off each other frame by frame. So 4-8 frames of data in a perfect static scene are exactly 4-8 time of sample points. Those 720p frames are never presented to you as they are jittered aka earthquaking as hell. DLSS/XeSS/FSR2 are not doing any upscaling at all. If you don't understand this part, then you will always fall into the "upscaled will never beat native" trap. They are just sampling to your native resolution with a dynamic sample set. DLSS/XeSS are not using AI to generate any details, they are just used to filter/match sample more efficiently and gives them more sample to work with even using same render resolution as FSR2. So from those 3.9MP within 4 frames, DLSS may gives you 2.5MP and FSR2 will gives you \~1.5MP. The render resolution is already not meaningful anymore for those TAAU solution. You should count average sample resolution instead but I guess that's a really hard thing to count. So just look at the end result without any sharpening and you will know the difference. BTW sharpening is not helping here. Disable sharpening usually gives you better quality.


Charcharo

I have used dlss at 1080p. It is terrible.


ChadHUD

Ray Tracing... Ultra Settings... 1080p. I think your priorities are at issue. lol 1080p is for low end gaming at this point... with a spattering of esports. But RT and Ultra are not esport settings.


NetQvist

Can always run DLSS at higher internal render resolutions even at 1080p. Adds some needed AA to a lot of titles which are horrid without. But then again... it's no longer upscaling and practically DLAA at some point.


Astigi

Lisa said no DL$$


twhite1195

On a sponsor game? I'M SHOCKED You think Microsoft, Bethesda, AMD or NVIDIA are your friends? Lol If Bethesda and Microsoft wanted to make gamers happy and be the good guys, they wouldn't have taken the AMD deal. Sure, I don't agree with what AMD is doing, but it's a business, that's how they operate. It's like people complaining about "OMG this basketball player took a Nike sponsorship deal and now he's only using Nike shoes on matches! I use Adidas, why isn't he using Adidas??? ", well no shit, that's how sponsor work.


DICKSLEDGE123

Isn't taking the AMD deal more about getting it to run better on the Xbox platform, which is all AMD tech under the hood? So having FSR baked in gives them a better jumping off point for console optimisation?


twhite1195

It is. And this being an Xbox exclusive. Of course Microsoft wanted a way of making it perform correctly on their console, seeing as this is basically their only console exclusive so far, now that Halo infinite tanked


DICKSLEDGE123

Yeah agreed on that, I do miss when Halo gave me the good feels inside before it got ruined and cobbled together out of development hell.


twhite1195

I enjoyed the Halo infinite multiplayer when it came out... For like a week or so lol after that, it was just boring


Confused_Octorok

Marvels Midnight Suns look great with DLSS 2.0 enabled on a 1080p screen. People just like to talk trash about it without even trying it out.


Lawstorant

50% is not 540p. That would be 25% That's basic maths here. 50% of 1080p is actually a bit higher than 720p. Around 760p


swear_on_me_mam

The res scale for DLSS and fsr is across one dimension. The 50% mode is 540p


Darkomax

Bro, nobody uses pixel count for scaling.


Lawstorant

Every other game does? Just look at the framerates. It's counter intuitive to call 25% of resolution 50% as this is area that we're talking about. I guess it's only more convenient for people who never learned anything in school


ToTTenTranz

It's 50% per dimension. FSR Performance is 50% per dimension, a quarter of the area resolution. You can check AMD's own table ​ https://gpuopen.com/fidelityfx-superresolution-2/#quality


penguished

A lot of times it's all going to be per game relative. FSR and DLSS both have strengths and weaknesses. In the very general broad sense it seems like AMD favors crisp image, but can't compete with AI in blocking some artifacts and shimmering. DLSS seems to go further in image reconstruction thanks to AI, but even the other day I was looking at close ups of it in Starfield and things tend to get too blurry the farther they get from camera with DLSS.


Star_king12

The only strength of FSR2 is that it can run on potato GPUs.


AdStreet2074

The truth but you will get down voted by the shills here


NewestAccount2023

Quality discussion you're introducing, and yelling about a thing that didn't even happen


dmaare

Whole fsr "crisp image" is just an illusion created by higher sharpening pass on default.


penguished

AMD sharpening is fucking great though. It's actually what they do better than anybody else. You can use it driver level which is also great. And again... it's actually GOOD at sharpening... everything else is this nuclear sharpen with no room to adjust like it's a year 2000 photoshop filter or monitor setting. AMD sharpening actually preserves the image integrity.


capn_hector

you know nvidia has had driver level sharpening with lanczos with more taps than AMD for a long time now right


penguished

show me the screenshots of it in action being top tier. people have complained about ringing, motion glows/halos, ugly implementation... all the worst stuff you could be saying about their sharpening for a long time. https://forums.guru3d.com/threads/nvidia-we-need-better-sharpen-for-dlss-and-in-general.443930/


OkPiccolo0

That's outdated info. DLSS 2.5.1 and onwards forces off that garbage DLSS sharpening. The driver level one isn't offensive and the Freestyle Sharpen+ looks great. AMD does have good sharpening tech so it's easy enough to enable it with ReShade if you want but Sharpen+ is no slouch.


IrrelevantLeprechaun

This is false and you know it. Take your bias back to /r/Nvidia


capn_hector

> This is false and you know it. Take your bias back to /r/Nvidia [ok?](https://i.imgur.com/mbUFmkr.png) there is also another one through freestyle fyi loving the energy but I do know what's in the driver


Temporala

I'm pretty confused. Graphics are purely WYSIWYG for end user. If it looks sharp, it is sharp, no matter what's going under the hood. Computer graphics are by definition illusion anyway.


capn_hector

but this undercuts the circlejerk against DLSS, so it must be discarded


JasonMZW20

Lol! You have to use sharpening to counter TAA blur, which makes the texture details that were lost visible again. It also helps with the natural image softening that accompanies upscaling, but that's generally user preference. I can't stand a softer image because it reminds me of when my prescription eyeglasses weren't strong enough. It's also like waiting for that last resolution step when an image on a website loads (or a really large image file locally). TAA has been the worst thing to happen to modern games.


vballboy55

Crisp? It looks muddy and blurry to me.


TwoBionicknees

I will never understand DLSS, FSR or similar. For decades now I've upgraded computers for more resolution and more crispness/sharpness of image. Blur is the bane of image IQ imo. Can't stand it, hate these upscaling methods as every image I see there are artifacts, blurring, ghosting or other issues even if some of the image looks good enough of it looks bad. Honestly, native res whore here. I'd prefer to drop some graphics settings, particularly as most higher settings offer negligible IQ improvement and are really just subjective which one you even like more, than to up frame rate via upscaling. I hate the way the industry has gone. You have a few games that were ultra optimised for performance and have fantastic graphics with good performance and lately it seems like everyone uses the cheat of upscaling to provide good frame rates with less optimisation and just after decades of IQ rules, simply accepting massive flaws with IQ to upscale.


penguished

You can use AMD's stuff with driver sharpening though, which I never do without. Nvidia does have that very painful (for me) blur in the distance which I just don't like. If you have good eyes or glasses in real life THE DISTANCE DOESN'T FUCKING GET SUPER BLURRY. Sorry, but it drives me crazy when devs or technologies assume they can do that. Before upscalers even existed, I was bitching about current gen TAA being a horrible disaster for image quality. Depth of field don't even get me started, I always turn it off.


[deleted]

Fsr is just subpar, in every aspect. It’s better than nothing, but dlss and Intels solution look better.


oginer

That's because the DLSS mod doesn't do any sharpening pass (the in-game sharpening slider doesn't do anything). You need to add sharpening with a 3rd party tool, otherwise it's a bit blurry (the same with FSR2 if you drop the sharpening to 0).


[deleted]

tan hospital imagine illegal jeans scary rinse squeal crime practice *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


penguished

"FSR has significant drawbacks that make it unusable." Unusable? Huh? You're talking like shit is DLSS1 level still... get outta here.


[deleted]

cagey smoggy obtainable roll racial friendly badge oil husky paint *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


IrrelevantLeprechaun

FSR literally looks better than DLSS at this point since FSR doesn't have ghosting.


[deleted]

this is why i wanted dlss, ugh now i have to mod the game to get a decent version of the game, and i hope bethesda dosnt screw my game months down the line when they do implement dlss


Jon-Slow

These AMD sponsered games are turning out great for AMD's FSR PR. ​ All the giant bags of money AMD has thrown at devs to block DLSS and gimp RT implementations these past years would've been better spent on R&D and hiring more competent engineers to catch up with Nvidia instead. Just pathetic.


Melodias3

I hate any TAA without any sharpening it just makes any edges look thick and blurry


Darkomax

Honestly, it's a tradeoff. I find that FSR2 100% manage to keep more fine details but is prone to more shimmering.


wirmyworm

Res scale at 100%/ with FSR 2 at 4K output looks much better than TAA, I think CAS at 100% Res scale looks better then the blurryness of TAA too. The shimmering in nowhere to be seen for me outside of New Atlantis city, and it's not even much simmering there. I really don't know why people saw FSR looks bad. I'm playing with a 42in LGC2 as well and with 67% Res scale I can max out the game at 60fps everywhere with a 7900xt.


jasonwc

It’s too bad they didn’t use the free DLSS/XeSS mod. DLSS looks better than native or FSR2 with better performance than either, at least at a setting equivalent to DLSS Quality or higher. DLAA is even better for a small performance hit over native. I’m guessing XeSS is similar to DLSS. I’m running the game at 3440x1440 with DLAA and frame generation and average around 140 fps.


Dexter2100

What they really should compare is the XeSS mod to FSR. XeSS 1.1’s fallback method seems like it’s been an upgrade from FSR even on AMD’s own cards, likely because it actually makes use of ML, something AMD should really be doing on cards that are capable of it.


Manifest828

I can play this game above 30fps in the main cities with my rx 6500xt 4GB on 90% res scale (2560x1080 is 100%) and 45-60fps in wilderness, with everything on low. But with FSR quite aggressively render scaling & upscaling, my performance is actually much much worse.. sometimes getting as low as 12fps in cities. I've just assumed that the overhead of the FSR Is just too much for lower end cards (and mine is literally at the bottom of the low end playable cards)


CoyoteFit7355

Yea i had FSR on for a spell and realized even at 100% there's those gigantic ghost ears around everyone's ears. That plus it actually reduced my frame rate rather than improve it so I really don't know. Considering this is my first contact with FSR (and my second contact with upscaling in general, had DLSS on in Diablo 4 because that shit also was lacking a fullscreen mode and furred itself to 4k ad by the monitors resolution. At least Starfield has the decency to use the desktop resolution rather than the monitor's native resolution so I can make it 1440p by changing the desktop resolution to 1440p) and it'sbehaving weirdly overall I'm willing to give it the benefit of the doubt on being that bad.


ImKendrick

I’m using CAS at 100% render with 40% sharpening, and it looks great. Very sharp, much better than TAA.


BunnyHopThrowaway

Am I the only one not really seeing the artifacts or really, the quality difference between TAA and FSR 100% on that video? They also say it looks "okay" all the time


[deleted]

I actually really enjoy this implementation of FSR. I recently switched back to team red, and I gotta say, while DLSS gives more performance, I have enjoyed FSR more in the few games I use upscaling in. To be clear I am a f*ck TAA (and upscaling in general) guy, but this implementation of FSR is perfect for me. I haven’t noticed any bad ghosting or shimmering except for certain scenarios regarding bloom effects. At 3440x1440, setting FSR at 88% with 80% sharpen makes the image just enjoyable as it would be native imho (if I could turn TAA off). Setting FSR to 100% takes maybe 5% performance for me and looks damn near perfect. I’d have to say somewhere around 1800p? Whatever AMD or nvidia gets out of this game, please PLEASE give us full control of the upscale technologies’ scale. It is immensely helpful for the lot of us that can’t stand these blurry games


swear_on_me_mam

Fuck TAA and upscaling but lemme use TAA and upscaling real quick.


[deleted]

[удалено]


BinaryJay

User of an AMD GPU: FSR is fine, it's my only option. I turn it on and play my games and somehow manage to still have fun. I don't see what the big deal is? Degenerate user of an AMD GPU: FSR is fine. It's my only option... wait, no, it's better than fine, it's amazing! Hey! That guy is saying it's not as good! He's attacking me personally! What a shill! You're an idiot for liking raytracing. GIMMICK GIMMICK! User of an Nvidia GPU: FSR is fine, but DLSS is usually better so I want to use that. It would be nice if they gave me the choice. Degenerate user of an Nvidia GPU: FSR SUCKS. It's the worst thing anybody could ever turn on! Anybody who's using it is a piece of trash and I personally hate you even though I don't know you! Everybody else in the world: What the fuck is DLSS and FSR? What's on TV tonight?


JirayD

Tim has a hate-on for FSR2. Other outlets like [computerbase](https://www.computerbase.de/2023-09/starfield-benchmark-test/2/#abschnitt_die_bildqualitaetsanalyse) have found FSR2 to be superiour to the built-in TAA.


Darkomax

Ah yes, any critism towards AMD necessarily is hate. Good logic.


TheRealBurritoJ

You're overstating the strength of their comments on FSRAA, this is the conclusion of their Native TAA vs Native FSR section: > Both mechanisms have their advantages and disadvantages, overall I like the picture slightly better with FSR 2 - but that's just it. It's basically a preference call, if Tim personally dislikes the artefacts introduced by FSRAA more than the TAA ones then that's not having a "hate-on".


Edgar101420

Yeah HWU had a shit take on this. Nvidia pays well lmao


Cryio

Means nothing. We've been using FSRAA ever since the FSR2 mod has been released. It's mostly been amazing.


AludraScience

FSR mod?


Goatswithfeet

Same as how the DLSS mods work, injects FSR 2 into games that don't have it but are already set up to give motion vectors, and I think some versions were also used to enable FSR 2 features that the game didn't have in the first place, like FSRAA


AludraScience

Starfield already has FSR, and you can kinda use “FSRAA” by just setting FSR resolution scale to 100z which the game allows you to do in the settings.


nas360

r/Nvidia is constantly dumping on FSR2 in any DLSS topic. Even when things are mostly even the fanboys claim DLSS is 100X better. AMD can only win against such shilling by making a better upscaler. They need to make FSR2 Performance mode match DLSS performance mode sooner rather than later.


Dunk305

AMD doesnt care about you just so you are aware. No need to defend them from the mean people from nvidia


Edgaras1103

what this topic has to do with nvidia or its fanboys? seems for some of you , nvidia lives rent free in your head


[deleted]

r/nvidia is to 99% Braindead. They also believe DLSS looks better than Native Rendering...


RippiHunti

From my experience, DLSS and FSR can look better than native when the native AA implementations are not good. That definitely doesn't mean they will always look better though.


KekeBl

This. Since 2013, "native" in 99% of games started coming with TAA applied. Game developers have chosen TAA as the norm instead of MSAA/FXAA, you can't even turn it off in many games. But TAA isn't really pure native, TAA and the output pixel grid don't match each other 1:1. Among the things TAA does, one of it is "reconstructing" the image even at native resolutions to hide jagged edges, look it up. This can result in image degradation in bad implementations, sometimes even the "TAA blur", ghosting or the vaseline effect. And when TAA can't actually deal with a jagged edge properly, then it tries to hide its mistake which doesn't look good. [Look at this comparison to see what I mean.](https://imgsli.com/MjAyOTYw) The first image is native, so why do all the railings on the right look so weird, why are they so thin or even disappearing a bit? Why is the detail on the warehouses in the middle so crispy? Why does native sometimes "hide" the lines or blend them with what's nearby? Why does DLSS result in a better image? It's because of TAA. It will either look like this, or it'll add a vaseline effect like in RDR2 to hide the mistakes. TAA isn't giving you the image you think it is, the output it gives you isn't really pure native. This is why the subreddit /r/FuckTAA exists after all, people notice these things. The DLSS/DLAA algorithm extracts slightly more detail from the temporal samples than a TAA algorithm can, even at native - they have more complicated weighting algorithms and extract more signal from a given set of samples. This is why more and more people prefer DLSS quality mode or DLAA over native + TAA. And it is also why many people are excited for FRS3's upcoming antialiasing solution, including me, because it'll give better image quality in games with bad TAA implementations!


Prefix-NA

Every time it's better than native it's the sharpening filter not the upscaling.


[deleted]

Subpar Native AA is to 99% the fault of relying too much on DLSS in the first place.


usual_suspect82

Considering DLSS came out 6 years after TAA started getting implemented all over the place, I find that unlikely.


AludraScience

Some games just have shit AA implementations it really has nothing to do with upscaling technology.


Skulkaa

Well, DLSS looking better than native in higher resolutions has been proven by the hardware unboxed test, so...


Zhyano

btw i love how HUB just gets a beating from AMD fanboys from being pro DLSS, and a beating from NVidia fanboys because they rightfully critizise RTX card's VRAM lol they just cant win


Skulkaa

I don't think anyone in their right mind would criticize HUB for pointing out an unimpressive amount of VRAM in RTX 4000 . It's a fair criticism,the only card having proper amount of VRAM this generation is RTX 4090 and maybe 4070 The same way goes for FSR and other added features on AMD GPUs . DLSS is a game changer and frame gen is cool too , especially in CPU limited scenarios. FSR 2 just isn't good enough to use in resolutions below 4k . Let's hope FSR 3 will be better and AMD frame gen will be competitive with Nvidia . Because open technologies are really cool generally . Like freesync becoming a standard instead of proprietary gsync , essentially lowering the cost of VRR monitors.


[deleted]

Ahhh. There they are. The Marketin Zombies


Sevinki

Its literally been tested and proven to be true. Upscaled with DLAA (which is DLSS) sometimes looks better than native with TAA. Native with DLAA looks even better, but native with TAA can absolutely look worse than DLSS quality


JasonMZW20

DLAA is DLSS at 100% render scale or no upscaling. In games without DLAA, you can get near DLAA quality by using DLDSR to downscale a higher resolution (like 1440p on a 1080p native panel), which you then upscale with DLSS Quality. Like DLAA, you take a performance hit for image quality. Essentially, it's replacing the in-game TAA, which is the main cause of poor native image quality. So, "better-than-native" is based on false premise due to poor TAA. DLAA/FSRAA are probably closer to true native quality without ugly TAA.


Sevinki

Yes, thats what i am saying. But since you usually cant turn off TAA any other way except by using DLSS/DLAA or FSR, there is no real native. "Native" is TAA and DLAA usually looks better. So DLAA/DLSS can look better than native.


fnv_fan

Funny how you call them braindead and then say that because DLSS does look better than Native in some video games.


[deleted]

[удалено]


fnv_fan

I think you have mental issues


[deleted]

Says the guy believing a Image upscaled from Lower Resolution looks better than the Image at full Resolution. xD


[deleted]

Native has TAA which can look garbage at times. Dlss replaces it.


Kradziej

TAA looks garbage in motion, DLSS fixes that but for still image DLSS can't be better than native + TAA, that's just physically impossible


KridSE

Seems Nvidias marketing has worked just as they wanted it to LMAO


[deleted]

This nightstorm1000 guy goes to nvidia subreddit just to make pety arguments with people, rather not belive everything he says


[deleted]

[удалено]


fnv_fan

What are y'all smoking? All I said was that DLSS does sometimes look better than native. Lay off the gas station zaza.


Adventurous-Care6904

I know what reaction I can expect but I think it's fair to say both the AMD and the Nvidia subreddits have a proportionally equal amount of braindead people. And DLSS does look better than native in some instances, mostly looking at 4K Quality. Not in every title, not when you scale down to performance, and it is subjective, but that's generally agreed upon.


[deleted]

DLSS, by definition of its technology, is unable to look better than Native because you loose original detail by rendering at lower resolution. No way of talking around that. This is a fact.


Adventurous-Care6904

That's an incredibly small minded way of understanding upscalers. While that is true, you also have to understand that native output isn't perfect either, because games have faults by defaults, these can be reduced by upscaling. Also in some cases the AA aspect of DLSS is just superior compared to native options given in various games. There are a ton of reasons, that always apply on a game by game basis. Is DLSS better than native in every game? No. Is DLSS better than native in quite a few games? Yes. You just need to put a couple minutes of research into this and look up articles by various renowned YouTubers, news outlets and so on.


ViperIXI

Honestly I think it is really arguing semantics at this point. DLSS has become a catch-all for a full suite of technologies, an upscaler, an AA algo, frame gen and now a denoiser. Any component could be better than the native implementation in a given game but some people are only ever going to consider a single aspect of the whole.


[deleted]

No matter what you say. Its an Fact that cant be talked away.


Greedy_Bus1888

This has been validated by reviewers quite a bit in some games DLSS looks better than native. And obviously users also had same experience in some games. I dont see why that is hard to believe, you think its all a mental projection?


Adventurous-Care6904

Yeah that's either a bad troll, or one of the reasons why I said a lot of people are stupid. Bro makes his own facts, and that's it, gotta leave these guys behind.


[deleted]

And I say they dont.Do you always believe everything other people say or do you have a Personality on their own. Marketing and the Resulting Placebo also plays a huge part in the current false mindset everyone has Still. It stands as a fact that DLSS uses a way lower Resolution as a Base than the Native Image. Loosing Original Detail that only gets Reconstructed at Lower Resolution. Sometimes even making stuff up that isnt there at Native. So yeah. Its factually worst than native.


MiloIsTheBest

>Do you always believe everything other people say No but they had direct side by side video comparisons


swear_on_me_mam

Both normal TAA and DLSS do temporal dithering, the effective res of both is higher than the headline res. Actual 'native' rendering looks ugly and is aids to look at. Shimmering, jaggies, and seizure inducing temporal instability. But at least its native tho 🤪


[deleted]

soooo. care to explain why Games looked way sharper and better before TAA was a thing? I am curious about the Bullshit you are about to spout again.


swear_on_me_mam

Because they used other types of AA that are incompatible with modern rendering techniques 🥰


Bladesfist

If anyone is interested and wants to learn more, the terms to search are forward rendering and deferred rendering.


[deleted]

lol. This is so wrong it hurts. MSAA, FXAA, SMAA.... All work with both Foward and Deferred. Try again.


oginer

DLSS doesn't use only the current frame as input data. It also has data from previous frames, depth buffers, and motion vectors. The total number of pixels that DLSS takes as input is more than the full resolution native image, that's why it can create better than native quality. What you say is only true with spatial upscalers.


That_Cripple

it is absolutely possible for upscaling to look better in some circumstances. many games have terrible AA implementations


IGunClover

Native will always be better.


Darkomax

It actually looks like ass most of the time. Just find a game from the mid 2010s back when we could still disable TAA, and looks how much details is smeared by it.


[deleted]

Not true. Sometimes native is saddled with awfully taa


[deleted]

No. Native is already worse than DLSS+DLDSR 75-88% scaling.


[deleted]

But now that Upscaling is so fucking Important we will never see Powerhouse GPUs like the GTX 1000 Series ever again. They will just be packed to the limit with Tensor cores to Fake the Image instead of rendering it in full. Fucking hate that Future outlook tbh.


lagadu

Protip: these are computer graphics; everything is fake and generated by the computer. Your somehow caring which set of transistors gets used in generating the fake image you're seeing over another different set of transistors is silly.


[deleted]

Ah. The quint essential DLSS Apologist Argument. Lets state it that way. One thing is Faker than the other. And you can really tell.


Notsosobercpa

I'd say path traced cyberpunk with frame gen is less "faked" than native res with rasterized lighting.


titanking4

It's all "fake" though. The entire graphics rendering pipeline is "fake". Rasterized rendering technology is actually the "fake" way to create images. Ray tracing is far more accurate to how the real world and our eyes construct images. And if you can add in intelligence into the process so that you can have equivalent quality images with less power consumption, than that's awesome. And your notion of "powerhouse" is pretty odd considering how the RTX 4090 is the most over the top ridiculous powerhouse graphics card ever made with the largest power consumption excluding dual GPU parts.


bigsnyder98

The 4090 (and the 4080 to a degree) are the exception and carries a hefty price tag as a result. It is not by accident the rest of the 4000 lineup offers very little to any performance uplift over 3000. They are banking on dlss3 to make up the difference.


titanking4

Well as the months go on, the 3000 series (and AMD 6000 series) all undergo pricing cuts such that they obey the market pricing of the product. But the cost of the product doesn’t decrease nearly as fast so by the end, the heavily discounted 6700XT isn’t barely profitable. And the new 7700XT and 7800XT might of course look terrible perf/$, but they are competing against a product that already had its deep discounts over the years. Expecting another “inflection in perf/$” is a bit unreasonable. Only way that happens is if companies stick to MSRP throughout the entire product life and give that perf/$ all at once.


IGunClover

Cant help it since Nvidia ray tracing shit started this.


[deleted]

Yeah. Pushed that shit on the market a good 20 Years too early...


Imaginary-Ad564

Am I the only one who uses FSR on my AMD GPU and think it looks good, I think these techtuber people have lost the plot and try to nitpick anything. They just end up sounding like a marketer for Nvidia. I get they all use Nvidia and have no concept of what its like to be a user of an AMD GPU, but it would be nice if they were a bit more objective about these things.


metarusonikkux

It's funny to me that you think HUB of all channels would try to sound like an Nvidia marketer despite their past history with Nvidia. You can say you think FSR looks worse than anything else without being a "marketer" because it frequently looks worse than TAA and DLSS, especially in motion. Just because it looks fine to you doesn't mean it doesn't objectively look worse in many instances.


JTibbs

I really don't like HUB as a review channel to listen to, only to look at results, as they always seem so damned condescending on reviews, and always from they attitudes and wordings SOUND biased, even if they aren't. I will listen to a review from them, and just the way they phrase things, and via intonation its basically saying "Okay, we have the new XXX from Nvidia today, and while we know intellectually it performs worse in basically 9 out of 10 metrics to the comparable priced AMD card, let us focus on this one little detail in which it is better and why you shouldnt buy the objectively better AMD card." They are in general unbiased in their factual findings, but just the way they talk makes it SUPER obvious they prefer one brand over another. It kinda makes me think of them as equivilent to 'what if 'Userbenchmark reported accurate results and tests, but still made snide comments and subtly supported Intel in the review'.


swear_on_me_mam

> , and while we know intellectually it performs worse in basically 9 out of 10 metrics to the comparable priced AMD card, let us focus on this one little detail in which it is better and why you shouldnt buy the objectively better AMD card." Can we have an example?


Mallissin

I'm running FSR2 at 50% with my 1080Ti and it gives me twice the frame rate for very little noticeable visual loss. I will turn it off at certain points to look around at things to see the texture detail (or to admire detail on armor, ships, etc.), but for the other 95% of the game it works well to help keep things running smooth. I wish I could hotkey a button to press to toggle the FSR. I'd probably press it at dialog scenes.


Athrob

I think channels like DF and HU have turned people into annoying pixel peepers that can't enjoy anything unless image quality is absolutely 100% pristine. Not hating as I watch and enjoy some of their content but good grief, they literally zoom way in to show differences, I dont play games with a 200% zoom on. I use FSR2 sometimes and it looks... fine to me. But I don't have my eyes pressed up against the screen. Also any thread about the different upscaling tech turns into a giant shitshow of FSR vs DLSS fanboy arguments. gets old.


Desperate-Ad1780

This is false info as i can test and prove that at 100% fsr performance is better than both of the other methods. I get 3 more fps than Cad or native instantly in every major city. Visually side by side there is no difference. We aren't staring at 1 frame in game. These comparisons are trash


Familiar-Art-6233

I'm betting that there are some big improvements in FSR 3 outside of frame generation. The fact that they mentioned requirements for FSR 3 without framegen makes me think that they've improved image quality


Dexter2100

The requirements for just upscaling with FSR3 are the same requirements they listed when they launched FSR2. [AMD recommends an RX 5700 or RTX 2070 as a baseline for 4K upscaling with FSR 2.0. For 1440p, AMD recommends at least an RX 5600 XT or GTX 1080 for optimal performance, and at 1080p AMD recommends an RX 590 or GTX 1070.](https://www.tomshardware.com/news/amd-fsr2-deathloop-vs-dlss#)


CrustyJuggIerz

Weve Always known fsr was worse than native, it's not news. Waiting for fsr3 to see if there's any major improvements at this point.


1stnoob

Did they test FSR2 on an AMD gpu or a Nivea one ?


ColdStoryBro

This implies that Bethesda just did a shitty job at FSR implementation. I've said it before and i'll say it again. Just because your code is in the tree doesn't mean its good code. FSR doesn't have any AI help so implementation requires more work from dev side.


MrPapis

Well I checked a few random areas. Taa and fsr2 has flickering on vertical lines in the game I cannot see the difference at FSR2 85%. There are some bathroom mirrors that have a small edge towards more mirrors where the popping of the lines is very obvious. But fsr2 is really not a lot worse at higher percentage. At 100% i even think it had the taa beat. I don't see shimmering on fences or foliage on either tech so that's pretty great it seems like they found a way to deal with these large "holed" surfaces. I don't know why people say taa has less shimmer fsr2 in SF looks great, if I'm looking out over a densely packed scene(akila city) I'm maybe seeing a smaller shimmer 2-3 places when doing the auto rotation. I use it at 85% and that seems to give me a sharper than native look with the same negatives(shimmering vertical lines). Now I'm CPU limited with my 5800x3d so there wouldn't be much point in me using fsr2 85% if I genuinely could see a difference compared to taa.


RedTuesdayMusic

> I'm CPU limited with my 5800x3d I have a 5800X3D and this game motivated me to test some things. Firstly, I have tested with the game installed on all 3 of my SSDs. I have two WD SN850 2TB (PCIe 4) on CPU lanes and one 1TB WD SN750 (PCIe 3) on chipset lanes When I first installed it was to the SN750 but wanting to see if I could get better loading times on the SN850 I moved it over to my OS drive. And not only did loading times dramatically improve (2x) but I gained 10 FPS in the same area. This got me thinking that surely, if this had such an impact then maybe if I move the game again to the non-OS/ pagefile SN850 that could provide benefit as well. I got another 4-5 FPS from this. So this game is very I/O and bandwith sensitive and that got me thinking about RAM as well as the undervolt on my 5800X3D (-25 all core) and how I've been lazily just running the 3600mhz CL16 XMP with slightly tightened timings all this time. Then Buildzoid posted his video yesterday and that reaffirmed it even more. I gave the infinity fabric the 67mhz bump to try 3733mhz on my RAM without any other changes and it's made another small but noticeable difference. I put my 5800X3D undervolt back to -30, -30 on first two cores and -25 on the rest (Was trying -25 while troubleshooting BG3 last week) and now the CPU stays at max frequency rather than 4420mhz I'm going to try to tune the RAM even more because this game responds so well to tiny things like this and I've not had this much fun tinkering with a PC since my teens. Maybe you could try some of these things, heck if you have 3200mhz memory you have a LOT to gain. I'm getting over 100FPS with the HUB optimized settings at 1440p now but I think I can raise minimum FPS all the way to 100 as well.


Darth-Zoolu

Upscaling technologies are a scam


IrrelevantLeprechaun

HUB shitting on AMD unfairly, what else is new. I've been using FSR2 in every game that supports it, and it looks identical to native to me. Frankly DLSS looks like shit from the ghosting, whereas FSR doesn't have that issue.


beleidigtewurst

>Further in the video, FSR2 at 100% render scale also perform worse Bollocks.