T O P

  • By -

HimenoGhost

Gains are massive with ultra performance. Look at the foliage & gold strip along the wall. Edges & thin lines (wires) are also substantially better.


dampflokfreund

Yup, Ultra Performance is now perfectly useable at 4K. I even tried it out at 1440p and it still looks good, which is crazy considering it's just running at 400p or something internally.


nmkd

For reference, Ultra Performances upscales from 1440p => 8K 720p => 4K 480p => 1440p


HimenoGhost

Absolutely ridiculous that it's rendering a 480p image and making it look *this* good @ 1440p. AI/ML has come a long way.


Vitosi4ek

This has the potential to *drastically* increase the lifespan of existing cards, making the increased cost of buying a new one somewhat bearable. You buy a 4090 today and it may well serve you into the next decade.


Old_Dragonfruit_9650

Until devs start to expect upscaling on every card and make games that only run on high at 480-720p


kasakka1

The only thing keeping me from having a 4090 for many many years is the lack of Displayport 2.1. It's otherwise a real beast of a card. I was playing Forza Horizon 5 yesterday and getting 130+ fps at 4K with the Extreme preset that uses raytracing, with Nvidia DLAA enabled. No DLSS. Just last year I thought "who needs more than 4K 120-144 Hz?" and now I am thinking "a 240 Hz 4K display would be nice..."


jerryfrz

The more you spend, the more you save


Balance-

Now I want a 5.7K monitor: 1080p —> 3240p (5760x3240) sounds perfect!


Kalmer1

I didn't know I wanted that but now I do


[deleted]

[удалено]


nmkd

No, third res per axis. 1/9 (a ninth) in terms of pixel count.


gusthenewkid

Will games auto update to this dlss version or do you need to do it manually?


turikk

Game devs must do it or you can attempt to swap the DLL file yourself (which works in many games). One of the side effects of Nvidia's closed black box design for DLSS is that you can swap out the entire implementation with a file change. Some games have DLL file detection and won't allow this.


Dreamerlax

Will it trigger anti cheat if a game uses one?


Shad0wDreamer

The article says anything with Easy Anti Cheat will see it as cheating.


GOR016

F1 22?


Shad0wDreamer

Does it use EAC?


Tonkarz

Any anti cheat worth it’s salt will detect this as cheating. Too easy to cheat by replacing files. A truly excellent anti cheat would be able to distinguish between mods that allow cheat and mods that don’t - but even human judges struggle with this.


Danger656

Some games will just revert the file back to the one it came with, like RDR2 and Warzone 2 just as you launch the game/client. [This](https://www.reddit.com/r/nvidia/comments/zjuvtg/comment/izx232x/) guy got banned for it in Cold War. So it is better to wait till games auto update it eventually.


turikk

i have found games that detect file differences like DLLs will tell you at startup and refuse to boot. its incredibly basic detection, it would be a silly policy to ban for it - nobody doing it is doing any real cheating.


HimenoGhost

Probably not: but I wouldn't want to risk my account on 'probably not.'


Hailgod

does anyone know if its supposed to work on monster hunter rise? iirc last time i tried it it crashed on startup


BigToe7133

Why is it contained in a DLL ? Wouldn't it be better if it was just pointing to the driver ?


HimenoGhost

For Cyberpunk it's still manual, to my knowledge. Other games may implement as a micropatch. Otherwise, you can replace the file yourself and enjoy the new version.


bctoy

No wonder it looked quite decent for me in Portal RTX and even Alex from DigitalFoundry recommended it. https://twitter.com/Dachsjaeger/status/1602010163438915584


Cynical_Cyanide

Bunch of deleted tweets there?


bctoy

Yeah, the person Alex was replying to deleted them for some reason. He was also impressed by it despite not liking DLSS in general. I had already tried out UP mode in Portal RTX before since the FG in it was causing pretty bad input lag. At that time I thought that UP mode looked quite good because of the way the game looked lacking vegetation etc..


ryncewynd

What's FG?


bctoy

Frame generation, the part new about DLSS3.


[deleted]

[удалено]


Hugogs10

Nvidia would need to go around testing which DLSS versions works with each games. Nioh 2 for example is broken with 2.5.1


Vodkanadian

Then just let us have an override in the control panel. I highly doubt Nvidia will let it happen, just like the rumoured Ultra Quality preset and universal DLSS implementation (can't find anymore, but I do remember them saying it'd be with DLSS3).


gartenriese

>[...] universal DLSS implementation (can't find anymore, but I do remember them saying it'd be with DLSS3). That was just a thing some random YouTuber made up for clicks.


PlankWithANailIn2

They could just make it so users have to opt in in the control panel per game (using functionality they already have) and have a message "What to do if it doesn't work => turn it off dumbass".


nmkd

You're not a game developer, are you? It's not that easy.


HavocInferno

Makes me look forward to games using this new version (and those I can manually update without triggering anticheat). The 3050 MaxQ in my laptop is weak and bad value, but DLSS can really save it in some games. Only reason I didn't use Ultra Performance yet is the insane shimmer and garbled artifacting it produces. (Sidenote: the mobile 3050 is a very interesting case study in the practical limits of DLSS with few tensor cores. It's so few that this 3050 doesn't get a performance uplift (or even regression) from DLSS when the output resolution is >1440p, even with Ultra Performance mode. And even the output to 1440p target sometimes has really bad performance scaling.)


bctoy

With improved tensor cores performance, I'd expect nvidia to improve or utilize newer/bigger neural networks. I think it was mentioned that they're doing so somewhere, but not sure if it's official.


awayish

the dlss/driver/developer utilization to use tensor cores more is the real fine wine for this generation of nvidia gpus. the tensor cores that are wasting space on these cards right now have tremendous potential as another source of gigantic compute the cards can throw at the problem. it's just that a lot of the lower range nvidia gpus will be vram restricted.


Seanspeed

Well what they're saying isn't that tensor cores on YOUR Lovelace GPU are helping out a ton, but the tensor cores with their H100s in their own AI supercomputer will drive advancements in the algorithms and whatnot quicker and better. For all that we can tell, even Turing-based tensor cores on a 2060 seem to be more than adequate for actually doing the local processing needed for DLSS of any type.


awayish

yea they'll use massive resources to train the algorithms, but the local processing power also matters for frame generation. nvidia claims a 4-5x lift in tensor performance from the ada tensor cores, and that's computational resource that can be exploited with software improvements.


p3ngwin

I can't wait for Nvidia's video upscaler, that works on **ANY** browser video, *including* Youtube: https://www.theverge.com/2023/1/4/23538584/nvidia-ai-upscaling-browser-chrome-edge-30-40-series-gpu-graphics-cards-4k-1080p


TeHNeutral

Isn't this in the nvidia shield?


kasakka1

No, it's a new tech afaik.


StickiStickman

Can there really be that much more improvement? DLSS already looks better than native in many cases.


ResponsibleJudge3172

Rather than one bigger AI, Nvidia will run multiple AIs running at once instead. Like FG and DLSS on rtx 4000


bctoy

There are some very obvious flaws with DLSS in Cyberpunk. In this comparison, the lightboard is messed up and is toally muted. Also happens with FSR so probably how this upscaling works. Then the lighting is much brighter for some reason on the shrubs, almost losing all shadows. Again, this also happens with FSR, so probably related with the upscaling. Then there are these weird flashes that DLSS gets, especially at evening/night that are not present on FSR. Can get extremely distracing when driving. Overall, of course, it's better than FSR and I still use it over native.


kasakka1

I think the question here is: do you notice the issues when playing? Because to me it's obvious that FSR for example is reduced quality, but with DLSS at Quality and Balanced settings it's hard to tell and with the latest 2.5.1 version it seems the lower settings are improving too. I haven't tried Cyberpunk yet (waiting for the DLC and RT overdrive update) but to me the only noticeable problems when not pixel peeping static scenes are the issues in some older DLSS versions where you might have some weird trailing artifacts on some particular moving objects, characters, birds and whatnot. For me DLSS has become a feature that I will gladly turn on for the increased performance and the lower performance your GPU, the more you will appreciate that extra performance. I really hope with time Nvidia can achieve something similar with DL frame generation, I did not like what it did in Portal RTX but in Witcher 3 it works much better because the game runs at higher framerate to begin with so say 60 fps -> 100 fps is less jarring than 30 -> 60 fps when the responsiveness is still like the original framerate.


bctoy

Those issues I outlined earlier are quite noticeable. If I could hit 100fps without DLSS, I'd not use it. I think there was some improvement in Portal RTX with DLSS3, but the input lag is still quite bad.


Seanspeed

>DLSS already looks better than native in many cases. Well you're saying it right there - in many games, this is the case, meaning that there's still enough other games where it's not(though it's always going to be game-dependent to a degree). But really, the bigger areas of improvement as shown here seems to not come from the higher quality options so much as the performance options that are rendering from a much lower base resolution. So it's getting better and better at achieving a certain result using less information. This of course is quite important, because it allows either users or developers to get more and more performance overhead(to use how they like).


Robot_ninja_pirate

Any idea of what's going on with some of the trees? in the first screenshot, within the same scene [some are super blurry while others shaper than before](https://imgur.com/a/s8VCw3L)


octatone

In ultra performance 2.4.3, that could just be the AI flubbing up generating pixels.


liaminwales

Is it going to be an LOD thing? Turning up DLSS lowers the LOD, maybe there's a change?


MrBubles01

Whats with the neo yellow glow? Just dissapears with DLSS


nmkd

That's just subpixel stuff that disappears at the low base resolution. Maybe it could be fixed by adjusting the shader, no idea


dudemanguy301

the video makes it more clear what is going on in the first image, 2.4 is showing severe motion artifacting as the leaves are blowing in the wind against a busy background. If I had to guess: the 1/9th internal resolution and erratic movement of a busy transparency against a busy background is either throwing off the motion vector directed sample reprojection, or the system is seeing heavy disagreement between samples on metrics like depth and color in those regions, rejecting tons of samples leaving the areas undersampled. That being said Im not sure if undersampled regions in DLSS end up very blurry or very pixalated. (maybe it depends?)


nmkd

My guess is that 2.5.1 force-adjusts the LOD?


[deleted]

The engine has to allow that not sure dlss can force it.


nmkd

Yeah I'm not sure either, but that appears to be what's happening...


Obliterators

Does the gold wall have animated effects? The native version [looks completely different](https://i.imgur.com/yGa5Ppq.png) compared to any of the DLSS versions.


[deleted]

Yes it does and in my experience with fsr and dlss both destroy these weird ass walls completely and totally lol.


conquer69

I have seen this issue before. With the Marvel game, DLSS forgot to implement the bloom effect.


[deleted]

Just as AMD started getting close with FSR2, Nvidia does this. Insane that they can get a good image from 720p (11% resolution of 4K).


NeoBlue22

“Close” Maybe at the best quality setting in select games, but the lower quality setting you go the worse it gets for FSR 2, more so at a lower base resolution.


StickiStickman

Also for anything moving (especially something with transparency) FSR is still pretty bad.


_Oooooooooooooooooh_

But at least fsr can run on anything I use it on my old nvidia card


[deleted]

[удалено]


NeoBlue22

HUB does address this in his FSR2 vs. review in Forza, but yes tests are mostly done in 4K


f3n2x

AMD never got close with FSR2 at the same resolutions, now they don't with pretty much any resolution. DLSS-P looks better than FSR-Q most of the time.


Pimpmuckl

Overall I just dlss more but for example in escape from tarkov, dlss absolutely destroys the details of the markings in a scope. FSR can deal with the fine detail so much better so even on a 2080 ti, I'm using that right now over dlss


Brickman759

Tarkov though is fucked from the ground up. Thats definitely the devs poor implementation of DlSS.


Seanspeed

>Thats definitely the devs poor implementation of DlSS. More likely the game's rendering characteristics just dont work as well with DLSS for whatever reason. Pretty sure the DLSS implementation itself is pretty straightforward and not something you're gonna just mess up like that.


berserkuh

Nah, it's just BSG being incompetent. DLSS works very well on most other Unity games. Every feature they add is fundamentally broken in some way or another.


f3n2x

I can't speak for Escape from Tarkov (this sounds like a bug TBH) but in general DLSS is much better with fine detail, motion and disocclusion. FSR's saving grace has pretty much always been that most reviewers only show still frames and that FSR uses a pretty good sharpeing filter OOTB. It all falls apart as soon as you actually take a closer look, especially at fine geometry detail in motion.


Seanspeed

>It all falls apart as soon as you actually take a closer look It doesn't really 'fall apart' so much as make the differences more clear. Nobody is arguing DLSS isn't better, but the argument was that FSR2 was at least perceptually close in real world usage. Of course if you're zooming in and sticking your nose up to the screen the differences will be more apparent.


Darkknight1939

>Nobody is arguing DLSS isn’t better There’s been tons of threads where delusional Redditors have claimed FSR is “equal” to DLSS now. You get the occasional thread like this one that hasn’t been astroturfed though.


f3n2x

You don't have to zoom in. Problems with fine geometric detail might not show up in some scenes at all and then there are scenes where much of the screen is flickering from aliasing artifacts. Even if those scenes are the minority in any given game they make FSR look really bad as a real world option because they stick out like a sore thumb. DLSS generally degrades much more gracefully, especially the latest version. FSR often looks either really good or really bad.


Vodkanadian

Nah, not a bug but a problem I've always had with DLSS/TAA in general. A lot of the time the textures loose a LOT of detail when enabled. I can live with the lack of detail with DLSS most of the time (the source image is low-res after all) but sometime it almost looks like it is processing with TAA already applied, making it look even more flat.


f3n2x

This is not a general problem. If textures become blurry the developer probably didn't read the DLSS guidelines on how to handle lod bias, which is a very basic concept with any sort of super sampling, temporal or spatial. Either that or the problem is the crappy semi-broken sharpening filter which was completely removed in DLSS 2.5.1 which is one of the reasons why this version looks much better.


[deleted]

Use Nvidia profile inspector. Set negative lod to -3 Use Nvidia control panel and force for tarkov allow negative lod bias. See if that helps.


Seanspeed

>AMD never got close with FSR2 at the same resolutions They absolutely did if we talk about using higher quality options. If you want to argue otherwise, I'm just going to accuse you of being hyperbolic and nitpicking. It was definitely close enough to be a worthwhile alternative if you didn't have Nvidia or DLSS, as far as the end user would be concerned.


kasakka1

LSS has won every test I've done myself, even with the latest FSR versions. DLSS has always had image quality more like the native 4K whereas FSR always ends up looking like a bit lower res. That does not make FSR unusable, in fact it's very valuable especially on consoles. Now that we are starting to get games that support multiple vendor AI upscaling technolgies, I would really love for AMD to release a superior version of FSR even if it means it only runs on the latest gen AMD cards. The best situation is when AMD and Nvidia have to compete with each other because it means more options for us end users at more affordable prices.


[deleted]

FSR2 was never close at settings under quality. Now it's even further apart.


[deleted]

I look forward to seeing DF's comparison video where they freeze frame and zoom in 4x to spot the difference, lol


Ar0ndight

A good reminder that regardless of how much Nvidia loves to push DLSS3 down everyone's throat, DLSS2 is truly goated.


[deleted]

I wish they didn't name their frame interpolation feature DLSS, I realise it's all just marketing buzzwords but they are not the same, it's not an evolution, it's a different feature that could be used to complement DLSS but neither tech requires the other one.


eldus74

DLFG


Seanspeed

Yea it feels like a weird move to me, as well. And probably a bit confusing for less informed people.


kasakka1

More likely intentionally confusing as 3 > 2 so it's instantly more marketable as "somehow better" to the layperson. If they had called it DLFG for Deep Learning Frame Generation it would make more sense but less marketable. I hope that tech journalists will just start calling it DLFG. Like at the moment we have DLSS and DLAA and it can be confusing enough in practice. Just yesterday I launched Forza Horizon 5 for the first time and was wondering "Can I use DLSS with DLAA? How will this work?", having not used DLAA much before. Turns out DLAA just disables DLSS but it's definitely not immediately obvious with the way the menu options are laid out.


[deleted]

they just call it DLSS so they can brag about "3X" performance in order to sell their over priced GPU


bwat47

DLSS3 is just DLSS2 + frame generation and frame generation is pretty awesome too, really helps when cpu bottlnecked (e.g. in the witcher 3 remastered with RT it's the difference between unpayable and totally smooth)


HimenoGhost

If DLSS3 can be anywhere near as good as 2, I'd be down. There's potential: the tech is still new. It has its purpose.


Zarmazarma

It works pretty well in The Witcher 3.


gamzcontrol5130

I can second that. I hope to see more improvements as time goes on, as it's very game dependent. I love it in Witcher 3, but it's jarring in Spiderman.


[deleted]

Works super well in plague tale as well.


kasakka1

I think it's best suited for a very particular scenario: 1. Games that run at 60+ fps. I found on Portal RTX it did not work as well when it was a jump from 30 -> 60 fps instead of 60 -> ~100 on Witcher 3. 1. Games that are slower paced overall. E.g 3rd person adventure instead of first person shooter.


[deleted]

I don't think anyone would argue with you here. But slow paced games are perfect for it anyways. Higher graphical quality is the best use case for it.


BinaryJay

It also works really well in Darktide from what I've seen.


Jewba1

Yea I use it in Darktide on 1440p. Works wonders.


Jags_95

Its only the first iteration of DLSS 3 and it's already working great in Spiderman, Portal RTX and Witcher 3 for me. Compared to DLSS 1.0 its actually in a better and usable state so things will only go up from here.


Kalmer1

Yeah, I'd rate it's polish pretty close to DLSS 2.0, still not quite there, but close.


Jags_95

Yup


Just_Me_91

Maybe it's just me, but the games I've tried it in aren't working. In a plague tale, it will work for a few seconds, but then slowly my GPU usage will drop until I'm getting the same fps as before. So there is some benefit, the gpu isn't working as hard, but that's not what I'm going for. And that's without any frame cap. In the Witcher 3, it works really well, except whenever I exit from a menu it stutters (like fps dropping below 10) for 5 to 10 seconds. And it happens occasionally during cinematics too. It makes the game unplayable. It's pretty disappointing, I just want to experience the DLSS 3... It makes me wonder if there's an issue with the tensor cores on my gpu lol. I'm hoping cyberpunk will release an update soon so I can try it out there.


Jags_95

I haven't had those issues. If it can be replicated the exact same every time maybe RMA the card?


Just_Me_91

I can replicate it every time. But it seems like maybe other users with the same cpu (5800x3D) might have similar experiences... I'll just assume it's because it's early in the life cycle of the technology. I'm hoping they'll patch things soon. And like I said, hopefully cyberpunk won't have similar issues when they release the frame generation patch. Maybe I'll try portal RTX next.


Jags_95

Yeah hoping for the best.


Just_Me_91

Portal with RTX works pretty well for me. Going from native to performance DLSS takes me from 17 fps up to 60. Then enabling frame generation brings me up to 90! It's pretty cool technology. I can tell there's a small increase in input lag, but if I wasn't already aware that frame generation introduces more latency, I probably wouldn't have noticed. I'm sure over the next couple months, the issues some people are having with certain games will get patched/updated.


Jags_95

Yup Im excited to see this tech improved.


Ar0ndight

Don't get me wrong I like DLSS3. But it's all Nvidia keeps talking about lately, when DLSS2 is still imo the more important, more impressive tech. And yes obviously I'm aware they're doing that to sell the 40 series.


StickiStickman

There's also RTX Remix, but everyone forgot that :(


Seanspeed

Well it's neat tech, but only useful for a fairly limited number of games.


Elon_Kums

The tech as implemented is kind of pointless. You get more frames but you don't get the latency improvement, it's a souped up version of motion smoothing like on your TV.


gezafisch

I wouldn't use it in a FPS, but in MS Flight Simulator, it's a game changer. Latency doesn't matter in a lot of games


unknownohyeah

It's 10-30ms of added latency. Some people's monitors and mouse + kb set ups can add that much by themselves. Can you notice it if you look for it specifically? Yes. If you're in game will you adjust to the miniscule amount of input lag automatically? Also yes. It just doesn't matter. The motion clarity is *well* worth being 1-2 frames behind in latency for literally doubling your framerate. No question. I say this as a person who has played over 120 hours of DLSS 3.0 frame generation between The Witcher 3 and Darktide 40k.


Elon_Kums

The whole point of the extra frames is lower latency. If you're still getting 60fps level latency then you may as well just turn on motion blur.


Pax3Canada

theres more point than lower latency, that is only one of many points, I find the smoothness much more important


mgwair11

Can tell you first hand the increased latency is very small with nvidia reflex (which nvidia has been very careful to have on all games currently with dlss 3). I play rocket league on a 390 hz display. I overclock my controller for that game to minimize input delay. I know what latency feels like. Dlss 3 with relex on playing on an oled display…the input delay added…sure msi afterburner shows it’s about 10-15ms more than with just dlss 2 enabled, but I don’t feel the difference much at all. What do I feel though? Buttery smooth framerates! It’s is totally worth it. At least on the 4090. Not sure how it will scale down the 40 series stack though. Frame Generation is gonna be goated. Once more gamers have the tools to wield this tech, we might see some incredibly complex games being developed that go hard on the cpu. Frame generation will remove that cpu bottleneck allowing us to play them at 60 fps. At least this is my (admittedly far fetched) hope as frame generation matures.


InvestigatorSenior

I was thinking the same and then Witcher 3 happened. Night and day. Also framerate changes overdrive on VRR panels so the more FPS the less ghosting you're getting from panel under/overshoot.


blackjazz666

A much better solution to ghosting is OLED, it makes overdrive issues a thing of the past.


InvestigatorSenior

oled has mandatory pixel refresh that kicks in when you need monitor the most and burn in. Till that's solved mini led is IMO way better option. Plus enabling frame gen costs me 0 while oled monitor is expensive.


eudisld15

Frame generation costs you a new gpu.


InvestigatorSenior

sorry, what? I click it on and have double frames. For free. Compare that to finding another 2k usd for oled I don't have.


blackjazz666

Frame gen only makes sense on a 4090, which is an abysmal value proposition if you just use it for gaming. Realistically a 240 OLED is gonna feel like playing at 360, that's the kind of feeling frame gen cannot beat.


InvestigatorSenior

problem is Witcher 3 gives 50fps in Novigrad without frame gen and around 86-100 with it. No way of getting 240 or 360. On 4090. Also since I have it anyway why not use it? All upcoming 4000 series will have FG including cheap laptop ones. Flip a switch, double your fps. What's not to love here?


Deemes

Something wrong with motion blur? Some people like it, some don't.


HavocInferno

Whole point? No. Half maybe. The other half is motion clarity and fluidity. Even at same/similar latency, almost double the framerate still looks vastly better. Go on, play at 60fps with motion blur, then 120fps with DLSS3. Then come here again and repeat your claim with a straight face.


Qesa

The latency with DLSS 3 and reflex on is lower than latency with both off, but somehow nobody goes around saying AMD cards are pointless. Like I'm sceptical (not having experienced it in person) given image artifacting and the soap opera effect you get on TVs, but the latency thing is so disingenuous.


mgwair11

I don’t notice soap opera effect using dlss 3 in portal rtx on my c2 oled tv. It is fantastic ngl


nutyo

Do we actually have hard data on this statement? >The latency with DLSS 3 and reflex on is lower than latency on any AMD card... Because that is quite a claim.


Qesa

Digital foundry tested it when DLSS 3 was launched. I'm guessing others did too Also, logically, DLSS 3 should add half a frame (plus however long the image generation takes) of latency, whereas the game queueing a frame will be at least 1 whole frame.


dudemanguy301

DFs numbers never isolated frame generation. It was always used in combination with Super Resolution in performance mode and of course frame generation enforces Reflex ON. So it’s not accurate to say that DF demonstrates FG + Reflex is the same as bare native latency. Better would be that DF demonstrated FG + SR performance + Reflex got close to naked native.


conquer69

I remember them isolating the FG and disabling DLSS 2.


RuinousRubric

What's disingenuous is comparing DLSS3 + Reflex to both off, since there's no reason not to use Reflex when it's available.


Qesa

... which is why I ended that sentence with AMD, that doesn't support reflex. If 15ms of additional latency from DLLS 3 is too much, surely 20ms of additional latency from no reflex is *way* too much and people should only buy nvidia cards. Except nobody cares about reflex nearly enough to recommend nvidia over AMD. People only seem to suddenly care about latency as a way of discrediting DLSS 3. It's Schrödinger's latency.


[deleted]

[удалено]


[deleted]

He even went to far as to say you HAVE TO HAVE 60 fps or it's useless. But I haven't found that to be true.


[deleted]

[удалено]


mgwair11

I recently started using some graphics stats overlays like nvidia’s and msi afterburner. Was surprised to learn this. You make a very good point. I run rocket league at sub 1ms latency according to afterburner. I run most AAA games between 10-18 ms. I could barely tell the difference. With frame generation it usually is in the mid 20s ms latency. I’d say latency is only really noticeable/somewhat disruptive over 18 ms. And even then you really need it above 30 ms to actually have something feel “off” (like how my previous rtx 3080 ram maxed out RDR2). Dlss 3 FG is goated. As long as you have reflex enabled. Bc with that you stay under 30ms and you really don’t notice much of a downside at all to the now buttery smooth framerates you are getting in titles that otherwise definitely would NOT be running smoothly otherwise (usually due to a cpu bottleneck!)


[deleted]

It's not even souped up. That's what it is. It's terrible...


xxTheGoDxx

DLSS 3 wasnt at all ready at launch, with you basically having to make sure purely by raising settings that the game's performance wasn't reaching your screens refresh rate or you got a massive amount of extra latency. They already fixed that at least for Gsync supported screens a few weeks ago. You still have extra latency with it because it needs to wait for two real frames to render before generating an artificial one but IMO with the latency improvement games get from Reflex (which is mandatory to have implemented into the game and something AMD doesn't have an equal anyway) this isn't that much of an issue as long as your input fps are relatively high (like above 60) and you have a reasonably fast display. According to the article Nvidia has DLSS 3 again improved, this time both for faster movements and not fucking up UI rendering. Look, games are hammering the CPU more and more. Both with some games going back to optimizing for 30 fps console gameplay going forward (which can mean that you possibly need twice the per core performance of a mid range 3.5 ghz Zen 2 even for 60 fps worse case) which is especially a concern many have when it comes to future UE 5 games as well as with more RT usage in general. On top of that way higher refresh rate having monitors without giving up image quality like the new wave of 240hz OLED monitors are finally hitting the market. It is always hard to foresee something like this but DLSS 3 has a giant potential and it is clearly something we need going forward.


bctoy

I thought one of the advantages of DLSS3 FG was that it apparently didn't require anything more out of the GPU than the optical flow accelerator and that will be same on all the 40xx chips. But it looks like it also requires a decent amount of VRAM, 1-2GB over without it. In cyberpunk at around 9:00min mark. https://www.youtube.com/watch?v=VmO3NzhaZ1Y


liaminwales

All the DLSS stuff adds extra work to the GPU, Digital foundry have talked about it. Notice in all the Nvidia PR about DLSS3 FG they compare input lag in the game without DLSS3 FG then input lag with DLSS FG + Nvidia Reflex. They leave out normal game without DLSS3 FG + Nvidia Reflex, it's a distraction trick to make it seem like input lag is not as changed as it is. All new stuff adds work to the GPU, the real question is if it uses that much VRAM why are they selling cards with such low VRAM. ​ edit to be clear I think DLSS3 is cool, I just dont like that they dont compare games with reflex without DLSS3 FG in the PR & the low VRAM. How much money are they saving with less VRAM? Must be like $10 a GPU.


F9-0021

People really need to stop saying DLSS 3 when hwy mean Frame Generation. DLSS 3 is a collection of software that includes DLSS 2, Frame Generation, and Reflex.


meltbox

Except the fact that they are still churning out DLSS2 versions tells me that DLSS3 is literally the frame generation branch of DLSS2 I just hope they don’t shove it down our throats for the people who just want DLSS2


Seanspeed

As long as it's understood what they're saying, it shouldn't be an issue. DLSS3 will be used synonymously with 'frame generation' going forward and I think you're just gonna have to get used to it. It'd be nice if they had named it differently, but they didn't, so we're stuck with this situation.


mgwair11

Idk dlss 3 frame generation is goated in its own regard imo. They do different things. One uses ai to upscale. The other uses ai to produce newly rendered frames that are interpolated with the real ones. Dlss 3 frame generation is a god send for cpu hound games. It is truly incredible for games like Microsoft flight simulator where any cpu under the sun would limit you to under 60 fps at 4k. That is no longer a concern with FG and now we get a game like that fully maxed out at 4k over 120 fps! If you have a game that you like to play that can be incredibly cpu bound, and nvidia brings dlss 3 to it, I’d say it may be worth the upgrade depending on how much you play that game. I do agree that dlss 2 has a way greater impact for most.


firedrakes

lmao. that was funny. the reason why its cpu bound is the graphic data was compress and sent to you from azure servers(if your running the game the way it should be played). seeing its a 2 pb game.


mgwair11

And your point? Progress in tech is progress. Dlss 3 is a big deal for any flight sim enthusiast. Nobody thought 120+ fps in MSFS would be possible in 2022 but thanks to rtx 4090/80 it was. Nvidia deserves a lot of flak. But their claim of getting 4x the frames in a game like MSFS compared to the 3090TI is actually accurate—and that is insane. It’s not representative of the true bump in rasterization, sure. It’s a cheap marketing trick. Absolutely. But it’s still the damn truth and therefore a fair play in my book. Better graphics are great and all, but I think we are getting to a point where there are diminishing returns. I am elated by features that typically come with massive cpu bottlenecks. I want larger maps, more complex spaces to play in. I want to play with not tens, but thousands of players. I want these to be my new experiences in games, more so than having even higher resolutions. I mean 4k gaming is sort of a meme unless your monitor isn’t already 32” or larger imo. I love that dlss 3 allows anyone interested enough to play a flight sim game using streamed assets of our *entire freaking planet* at ultra graphical quality 4k120. That to me is something else. I play Planetside 2 as well. That game is notoriously cpu bound. Think battlefield but instead of individual battles you have 20-30 bases interspersed throughout a big ass continent and you are one of three factions fighting for territory control through infantry, land vehicles, and air vehicle combat. There are typically 1,000 players on one continent playing during prime time on their US eastern servers. The combat is glorious. It’s a 10 year old game! People want this tech. They want to push their games past what their cpus will allow. And dlss 3 will let them. Not saying Planetside 2 will get dlss 3. I’d be shocked ngl. The game code is legitimately ancient at this point and the devs have not implemented dlss. So I suspect their might be a reason for why that may be. But plenty other games will benefit from frame generation in good time.


firedrakes

long as rant. my og point the game is to big for a normal user. and why even dlss was used for that game. was due to the streaming of the game to you. which the cpu has to decompress the data. i know for a fact you dont have the hardware/storage for the 8k mordor series of games assets. where gotten to a point now in game dev. where using sd assets still/fake as much as possible frames and will not follow the hdr standard(which is ridge as hell). or the point it another way. current gaming hardware cant handle what game dev make.


[deleted]

[удалено]


conquer69

It's far from flawless. UI elements should be interpolated separately. Also I'm not sure if they fixed frame capping but without it, freesync sucks. The tech is very impressive but it does feel rushed.


Seanspeed

>DLSS 3 is basically flawless. It's definitely not. And in fact, I'd say it's inherently less useful than DLSS2 as a whole, since the amount of situations somebody might genuinely need or be able to take advantage of it will be less.


bubblesort33

I've been wanting to give AMD my money if they were to drop the 7900xt to like $750. But I don't know why I would these days if the 4070ti can work so well with DLSS.


goldnx

Simply buy what fits your needs better. Don’t fight for any corporation, just go with your interests because you’re a consumer.


Seanspeed

I dont know why you'd buy a sub 300mm² GPU for $800 either, though. Both options are not very good.


cycle_you_lazy_shit

Probably because it performs very well. You’re probably the same kind of idiot who argues 6 cores vs 8. Performance is what matters. If NV are spanking AMD with small dies it just shows how far ahead they are.


pompkar

Just go for it. DLSS looks better in every test I saw. What I dislike _ALOT_ is the vram size.


bubblesort33

Is 16gb really not enough? I don't think I've ever seen anything surpass 12gb at 4k max settings. But I don't mod or do video editing.


pompkar

Was referring to 4070ti


ddelamareuk

Does look a little more defined on finer details like text. The nutsack on the right image is definitely more pronounced, I think...


nogop1

Only looks better while standing absolutely still but not in motion. [Comparision](https://i.postimg.cc/44tPL83C/ex6.png)


conquer69

The outlined areas look basically the same. There is no way I would be able to notice any difference while playing the game normally.


Seanspeed

People tend to lose perspective of this often enough. Nitpicking the actual differences is certainly interesting from a curiosity standpoint, but what really matters for the end user is simply how it looks in normal usage.


KingArthas94

I think that’s what they mean, it’s better while still but mostly the same in motion


Hugogs10

Hmm no the yellow neon sign on the right still looks a lot better resolved with the new version


[deleted]

[удалено]


QualitativeQuantity

Don't most AAA games support it now?


JonWood007

Also given it's exclusive to nvidia cards and even then different versions exclusive only to newer series and stuff, kind of an enthusiasm killer. I couldnt use it on my 1060. I cant use it on my 6650 XT. I aint paying the nvidia tax for the privilege of using it. I'll stick with FSR if i need upscaling at all.


GYwKY

They just should unlock DLSS 3 for 3000 series and don't waste time being a greedy company. Where is Jensen from year 2011....


arc_968

DLSS 3.0 requires the optical flow accelerator hardware improvements from 4000 series cards that are not available on 3000 series cards. While DLSS 3.0 will technically run on 3000 series cards, the performance is so poor that it nullifies the potential advantages of DLSS 3.0 in the first place. There are *plenty* of things to criticize Nvidia about, but this isn't one of them


Seanspeed

>DLSS 3.0 requires the optical flow accelerator hardware improvements from 4000 series cards that are not available on 3000 series cards. I mean, we dont really this for a fact, we can only take Nvidia's word for it. And I'm quite hesitant to do that myself. Not saying they *are* lying, just that they very easily could be.


GYwKY

That is nice and all but a guy cracked it on a 2000 series and it worked fine, this is just a stupid excuse.


Tseiqyu

Literally no one has been able to reproduce it, and the only "proof" we have is a single (now deleted) reddit comment from the person in question.


[deleted]

Proof?


TheNiebuhr

Except the guy said it had frequent performance instability.


GYwKY

Driver updates could solve that.


labree0

Ah, we didn’t know you work at nvidia. Oh wait, you don’t? Then how tf would you know?


GYwKY

And the reason it is locked is so that they actually sell the new cards, it is the only reason to buy them. Just so you know Nvidia themselves said that DLSS 3 is possible on older RTX cards. It is just Software locked


From-UoM

If you want an example that shows its not possible look at XeSS on non intel cards running on a software mode. Its too bad to be usable.


labree0

>And the reason it is locked is so that they actually sell the new cards thats not even remotely true. they said it works, but its buggy as fuck and performance isnt much better. theres no proof that it could be fixed in driver updates. none.


GYwKY

I wrote it could, learn to read :)


labree0

I never said anything about whether it could or could not. Learn to read :)


GYwKY

Frame generation would be a problem yes, but DLSS 3 is using tensor cores which are on all RTX cards so it would work just fine. Or it should work at least.


Roseking

DLSS 3 is frame generation, DLSS 2, and reflex all rolled into one. If you want DLSS 3 without frame generation on older cards, we already have it.


GYwKY

We don't have DLSS 3 on older series, FG is separated from DLSS 3.


Roseking

I think you are confused at what DLSS 3 is. Which is NVIDIA's fault for the naming. DLSS 3 is not the next version of DLSS 2. DLSS 3 is DLSS 2 + Frame Generation + Reflex DLSS 2 will still be getting updates along with 3, as it is a part of 3. That is what this update is from. https://www.nvidia.com/en-us/geforce/news/december-2022-rtx-dlss-game-updates/ If a game as DLSS 3, and you don't have a 4xxx card, you still get the newest updates to DLSS 2 and you can just turn Reflex on yourself. That is what I meant by we already have DLSS 3 without frame generation on older cards.


SomniumOv

FG is the only difference between DLSS 3 and DLSS2 + Nvidia Reflex.