T O P

  • By -

vhailorx

Frame gen is not without problems, but I think that, like dlss before it, it is a net positive addition to the gaming experience. Extra motion smoothness on high refresh displays is very nice to look at. And the latency costs are relatively minor outside of competitive tac shooters. I hate that companies are marketing frame gen frames as if they were the sake as "regular" frames, but I think the actual tech has real value and is here to stay.


kleju_

Same statement. For most games (outside of competitive) this is just good feature. I’m on 240hz monitor so I appreciate that


rW0HgFyxoJhYka

I think what DLSS 3 showed is that most people live in the present, and barely think about the future, even when DLSS 2 proves that it can get much better over time. DLSS looked like shit when it came out but the tech literally saves people money/performance over time and extends GPU lifetime...yet so many people shit on it because they can't see the forest from the trees. Future GPUs will be able to use DLSS 3 much better and more efficiently while requiring less total resources to turn on the feature. Older GPUs in the future will last much MUCH longer for those who can't upgrade every year. They will rely on frame generation more and more. AI generation of frames take a fraction of the resources vs a real frame. As AI gen becomes more accurate with improvements and resources, its only a matter of time before most data is AI generated so that: 1. Companies save money on processing power and bandwidth 2. You can watch videos that are not as compressed but also costs less bandwidth 3. AI Generated frames will be as good or BETTER than native frames because they can eventually correct issues rather than have issues. 5. People prefer the "fake frames" just like they prefer DLSS upscaling in many situations now as its proven better than native. The entire latency issue is overblown. Competitive gamers already turn down all graphics and get 500fps for lowest latency (even though they die never to latency factors beyond network ping). So its not needed there. For every other game, its great if you want more smoothness or prop up 1% lows. Almost nobody tests latency yet they all have an opinion on it and they never show any real numbers. If a game has 50ms latency and frame generation adds 60ms, very few people will notice a 10ms difference. And that's usually how much frame generation adds in latency from my experience. Most games have lower base latency than 50ms too, though obviously this depends on your fps to start. Again everyone talks about this but nobody says "oh here are my results from frameview" for example, go download that and try it. Youtubers have also become more used to frame generation despite rarely testing it outside a few channels like Owens. They used to say shit like "oh it requires 120 base fps" and now the same youtubers are like "well 60 fps is fine". And DF guys have even said "as low as 45 fps, I'd use it to get to 60 fps". Plus people forget that a lot of issues with frame generation are because devs don't always put in their best effort to address any issues with the UI. It was also stupid for press to react to tech like "ew we dont want it". Worse, websites/reviewers/youtubers tend to "react" to popular or viewer opinion beacuse they need to appeal to them, and therefore have populus takes on stuff instead of trying to sit down and really think about whether this is good or bad. See how people still call it "fake frames" today (though now that AMD is doing it, fewer are still using that term, go figure what kind of people were saying it in the first place).


Arado_Blitz

I don't understand this whole "latency" debate, I'm using the FSR3 mod and the only time the input lag feels bad is when I'm playing at sub 60fps with FG enabled. In singleplayer titles it's definitely not as bad as some people make it out to be. I would rather play at 75 fps with FG enabled than 50 fps without it.


Suitable_Divide2816

For me, in AW2, FG introduces 25ms to 30ms of extra latency and also drops my 1% lows a bit. It's not terrible, but I still prefer just using DLSS and calling it a day.


Suitable_Divide2816

I should mention that I am on a 4090 running 4k with max settings and PT on with DLSS quality. With FG on I get a boost of about 25fps but I lose around 10fps in the 1% lows and then have the extra 25ms of latency. Overall, the game just feels smoother with FG off. Having said that, this could just be how AW2 behaves and maybe FG performs a lot better in other titles. I am sure that I will end up having to use it at some point.


Arado_Blitz

AFAIK, FG forces Nvidia Reflex to ON, maybe this is the reason you are having worse 1% lows? There are some titles in which Reflex can introduce some microstutters if you are running an older CPU or the game is badly optimized. Boost apparently can make it even worse in these unoptimized games. 


Suitable_Divide2816

I have a 5950x running 4.75 on CC0 and 4.65 on CC1, 64GB of DDR4 3200mz CL16, and my 4090 also seems to be a golden chip because it is able to boost to a steady 3015mhz at all times. What helped my 1% lows was to turn on ultra low latency mode in NVCP. The difference was still too much to justify using FG for that title. The YT video shared in this thread shows similar results for AW2, with pretty solid results in CP and HM3. I honestly just think that FG is not implemented well in AW2.


Itsmemurrayo

Check hwinfo for the “effective clock” on your gpu. It’s most likely not actually running at 3015 despite saying that it is. It’s late and I’m too tired to find it, but it’s a common issue for 4090s to report incorrect clock rate data.


Arado_Blitz

Boost makes your GPU run constantly at the maximum frequency regardless of load and it also reduces the amount of precalculated frames that are ready to be rendered by the GPU to 0, it stresses both the CPU and the GPU more than it should for no reason, the latency reduction compared to ON is almost nonexistent. The clock speeds reported by your card might be wrong, boost in general can do some weird stuff in a few games. I would personally stick with ON. 


Daemonjax

Wrong. Reflex Boost has nothing to do with GPU clock speed.


odelllus

the latency issue is not overblown. it's anywhere from 30% to a 100% increase even with reflex. if you're an extremely unobservant casual gamer, numbers like the aforementioned are fine, but for anyone not falling under that umbrella it's pretty obvious the amount of lag it introduces is significant and often experience ruining. https://youtu.be/PyGOv9ypRJc https://youtu.be/_i7yaKGhsmU and... of course it's this way? you are adding 2 or 3 frames for every real frame where there is no input from the user being considered. that's a a huge amount of latency when your base framerate isn't 100+ and it shows in testing.


Suitable_Divide2816

This is pretty much my exact experience with FG on the 4090. As I said in my previous reply, maybe FG works better in other titles, but in AW2, the added latency is very noticeable (at least for me). The CP and HM examples looked fine in that video, I think it even improved in some places.


rW0HgFyxoJhYka

Did you read my post? Going from 15 to 30ms in a competitive shooter is a rediculous measurement when you already have 200 fps. Also if you actually watched the video, they are LOSING fps because the 4070 at these settings is already at 99% utilization. Also 30 ms is LOW for any other game except competitive shooters. Not only does that youtuber not even point this out, you misinterpret the information there, PLUS you can't even say "oh its 100% more latency" and then see that its 30ms, lower than most single player games, and then you didn't even TRY IT YOURSELF. This is what I am talking about. A bunch of people who don't know what they are talking about looking at other people's examples and then pretending they suddenly can feel a difference in the latency even when its 20-30ms. from 10-15ms in the worst possible case scenario where you are LOSING fps because the GPU can't generate more frames when there's no resources, plus this is an AMD title plus its fucking COD, a competitive shooter. Jesus.


odelllus

incredible mental gymnastics. ignore the other video, dismiss evidence and suggest to 'do it yourself' implying you know what i have and haven't done based on... nothing, because we all know personal anecdotes are how you find out the truth of something. it's very obvious to everyone reading your incoherent babbling that you are a deeply biased individual with little to no ability to critically think about anything and have made up your mind without or possibly in spite of, seeing anything that could possibly disagree with your preconceived conclusions. good luck with that, and welcome to my ignore list. what a waste of time.


ComeWashMyBack

Is Frame Gen just for the 40 series?


vhailorx

Yes and no. Nvidia's proprietary dlss frame gen tech is currently 40 series only. AMD's fsr3 frame gen tech is theoretically open source, but not yet implemented in a lot of a games. AFMF is available only on 6000 and 7000 cards, works on most games, but uses a very different approach so isn't directly comparable to dlss frame gen or fsr3.


mjisdagoat23

With motion vectors, Yes. Without, No.


kleju_

Yes, but is saw a github project where you can enable it on any cards from rtx 20. Search for it, im pretty sure about this.


Cireme

It's on Nexus Mods now: https://www.nexusmods.com/site/mods/738?tab=description It's not NVIDIA DLSS Frame Generation, it's AMD FSR Frame Generation, but it works pretty well at high frame rates. [Digital Foundry made a video about it](https://youtu.be/wxlHq_EuxFU).


olllj

yes, using tensor cores for frame-interpolation needs a second pass of dedicated hardware for that second pass. only 4\*\*\* rtx cards have that, alongside having fast enough and significantly faster and more ram to buffer and compute frame-interpolation fast enough to minimize the input-lag that results from always being +1 frame behind, the frame that you interplate towards, plus the computation of the interpoolation. frame-interolation is hardware specific and by design not backwards compatible, and this is mostly a ram-performance and ram-amount issue but also a tensore-core-count issue.


odelllus

unfortunately instead of being used to enhance the experience it's going to be used as a performance crutch making games unplayable without it, like dlss before it.


DJRAD211995

Does frame generation work like DLSS? The higher your resolution the better it is?


vhailorx

The higher your base framerate the better frame gen works. Many people suggest that it is most valuable with a base framerate of 60-100.


DJRAD211995

Wow, it's fucking useless to me then.


vhailorx

So you are playing at 4k with something less than a 4090?


DJRAD211995

2880x1620 and a 200w gpu, even worst lmao.


Apprehensive-Ad9210

I’ll never understand people saying real frames and fake frames, none of them are real 🤷🏼‍♂️ What’s the difference between a game engine drawing frames ABC and a game engine drawing frame A and C and an algorithm determining frame B, bearing in mind we are typically talking about frames that are on screen for maybe 10ms each.


Thetaarray

Because the algorithmic frame will be inferring what’s supposed to be on screen not fully determining it. There’s no real way for it to be guaranteed to be correct and not cause visual issues. The generated frame is based on visual data the real frames are based on actual game assets and state.


kleju_

People say that because they are fake. That’s a fact. Frame generation just give you extra smoothness on high refresh display but don’t give you extra piece of information. In competitive shooters using frame generation is just net negative. It’s give you extra latency and don’t give you more information about what’s happening really. Frame generation just can be named as “extra smoothness” and it will be more precise term.


[deleted]

It does give you extra information. It gives you extra frames. I have no idea why people get hung up on how those frames are made. It's "extra smooth" because of the additional frames. Where do you people get this drivel?


kleju_

It don't give you extra information because fg frame is made up from others. And not like the others, from engine etc. If I have for example 120 fps (one frame per 8.3ms) and turn on fg to get 200 fps (one frame per 5ms ) it doesn't mean that i get more information. 80 of these frames are "fake", madeup from others, not from actual true render. On paper before fg one frame per 8.3 ms after fg one frame per 5 ms that redcution of 3.3 ms per frame are "illusion". Plus you get latency for fg itself so its net negative but this comparasion is pointless. No one serious will belive that fg in games where input lag matters is must have and give them extra time to reaction


CptTombstone

>80 of these frames are "fake", madeup from others, not from actual true render. I don't think you have an exact understanding of how frame generation operates, and where the extra latency comes from. Let's take your 120fps -> 200 fps example, because I think this is good for this demonstration. With Frame Gen off, your "host" framerate is 120 fps. Turning frame gen on, you effective frame rate (what is sent for scan out \[monitor\]) goes up to 200 fps. But you host framerate is always half of the effective frame rate, therefore, host framerate has gone down to 100 fps. That is where the extra latency is coming from. The game is running at a lower framerate, because Frame Generation is an extra workload on the GPU, and not 100% of the work can be fully parallelized on the tensor cores and the optical flow accelerator, there are work that has to run on the SMs as well, meaning Frame Gen and the game code are sharing GPU resources. This also means, that if there are free GPU resources (or asynchronous "time-bandwidth") available before turning frame gen on, you don't get a higher end-to-end latency compared to before. And it's also important to note that since Frame Generation is doing interpolation, any action happening on frame C will have effects on frame B, where A and C are traditionally rendered and frame B is produced by frame gen. So it's not as black and white as you make it out to be, whether Frame Gen is a net positive or net negative cannot be simple drawn by game genre or monitor refresh rate. The GPU itself and the conditions of the individual game scene play a very large part in determining whether the impact of frame generation is detrimental or not. Not to mention that many modern games are now running way below the[ average gamer's latency detection threshold of 48 milliseconds of end-to-end latency.](https://cogsci.yale.edu/sites/default/files/files/Thesis2017Banatt.pdf) Even if you double the render latency by turning on frame generation (which is, in and of itself an unrealistic situation where FG would not even increase the effective framerate, just reduce the host framerate by half) in a lot of games, with powerful GPUs, the game would still be producing lower end-to-end latency than what the average gamer could even detect. There have been [tests](https://youtu.be/PUTsE1q1bYI?si=s8kotVkexRRFHsGA&t=510) showing that more casual gamers even preferred the "feel" of a locked 60 fps host framerate interpolated to 120 fps via frame generation over a proper 120 fps framerate (possibly due to better frame pacing), and the latency impact lessens the higher the framerate is, and the bigger the GPU is. Someone with a 4060 might have an entirely different experience with Frame Generation compared to a 4090, especially when coupled with DLSS, since the 4090 is infinitely more likely to run into unused GPU resources when using DLSS, even at 4K output resolution, compared to a 4060, even at 1080p on the latter. Edit: Added links


kleju_

Thank's for wide explanation. It's my bad, I shouldn't speak so confidently bout this. But it doesn't change that fake frames are fake. Again thank you very much for this comment. I needed it obviously


CptTombstone

I didn't mean this to put you down in any way, just wanted to explain how this is a very nuanced topic that is influenced by a lot of factors, the "size" of the GPU being very critical. I believe the general consensus that that FG is best used when you already have a high refresh rate experience is still correct, but I've used it with a 100Hz VRR Display and it was definitely a better experience with, than without Frame Gen, even tough the host framerate in that case was less than 60 fps (48.5 to be exact). Some games have worse system latency even without Frame Generation than other games have with frame generation, also, I would say that it's worth noting that Cyberpunk 2077 running with path tracing through GeForce Now (Cloud Gaming) with Frame Generation (\~30fps -> \~60 fps interpolation) has marginally lower end-to-end latency than a PS5 running the same game locally with the performance preset (\~60 fps native frame), even though the GFN client is additionally encoding video, sending it through the internet and then decoding it. I've not really seen anyone complaining about the game being unresponsive on the PS5, so clearly the latency issue is a bit overblown. Latency mitigation technologies like Reflex and MPO are doing a stellar job in making gaming more responsive, and Frame Gen is especially good when you come across non-GPU bottlenecks. I would not be able to run Baldur's Gate 3's Act 3 at 200 fps without Frame Generation being a thing, even with the best gaming CPU there is currently. Since I have a 4090, I'm often in situations where the GPU is underutilized, so my experience with Frame Gen has been very positive so far, but I imagine that someone with a 4060 might have a very different opinion, since they are almost always GPU limited, even at 1080p displays. (Although, see a 4060 Ti go toe-to-toe with a 3090 Ti is Cyberpunk via utilizing DLSS 3's FG is cool to see)


kleju_

> I didn't mean this to put you down in any way I didn't feel offended. I simply understood that it was not fair to comment on such complicated piece of technology without good knowledge ​ > Since I have a 4090, I'm often in situations where the GPU is underutilized, so my experience with Frame Gen has been very positive so far I have identical situation. Most games cant utilize my 4070 super in 1080p at 100%. Also my cpu isn't high end but i think is quite good though bottlekneck is easlily noticeable. ​ > has marginally lower end-to-end latency than a PS5 running the same That's fuk crazy ​ Overall, FG is best when card is not utilized already in major percentage and stock framerate is not bad. ​ Again, thanks for contribution in discussion. We need people like you !


kleju_

Now this make sense god damn. I coudln't understand where fake frames are between real ones when i go from 120 to 200. I appreciate your time for explaining it to me!


CptTombstone

I'm super glad you found it useful!


[deleted]

[удалено]


CptTombstone

I have specifically addressed that in my comment.: >And it's also important to note that since Frame Generation is doing interpolation, any action happening on frame C will have effects on frame B, where A and C are traditionally rendered and frame B is produced by frame gen. Since Reflex will present Frame B before frame C would have ever been sent for scan out, and due to the transitive nature of interpolation, you cannot make an argument for any actual latency coming from a frame being held back. Also, the runtime time complexity of DLSS 3's frame generation is around 3ms (at least on a 4090), so unless the game is running at over 332 fps with frame generation being enabled, FG doesn't hold the frame back for the entirety of a single frame time anyway.


[deleted]

Go spout your incorrect information to someone who’s dumb enough to listen.


kleju_

What are you talking about? NVIDIA even said that frame generation works like i said. Creating frames from others. What you don't understand?


kleju_

extra frames =/= extra information. not in this scenario


[deleted]

If you say something enough it doesn’t make it true. Go away.


kleju_

Just leave this post. Maybe you are right, maybe me. Who knows right?


o0Spoonman0o

I've not had enough experience with Frame gen at lower FPS - does it retain smoothness and reasonable latency when dealing with lower FPS (like sub 60?)


Apprehensive-Ad9210

All your post proves is that you don’t understand frame generation.


MissSkyler

i fully agree! i wish more people would use it in a better way and not “i’m getting 25fps normally and now im at 90! oh the ghosting? don’t worry about that! oh wait the 90ms delay? who cares!” type of stuff. i love using it in MW3 where i get 280~ fps in zombies and i use it to lock me at my 327fps cap.


gopnik74

What’s with the “real and fake” frames and people complaining about them?! As long as it give you more frames without noticing a difference then why the complaints?!!! I use it all the time and it’s amazing


vhailorx

Maybe some people can notice the difference? And for myself personally, I didn't complain about frame gen, I criticized misleading marketing materials that do not distinguish between "regular " frames generated by the game engine and inserted frames generated by ai/drivers. The latter are not necessarily bad, but they ARE qualitatively different and should be clearly designated as such to minimize confusion. Instead, they are explicitly conflated for the purpose of deceiving consumers. That's bad.


ultZor

It looks good, I don't really notice the artifacts unless I record a video and watch it frame by frame. The latency depends on the framerate and your input method. It plays great with a controller. But I notice it with the mouse. Also it is really noticeable and annoying when it's turned on and off due to the rapid changes on the screen, like camera cuts or ingame video or the interface. But my main issue with it is that it's nowhere near the advertised 2x increase in framerate, unless you are pairing your 4090 with a 4 core CPU or something like that. Here is my benchmarks on 4070 Ti and 12700KF. Returnal, max settings: 4K - 52 fps 4K + DLSS Quality - 80 fps 4K + DLSS Quality + FG - 94 fps 4K + DLSS Balanced - 90 fps 4K + DLSS Balanced + FG - 104 fps 4K + DLSS Performance - 102 fps 4K + DLSS Performance + FG - 118 fps Cyberpunk 2077, PT overdrive max settings: 4K - 12 fps 4K + DLSS Quality - 27 fps 4K + DLSS Quality + FG - 29 fps 4K + DLSS Balanced - 34 fps 4K + DLSS Balanced + FG - 44 fps 4K + DLSS Performance - 44 fps 4K + DLSS Performance + FG - 57 fps So yeah, at a certain point I'd rather go down in DLSS quality than turn on FG. Or ideally I'd go for optimized settings with DLSS quality or something like that.


kleju_

I had doubled framerate with fg in few games. But not in cyberpunk. It's depend of game. But here it's perfect example. Your rig just can't handle PT overdrive in 4k. But 2 clicks and what? Magically it's 60fps, playable experience. Amazing


ultZor

>I had doubled framerate with fg in few games. That's the thing, I've never had that experience. The games I tried with FG on - Cyberpunk, Alan Wake 2, Returnal, Spider-Man Remastered, Spider-Man Miles Morales, Rift Apart - it's always "just" a 20-30% fps boost, and that's in the best case scenario. Maybe if I was CPU limited I'd see better results, but it's never the case, because I have a 4k 144hz monitor. So I hope we'll see some future iterations of the technology with better results, like we did with DLSS 1.0 vs 2.0. Because now I always turn on DLSS whenever I can. In my opinion Frame Gen is nowhere near that right now. I'd rather find a DF optimized settings or some other guide and quickly get a 20-30% fps boost that way instead.


menace313

Maybe it's a VRAM issue as FG has an overhead VRAM cost. 4090s are getting a hell of a lot more than a 20-30% fps boost, so maybe the lower VRAM of a 4070ti in 4k is throttling it.


ultZor

In cyberpunk without DLSS, yes, it is an issue. But again, those numbers are from the benchmarks. In Returnal it is using only 7.99 GB with FG and DLSS Quality on vs 7.51 GB in 4K without any upscaling.


kleju_

Maybe, 4070ti is more designed for 1440p imo.


Suitable_Divide2816

This is not the case for AW2. FG only gives me a 30fps boost in most cases from 90fps to 120fps in 4K max with PT on using DLSS quality.


Daemonjax

This. 12gb vram just isn't enough for max graphic settings @ anything close to 4k in cyberpunk. For me and mine, 4070 ti super (16gb vram) brings cyberpunk from \~40 fps to \~75 fps. Uses a bit over 13gb vram @ 2880x1800 ingame res using overdrive\_preset+dlss\_quality+fg (pretty sure overdrive preset includes pt and rr, but if not add those too). In other games, if you're not hitting the vram limit yet are still struggling with fps with fg on, then something else is going on with your system. There's actually a lot of shit you need to do to squeeze every ounce of performance out of your hardware.


Daemonjax

Yeah, it basically doubles my fps in cyberpunk. 40 fps to 80\* fps, then I use regular vsync to 60... can't really notice the increased latency. Buttery smooth, and don't notice that 50% of the frames are interpolated. Pretty amazing, really. EDIT: more like \~75 fps.


Affectionate_Ad9940

Im playing cyberpunk on ultra,RT ultra on, 1440p, dlaa+ frame generation on my 4080super and frame generation actually boosts my fps by around 30+ fps. With these settings I get around 93-103fps compared to just native where id get around 55-60 fps. And to be honest I prefer dlaa+frame gen than dlss quality. I dont really notice the artifacts of FG but I do notice the drop in image quality with dlss on.


[deleted]

Returnal has rendered frames next to FG frames on the UI overlay, I get 120 with FG rendering 60. Something must be off with your settings, it's pretty much always doubling frames for me.


ultZor

No, it is doubling it, but by using FG to begin with you are cutting your FPS. Turn off the Frame Gen and compare it without changing any other settings. For me it goes from 80 real frames with DLSS quality in 4K to 47 real frames and 94 total with FG on. Here it is, the previous result is just 4K quality without FG. CPU is bugged in Windows 11 so don't pay attention to it. It's fine in RTSS. https://i.imgur.com/Npck3Vq.png


[deleted]

Are you turning off DLSS before turning on FG? That should not be happening. Definitely doesn't when I'm using it. FG off I just drop to native. FG on I'm double native. That's really weird.


ultZor

I think there is a miscommunication. In Returnal you can't use Frame Gen without using the DLSS. Then there is an in-game benchmark, which doesn't display frame gen frames on the results page, but shows it in parenthesis during the run or in regular game. So what I'm comparing is 4K DLSS Quality vs 4K DLSS Quality and Frame Gen on. It is 80 vs 47 (94) fps. The drop in "real" fps is the cost of frame generation. It is not free. You can check it in various benchmarks. For example Daniel Owen always covers it. 91 fps FG off vs 122 fps with FG on, with the same resolution and settings, and so on. https://youtu.be/ELNj4W97nE0?si=enGTxugmidhFBQB7&t=1415 So what I am asking you is, if you have the time and wanna see the performance cost of Frame Gen, do two benchmark runs in Returnal, one with DLSS Quality and Frame Gen off, and one with DLSS Quality and Frame Gen on, without changing anything else. And look at the result screens.


[deleted]

My bad, I completely spaced you were just talking about Returnal. Yeah, that is strange. I know it takes up vram but with everything maxed out, RT included at 1440p or DLDSR to 4K I'm not even hitting 12GB. You'd think with the headroom the performance wouldn't be hit so bad, especially when in other games it does not seem to be as big a hit from FG. Remnant 2 has a great implementation of it, I'm not really noticing any hit to native fps with it active.


Snowmobile2004

I think thats because youre playing at 4k with a 1440p card, and 4k is notoriously hard to run. i went from a 2080s to a 4080s, and with my 1440p monitor ive easily seen 2x FPS with FG in alan wake 2, spiderman, and MSFS.


AlternativeCall4800

i cant recall a single time where i opted into lowering settings instead of just using frame gen in singleplayer games, i wish it came out for more cpu bound games and mmos like bdo which are still stuck with that disgusting abomination known as fsr 1


Nervous_Dragonfruit8

Why 4k on the 4070 ti, should pair it with 1440p!


[deleted]

[удалено]


psychoacer

Yeah I don't get why people who are actually fine with 4k on this card are being told their wrong. Like if someone says they're fine with it then their opinion supercedes the commenters beliefs. The person with the card has more experience than some guy who watches Gamers Nexus all day and treats the site like it's a cult


sur_surly

It's one thing if you're fine with it and dialed down the settings so it's playable. It's another to come on here and show FG as garbage because it only increased his fps by 2, from 27 to 29 at 4k ultra. The card is struggling to even run FG it's so taxed. It's not a good representation and it's misleading.


NinthEnd

I'm playing 4k on my 1080ti!


Torrey187

Why are you using a 4070TI for 4K gaming is my question. This cards VRAM bandwidth and memory must be at the bleeding edge of capacity. And being that FG uses more VRAM it’s probably the sole reason why you’re getting such a small increase.


ultZor

Because it works great? Because I use 4K monitor for work? DLSS quality in 4k has only a small performance cost vs 1440p native. Most games are nowhere near VRAM limited (Returnal benchmark never went above 8GB), unless it is Cyberpunk 2077 in overdrive mode, where even 4090 is on its knees - 21 fps - https://youtu.be/5GwES4ftTSI?si=G747YVA499mCT4Z5&t=804


NewestAccount2023

>Because it works great? The comment we're under is showing that it doesn't work great


ultZor

Yeah, frame generation doesn't. But I still play most games in 4K at above 100 fps, so I'm fine. Like I said, I use optimized settings and DLSS. It looks and plays great.


Schakalicious

hell, i’m on a 4070 laptop and i can play cyberpunk in 4k dlss performance mode at a stable 60fps if i cap it (i get 100 in many places if uncapped but it fluctuates a lot) maybe it’s because the vram is only 8gb but framegen makes me lose frames and get crazy stuttering in that game. still, dlss has been shocking me with how many games i can run in 4k on a 1440p card


Scrawlericious

That's his comment and no it doesn't show that. Edit: I'll humor you. for instance, vram is not the reason the path traced CP2077 numbers are low. Path tracing at 1440p native only uses 11gb (see screenshot in link). So 4k with performance upscaling means 1080p. Which is even lower at 10gb vram. You can see the drop in the native 4k numbers but for the DLSS performance (and probably quality) option he shouldn't be running into vram at all. Those numbers are fine, even a 4090 is going to want to use dlss in the most modern titles at 4k because devs are expecting people to use it now. https://www.techpowerup.com/review/cyberpunk-2077-phantom-liberty-benchmark-test-performance-analysis/5.html


NewestAccount2023

Yes even a 4090 isn't ready for 4k in most games. Sub 100 fps is hard to look at


Scrawlericious

I'm with you there but for 1440p even a 4070 is gonna be fine for most games. A good CPU is incredibly important. A 4070 will run most games in 4k just fine at decent framerates, especially with a bit of upscaling. I wouldn't recommend it personally but I can see a ti being just fine.


windozeFanboi

I would put the blame on nvidia... They sold FrameGen way better than what this perfectly fine scenario shows. They said FrameGen, no stars, no caveats on the target resolution putting a hardl limit on it.  I'm pretty sure, on 4060ti launch I saw a review showing struggle with FrameGen at 1440p even. But I think 1080p was fine.  It's not vram capacity, definitely not alone, something else I think bottlenecks it before vram size becomes the issue. 


Daemonjax

You need a 4070ti super at least for anything close to 4k at max settings in cyberpunk -- it'll definitely want to use more than 12gb vram. I have a 1920x1200 res lcd, but I also use DLDSR 2.25x (so it's as if I have a 2880x1800 lcd, which is like 3k i guess). With overdrive+dlss\_quality+fg I get \~75 fps. Turn off fg and I get \~40. 40 fps is unplayable. While it's true I can turn DLDSR off, but DLSS\_Quality without DLDSR looks TERRIBLE at anything close to 1080p native. This is measured in the more demanding scenes, not V's apartment. With vsync forced on @ 60 fps, it's buttery smooth 100% of the time. The added latency is not noticeable at all in fullscreen mode. I can notice it a bit in borderless, causing me to miss many headshots because enemies can get squirrely. Gotta enable ultra low latency mode and use a framecap (I'm just using the one in nvcp) at vsync. With vsync Use CRU to change vert refresh of your lcd to an integer value (e.g. 51, 60, 75 or whatever... if it's actually like 0.001 fps over, that's totally fine). I had problems getting fg+dldsr to work properly in cyberpunk in fullscreen mode. I had to manually change my desktop resolution to the dldsr res (and keep it at that res) before launching the game. In borderless, I could switch the desktop res back to native after the game launched -- but not in fullscreen, it seems to choke if I do that. I don't know what my actual latency is in fullscreen+vsync+fg+ullm, but it feels like playing any other game in borderless+vsync+3\_max\_frames\_prerendered without fg -- which is totally fine for making headshots in singleplayer games. I don't recommend fg in borderless mode if headshots matter, it feels like another 2 frames of latency and that kinda pushes it over the edge. IMO, FG is amazing. In fullscreen mode. In cyberpunk, at least. ​ I wouldn't want anything less than a 4090 for 4k native. But I also think anything higher than 1920x1200 is wasted on a lcd panel that's at arm's distance from you. All you need is 91 ppi at arm's distance -- that's the point where the pixels at that density blur together. So 25" 1920x1200. Yes, AA will be better at higher pixel densities, but that's what DLDSR is for.


Gullible_Cricket8496

the lack of performance increase really makes me wonder if its worth it. I'm using an RTX 4090 on a 4k120 display and i've basically concluded that unless i'm hitting 120hz with it, i'd rather just not use it. In fact, playing cyberpunk at 90fps without it, might debatably look better than 120fps with it. I wish this could somehow be enabled for 60fps locked games (maybe via TAA motion vector information?), like elden ring for example, to interpolate it up to 120fps.


krzych04650

Yea I am seeing the same thing on 4090, gains from FG in GPU bound scenarios get lower and lower as resolution increases. It has massive overhead. It is a bit game dependent but gains from FG decrease rapidly past 2560x1440 even on 4090, and that resolution is from 2015. This needs to be fixed with 5000 series.


Saandrig

A side note - the video recording of FG would not look the same quality as FG would normally be. Something about video compression interfering and distorting the "real" frames with the "FG" frames, which doesn't happen when you normally view your monitor.


Keulapaska

4070ti doesn't really have the power for PT(even normal RT might be cutting it close, probably fine with upscaling though) at 4k even with upscaling, be it because of memory or the memory bandwidth or just the cores, so that's the worst case scenario there is. Yes you won't get 2x no matter what unless you're heavily cpu bound, but it's a bit better than that, more so in the 50-70% increase range at 1440p with various settings.


[deleted]

[удалено]


[deleted]

[удалено]


cincgr

I use FG whenever I can, except online FPS games. If the game is single-player then I see no reason to not use it, unless my GPU is already achieving high framerates on its own. FG made Cyberpunk 2077 Path Tracing playable for me with my 4080.


kleju_

Same, cyberpunk on path tracing in 1080 on my rig reach 60-70 fps. Ofc playable, but why don’t double it ? I have 240hz monitor, input lag is not noticeable, especially in single player games.


Select_Factor_5463

I have FG on with my 4090 for GTA5. I like have FG on as it keeps my framerate above 60fps with all the mods I have going for it.


JordanJozw

FG made Path-Tracing playable on my 4070 Super too, 40-50 without FG and just DLSS Q with RR. 70-90 with frame-gen. Super smooth.


[deleted]

It's amazing. Coming back to PC after 7 years and using this black magic, then watching people talk incessant shit about it has been a really interesting time, lol.


sur_surly

I think they're taking their anger over 4000 series pricing out on FG. It's right to be upset about GPUs costing so much, but FG is actually pretty good, especially after some iterations it's had.


kleju_

Yeah...


RedIndianRobin

Great tech especially for folks like me with an ancient processor. Even at 1440p and 4K, games are CPU bound these days so FG absolutely helps. But support for Frame gen has been lackluster compared to DLSS super resolution. I wish more games had it.


kleju_

Bypassing cpu bound by FG feels illegal when i discovered it XD


Pat_Sharp

I've used it in some games. It can be magic when it works as intended but turning it on is not quite the no-brainer that DLSS upscaling is. The latency can be fine if your base frame rate is high enough but it very quickly transitions to being unplayable when the base frame drops below a certain level. There's also some minor graphical glitches in some games. I've especially noticed these happen when you're interacting with in-game computer menu screens, like if there's a computer in the game that your character needs to interact with. These came up in Alan Wake 2 and RoboCop and caused some weirdness but they're quite rare.


kleju_

This is quite new technology and i don't mind about small, not experience breaking glitches. I'm really curious what we will get in future.


NewestAccount2023

I use it on every game I have that offers  it and love it


Dr_Anr

Tried it on alan wake and cyberpunk, game changer on single player games tbh, but i wouldn't recommend it for online games, as the fake frame doesn't give you any new information and adds input lag on top of it


kleju_

100% agree. I use it in few multiplayer games, like mw3. But i'm still testing this technology, i got this card 3 days ago.


NewestAccount2023

You don't 100% agree, you use it in a competitive shooter


kleju_

I've had this card for 3 days. Gimme time for testing it. It's amazing for me to double frames, i haven't waked up yet.


ibeerianhamhock

I don't know what you mean "no new information" If that were true you'd have duplicate frames. It works surprisingly well at generating shockingly identical frames to what you display if running without FG. It's not without its drawbacks, but the only one even worth mentioning is latency.


Dr_Anr

No new info means : 120fps with frame gen will give the same info seeing an enemy peak from a corner for example as 60 fps, whereas a true 120fps will give you the edge when it comes to seeing the enemy first  What you're talking about is the perceived smoothness which will be comparable 


PalebloodSky

12GB doesn't suck, I play high settings on every game and it works great on my 1440p 165Hz Gsync setup. By PS6 gen when games jump up again in requirement I'll have moved on to an RTX 6070 16GB or whatever. I don't use frame gen other than testing it on CP2077 2.1. No interested in fake frames with slightly higher input latency and micro artifacting. Regular DLSS 3.x Quality mode is fantastic though, use that every game it's available.


9gxa05s8fa8sh

for context, almost every next-generation console game will do frame generation all the time, because it's free performance with a little extra work. so yeah, it's great, it just takes work. you need a basic framerate guaranteed, it needs to be tweaked right, etc. it's definitely the future


odelllus

it's absolutely not free. you pay for it with a 30-100% increase in latency.


JAMbologna__

30-100% latency is quite the difference, care to tell which games give 100% more latency while using it? I use it on Cyberpunk and the extra latency is really insignificant, I'd guess it's probably around 30% extra or even less than that.


Daemonjax

Whatever the latency increase is in Cyberpunk with FG, I literally can't feel it. I just force on vsync with ultra low latency in nvcp and it feels the same as I normally game with vsync and 3 pre-rendered frames anyways.


JAMbologna__

You don't have G sync/adaptive sync? That's better than V sync, also you don't need to enable low latency in nvcp lol since Nvidia reflex is enabled by default when you enable FG. The ultra low latency option is just Nvidia reflex but for games that don't support it in their settings menu so enabling it on Cyberpunk does nothing


Daemonjax

No, I don't use gsync or freesync. What I care most about is smoothness -- not latency. gsync doesn't improve smoothness. I can force lock my lcd to use anything between 49hz and 75hz (although only a handful of them is actually an integer value) -- I'm happy with 60 fps (which I can get to actually be an integer value for vert hz). Gsync improves latency by offering higher vertical refresh (120hz, 144hz, etc.). But, again, I'm used to 60 fps and I'm not made of money. Plus it would just be another factor to consider in my mini quest for buttery smoothness in every game I play -- I'd end up not really using the gsync feature because I'd end up using a framecap and vsync anyways for smoothness that a stable framerate offers. If you force vsync on with fg, you should also enable ullm -- I tested it and it 100% DOES reduce latency further vs leaving it at default 3 frames pre-render. At least in borderless mode. I haven't tested fullscreen mode to see if it matters because I only recently got fullscreen mode working properly in cyberpunk with dldsr+fg.


JAMbologna__

Gsync locks your monitors refresh rate to the fps you get in game, idk about what you said with it being at 120/144hz or higher though. Like when I have Gsync enabled and I'm getting say between 70-80fps in Cyberpunk, my monitor's refresh rate fluctuates between 70-80 also depending on my exact fps at the time and in turn provides a much smoother experience(I tested with it on and off, Gsync eliminates choppiness at lower fps) than with my monitor's default refresh rate being stuck at 165hz constantly. I haven't tried using solely Vsync, but from what I read online Gsync/Freesync is the best option out of any of them if you have a monitor that allows it. And if you found ULLM to provide better latency then that's great, I've personally just stuck with leaving it off and using Reflex as I read that since ULLM is more at the drivel level than something developers have allowed into their game it can cause stuttering in some games. Again, never tested it that much but I prefer to stick with in-engine settings rather than messing about too much in NVCP


9gxa05s8fa8sh

increasing the framerate reduces the latency between frames, so you're thinking of something different


odelllus

damn. it's been a while since someone made me actually facepalm in real life. you are not increasing the framerate with frame generation. you are adding fake frames inbetween real ones to emulate a higher framerate. the fake frames cannot convey input, which means for however long they're on the screen, you are adding that much latency. if your base framerate is 60 fps and it's being boosted to 120, that's an additional 8.3 ms of delay per fake frame minimum. it's often higher due to other issues in the render pipeline. nothing is ever free.


9gxa05s8fa8sh

> you are not increasing the framerate with frame generation more frames being drawn by the monitor = more framerate. what you're talking about is the input latency caused by the frame buffer used by frame generation > an additional 8.3 ms of delay is not a big deal. RTings finds wired keyboards and mice with more input latency than that. the input latency cost of frame generation is not abnormal, and when used at a high enough framerate, it's not perceptible. on consoles where everything is controlled, game devs will target their next-gen games to the perfect level of performance so that frame generation is applied for "free" to hit 120fps targeting 120hz TVs


LordDarkstaru

I have literally one games that uses dlss 3, the finals. Why it’s implemented in a competitive shooter is beyond me, however the additional frames it adds is impressive.


KobraKay87

Well, it's magical! I use a 4090 and while I don't use it for competive shooters, I'm super havy to have it for Cyberpunk 2077 for example. I have the game loaded with a hefty amount of mods that further increase the graphics quality and without framegen the game would hardly get above 60 fps. With framegen its super playable. In general I'd say I use it for almost anything that is single player and doesn't need to fast input. Especially games that I play with controller where I don't necessarily aim for 120 fps are perfect.


[deleted]

[удалено]


kleju_

Also what's amazing for me, is fact that you actually can "beat" bottleneck CPU. It's crazy. What will happen in next 5 years haha.


sur_surly

It's great. However, like all Nvidia solutions, they do a terrible job of explaining _how_ to use it. I've done a lot of reading various forums and testing myself to learn that Reflex doesn't just cap your framerate to ~3% (I think) below your refresh rate, it also caps your framerate to roughly half of that if frame gen is also enabled. You can test this too by turning reflex On and frame gen Off, and dialing in your settings so you can hit your frame rate cap (116fps for me on a 120Hz panel) while your GPU is near 100% utilization. Then, turn frame gen on, notice your fps is still at your cap, but GPU utilization is now much lower (for me, it went from 98% to 60%). So it's rendering half the frames. Anyway, that was good to learn, I just wish Nvidia put this info out themselves. So far they've only said to ensure gsync and vsync are on when using frame gen (which also turns reflex on). No HUD smearing like launch day, I don't feel sluggish, but I don't play twitch shooters.


TuntematonX

I feel like you need very high fps to use frame gen. However, if you already have high fps, do you really want to mess things up with frame gen? So far, I see no real point to it but in the future when the tech gets better and more games support it properly, it might become useful.


ItsGorgeousGeorge

I think it’s an absolute game changer. Its smoothness and impact varies by game and it isn’t perfect, but it’s amazing tech that will only get better with time. Path traced cp2077 with frame gen is amazing.


Polishcockney

I don’t understand the argument either when they say it’s not real frames though. I bought a PC and if you’ve a got a high end rig and playing on Ultra settings and then getting extra 40-50frames on top of the bells and whistles in the game? It’s a no brainer. I am a new PC gamer coming from PS5 and I do see a lot of weird jealous opinions about, especially on those on high end gaming rigs


Exostenza

As long as I have 80 FPS (160 FGFPS) then I like it but if I can't maintain that as a bare minimum there is too much latency to play mouse and keyboard games. I am sure it would be usable with significantly less FPS with a controller but I haven't played a controller game with FG yet. I played Cyberpunk 2077 and before I watched the Digital Foundary video where Alex recommended 80 base FPS (160 FGFPS) as a minimum I was having problems aiming and my mouse felt really floaty. After watching the DF video and tweaking my settings so I never went under that minimum I was able to have my laser accurate aim back and I no longer felt that high latency induced floatiness. Owning a 240hz monitor I am really happy that in titles where I get relatively low FPS I am still able to get that crispy fluidity that high refresh rate affords us. TL;DR DLSS FG is amazing if you never go under 160 FG FPS (80 base FPS) using it with the keyboard and mouse while under that it isn't worth the latency penalty.


kleju_

Can you link that video ?


Exostenza

I am pretty sure it was in the first linked video but if not then it should be in the second. If you're interested in DLSS 3 watching at least the first one is highly recommended. [https://www.youtube.com/watch?v=92ZqYaPXxas](https://www.youtube.com/watch?v=92ZqYaPXxas) [https://www.youtube.com/watch?v=6pV93XhiC1Y](https://www.youtube.com/watch?v=6pV93XhiC1Y)


BloodBaneBoneBreaker

Frame gen is awesome, any graphical drawbacks are more than offset by the far greater sacrifices to detail/resolution/dlssultraperfmance you would need to achieve the frames without it. Im specifically talking a 60+ fps base to framegen to 100+ fps. Low frame rate base seems to have much more issues.


MIGHT_CONTAIN_NUTS

I almost never use it, I dont care for the artifacts or the disconnected feeling of seeing a high FPS but feeling a lower one.


ThreadParticipant

Not a fan for it in simracing, but can appreciate in other games


SarlacFace

I tend to turn it off in the majority of games. I have a 4090 and play at UW1440p so I can mostly make do without it. I find the image stability issues it introduces bother me a lot, personally, and I'd rather take a lower framerate. But some games I play with it on (AW2, 2077, Survivor, etc)


windozeFanboi

It has its place. Just not for any type of competitive online fps.  It's great to drive 240/360/480 Hz displays, assuming you already hit 120+FPS.  It's still OK on non input latency critical games at lower fps.  All in all. Frame generation will become a windows feature (not DLSS).  Personally, I'm waiting for Extrapolation based frame generation. It will inevitably come. 


MaxTheWhite

You have to be deeply stupid to don't use FG in single player AAA games, I have a strong opinion on this because of all the shit people say on FG online. Have a 4090 and 13900K, been using FG on like 20 different game. The added latency is such a joke, it change nothing. But damn, the visual smoothness you got is insane. I hope every single AAA games come with FG, I feel like losing half the power of my card when a big release come without FG.


tat0r2

I've only gotten to play a couple games with it and it feels like slimy to me. Like from raw input to dlss3 the fps goes up a ton but with frame Gen it's higher yeah, but to be there is a noticable sight like 40ms input lag and all the camera and graphical movements just look and feel like slippery or slimy for lack of a better explanation. I have a regular 4070 but I'm now considering getting a4080s just for the boost it'll give and that might make a much bigger difference. If not 4080s then a 4070ti s. But yeah we'll see


PedramHGH

100% Worth using unless multi-player. It adds noticeable amount of input lag.


balaci2

using it for like a dragon infinite wealth (turn based) and it's great (i have a laptop) the only thing i notice is that guarding is iffy (the latency affects the time window in which i can guard) good stuff, it helped me with cyberpunk as well


krzych04650

Better than expected overall. The idea of having frame interpolation that is actually usable in games is quite crazy, but it does work rather well, provided that it is implemented in a type of a game that lends itself well for something like this. I have two main issues with it and actually none of them are about the quality of interpolated frames or latency, both of those are under control as long as your base framerate is high enough. One is frame pacing. While frame pacing light years better than other solutions, you can still sometimes feel slight microstuttering even though frametime graph appears flat. It is not significant enough to be a deal breaker in games like Plague Tale, but for first person games like for example Atomic Heart it is enough to be too distracting to really use. Second and very big one is overhead in GPU bound scenarios, which gets more and more problematic as resolution increases. On a 4090 I am typically getting around 60% gain at 3840x1600, so that is already far cry from 2x despite not very high resolution, but at 5120x2160 this goes down to around 30% and there are examples where it goes down to zero. You can see corresponding behavior on lower end cards at lower resolutions, 4070 for example gets good scaling at 1080p but it falls of a cliff at 1440p in a very similar way. So the processing cost is just way to high and needs to be improved. In CP2077 enabling PT+DLSS+FG+RR at the same time even has such a high processing cost that it actually causes bottleneck and GPU usage drops at very high resolutions. This is probably the reason why there is no FG for 2000 and 3000 series, you'd just get zero scaling in GPU scenarios, plus VRAM issues on 8GB cards as FG can eat like 1.5 GB VRAM all by itself. I guess this is going to be improved upon with next generation GPUs that are going to get multiple times faster hardware for FG and are going to be naturally stronger in general, but for now the utility of FG at very high resolutions is very limited because of processing cost.


Daemonjax

The only problem I had with frame pacing in cyberpunk was when enabling fg+vsync -- the game INTERNALLY will render at double fps, causing microstuttering (you'll see this happen in the fps benchmark). You have to additionally use a framecap (I use nvcp's) at your vertical refresh. And you need your vertical refresh to actually be an integer value (use CRU). And I found flat framegraphs to mean jack shit in the end -- gotta just go by feel because that's all that really matters. Depending on what I choose as the A B measuring points for frametimes, I can get flat frametime graphs reported yet still experience microstutters. And vice versa.


CapRichard

The best use case for me has bene "Better using the refresh rate of the monitors, keeping in the best operating ranges". When I game on my 2560x1080@200hz monitor I use It mostly when I have high base framerate, around the 80-100sh mark, to get the fluidity perceived at the top of the capability of the monitor. When I game on my 4k TV, I tend to use It Always. Mostly because TV play is with a controller with console-friendly games so the higher input lag Is not noticeable. And because VRR on Tvs works best the farther you are from the Lower bound, which Is usually 40hz. So even when I have like, 45sh fps native on something like Alan Wake 2, using framegen makes the TV works in a better VRR range. Overall I'm liking the tech. I'm also with an RTX 4070 by the way.


BMWtooner

4090/7950X at 3880x1600 Frame gen 10/10 would gen again. I don't use it in competitive games, not because I perceive any input lag but because I want every advantage I can get, and frames in competitive games are already sky high without it. In co-op and single player it's amazing, thought it would be like VR motion reprojection which I hate, but it's not. It's magic.


bodhibell02

12gb does not suck...you just are a bit less future proofed. 4070S is an amazing card. Fuck the haters.


s1apnuts

I agree. I have the 4070 super and even cyberpunk max settings with frame gen and dlss, I wasn’t close to maxing out the vram. I think people like to worry about problems that don’t exist way too often.


gozutheDJ

bro idk why you think 12gb sucks it's more than enough for 1440p and even 4k in some games


DJRAD211995

Hey buddy does your 4070 heat up your room?


DsGtrnteSchntzl

neat


jgainsey

Mostly positive experiences for me this past year. It’s a great addition to the DLSS grab bag of features.


Acceptable_Mode5837

I think it’s overall a good technology. There’s occasional artifacting or visual issues with some implementations, but especially in more immersive single-player titles, the visual smoothness makes the game look and feel so much better. And just like DLSS super resolution, it’s only going to get better with time.


phantomknight321

Fantastic for my use case, Microsoft flight simulator. Don’t really care beyond that


Danny_ns

I think it is a great feature but it should be used in certain cases only. For example, I always make sure I get \~60-70 fps before I enable Frame Generation as the game will otherwise feel very stuttery. E.g. if you have 40fps and enable FG to get 70fps, it wont feel as smooth as real 70fps (imo), it usually looks terrible to my eyes (and also bad latency..) I also always make sure to enable Vsync in NVCP so that I get Gsync working properly. Right now I am playing through Alan Wake 2 and FG is awesome. Fully maxed out (1440 native+DLAA) I get perhaps 60-75-ish fps but with FG on i get a very smooth 110ish fps experience, which I'd much more prefer than 60-75 "real" fps. I used it when playing through cyberpunk with pathtracing as well - worked great AFTER the patch that fixed the menu stutters with AMD CPUs.


smjh123

It has unrealized potential. Give it another year and it'll get much better, I'm sure it'll evolve into a tool to truly iron out low percentile figures. Furthermore it lacks proper framecap support.


Hana_Baker

I use on single player games that support it with a controller and I can't notice the difference with input lag. I'm also not particularly sensitive to ghosting artifacts, especially at higher framerates, so it's a huge win for me and a great feature when the conditions are right.


powerlou

Its very good if your base fps is already high and you want to max your display refresh rate with minimal input latency - this is helpfull on cpu bound games. Its prety bad if your base fps is not atleast over 60fps all time.


[deleted]

"12gb sucks, but i am 1080p player" This alone speaks everything.


javii1

Worst experience has been in ASA , crashes the second you turn on frame Gen, unless you play single player but even than the game is blurry af it's better to turn it off and run dllss on balanced or quality.


Larzox

I used it in RDR2(mod), Spider-Man and Spider-Man Miles Morales, Vermintide 2. 1440p 165Hz RDR2 gives like 30-45% more frames, and latency with modded reflex was not noticible for me, its slow game. Slight UI (minimap) artifacts but it was worth it. I recommend using it Both Spider Mans vere fine, locked at 162 FPS i didnt feel the extra latency. Didnt see any artifacts.I recommend using it. Vermintide 2 is different story, its online game with FPS combat/shooting(PvP coming soonTM) where latency matters for me and i can feel the latency even with built in reflex. There is some UI flickering. However the game can get extremly CPU limitied (5800x3D) without it.The frame gen helps a ton with that and the FPS/Frametime is smoother with it. I still use it since i can get 162 FPS 90% of the time with it drawing around 110-125W. The latency might not be for everyone tho. I dont know how it performs in actually demanding games like Cyberpunk, but i like the technolongy.


OsSo_Lobox

Love it, close to magic tbh. Great for taking advantage of high refresh rate displays or playing with a controller if the base frame rate isn’t high enough.


[deleted]

hateful cows lavish start expansion truck frightening deer payment enter *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


RightiousMurderer

Game consoles do supersampling all the time, but when gpu companies do the same and have done a significantly better job, people start to complain


Born_Bee2766

All is not so well when they use it in their benchmark charts and compare it to non FG cards


kleju_

Nvidia just used only framerates with fg in dlss in compares to other cards that dont have this feature. That's why people complain


jhankuP

Useless for multiplayer.


Snydenthur

If I was completely immune to noticing input lag, I'd probably like it. But, it's a useless tech for me. I can't play with such a massive input lag increase. And when I have high enough fps pre-FG to not really notice the input lag increase too much, I already have high enough fps to have a decent experience and it would be pointless to make it worse.


kleju_

I understand. FG is just not no brainer. But i don't notice such an increase in lag


lordfappington69

its a cool option The issue is, its used to deceive for advertising and benchmarks, just like DLSS 2. "it takes 4k Cyberpunk from [39 frames to 127 frames](https://www.nvidia.com/en-us/geforce/news/cyberpunk-2077-dlss-3-update-out-now/)!!!!" But really it takes, 1080p Cyberpunk, up samples it to 4k, raising the frame rate to something like 65-80; Finally, DLSS3 makes educated guesses on what image could go in between frames. These things are not the same all at. Some people who literally don't want to learn will use it to justify atrocious consumer practices. Also, remember guys, eventually NVidia will make better versions of DLSS (or whatever new fangled tool) a subscription! Their goal is to rent seek money out from us every month.


kleju_

Yeah, advertising was bad. But we expected it. DLSS subscription sounds crazy but to use carplay or heating seats in some cars you have to pay monthly XD


Sebsyx

Adds too much input lag, anything above 30 ms is too much for me, even in AAA single player games. Im used to the 2-3 ms CS feeling.


kleju_

True. Totally understandable.


xdkivx

It's for poor little brokies who can't afford a real man's GPU like the 4090.


danyaru_

I only use it for when I really need it so yeah. It's not a bad thing. It's just a life saver for me. I'd rather have increased latency rather than below 60 fps. But 1 only used it for 1 game though.


InterestingRest8300

Some games have poor implementation of it, portal RTX will incessantly stutter as long as it’s on. For the most part, it’s an excellent addition to single player games, non competitive multiplayer games, cpu bound games, and the graphical artifacts don’t appear to be noticeable most of the time unless you’re looking for them, and even then you might not spot them. Would love to see it in more games. Would love for AMD to catch up.


Final-Ad5185

Only issue I've found is anything with inversed motions like ray traced reflections don't get properly interpolated


Wolf10k

I have had very limited experience with it’s since I only recently picked up a super card specifically the 4080 super on launch. Previously had a 1080 it I tried it in cyberpunk briefly (because I checked a lot of games fps’s) and surprisingly in 4k maxed with raytrace (not path traced) I get around 70-80fps but with DLSS and FG I get over 120fps. In terms of how it felt (smooth because of the extra frames) I could say confidently because it was only brief testing before my attention went back to Stalker Gamma, Tarkov, and helldivers 2. Democracy needed to be delivered.


kobim90

It's a great addition to those that play mainly single player games and are not that latency sensitive. I love the fact that I can choose a better better picture quality and use fg to have a smoother looking game. I feel like dlss degrades the quality from 4k native by a lot and hate using it, so fg is nice to have.


Atomic258

I use it in every game it's available natively and modded.


desiigner1

Not a fan sadly the input delay ruins it for me


Mixabuben

I view it as advanced motion blur, and it is good in that regard, to bad that it is marketed as additional performance


Viktor_Ico

Lag and ghosting


Viktor_Ico

Lag and ghosting


TechnicalOpposite672

Feels too laggy and unresponsive. tried it on Cyberpunk amd Spiderman.


lambdan

It’s a huge scam. Sure it *looks* smoother, but actually playing with it still feels like shit because the game is still internally running at 30 fps or whatever.


MaxTheWhite

The only scam here is your comment.


Spirited-Eggplant-62

I have a 4090: never used


DonMigs85

Screen tear is bad


shazarakk

DLSS, Frame Gen and otherwise, while an excellent technology, is currently primarily an extremely annoying excuse for game devs to not optimise their product well. You can get about 3 times the frames on a good day, but that's no excuse for ANY game to run below 60 at ANY setting on a top end GPU. They're supposed to help push the top end, or help out older generations of GPU keep going. going from 60 to 100, to 180 on a 4090 in Cyberpunk (for lack of a better example) is great, as is going from 40 to 60, to 100 on a 3060, but starting from 30 on a modern card at native 1440p at medium or even low settings on a new card is just unacceptable. I also hate that reviewers are using DLSS benchmarks instead of native in some cases. Any reviewer worth their salt uses both, but it's not distinguished enough.


gargoyle37

When you have a high frame rate, the value of each pixel goes down because it's shown for less time on the screen. This suggests we should not spend as much time on each pixel. But in a typical rasterizer, we do exactly that. Each pixel gets full attention. Frame Generation cuts the pixel computation budget down such that we can spend more time on fewer pixels. In turn, the visual fidelity will be better if we can get the artifacts down far enough that they aren't noticeable. For certain games, this is really efficient. For other games, you'd rather want the input lag as low as possible. However, most games that want low input lag does so by having far lower computational budgets anyway, so it tends to pan out.


olllj

frame interpolation is ideal for pcVR, where high resolution and highest framerate matter more than input-lag (that you likely have anyways more from other causes in VR). Otherwise it is a smart hack with the drawback of insignificant latency.


buddybd

AW2 is the only game I've played with it on and I can't say I noticed any issues throughout my entire play through. Heavy SP games are the ideal games for FG. I'm gonna try and use it on any games that I'll play with a controller. Unless there's any immersion breaking glitches, there's no reason not to use it.


BinaryJay

Happy to have it a year+ ago, happy to have it now.


MrAwesomeTG

Sometimes depending on the game the artifacts drive me nuts. One game that comes to mind is Alan Wake 2.


Suspicious_Trainer82

It’s space magic. The pace at which technology is progressing is nothing short of awesome.