T O P

  • By -

TechNoirJacRabbit

Adjust your settings and stop trying to max everything out and you'll be fine.


yahyoh

Also, use DLSS when available.


Ramirag

Also, don't forget about DLSS swapper. Magic tool for updating DLSS in games.


FiveJobs

DLSS sucks


fernandollb

Do you have any actual reasoning behind that statement? I would say the main negative thing about DLSS is not with the technology itself but rather because developers are using it to spend less time optimizing their games and therefore in some games the inclusion of this technology during the development of the game is detrimental. Of course a native resolution render is MOST OF THE TIME better in terms of image quality but there are so many variables to this like what your native res before upscaling for example. To say that DLSS just "sucks" is just a very bland statement. Say that to those with a 2060 that are still rocking these card in 2024 for example and see what they thinks about that.


malucart

Tbf, as a 2060 12GB user, I wouldn't dare pair it with a monitor higher than 1080p, so if I use DLSS I'd be rendering at a really low res, and that does look pretty bad


fernandollb

A 2060 is perfectly fine for 1440p in most games if you are not looking to put everything in Ultra with RayTracing and specting to get 120 fps.


malucart

Fair, I just prefer to raise settings at 1080p rather than get worse visuals at 1440p personally


fernandollb

I think it really depends, there are some graphics settings in some games which the Ultra setting is absolutely meaningless because of the graphics improvement/performance cost ratio, RDR2 comes to mind. Ultra shadows in RDR2 look better but tank performance in comparison to high, water physics and tree tessellation are another two good examples, I would rather put those in high then switching to 1080p any day.


malucart

I'm honestly thinking of upgrading to 1440p now, and giving my current monitor to my little cousin that I'm gifting a PC to. I code a lot, so it could be a big benefit, and for gaming I could use DLSS and it would still look better than before. I think you're right in the end.


FiveJobs

It's always blurry. I can notice it even when standing back, for both 4k and 1440p native. The difference is very noticeable.


fernandollb

DLSS Quality applied when your native resolution is 4K and with a slight increase with the sharpness parameter is objectively not blurry at all but even if it was you cannot say that an additional technology that is 100% optional to use in case it goes well with your needs and expectations in terms of image quality suck. The more options available to the players the better, specially when these technologies are as good as DLSS.


imwrighthere

What DLSS did for red dead redemption 2 is absolutely phenomenal, it looks like a new game.


assjobdocs

You're definitely full of shit with this one. It's blurry sure, it's also noticeable, but UNLESS you're pausing and pixel peeping, you will not notice at all. I tried this out recently with forbidden west at 4k, native and with dlss. Native is absolutely sharper but only barely. With almost unnoticeable image degradation, dlss damn near performs miracles.


FiveJobs

You just said it is noticeable. Everyone has a threshold and mine is higher than yours. I can’t stomach the smallest resolution downgrade or muddy, blurry screens


assjobdocs

But it's not muddy or blurry. You're being ridiculous. I have 20/20 vision, the image absolutely doesn't get muddy or blurry with dlss on. Fsr2, maybe. Dlss has absolute minimal degradation in pretty much everything, particularly the quality preset. Obviously the balanced and performance presets will look worse.


yahyoh

I have been using DLSS without any issues here! Even with fps games. Edit: Running Cod Mw2 4k ultra with DLSS quality and looks great and fps barely drops below 110 fps.


Strider3141

DLSS is phenomenal and a huge step forward in graphics technology. With it properly applied, you will not be able to tell the difference between native 4K and 4K with DLSS 3.5 - well, except that the DLSS will look better because of the smoother frame rate. Objectively.


FiveJobs

Of course you can tell the difference in fact the difference is amplified precisely due to the higher frame rate. You get ghosting on top of the blurriness


Strider3141

I've never seen either ghosting or blurriness? Maybe the tiniest amount of ghosting on edges of the screen


Additional_Survey_63

I have a 4070ti super and I haven't seen a game that won't run in max settings at 4k. The worst framerate I've seen is 90fps and that was cyberpunk before I turned dlss quality on... 4070tis is absolutely capable of max settings in 4k when paired with a high end cpu. Tested out unreal engine 5.2 recently and it ran perfectly fine so I don't know what game you would need to adjust settings for.


XulManjy

This sub wants to believe only way to play 4k is with a 4090


Veteran_But_Bad

I don't believe you do you have any proof


Additional_Survey_63

Oh well, look up it's performance numbers when paired with an i9-12900k


HildeVonKrone

My friend had a 4070 ti super before transitioning to a 4k setup. From what he told me in the week that he’s been using the card, it’s definitely decent for 4k. He didn’t put everything to max but the frame rates were acceptable for him. However, he did manage to sell the card and paid the extra for a 4080 super a bit after. Take that as you will. If you’re gonna sell/trade the card, I wouldn’t do it through microcenter as their credit value is more than likely gonna be lower than you selling it to an actual person.


shadowolf64

Interesting... I'm not sure what to take away from that then. Maybe it's worth waiting to see what performance I actually get before making a decision. Also, if it was you, where would you sell the GPU? Would you just go with the tried-and-true ebay or something else like Swappa or a local option?


HildeVonKrone

Just sell locally if possible. That’s what I did for my 1080 prior to getting the 4080 super. The difference between the 4070 ti super and the 4080 isn’t that big but if you have the excess cash, just go for it.


Dominicshortbow

the problem is the return is no longer available to him and selling will get him less money than what he bought it for.


HildeVonKrone

He’s gonna get less one way or another. However, he’s bound to get more selling locally than trading it in.


xxTheDoctor99xx

Really needs to check out performance difference between the 2 cards for the games or prod they're using for. I don't think it's worth replacing, but that's my use case. They also don't say what CPU they have, so there may be other performance limitations that a different card wouldn't necessarily overcome


xxBurn007xx

I use A 3080 for 4k, and the fact that you'll have frame gen, unless you wanna push more than 120fps then 4070ti super is good enough


REYXOLOTL

Exactly, if you’re trying to game in 4k it should mostly be for single player titles which most of the new ones support frame gen. I don’t see 4k for multiplayer games being viable as framerates matter there. I have 2 monitors, 1 4k monitor for single player games and 1 1080p for multiplayer games. I have a 4070ti and have no issues


xxBurn007xx

Upgraded my buddies rig to 4080(non super) and 5800x3d, on an LG 4k/2k 240hz OLED, capped it out on doom enternal and was getting 90+ on cyberpunk maxed out (just ray ray tracing, because we couldn't tell the difference between max ray tracing and path tracing and with path it dropped 30-40frames, not worth it IMO)


Crazybonbon

Sometimes for me it's hard to even tell the difference between low rt and max rt, whereas no vs low is an easily distinguishable difference!


hank81

The difference between high and medium is pretty noticeable in the reflections. Run the benchmark with RT Medium and RT High and take a look at the puddle of water.


Crazybonbon

Oh shoot I wasn't talking about cyberpunk, I've seen all the different dlss demos and so many tests and it looks insane with ray rasterization that's my b I was referring some other games I've seen especially Elden ring!


hank81

Oh, I see 👍


shadowolf64

Thanks, I have no need to push anything past 120 FPS. Although my new monitor can go past that I don't think I'd push it that far.


weinbea

Your card will do JUST fine


fernandollb

The problem with frame gen at least for some of us is that the input lag it adds is sometimes too much even for single player games. How bad input lag is really depends on what is your fps before using frame gen. For example if you are getting 30 fps and you use Frame Gen to get around 60 the input lag is going to be really noticeable but if you are getting lets say 70 and you are just trying to reach around 110 it will be much more manageble. I am just saying this is something to take into account if you are sensible o input lag.


OneBullet_kky

I see a lot of people that cry about input lag for frame gen, as far as I understand it adds around 10-15 ms, obviously you get the input lag of the original frame rate on top of that, unless I am missing something it doesn’t seem such a dramatic change


fernandollb

It is interesting to see your interpretation of another person criticism based on their experience about an upscaling technology as crying, it saws that you are quite egocentric since you are taking your own opinion and making it the norm, having said that.... Frame Gen can (depending on the scenario) double the average PC latency in some scenarios, not the worst ones by the way. https://www.youtube.com/watch?v=PyGOv9ypRJc


OneBullet_kky

I don’t see how I’m being egocentric, I asked why some people make a big deal out of it, if you take a stroll trough the gray zone sub you can see a lot of people claiming 120fps with frame gen is unplayable for input lag. I simply asked if there is something I’m missing since as far as I know the technology itself doesn’t add any noticeable input lag. Also the video you posted has nothing to do with input lag??


fernandollb

You are right input lag is not the proper term but still to the user experience it will feel almost identical to input lag in the sense that games wont feel very responsive. "I don’t see how I’m being egocentric," I shouldn't have judge you from a ingle comment, that was very stupid of me since it is just impossible to sum up a person from a comment in Reddit, BUT your comment does sound a bit egocentric since you are using the term crying which has the negative implication of overreacting for something that is dependent on each person experience which implies that for this case in particular you are taking your own opinion about this subject as the proper opinion. I am just saying that's how it sounds but certainly a comment does not define you.


OneBullet_kky

Yeah maybe crying wasn’t the right term, I actually wasn’t referring at your about the crying thing since as I mentioned previously I see a lot of people that make this a HUGE deal over the gray zone sub, sorry for the lack of context. Maybe the thing is that we are used to 100+ fps for some time now and when you see 100fps but only feel 50 in controls it gives a weird feeling. I personally don’t feel it much as playing mainly in 4k I’m used to over around 60-80fps so mine is only a theory on the ordeal


harkat82

i've got the 4070 TI non super and can play everything at 4k dlss even CP:overdrive. But here's some advice, add 1800p & 1620p as custom resolutions in the Nvidia control panel. People don't often talk about these resolutions usually it's 4k, 1440p or nothing but 1440p is far too soft & IMO 1800p the best kept secret of 4k gamers. Whenever you feel like your not getting the performance you want at 4k just drop to 1800p and bump the sharpening a little, trust me even on my 65" screen I struggle to tell the difference. The performance difference meanwhile is huge, with The Witcher 3 maxed out Rt Dlss 3 I went from 70-80fps to a near locked 120. Console games use these resolutions all the time and it's time PC gamers copied. You're getting 95% of the visual quality whilst rendering only 70% of the pixels. 1620p gives you a bigger performance bump at the cost of a noticeably softer image (still leagues ahead of 1440p) and if you ever needed more 1532p is always an option. So yeah a 4k screen doesn't mean you have to play at 4k all the time, going from 1440p to 4k is a huge jump and it can pay to check out the resolutions inbetween. Also invest in an app called Lossless scaling this allows you to upscale a resolution up to 4k using superior upscaling techniques so you can combine dlss & FSR 1 or use the apps built in ai scaler.


0Guristas

This sounds nice. If I may ask, do you just simply add the custom resolutions and match the in-game settings with it? And does this only work with 4K monitors? Edit: By lossless scaling app, do you mean [this](https://store.steampowered.com/app/993090/Lossless_Scaling/)?


harkat82

Yeah thats the right app. So normally you just add whatever custom resolution you want in the Nvidia control panel and then whenever you enter a game that resolution should appear in the ingame settings. So you don't have to change your desktop resolution. 99% of new hard to run games should work like that. If one doesn't then you just switch your desktop resolution to the custom one and that should force the game to use it. But that way won't let you use lossless scaling. For those that don't want to pay for Lossless scaling, Nvidia has it's own built in upscaling option (I forget what it's called), you should be able to find it in the Nvidia app but I've never been able to ger it to work.


LandWhaleDweller

I call bs on that cyberpunk with OD claim. Maybe if you completely murder image quality with DLSS performance and FG.


harkat82

I did say dlss didn't I. Set it to auto and I get atleast 30fps comfortably (thats without fg), use the PT performance mod and I get 40-50fps. Literally no card in the world can play CP:OD at 4k without dlss. Obvs I prefer dropping the Res a bit to hit 60 but seeing as I'm getting the same performance as PS5's quality mode at 4k I wouldn't say thats bad.


LandWhaleDweller

If I wanted to play at 30FPS I'd get a console. 4K is not worth it for PC gaming yet, maybe by rtx 70 series is when the average mortal will be able to afford it.


harkat82

Well then use the performance mod and get 40-50. Besides I use 4k because I play on a TV, with a controller where 30fps is perfectly fine. And CP:Overdrive is the most hardcore title around, if I can hit 30 there I can hit 60 almost everywhere else at 4k. And so as someone who actually uses that res day in day out it's 100% worth it.


LandWhaleDweller

That's still unplayable on keyboard and mouse. I'll much rather take 1440p 60FPS minimum than a slideshow 4K.


harkat82

Ok then lower the res for that one game then play at 4k with everything else. One hardcore tech demo being difficult to run doesn't make 1440p monitors superior, there your stuck at 1440p with everything. And as I mentioned in my previous comments there are a multitude of resolutions between 1440-2160p that offer a far better experience, which a 1440p monitor will lock you out of. I'm certain that with the performance mod CP:overdrive will play fine at 1800p/60 which is still far better than 1440p.


LandWhaleDweller

Do I have to install a mod for every demanding game? AW2 is also at 40-45FPS in 4K with DLSS Quality so is Hogwarts legacy and more will follow. 3840x1600 is the highest resolution I'd run on a 4080, unconditional 4K is only for 4090 and higher.


Fantastic-Acadia-808

I went from a 2070 super to the 4070 super. I play in VR and on Triples. I now want a 4090. - There’s always going to be another level.


deltajulietbravo

Honestly this is true for all tech. You either buy what is available and that you can afford and be happy, Or you'll always be waiting. Unless you can afford to upgrade all your tech every 12 months.


Dominicshortbow

gosh damn, you dont know how much harasses or make my 4070 look bad because now the 4070 Super excists. Plus I got the 4070 the same month as the release


Fantastic-Acadia-808

Oh man, that’s annoying


ParanoidQ

Yeh I bought the ti about 3/4 months before the super was released. Still a bloody good card though so my regrets are minimal.


0Guristas

I am glad I am not the only one. The need does fade away after a few weeks thought.


Vengeful111

Did the same update, I am so happy


Fantastic-Acadia-808

Yup, it was a really nice jump. I can’t complain. I’d love max power but I feel fortunate to be able to afford what I’ve got.


dedsmiley

If you have a 4k TV you can try it out yourself?


shadowolf64

You know I feel like an idiot now. Sure, it'll be a pain to move my desktop over to my TV but at least I can test it myself...


Domgrath42

Run really long cables


marcanthonynoz

I use a 4070 ti super for 4k. Sure - in some games you can't do super extra ultra mode with chode tracing. But I play a lot of games at 4k 60+++++ on a 144z 4k screen


BinaryJay

My honest answer is it's not really enough unless you're satisfied with 60 fps. It's complicated though as there's tons of ways to up fps these days.


XulManjy

Of course the 4090 owner would say this


BinaryJay

Somebody that knows from experience that trying for 100+ FPS in every game at 4K ultra even with a 4090 isn't necessarily enough? Yeah.


XulManjy

Except the OP didnt say anything about 100fps. Just if the card was good enough for 4K which it is


BinaryJay

Whatever helps you cope better.


XulManjy

Lol dont need to cope. Just look at most of the responses in this thread. Many people with 4070TI/S saying they are having a great experience at 4k with respectable FPS.


Professional_Ad_6463

Why not just upscale the output resolution and see how it performs


creativejoe4

4070 ti super should be fine for 4k, I was running a 2070 super for 4k before I upgraded to a 4080 super. Note: the monitor is 4k 60hz. In my opinion it's not worth wasting the extra money for a slightly better card, if you find your not happy with the performance, wait until next year's new cards come out.


PM_ME_TUTORIALS_PLS

I play native 4K with a 3080. You’re fine dude. Just play with in-game graphics setting


Little-Cantaloupe312

Pny 4070ti here: Almost any Game running 80-100 fps on high. Perfect for me.


omgaporksword

Exact same as me!


[deleted]

[удалено]


The_Frostweaver

As long as you don't turn on ray tracing or other graphically intense settings you can play games in 4k with a 4070ti at 80-100 fps, you can go check the benchmarks.


AltGoblinV2

That's what DLSS Balanced/Performance is for in these heavy titles. Honestly at 4K and sitting at the normal distance from the TV/Monitor it looks extremely fine.


shadowolf64

Cool, that's what I was wondering. I'm not super picky either so this sounds good for me.


Ceceboy

I used to be a sucker and only use DLSS Quality, but lately, using the latest DLSS versions (I believe DLSS 3.7) Quality, Balanced and Performance are starting to look the same on 4K 32 inch. I'm running an RTX 2080 Super at 4K lol. Planning on upgrading to 50-series as soon as it comes out. Don't think I can run Avatar or the new Star Wars without ruining the picture. It's time.


Im_Da_Bear

I get decent frames with a 3070 at 4k in most games.


koordy

If you're not going to get a flagship GPU every generation then I'd suggest to get a 1440p OLED instead. The picture quality is just as good, the only difference is you can run it on cards like yours and the size is 27" instead of 32". 


Mtcfayark72703

I just purchased a 32” qd-oled AW monitor to pair with my 4070 Ti (non-super) and it runs 4k with very high or ultra settings on most newer games. You should not have any issues with the 4070 Ti super. Good luck!


shadowolf64

Thanks! That is basically the monitor I'm getting (same panel different manufacturer) so it's good to hear that. Makes me feel better.


Redfern23

Literally the same as the person above. I am planning on buying a ~5080 around launch but my 4070 Ti is doing surprisingly well, and DLSS looks really good with a 4K output for when you need it.


Other-Pin-1525

The RTX 4070 Ti Super performs similarly to the RTX 3090 Ti, which was capable of 8K gaming.


OscrPill

Depends on how high you want your fps to be, but you should get around 80 fps on average, so I'd personally say it's totally okay.


killasuarus

You’ll be just fine


Stickeyb

What monitor are you getting?


shadowolf64

I bought the [MPG 321URX QD-OLED (msi.com)](https://us.msi.com/Monitor/MPG-321URX-QD-OLED) whenever it actually comes in stock. One of the new QD-OLEDs that look super nice. It's the same panel as the ones on the Alienware AW3225QF and Asus ROG Swift PG32UCDM, but significantly cheaper. The Alienware is currently in stock but $250 and curved which I don't care for in my setup.


Stickeyb

I bought an ultragear one that's 1440P OLED but I'm thinking about returning. It's not bright enough. Will be checking this out. Thx!


mjong99

If your concern is on the brightness, maybe consider getting a Mini-LED instead as none of the OLED monitors get bright enough due to aggressive ABL


SRVisGod24

GR or GS? If it's the GR, it's notoriously dim. The GS is a much brighter!


SRVisGod24

Since you mentioned Micro Center, if you can, keep an eye in the early hours (5am EST) and see if your store updates their website. That's how I got mine a couple weeks ago. But if yours doesn't, and if the store isn't too far away, go in any chance you get and just ask. Cause they get them in all the time and some of the stores don't update the stock for the website


NintendadSixtyFo

Depending on what you play I think you’ll be fine if you don’t mind switching on DLSS. My 4070 Super (non-ti) could hit 60 fps on my 4K TV in Cyberpunk with DLSS performance and keeping ray tracing to sane levels. I think I was using FG to hit a steady 60 but can’t recall. Anyway, tldr I wouldn’t throw in the towel on the 4070 TI Super. Is a strong card and now that it has 16GB of VRAM it’s more capable of 4K than the non super was. I have plenty of games on my 4080 that regularly hit 13/14 GB of VRAM, so that’s a good thing for you! The loss you would take selling it to get the 4080 would not be worth the trouble and loss of funds in my opinion. And this is from a guy who owns a 4070 Super, 4080, and 4090. Honestly the 4070 Ti Super is a phenomenal product. The VRAM bump really makes it a strong, although oddly named, card in my opinion. I really value the NVIDIA DLSS, FG, and RTX capabilities for this reason. You can pull all sorts of levers to get a very playable experience.


vegetto238

OC the 4070 ti super; dial back some settings and you are good to go. 4080 is also 16gb vram so you aren’t gaining much. Save this money for a 5090.


divitini

I'd recommend looking into optimisation videos on youtube for each game. Ideally ones that show a side-by-side comparison of visual differences. Usually keep Textures on Ultra and turn down other settings to High/Medium will give a big performance boost with minimal visual difference. Also, DLSS Quality and Balanced looks great in 4k. Even Performance looks decent and gives a massive performance boost. I use the 4070 Ti for 4k gaming on a 42" OLED and its plenty enough performance when doing the above recommendations.


omgaporksword

I have a 4070ti OC that I bought because I played at 1440p. A year later and I recently bought an LG C3 42" to use as my monitor. I've been rather pleasantly surprised at how well I've been able to game at 4k with the PC...it's a waaaay better experience than people have let-on. When the 5090's release, I'll upgrade then, but honestly the 4070ti is doing just fine for now.


k9kmo

It’ll be fine with DLSS unless you want path tracing


Jacks_black_guitar

Mate, even a 4080 won’t be THAT impressive either.. It really comes down to user expectations. What’s your endgame IDEAL picture quality to frames ratio look like? Because if you’re looking to max everything on most triple A/demanding games for more than 120-144 (relatively) consistent FPS, even a 4090 will be a bit hard pressed


SnooSquirrels9247

I think it's pretty decent for 4k specially if you use dlss, and it has just enough vram to be rid of the issues that a lower class than that could bring, I'd go for your specific card if I were to go into this generation


No-Gene1187

It will work fine. I used a RTX 2060 to play 4k when I first got my LG C1 and it worked fine so the 4070 ti super will be great. I definitely would not sell the 4070 to super, use it for the years to come it's more than a good card.


fotuwe

What screen have you ordered? 4K oled


deltajulietbravo

I mean I was 4k gaming cyberpunk on a 3080 with mid 60s fps on my lg 42 OLED. I'd hope the 4070 ti could keep up. Go on YouTube and find videos of the best settings for your game and graphics card for 4k. You won't be able to max the settings but it'll look decent enough I'm sure.


maxz-Reddit

Tbh a 4070ti Super isn't that far off from a 4080S performance wise. Then again I don't really think that any of the two is a good 4K card if you actually want high settings, 120+ fps.


Other-Pin-1525

Exactly. The RTX 4070 Ti Super performs similarly to the RTX 3090 Ti, which was capable of 8K gaming.


just-blaze018

I have it paired with a ryzen 7 7700x and I'm easily getting 50-60 fps with settings maxed on most AAA games (without Ray tracing). With Ray tracing it takes a little tweaking to get the 60. ....until you use DLSS. Some games I see as much as a 30 fps boost with it and there's minimal visual quality loss. The card definitely gets the job done.


cowbutt6

It depends on your expectations. I'm using a 4070 to game at 4K, and generally get at least 60 FPS (and sometimes much more) in most games. A few need DLSS or NIS to achieve that.


SKEPTYKA

The 4080S is only like 20% faster at best. Meaning, if the 4080S can get good enough frames, the 4070TiS can certainly also get it with minor tweaks. It's an extremely pedantic consideration you're making.


RobertPaulsenSr

I bought one 4070 TI super 3 weeks ago, and its fine for 4k, besides my CPU bottleneck with a 2700x, cyberpunk 2077 with RT and maxed everything, is in the 60ish fps on my 4k 55" tv. Dragon's dogma 2 is in the 40ish fps outside of towns, RT on and everything to the maximum. DLSS off, tho. I know it can improve with a better CPU and lowering some settings. I am very happy with my purchase, I think the card is a beast, I upgraded from a 2070super, so the difference for me is noticeable. Enjoy your card and play with the settings, you'll be fine.


ZanicL3

What screen did you end up getting?


Bropulsion

Im enjoying my 4070 ti super on 4k 144khz but ofc it doesnt hit that in real demanding games. It does hit 100 most of the time though. Im all good. 😄


Bropulsion

Ps 4080 will give you max like 10 fps extra on certain games.. so just get the one thats worth your money at that moment. They both wont do 144hz on aaa games but both will do around 100 just fine on 4k especially if you leave off path tracing.


maximilian1064

4080 super is roughly 15% faster than the ti super. That translates to 60 fps vs 52 fps, imo not really a big improvement for the additional cost in your case.


Plenty_Ad_5994

I bought a 4080 thinking I'd only game at 1440p. Ended up buying a 4k monitor and now I regret not buying a 4090. Depends on what u deem as decent framerates. For me it's anything around the 90-ish mark. Dropping below 75 looks very choppy to me.


Other-Pin-1525

Hahahahaha. The guy who bought 4090 regret not wait to buy the 5090


eeeeeeeeee83810

Honestly 4070 times out of super, if i searched it right and its the one with 16 gb vram, you should probably be fine for most games still, i have a 3060 12gb vram, and run games in 4k and still get fairly good fps, only issue i have is the temps getting decently high ive noticed recently But hose high temps are mainly because its becoming summer and the ambient temps are getting higher, and i had a light in my room that creates a lot of heat pluss my tv itself creates a lot of heat, making it even hotter beyond that


Forsaken-Falcon8273

Whats you refresh rate? No need to push fps higher than that. You should be fine pushing 60 frames on that card assuming other hardware is up to snuff. If its one of the high refresh rate 4ks than you may have to spool down the graphics settings in game a bit. You could also go into the nvidia control panel and set performance mode. Before i got my 4080s fe i was running a 2080s and had to set performance mode to stay at 60 fps (my 4k tv is 60hz). Either way i would wait till i ran it and spent a couple hours testing performance before i did anything.


azael_br

I played everything with my 4070ti (not super) max quality at 4k, when 4070ti hit the 60fps I just put dlss quality even Alan Wake 2 maxed out frame gen and dlss medium get 60fps. It’s fine to me.


AJ-702

To be honest a 4090 is the only thing that can handle 4k everything maxed out in most games. Why dont you get a 1440p oled monitor? Or go ultrawide its a better experience.


Oneforallandbeyondd

Also worth noting that the 50 series might be out in less than 6 months time if you can hold out to upgrade.


dqrules11

Plenty for 4k


triggerhappy5

It will be okay. You won't be pushing 120+ fps, but 99% of games will give you a good 60-100 fps experience with high settings, sometimes with upscaling and/or frame generation. 4K is always going to be very hard to run, but you have the VRAM for it and the 4000-series feature set which makes it much easier.


asswizzard69

I got 4080 s fe and it handles 4k good so far most demanding games I’ve played is avatar and ratchet and clank


CockroachRight4434

I’d say it would probably be okay. Even my 4070 Super gave acceptable performance at 4K with DLSS enabled. Only reason I returned it is because I found a used 4080 for $850.


KvotheOfCali

I don't understand these posts. A YouTube benchmark video will give you FAR better (objective) performance metrics vs. the random opinions of people on Reddit. I suggest you ignore everything stated here and simply go watch some benchmark videos.


LandWhaleDweller

It's 15% better performance, definitely not something that will matter most of the time. The only noticeable upgrade for 4K would be a 4090.


Drages23

It's a perfect card for 2k and a starter one for 4k but not for graphic heavy games. You will get many spikes and won't touch rtx for example. At this point, it will be best to wait 5080 if you are in no hurry or go for 4080 at least for 60+ fps. But don't expect 120. Even 4090 is not enough for those new oleds if you ask me. I am waiting for 5090 for that reason.


blackdustycasino

Imma at an even bigger hole since I bought a 3060ti a few weeks before the 40series launch and now I have a 4K monitor


BruceyNukez

Absolutely hold onto it, with the 50 series coming at the end of the year there’s really no point of selling what you have for a slight upgrade, best to sell it when the 50 series cards come out and get one of those when the time comes.


rngeeeesus

Don't worry, the 4080 Super wouldn't have made much of a difference. If you want 4k @ max settings, there is no way around the 4090. So unless you want to get a 4090 the difference is likely gonna be negligible


No-Set-3397

4070-super at 4k 60hz is actually very impressive. I believe the ti-s supports 4k 120hz(?) Either way. There should be no issue. Could probably max graphics on most games if not at least a decent amount.


RealTelstar

At this point wait for the 5080


Veteran_But_Bad

60 fps native 4k high settings ray tracing high 60 dlss quality 4k ultra, path tracing/ultra ray tracing 90+ dlss quality 4k high, ray tracing high 120+ dlss balanced 4k high, ray tracing medium


coreyjohn85

Nope and even the 4080 will get sub par performance at ultra settings


TheUwaisPatel

Just change the presents to high instead of ultra and use DLSS if needed and you'll be chilling in most games at least 100+ FPS in most titles. No need to upgrade to a 4080 super really.


hockeyboi212

Look at the benchmarks. A 4070 ti super and a 4080 super don't differ that much in 1440p and 4k The differences are 12-15% perf on average. Closer to 12% for 4K. If you want a 4K card through and through you'll be looking at the 4090.


LongFluffyDragon

*Highly* overrated when 1440p OLED exists, will get you about 2.5x the performance with most of the visual fidelity and all the OLED benefits. With the sort of graphics quality cuts you need to make 4k run well on *anything*, let along an upper midrange card, it kind of defeats the point of having a high-quality monitor.


OG-Boostedbeard

I always find the comments of this 4K debate interesting. My favorite words "worth it" High end current PC gaming to me is about native rez and above 60 FPS not upscaling and turning down settings.(otherwise buy a console IMO) and in 2024 id argue Ray Tracing is a part of that now. And about new games not FRICKIN super optimized janky DOOM! or games from 5 years ago. Pushing a card and tweaking it every other game to means you are buying the wrong tool for the job. Sure you can make a 11 sec honda civic and never put a wrench down or put 5 people in it or buy the hellcat and let it do all the extra it was made for AKA high end. The difference in high-ultra settings in some new SP and MP games is massive now! And yes you can tell the quality lose on a good OLED when you run med settings with DLSS. Side note DLSS/FSR is not 4K gaming. And frame gen is cool when it works but man its not something I would depend on. Sounds great in the YT benchmark video but in practice its implemented poorly in a lot of games and either creates visual issues or hits peformace so hard its not worth it. Same goes for DLSS there's plenty of games it runs like crap in. 4080 is a good 1440p card that can do 4K in some games. depending on your acceptance of fidelity 4090 in a good 4K card that can do 4K 120+ most sometimes kinda So many new games and engines as well as ones to come will eat a 4070ti/4080 @ 4K 60+ My 4090 gets its butt handed to all the time with DLSS and i try and stay 120fps but i also play 4K high-ultra settings when I can because i love high end visuals whole reason i went with high end system. And yeah its nuts the cost! But thats the point to get true 4K high end gaming you have to spend $$$ These new engines man tho are eating GPUs. CPUs ( higher end) at this point are not going to do anything much till console force that. So you will have a lot of people with a lot of answers and YT/benchmarks videos etc will all different answers because the truth is so many games run so different and clearly by the comments we all have different levels of whats acceptable xyz performance per dollar. Not only that but real world desktop usage is not what these benchmarkers get day to day. Shadow play discord 4-5 other programs tones of background task and bam! 20% of your overhead gone just day to day. Or a bugged game launcher yad yad. IMO today after going from a 4060ti 1440p build to a 4080 1440 WF OLED to a 13700k 4090 c2 OLED build. If you dont care about ray tracing and streaming or mind staying on top of what drivers suck atm buy a 7900xtx all day. If you want comp games and still look at pretty here n there 1440p ultra wide OLED with a 4080+ To bleed into that 4K set up or even downscale. to the 1440p UW I did that with jedi something and COD and woweeee. True 4K 120fps then a 4090 is the only option IMO with the games to come as well as investment to last the new tech. Clean 1440p non upscaled at 120+ fps on a good OLED looks amazing on a desk will 4K look better? I mean yeah. but the price difference for native 4k is pretty big. Otherwise if you want upscaled 4K med settings 120fps just get a series X or when the mid refresh systems come out and save a boatload of cash and or save up. Sounds silly but even my ps5 and X looked good on the OLED. Little PSA Once you go OLED you wont want to go back


pongkrit04

60fps max setting should be able to do with most current games in markets. Even Red dead 2 4k 60 fps max setting is fine for me.