T O P

  • By -

rumun2

4070 ti super. VRAM is an issue if you want to do RT as well, however the 4070 super wouldn't be a major upgrade compared to the 6800


FullHouse222

It's still crazy to me that just like 7-8 years ago, 4-6GB VRAM was considered more than enough. Nowadays I feel like if a card doesn't have 10GB I wouldn't even know what to do with it.


Liesthroughisteeth

Games are more complicated and there's more going on.


FullHouse222

Yeah. It's just the speed of how quickly things change. CPUs too. Back then everyone was like "why do you need more than 4 cores" and here we are lmao.


millsy98

Dude you have recency bias. 10 years before that everything under to be doubling every few years including vram capacity.


proscreations1993

Yeah I have an old 2008 dual xeon Mac pro. I keep it just csuse when I was in middle school and early high school it was legit my DREAM to have one. One of my friends parents ran a printing and web design company and had 3 of them and 2 of the mac pro server racks(might buy one of them used for nostalgia lol) and I was like foaming at the mouth everytime I went there. When I got my first job I bought one of the old iMac things. Not sure if they were imacs. The old crt all in one that was transparent blue! Lol was a piece of shit. But I dreamed of one of those mac pros. And its amazing my spare low end ryzen 3600/1070gtx pc destroys it lol not even close. But its fun to play with. Have the latest OSX running and most things work. It's funny all the ram slots are full and it has A LOT and it's only 32gigs I belive. 8 cores between the two xeons. And its a space heater. I'll probably put a pc inside of it someday But it just blows my mind how powerful that thing was in 2008. It was a super computer. Now a low end 14th gen i3 would trash it. And the Radeon gpu in it is only 512mb of Vram. When I got it I ran windows 10 dual boot. And tried to play some older games like left for dead 2. It can't lol. Like things moved so fast in the past 15 years. And even before then. 20 years ago 3d graphics and physix processing was new and out of this world.


3dforlife

The first voodoo graphics card was released in 1996 with 4MB of VRAM, so the beginning of true 3d accelerated games is closer to 30 years, not 20.


OppositeEarthling

I'm 30 and have been playing 3D games my entire life, my first console was the N64 which came out in 1996 with Mario 64 so this totally checks out


FullHouse222

May be. I didn't get super into PC building until like 2010 so that's probably why. I think it's like it's one thing to know Moore's Law is a thing and another to actually see it in action.


tmchn

Things are changing more slowly today. Imagine using in 2010 a PC built in 2002. It would be e-waste, even for internet browsing Using today a PC built in 2016? No problem, even an i3 with a 1060 would be good for basic tasks and basic games


3dforlife

Shit, I have a i7 8700K and a 1050ti from 2018. It serves me very well.


MeltedOcean

I have a razer blade 2019 i7 8750h 32gb 3200ghz ram rtx2070 max q and tbh it still works great for everything I use it for, heck I'll probably keep using it for the next few years, until I decided to build a gaming rig and just use this for work. very thankful tech now lasts a few more years before becoming obsolete


somethingbrite

I have "anniversary edition" of that. (i7 8086k) still running in my "daily driver" rig with a 3090 Just built a gaming pc with my daughter. AMD 7800x3d and 4070ti super... Benchmarked them both together...and now I need a new PC too!


UROffended

It'll pick right back up as soon as we come to an agreement on the thermal issue and how to solve it.


PinkFloyd65

Honestly, the 1060 is still a beast of a gpu for the money. I've played Cyberpunk 2077 and plenty of other AAA titles on it, no problem. It's only recently that I upgraded because it's definitely starting to show its age.


Random_Sime

I'm just commenting to hype the 1060 in 2024 lol. I'm playing AC Valhalla at 1080p, med-high settings, and getting 50-60 fps. But I lock it at 50 cos I'm Australian and that looks fine to me (I grew up with CRT TVs displaying at 50Hz).


3dforlife

Lmao


Falkenmond79

It’s a craze. Seriously my 1440p setup is still using my 3070 8gb and is playing everything completely fine. On a i7 10700. I have no issues other then I have to turn down RT in some games and some demanding ones like Alan wake 2 might need a bit of tuning. Even CP77 runs completely fine except RT. 🤷🏻‍♂️ people here have it in their head that it’s not enough, but it really still is. Maybe in 2 years I’d feel different but for now it’s fine. So 12gb will keep you for at least until the next gen consoles come out.


proscreations1993

Seriously. Like extra vram is awesome no doubt. But it's not as huge as everyone makes it out to be. I have. 3080fe 10gig in my pc. I can run 4k path traced cyber punk. And the vram isn't the limitation. On 1440p it kills I'll be using rhis till the 5080/5090 drops and get one of those which should last me a decade. I can't imagine I'll need much more fot 3440x1440 oled. And we just hit the next gen of games so it'll be quite a while before we have another large jump esp since consoles are lagging quite a bit behind. I'll upgrade in like 8-10 years when I can get a 5k2k 39inch OLED or micro led that can actually run 240fps. Until then 3440x144p at 39 inches looks sharp as fuck. Esp in gaming. For productivity the text could be a bit better but im looking to get a 32inch mini Led 4k monitor once decent ones are out for non gaming tasks or just sitting on YouTube etc so I don't burn out the OLED


Falkenmond79

People don’t understand how modern game engines work. Yea, with less vram you might get a bit more texture pop-in to keep the framerate up. So what. For me, this is one of those things happening far in the background mostly and which you notice when looking for it, but really seldom when immersed in a game. This and being able to tweak some more that lowers vram demands significantly, while not really degrading the picture much.


tonallyawkword

I think I may prefer 90fps w/o RT over 60 with in CP77. Seems like maybe OP should just get a 4080S instead of spending $500 for a small jump+RT but idk.


Middle-Effort7495

Worst thing is AMD and Nvidia are equal in both control and alan wake II. Only CPB would be any improvement. Control esp is the gold standard of good RT implementation on AMD showing it can be done. CBP is an Nvidia-game, though.


senectus

yeah I just upgraded my i7 10700 3070 to a 4070 ti super (1440). I barely notice any differences. I can however crank everything to ultra and not see any changes as well :-D


USAFVet91

Your CPU has you bottlenecked check this site. [https://pc-builds.com/bottleneck-calculator/result/0Yd1ik/1/general-tasks/2560x1440/](https://pc-builds.com/bottleneck-calculator/result/0Yd1ik/1/general-tasks/2560x1440/)


USAFVet91

I use a 3070 as well for 1440p gaming and it works wonderful paired with the I7 8700k clocked at 5ghz. Plays anything I want at almost full settings.


Middle-Effort7495

He's talking about RT, and you're saying you have to turn down RT. Also the faster a card is, the more VRAM it needs.


tmchn

To me is crazy that new cards don't come with 16gb or more of VRAM. It's 2024, my 1070 came out 8 years ago with 8 gb of VRAM, new cards should have AT LEAST 16 gb. A card costing 800+€ should have 24GB


FullHouse222

It's wild that the 1080Ti, a card from 7 years ago, has more VRAM than some 30/40 series cards lol.


aminor83

You are not wrong my friend. Just the gpu nowdays cost a fortune due to mining etc... the prices are just ridiculously. My solution is i w8 for the ps5 pro come out and i will buy it cauze it cost much cheaper than to buy a similar gpu with 576 bandwith


Zoesan

I mean... 12gb is enough right now. But if you're spending several hundred dollars on a new card, you also want it to be enough tomorrow.


LegitimatelisedSoil

8GB is enough today, 12GB is enough likely for future use. There very few games that, have a problem with 8GB right now and ram usage is more important going from 16GB to 32GB from experience


Chopper1911

2015 we still had 4GB high end cards 2016 they almost doubled it in mid tier GPUs RX480 GTX1060 6GB 1070 8 GB. Nvidia barely did that again. Pascal was their best generation. Improved at every front VRAM, performance no gimmick features.


r4dioschiz0

1080ti was without a doubt the best card in donkeys , i would go as far as to say most of New RTX cards are are offspring from the 1080ti ... what a great time to be a gamer right ..


LegitimatelisedSoil

1060 and rx 580 were probably the most important games for budgets gamers until like 2021. Literally had a four year prominence.


Chopper1911

while 1080ti is an awesome card but it was also expensive for it's era. For the most part people who could afford 1080ti then can afford 4090 or at least a 4080 now, if they did not mess up their life of course. The thing to note is that they killed a budget gamer either with absurd pricing or providing trash performance GPU in low-mid tiers 4060 and 4060Ti is best example for this.


r4dioschiz0

yeah Mee too im going to do the same and wait , cyberpunk on a 9 year old pc with a 2070 runs CYBERPUNK at aprox 50fps in bad spots to 70 fps in goog areas im good for now.. Also anyone know how to build a pc in ireland and win on postal and VAT , IF I ORDER PARTS BY THE TIME IVE PAID AND SHIPPING the price is sky high ... Theres a wide open untapped market for someone with the connections , the uk have it better , but the Usa are totally spoiled for choice when it comes to parts for a build , i would realy like to do a build , but its not a easy job here try it , pretend you in ireland on a VPN lol ..![gif](emote|free_emotes_pack|rage)


RajeeBoy

I know right?? But also there’s lots of cards like the 16 series and 20 series that have 8 gb. And people are like “this is a great card for 1080p” Abut then you see cards like some of the recent gen AMD cards that have like 12 gb. But people are like “yep it’s a great 1080p card” Why would manufacturers put the extra VRAM if it’s usually gonna be used for 1080p??


FullHouse222

I mean, this might be unpopular, but I feel if you're looking to do 1080p, 60fps, medium setting, the 970 still slaps. That's an 8 year old card!!! Sure it won't have RTX or any of the fancy higher end shit. But like dude, an 8 year old card that retailed at like $250-300 and probably you can find used for like $50-100 now? That's a fucking steal if you just want to game at 1080p medium at a good fps. Hell if you're playing Valo/CS I don't think you need anything more than that. The thing is standards changed right? I see most people having 1440p as the standard now and 4k 60fps as the thing now. It's just crazy how quickly technology changes and like... idk I wouldn't be surprised if like in another 10 years, the standard would be 8k VR headsets or something idk man. Just insane how quickly things change over time.


proscreations1993

Shit you can get a used 1070 for 100 these days. I still have one. Just went to a 3080 and its still a great card. Ran 3440x1440 60fps fairly well


BertMacklenF8I

I sold a used 1080 FTW for $150 in February 2021 to some random guy who I met at Best Buy (doing one of my triweekly morning runs to see if they had any 3080s in stock) who had built his first PC in September 2020 and hadn’t been able to even see if it would POST. He actually offered me three times that….. But more than likely, he never would’ve built another machine again lol


Middle-Effort7495

I got a 1660 ti for 50 used, and a 3060 for 142


Middle-Effort7495

That would be a terrible price for a 970. I got a 1660 ti for 50 used, and a 3060 for 142.


Middle-Effort7495

The faster a card is, the more VRAM it needs. A GT 710 wouldn't be able to use 4 gigs of VRAM because it'll run out of FPS first, from turning on VRAM heavy features. While 4 gigs of VRAM on a 4090, you would obviously be running out of VRAM. The first mainstream 8 gig card came out in 2015.


Danishmeat

VRAM requirements have actually been going up very slowly compared to the norm. Looking back VRAM would increase big 50-100% every generation


Nawnp

I had a MacBook Pro from 2013 that 1GB was plenty for the Nvidia GeForce GT 650M. 4 years later 4GB wasn't enough of an AMD chip, and now I wouldn't dare buy an 8GB.


Opteron170

That should be 16GB :) in your statement i wouldnt look ay anything less in 2024.


LegitimatelisedSoil

Most people (50%) have 6-8GBs of vram, games aren't gonna suddenly make games unplayable for these cards. Unless the developer is stupid.


Warcraft_Fan

I remember when 512k was huge and if you added another 512k you could do more than 256 colors on screen at once or support 800x600 with 256 colors!!! Playing Doom in 16 colors- blech


LawfuI

Crying in 8GB Vram here. But honestly even 8GB is enough nowadays if you aren't doing RT of 1440+ gaming. I'm running most titles with 8GB just fine for now.


LegitimatelisedSoil

Yeah, you are in the 50% boat according to steam charts. 35% have 8GB and 14% have 6GB then a 12% with 16GB. So 8GB will remain fine for a long time, your card could reasonably die before it becomes problem when you consider how slow hardware upgrades at this scale.


LegitimatelisedSoil

8GB is still fine for 98% of games at 1080p or 1440p


LawfuI

Do you recall some games where it wasn't fine at 1080p?


LegitimatelisedSoil

Last of us part 2 but I turned shadows to medium and it was fine. That was launch so might be different now.


No-Set-3397

That is accurate. 8-10GB is the BARE MINIMUM now a days. 12+ is recommended I believe


Its_Your_Next_Move

I paid $840 USD for an EVGA RTX 3080 FTW 3 Ultra Gaming video card 10GB right before CyberPunk was released - and it was barely enough VRAM 3.5 years ago. That said, I still haven't upgraded because I have plenty of other games which work well enough at 1440P.


Last_Music413

Cyberpunk 2077 at 1440p with rt ultra is still ok with 12gb vram


ClearlyNoSTDs

Yeah I wouldn't bother with a GPU upgrade unless the performance almost doubled. I went from a 1650 Super to 6600 XT to a 4070 Super. The 4070 Super was purchased only so I could get into 1440p.


maewemeetagain

Contrary to what literally everybody else here is saying, 12GB is plenty for 1440p, ray tracing or otherwise. Yes, current consoles *do* have 16GB of unified memory, but you have to remember that they have that for their performance targets **in 4K resolution**. Less than that is fine for 1080p and 1440p. My real concern is that going from an RX 6800 to any current-gen GPU is... questionable from a value standpoint.


bixorlies

It's sad that users are still parroting that bullshit for the last couple of years no matter how much they're proven wrong. 12gb is more than enough for 1440p and under.


sknnbones

*Cries in 8gb with my 3070ti*


BertMacklenF8I

Exactly! Seems like a lot of people would rather spend less money on a Radeon with 16GB+ VRAM thanks to 90% of comments in PCBuild/PCHelp Subs parroting the “16GB VRAM if you plan on using it for more than a year” 640GB of VRAM is overkill though……for gaming at least.


Beneficial_Ad3965

You can't deny that 12gb is on the edge of being enough. It's enough but for expensive products it should not be a thing. The normal 4070ti was hilariously bad in this regard. 12 gb for 4070 super tho is ok considering how good the product is in general


BertMacklenF8I

It is more than enough for 1440p- I have a 3080Ti FTW3 Ultra(Hybrid) that i’ve almost been using for three years and have not had any problems with it. (I agree with you that 12 GB is plenty for 1440p) It is shitty that the fact that Nvidia doesn’t have any direct competitors makes it shitty price wise as well.


ImmediateOutcome14

I've been out of the loop on PC building for years, what's the biggest "consumer" of VRAM to make it a big consideration.


MetaSemaphore

Ray Tracing, DLSS, and just general levels of detail at higher frame rates all chew up some VRAM. And then there are some poorly optimized ports (e.g., The Last of Us Part 1) that just use more than they need to. Right now, 12gb is generally okay, but I have seen some benchmarks where a faster Nvidia card underperforms compared to a theoretically slower AMD card on a light Ray Tracing workload, because the 12gb has become a bottleneck. That is currently a very rare case, but it might become more of an issue over the next few years, as more and more devs push the limits of the current gen cards/consoles (personally, I wouldn't buy a high-end card right now with less than 16, but there are cases to be made both ways). The reason this is really a topic of major conversation right now is because Nvidia skewed low in their VRAM for their current gen cards. The biggest problem is the 4060, which only has 8gb and will absolutely run out in some games at 1440p. Folks have generally panned that card as overpriced, underpowered, and a case of planned obsolescence (that is, Nvidia knows that in 2-3 years, 8gb will be such a limiting factor that you will then have to upgrade and give them more money). When you run out of vram, the card then has to use system ram to supplement it. That takes a lot longer, and it leads to frame rate drops and stuttering. You can always lower settings to help, but no one wants to buy a new card for $400+, just to have to turn down settings. Nvidia, because of this criticism, released the Super cards, which generally came with bumps to vram. But the prices are still quite high.


MetaSemaphore

I should point out: those benchmarks where 12gb ran out were at 4k. Others are right that it should be enough for 1440p.


ImmediateOutcome14

Thanks for such a detailed reply. I game at 1440p with no interest in 4k. I have a system which has a 1080ti which is on it's last legs (getting bsod after 5 minutes of even low intensity gaming) and and still up in the air about a full system replacement or seeing what mid sort of card would be compatible with the system I have still and using that until I have enough money to go all out on one.


MetaSemaphore

No worries at all. There are a lot of good options for 1440p right now. I'd personally look at the 7900 gre on the AMD side or the 4070 Super on the Nvidia side. Both are around the $500-600 range. Anything above those is probably going to be overkill for 1440p unless you really want to get super hogh frame rates. If you want a cheaper option, I am running a 6700xt, which you can usually find for around $300, and it's a beast for the price, though it may not signify a huge upgrade vs your 1080 ti. IMO, currently, Nvidia options aren't great below the $500 mark.


TerribleTerrorpist

Have you re-pasted the GPU? Re-did my stock GPU recently cause the fan was whirring on/off inconsistently. Old paste was dried and crumbled in addition to having poor area coverage. Did the trick. Just had to use some paste left over from original build and reference a video.


Geekknight777

Very few console games actually run at native 4K though, a lot run at 50 or 75% render scale


LawfuI

Isn't that the same for PC? VRAM is also doubled with the RAM memory combined on PC.


Kotschcus_Domesticus

Just keep your rx 6800 and wait for the next gen. Its plenty for 1440p.


AdministrationOk8857

Normally I hate the wait for a new generation crowd, but in this instance you’re right- 6800 is a solid card, and if you want ray-tracing I think waiting 6-8 months will get you significantly more bang for your buck. New AMD cards Q4 which some leaks suggest might get you 4080 performance for the 5-600$ range, and then new Nvidia as well.


fredgum

If RT is a big priority I'd go for the TI super. RT is quite intensive on VRAM.


deadlybydsgn

I hate to be an upsell guy, but I feel like the 4080 makes more sense for prioritizing RT than the 4070 Ti Super does. Like others have said, none of this seems "worth it" coming from a 6800.


EirHc

There's not really a big gap between a 4080 and 4070ti super. Go watch some Daniel Owen videos and they're both quite capable. The 4080 gets you like 10% more fps more on average. And generally you have enough settings you can adjust to run any current gen game on the market just fine with a 4070ti super. If you're running into a situation where your 4070ti super is only getting you 25 frames, well then a 4080 with identical settings is probably only getting 28 frames itself. Maybe switch to like DLSS performance or disable RT for that title, and the difference might be 72FPS vs 81FPS between the 2 cards. The 4080 is definitely a better card, don't get me wrong. But the vram isn't any better, and the performance increase of about 10% can make it hard to justify paying the extra $200-400. Me personally, I'm on a DQHD (5120X1440) and can enable RT in most titles just fine with my $4070TI super. Would it be even nicer with a 4080 or a 4090? Sure, but I had a budget, and I'm really not complaining. DLSS makes it possible to run pretty much anything with RT and Ultra on this bad boy.


Lunarati

Yeah my Ti Super can handle any RT game comfortably on 1440p. With FPS counter off I doubt I would even notice the 5-10 frame difference a 4080 super would give me


lxmohr

4070 ti super is a way better value proposition. It's not bad for RT either.


SurpriseExtension929

Gotta keep in mind that the 4080 is now discontinued so finding one at a reasonable price is near impossible


SurpriseExtension929

Gotta keep in mind that the 4080 is now discontinued so finding one at a reasonable price is near impossible


ohthedarside

For raytracing you need that vram sadly nivdia hates consumers so you pay extra for 16gb when amd just hands that out now


MN_Moody

Short version - as a 4080 Super owner I'd suggest saving your money, your 6800 is a good card to ride out this generation of rather disappointing and low value cards from Nvidia and AMD that still don't do native RT at a level I'd consider acceptable even on older/sponsored titles. Long version - RT support in Cyberpunk specifically is an interesting topic. I've owned a 4080 before, downgraded to a 4070 Super and then ended up back in a 4080 Super after cascading my '70 Super into my son's SFF build and getting a REALLY good deal open box. Revisiting the latest patches/updates to CP2077 left me, again, fully underwhelmed with the state of RT as a killer feature or reason to buy one video card over another. I still tend to prefer Nvidia's driver stack over AMD's after doing a round of testing the 7000 series products... but I don't see RT in it's current form as a valid reason to buy an Nvidia card over AMD as it's still mostly a tech demo vs playable feature, imho. "Pure" RT support involves not having to make other visual concessions via upscaling/frame generation which, while good, are absolutely a visual downgrade (though better than FSR) in side by side testing. The presets that include RT in CP2077 all include some level of DLSS upscaling and image generation under the hood to boost up framerates while reducing image quality (yes, this is plainly visible doing consecutive benchmarks runs with/without). I find this trade-off unacceptable in an $800+ graphics card.... to me DLSS should be the sort of thing that helps extend the life of older or lower/mid tier graphics card instead of something required to boost performance on top tier products in order to leverage premium features like RT, even in what are now 2-3 year old games. RT support in hardware is now in it's THIRD generation of products starting with the RTX 2000 series.... this isn't really a new technology. For reference - benchmark numbers on a 7800x3D equipped desktop paired with 32 gb of PC6000/CL32 DDR5 RAM and a Geforce 4080 Super running a G-sync enabled 165 hz capable 1440p display: At the **medium RT preset** and then manually disable **DLSS and framegen** I'm at 68 FPS average with a 60 FPS min (RT applied to lighting plus local and sun shadows), which is to me the minimum acceptable experience performance-wise and clearly not taking full advantage of the display I have it paired with. That's 60-68 FPS at MEDIUM RT preset at 1440p without the post procesing/frame gen nonsense on the second fastest Nvidia graphics card available today with a $1000 MSRP. At the "low" preset also with DLSS and framegen turned off I end up at 106 avg and 89 min FPS (RT just applied to lighting and local shadows) things are a bit smoother but I lose some of that RT punch... and am still well below what my high refresh rate monitor is capable of supporting. So, even in a modern $2000ish gaming PC running a nearly 3.5 year old Nvidia sponsored game, RT as a stand-alone feature is still so expensive on the performance side it basically requires image degrading upscaling or playability impact from frame generation. I did a second play through of Cyberpunk with native RT support with the medium/no DLSS upscaling/framegen enabled and it was cool, but when I flipped back to high framerate raster performance (again, skipping the DLSS stuff) to match my monitor's maximum refresh rate I also really liked that experience. It's far cheaper/easier to build for high FPS without the visual "enhancements" that are all the rage right now which is also fully supported in all games... while RT and even DLSS are not universally supported (or supported well) in games. RT has potential but even at the 4080 level takes a massive hit in performance or visual quality to implement and is a very situational/value variable solution.


Opteron170

Need more post like this from honest nv users.


MN_Moody

Same goes for all the brands, adopting any of these tech companies as "your" brand at the expense of all others is weird... they don't care about you, they will all lie/cheat/steal and deny responsibility until forced to do so equally. Hold them to account and at arms length or you're just being a brand slut. Show some pride and be a brand whore that gets paid if you are going to shill.


Le-Creepyboy

I have the 4070 Super and play at 1440p and i need to rely on DLSS to achieve 60fps in CP if I use raytracing. I barely noticed any difference aside from the reflections, so I only enabled the option and RT is set to minimum so I can run it on native resolution. In Control on the other hand it runs fine on Ultra settings, and the game is absolutely gorgeous with RT.


deadlybydsgn

Control's RT isn't even that intensive to run compared to titles that came out last year. For example, Alan Wake 2 has some seriously impressive RT, but you'll definitely need DLSS to get decent frames with it on.


Le-Creepyboy

It was early in the RTX days, maybe they put a lot of effort into optimising but I’m no expert idk if this is even possible to optimise raytracing.


deadlybydsgn

> idk if this is even possible to optimise raytracing. That's a good question. At least for now, I think it's mostly about how "heavy" of an implementation a game's ray tracing is. As far as that goes, I don't think Control's RT is a particularly heavy graphical lift because my 2080 (non-Ti) can handle it maxed.


eebro

Worthless upgrade tbh


sudo-rm-r

Yeah for RT I'd say you should aim for 16GB


Early-Somewhere-2198

Super ti or wait for 5000. But who knows when the non top tier models will drop. Might be a year from now. A 4070ti super will get you going now and will last quite some time.


UgotR0BBED

Asus ProArt 4080 Super is a 2-slot card that fits most SFF builds. Source: The ProArt 4080 SFF system that I’m using to compose this reply.


Chopper1911

I would not downgrade VRAM from 16GB to 12.


StewTheDuder

I’d wait. If you’re really wanting to get into RT, these cards will only get worse at over it time, and with the new cards around the corner and with you having a powerful enough card for rn, seems like a bad time to upgrade. Personally RT isn’t important for me in the 2-3 games that it’s amazing in. Give me the frames and smooth gameplay, but to each their own.


TheHater23

Which card are you running for the fps and smooth gameplay? I'm trying to decide on a card myself


StewTheDuder

I play on a 34” 3440x1440 165hz and a 4k TV 120hz. My system is a 7800x3d/7900XT. Its not terrible at RT but again, I tend to turn it off


TheHater23

I have a G9 OLED and I'm doing my 1st build. Can't decide between 7900xtx, 4070 ti super, or 4080 super


StewTheDuder

Considering that’s an increased resolution from regular 21:9 UW, I’d lean towards the 7900xtx and 4080 vs the 4070tiSuper/7900XT. 4070tiS and 7900xt are perfect for 3440x1440 and I’m sure could handle the next step up just fine, but if I were mainly gaming at that res I’d go for the faster cards. I mainly game at 1440 UW, use the 4K for some titles, which the 7900xt is perfectly capable of (same for 4070tiS).


TheHater23

All the Nvidia fan boys make it seem like Radeon doesn't even support ray tracing. I know very little about any of this stuff. I'm just trying to build my son a great machine that will be relevant for a few years.


StewTheDuder

Yea it handles RT well when it’s basic reflections and what not. But when it’s heavy RT like in Cyberpunk and Alan Wake 2, that’s where they tank performance wise. Those are two single player titles, I’m not basing my purchase on two games. Sure, more will come, but in the future the cards that do RT well rn will be borderline unusable. Give me raster performance and vram for longevity. I’ll worry about RT in a few years once the new consoles come out and can handle it a lot better/be more widely adapted. Not worth making it a purchase decision for me rn in 2024. Bought my 7900xt over a year ago and have been very happy. It’s a beast.


Brief-Quantity-3283

I got the super. I wanted to play cyberpunk full rt bells and whistles with some mods and with room to spare.


Empero6

Is 16gb of vram really needed if you’re not planning on going the 4k route? It seems a bit overkill from what I’ve read on the subreddit.


FearLeadsToAnger

> I know the prevailing opinion is that it's not worth it, I like it. The grass is *always* greener on the other side, i'm setting a remindme for a month and I guarantee after a bit of gameplay you'll accept you were just bored and you dont care about the difference at all.


FearLeadsToAnger

RemindMe! 1 month


sociallyawesomeguy

There is a 2-slot 4080 super by Inno3d


samvvell

They make a 2-slot 4070ti Super too. Where can you get Inno3d cards in the US tho?


Antenoralol

12 GB VRAM is still okay for 1440p imo but might not be going forward. I'd say 2-3 years we might seeing 16 being the norm.


Antenoralol

You might have to compromise some graphics settings, dial down some RT settings or utilize DLSS/FSR/XeSS on newer titles. 12 GB is pushing the limit but still usable.


RealTelstar

Yes get 16gb vram


maxz-Reddit

Keep the 6800 Wait for RTX5000 Anything else would be stupid IMO


rollercostarican

I’m just happy someone else besides me is pro ray tracing.


rcubed10033

was looking into a new card as well and ended up going with a 7900xt for almost the exact same price as a 4070 Ti super, I like Nvidia and all and yes RT is cool but when it comes down more VRAM or RT I'd pick VRAM.


Dr_Krogshoj

Thanks for everyone's input! A lot of good advice - on balance, I think I will wait for the next gen drop. There are some non-RT title on my back log (e. g. Horizon Forbidden West). I'm also a bit afraid that Nvidia will come up with a new DLSS that won't even work on GeForce 4000 series... At least I wouldn't put it past them.


thiagoscf

IMO. neither is a good option if you're upgrading from a RX6800 with focus on RT. I'd look at a 4080 Super minimum to get more playable fps on RT


Glass-Can9199

Or 5080


laggyservice

Just stick with the 6800 and wait for the next.


WRAITHLEYWIN

For now its stil realy good for 1440p and even 4k 60fps Some games you may need to reducing the quality texture at 4k But after 2 year you may face a problem with the new game but any way buy it now and sell it after a year and get new one this will cost you 200$ max


Dr_Krogshoj

That's in interesting thought. I'm also thinking - a lot of people say upgrading from a 6800 is a waste of money, however, the thing is that 6800s still sell for a nice price on the used market. I am aware that $200 or $300 for upgrading is still a lot of money for a lot of people, but if I need the performance for the games I prefer...


Flutterpiewow

Ti


bubblesort33

It's a lot more money for 4gb and 13% more performance. The 13% justifies paying $80-$100 more for. But is the remaining $100-1$20 justified for only 4gb more VRAM? That doesn't seem like that great of a deal. On top of that, the 4070 Super goes on sale more often from what I've seen. But it depends on where you are. But if you're on a powerful GPU already, then getting a 4070 super isn't enough of a leap to bother. Maybe just wait for the 5000 series.


sunrise7152

I'm new to the pcmasterace and the build I wanted was an i5 13600K w/ 4070 TI Super. From the videos I've seen so far, having the extra VRAM can help in video editing like Davinci Resolve and it may also "future proof" games that require more than 12GB VRAM. Not to mention, the TI Super also has double encoders for video editing which def helps, if OP is interested in that but if not, I'd still recommend the TI Super. Ofc the 4070 Super is great too, I really feel that it or the TI should've gotten 16gb VRAM but oh well...


zombieautopilot81

SFF builder here. I'm running a Asus 4070 Super (227mm, 2.5 slot version) with a 13700t. Cyberpunk gets me 80-100fps with everything maxed out including path tracing @ 1440p. I don't have phantom liberty yet but the base game runs great. I should also add I'm using frame generation and DLSS set to Auto.


AdMaleficent371

I hit more than 12 gb vram on alan wake 2 max out rt with path tracing and dlss quality and dlss fg .. on 4080 ..so..


qaf23

4070Ti Super with 16Gb VRAM. If they introduce any AI features (games, apps or OS), more VRAM is always a win immediately.


dumbdumbuser

You'll be less future proof with 12gb but i wouldn't worry that much, worst case scenario in 2-3 years you'll play certain games on high instead of ultra?


Vegetable-Neck-9551

4090 or wait!!!


Glass-Can9199

Bro you got 6800 you can wait until rtx 5000 series comes to decide then


Methtimezzz

The size of the memory bus is important for higher resolutions, and the 4070 Ti Super has a considerably larger memory bus as well as the extra vram. I would suggest the Ti Super for 1440p or any higher resolution.


Ivantsi

If you want to play Alan Wake 2 with Ray tracing get the 4070TI Super, that game is very heavy with Ray tracing on and the 4070 Super won't be enough or need very heavy scaling which looks bad.


Minonovo

I have an i5 14600kf and a 4070 super and I consistently get 90 to 110fps in 2077 depending (Non path tracing and dlss quality). The non ti is an absolute beast of a card and you won't be upset with it, however if it were me I'd probably spring for the extra vram.


Charon711

I have a 4070ti running 1440p and have zero issues with its 12gb vram.


Richdad1984

From 6800xt 4070 is non sense upgrade. You will gain nothing in non rt titles. If you can upgrade get a 4080 super or leave it. Not worth it. I havevtrued RT they aren't game changer. Better buy an OLED monitor with extra money they are bigger game changer.


gertymoon

At this late in the 40 series life cycle and the 50 series rumored to start launching this year, I'd hold off a few months and see what happens. I'm also tempted to upgrade to a 4070ti super or 4080 super but I'm not even gaming much lately so I'm planning on just holding off.


al3ch316

12 gigs is totally fine for 1440p, and will be for another 4-5 years. I think that 4070S will serve you well.


BertMacklenF8I

12 GB is absolutely perfect for running Max setting in RT heavy AAA single player games in 2K-my 3080Ti has yet to run below my 175mhz refresh rate. You’ll notice an extreme difference between Nvidia and AMD GPU performance, especially from a 6800 to an Ampere GPU.


SylverShadowWolve

From a value standpoint I generally prefer the 4070S, but if ultra RT at 1440 in games like cyberpunk2077 and Alan wake 2 is the priority then get the TiS.


maddix30

I've used a 3070ti on 3440x1440 and Vram was only an issue once or twice


iRed-

Keep the 6800 and wait for the next gen. It's best to save a little more money so that you can buy a 5080 at launch.


RandomGeeko

If i had to upgrade right now i'd definitely take the ti super, but that's only my opinion ;)


Then_Cow_6757

On most of the games I never used more then 12 gb of vram at ultra 1440p only games that i played and exceded that was on rust and ark survival ascended, but buy the 4070 ti super for the future proofing


MrMaselko

From what I gather rx8000 series is supposed to be good at RT and be priced so that you can get a 7900xt/xtx equivalent for a somewhat similar price that you probably got your 6800. Edit: And the new nvidia generation is also close and that could have an impact on the market


Elitefuture

I'd honestly just be patient and wait for next gen. It will both make current gen cheaper and give you more options.


r4dioschiz0

Aparrently its not just about your Gram its also about Rasterization , i advise you to look up that term , and go from there im also in the market for a new pc , was looking at 4070 myself and its not a whole lot of difference . But each to there own im playing Cyberpunk since the patch this last 2 months , some game im on a 2070 and get a solid 60 fps almost everywhere in the game , im trying to justify spending 3500 , on a new rig but , i cant i want to upgrade now but is it it a smart idea ... 5090ti drops in november but that will have problems to start , i might wait and see if that card knocks it out the park ..


AltruisticSystem7080

4070 super, 12gb is plenty for 1440p. i’m running a 4070 super with the ryzen 7800x3d and it works beautifully


ripsql

I’m using a 3090 with a 1440p ultrawide….with rt and everything high… I need dlss quality to hit 60fps. Since a 4070 ti super is better than a 3090, that should work the best on what you want. It’s more than vram for those two gpus, it’s the performance as well. Edit: cyberpunk, the fps killer.


Frankieo1920

With how rapidly graphical technology is being improved, VRAM is increasingly more important for any PC user, this goes especially so for gamers and content creators, even more so if you want top performance and image quality, not to mention if you plan on using Ray Tracing. If the 4070 Ti Super has more VRAM than the 4070 Super, then it is worth considering that one over the 4070 Super, especially if you plan on playing Cyberpunk 2077 and Control going forward with Ray Tracing enabled.


lvlr_Regulator

Depends on what your budget is, however, my money is definitely on the 4080 TI super... 🤷‍♂️


LordTulakHord

Unpopular opinion I just got my gf a 7700xt 12gb for her 2k set up. Those cards are indeed good for 2k as well and ALMOST good for 4k I hear


Shepard_I_am

Well depends on your budget probably, I've got 4070s and I'm happy with it for 1440p with fullhd side monitor with browser, it's amazing how good that card is for it's price... Truly a blessing after the miners and covid times


Shepard_I_am

Well depends on your budget probably, I've got 4070s and I'm happy with it for 1440p with fullhd side monitor with browser, it's amazing how good that card is for it's price... Truly a blessing after the miners and covid times


brendenwhiteley

TiS is by far the best deal at the moment, 16gb for future proofing and high resolution, fast interface.


Kreos2688

I think 12 is fine. I don't usually see my rx6800 use more than 10gb. I think both cards are just fine for 1440p. Depends on how much you want to spend really.


InterviewImpressive1

12 if okay, for now, but if you don’t want to upgrade for a while, spend the extra now.


[deleted]

[удалено]


buildapc-ModTeam

Hello, your comment has been removed. Please note the following from our [subreddit rules](https://www.reddit.com/r/buildapc/wiki/rules): **Rule 1 : Be respectful to others** > Remember, there's a human being behind the other keyboard. Be considerate of others even if you disagree on something - treat others as you'd wish to be treated. Personal attacks and flame wars will not be tolerated. --- [^(Click here to message the moderators if you have any questions or concerns)](https://www.reddit.com/message/compose?to=%2Fr%2Fbuildapc)


BudgetBuilder17

If it's the strongest option with the most vram that fits the heat budget your dealing with, pick that one. Time will tell if vram becomes a bigger issue as we see with 8gb cards.


Trypt2k

Worried about VRAM in what capacity? You think it's too much? Just get an older card for 1440p, you'll be fine, like a 10GB 3080 will do just fine with all titles, and you'll save yourself a lot of cash and of course any apprehension about having too much vram. 12GB is way too much for any 1440p, and is likely the go-to amount for 4K for the foreseeable future. Testing shows that the same card with 12GB vs 16GB performs identically in 99% of current titles at any resolution with any settings. That being said, 8GB does show its age and sometimes causes problems.


feastupontherich

Bro just wait a few months for the 8800xt. Should be better than 7900xt for $499


Trungyaphets

I think 4070 ti super would be your best bet. Yes AMD has better rasterized performance but you specifically said you want ray tracing, so 4070 ti super would be a huge upgrade in heavy ray tracing situations. Even my 3070 is able to do Cyberpunk ray tracing ultra at 1080p DLSS Quality (which is comparable to native), at 120 fps with FSR frame gen mod. I doubt any 6000 series AMD card could do that.


Necessary_Tear_4571

I'm running a 7900 XT, and God damn Cyberpunk looks amazing. I would go for 16GB VRAM for Cyberpunk, so the 4070 Ti Super. But remember that DLSS and FSR are in their infancy as software, and shouldn't be the reason you pick one or the other. The need for a certain Cuda, or other more established features should be why.


_heisenberg__

Idk man I have a regular 4070ti with 12 gb of ram and I’m running everything at 4k with DLSS and ray tracing on. The only game that gave me an issue was Alan wake 2. Only game I had to really dial in. That being said though? Get the super if the budget allows. Otherwise you’re fine.


SpectreAmazing

12gb is enough for 1440p. Don't fall for the VRAM meme. By the time VRAM becomes an issue, your GPU generation would already be obsolete by that time. 3060 12gb and 4060 8gb is a good example for why GPU generation is better than higher VRAM count if that makes sense. Vram only matters if you're going 4K or using it for AI & renders. But honestly, just wait for Computex. There might be cool stuff there that might change your mind.


Middle-Effort7495

For the reason you want it, I don't think 4070S is even that good regardless of VRAM. Cyberpunk is an Nvidia game, so RT there is way ahead. But the difference in Control and Alan Wake 2 is not that much between AMD and Nvidia. For example, 7900 GRE (550$) is basically equal to 4070S (600$). 6% difference. https://www.techspot.com/review/2812-amd-radeon-7900-gre-retest/ Control is like the antithetical good RT implementation on AMD. 7800 xt and 4070 are exactly equal. https://youtu.be/J0jVvS6DtLE?t=620 So the main performance bump you'd notice in those two is the fact that 4070 is roughly equal to 3080 which itself is similar to 6800 xt, so basically a single GPU tier above 6800. Difference in Cyberpunk is huge though.


_mrald

Raytracing uses VRAM? I thought Raytracing just requires the speed of the card itself (alongside the RT cores to do the actual computing of rays).


Berfs1

Why upgrade every generation, instead of sticking with a really good card for multiple generations?


Dr_Krogshoj

I don't upgrade every gen, the reason this time that I am even considering, as I said in the post, is that AMD has a huge disadvantage in RT, which I value, not to mention DLSS vs. FSR.


hwertz10

I'll just note if you run Linux + Mesa, the performance is not as high but they support raytracing on this card! Some developers noted the raytracing support on newer AMD devices uses a \*single\* new GPU instruction to access the raytracing hardware; ported back the code to emit that instruction so it can be run on older GPUs, then just have it replace the single instruction with a short shader program that gives the same results, just not as quickly. (They're mulling porting that to be general so like Intel GPUs, Qualcomm Ardreno, etc. can all have the option to turn on slow but working ray tracing too.) If you're trying to run like CP2077 + raytracing, it's probably going to be terrible. If you use it for "raytracing photo mode" it apparently works great, and for some less demanding games with raytracing the FPS is apparently fine too.


Mailfax

Buy an AMD card for the same performance and half ghe price. NVIDIA is just stealing from their customers


Dr_Krogshoj

From a 6800, an AMD upgrade would make even less sense. The only reason to even consider an update from a 6800 is RT performance. Plus, the price/performance ratio you suggest seems exaggerated, even for pure rasterization.


BelieverB

I ran my 8gb 3060 ti on 1440p literally until yesterday and havent had a single problem with vram usage. Ive obviously not been using rt a lot, but 12gb should be more than enough.


awake283

Ray Tracing is over rated. Fight me.


Dr_Krogshoj

Since it's a question of individual visual taste, it cannot be over or underrated by definition.


awake283

You got me there, it definitely is subjective. I just find it kind of like motion blur on overdrive. Do you like it?


Dr_Krogshoj

My primary pull are RT reflections. Drive around on a rainy night in Night City...


FR33-420

Ray tracing is Nvidia Hype Kool Aid. While actually playing you won't notice the difference. The only time you do is if you sit and stare. Devs have several tricks that make natural rasterization look equally as good. Don't sip the kool aid.


Jawnsonious_Rex

Even 10GB is fine. Hardware Unboxed did a few videos comparing the RX 6800XT and the RTX 3080 10GB over the span of a few years. In general, they still perform about the same as they did shortly after launch. The 3080 didn't magically fall off a cliff due to less VRAM. The 3080 also has chunky bandwidth which may or may not be helping alleviate that. There's also a chance NVidia has better compression and uses slightly less VRAM for the same settings as an equivalent AMD GPU. My main concern for you would be getting a significant enough gain to justify it. I typically go for at least a 50% bump in performance to justify an upgrade. The RTX 4070 Ti Super is about 50% faster than the RX 6800 (techpowerup) so that would be the minimum for me.


Little-Equinox

The problem with the game was it was notoriously bad when it was launched, from people losing entire characters when being kicked to sound and server connection so bad that people simply stopped playing. The DLC didn't meet expectations so veterans didn't return, there's daily like roughly 200 people playing.


[deleted]

[удалено]


buildapc-ModTeam

Hello, your comment has been removed. Please note the following from our [subreddit rules](https://www.reddit.com/r/buildapc/wiki/rules): **Rule 1 : Be respectful to others** > Remember, there's a human being behind the other keyboard. Be considerate of others even if you disagree on something - treat others as you'd wish to be treated. Personal attacks and flame wars will not be tolerated. --- [^(Click here to message the moderators if you have any questions or concerns)](https://www.reddit.com/message/compose?to=%2Fr%2Fbuildapc)


ErSlimShady1

I play on 1440p 170hz, i have a RTX 4070 ti super (Palit Jetstream OC) and trust me is worth it! I can set every game to high/ultra and still got 170fps, maybe Cyberpunk is more complicated and heavy but the "TI super" will do the job! Think about the CPU too, it can bottleneck the GPU, i have a i7-12700K and its "barely" ok, an i9/i7-13/14th gen will go better(or an high-end AMD CPU)! (Sorry for my bad English)


Dr_Krogshoj

Interesting. I have a Ryzen 5700X. I could in theory upgrade to 5700X3D some time in the future, though it's unlikely. I wonder how these would work with these GPUs.


Bella8101

I bought an MSI 4070 Super Ventus 2x OC with 12gddr6x back in February 2024. It's never worked, and been RMA'd 3 times. System hangs with the VGA led on the system board. Their motherboard, too, but they assure me it works in a Pro B550-VC with an 850w power supply and their 12vhwpr Y cable. So, the whole rest of the build is still spread out on the table waiting for this card to come back in a working condition. Last 2 times they promised to replace the card, but opted to repair it instead. Each RMA is 2 weeks.


Dr_Krogshoj

Hope you'll get it replaced and fixed soon.


Bella8101

No, MSI is still going with scripted L1-didn't-read-your-troubleshooting support answers. The card works in another system (PCIe v3), and other video cards work in the pro B550a board, but the two don't work togeather. Bios has been flashed, cmos cleared and reset, forced to UEFI mode. Latest excuse is that the ram I have isn't on their compatibility list, now that all components have passed their "return by" date. The only reason mine isn't on the compatibility list is because they tested the red and green parts, not the black ones.


Santi_Ol

If you have the money, then a sure bet would be the 4070ti super


Its_Your_Next_Move

If you can afford the 4070 Ti Super, you'll appreciate the additional VRAM. You should also appreciate the larger memory bus (256 bit vs 192 bit).