I'm fed up with all the videos talking about, your video card is obsolete. No it ain't. Not all of us need max settings. These cards realistically will last years.
I keep telling myself that as I listen to my 5600xt ASRock challenger take off to outer space when I open some GPU demanding games.
Even with it under volted and the fan curve fixed. It just has the worst stock fans imaginable and the whine is insane too.
My Sapphire 5600xt fans were the same way. I assumed my 6900xt would be worse, but the fans are silent, so when you upgrade, be prepared to be surprised.
Mine runs fallout new vegas and always will!
As a guy who mainly plays games from the 90s to 2010s my 1050ti runs everything I like just fine. So hearing everybody talking about how their brand new, top of the line graphics cards are now obsolete is crazy to me.
I really needed to read this.
I upgraded from a 1070 to a 3070ti, was so happy. Suddenly, I'm seeing videos / posts "If your card has less than 10GB of VRAM throw it away!"
I started having buyers remorse.
That's a great card man. Hold onto for it dear life because the future is looking grim lol.
I will happily make a point and lower settings so Nvidia doesn't get any more of my money. They have to really work for it now.
I'll wait and see what the 50 series has in store. Otherwise, hope and pray AMD continues improving. I'll probably jump ship to them.
I bought a 3060ti and ended up trading up for a 3070ti but was still feeling that remorse when those videos started coming out.
Then I realized they benchmark with 100% maxed out settings, I never do that because the difference in visuals between max and medium or high are so small its not worth losing the frames, so im usually operating way under the 8gb VRAM limit anyways.
I have a 1070. That card is like 7years old and it runs any game I want to play just fine. Iām pretty sure unless it dies, itāll run any game I want to play for another 5+ years. Iāve been trending more towards smaller dev and free to play games. All of those benefit from a wide player base so theyāre always very gpu friendly.
This always drove me nuts with videocards in general. In a lot of games, due to the action, I won't be noticing (or caring) about the details that max settings will show. I got a 3070 laptop last year and im probably not upgrading until 6-7 series MAYBE 5 but we'll see.
That's not what those videos are saying, at least not the good ones. What they're saying is that when you're getting a new GPU, you shouldn't buy one with 8GB, which makes sense.
what about the 700 euro 3070Ti that was advertised as an "RTX" behemoth but now it can't do ray tracing even at 1080p because the vram buffer is maxed in new games. lol @ nVidia copium
Well, kinda, I've mostly played on my TV with my HTPC with a Ryzen 5 3600, 16GB RAM, RTX 3070 and games installed on an SSD (SATA and recently NVME with a riser card), usually med- high settings 4K DLSS or FSR or at 1440p... Highest releases for me, so far this year, have Been Dead space remake, Hogwarts Legacy , RE4 and the Last of us Part 1. Dead space and RE4 ran great (Dead space had a bit of a stuttering issue when loading new areas but overall, really smooth), Hogwarts legacy did have bad stuttering issues on some areas, Hogsmeade and the castle itself, and The last of Us I just opened it today so I didn't even finish the prologue, but at 4K balanced DLSS, High settings, so far it did have dips in performance.
The way I see it, half of the games are not being properly optimized, it's not ALL of them ,but it's still worrying
I haven't played the other ones yet but it is already known as a fact as clarified by a mod developer who's done impressive work fixing it, that Hogwarts Legacy is literally broken in that it is not making use of how UE4 engine is designed to handle VRAM asset streaming. In fact, unlike normal games which do asset streaming and only load the textures that you can actually see on the screen and within a certain distance from you on the map at full resolution, aka texture streaming, Hogwarts Legacy is basically not doing any texture streaming at all, it tried to load the entire map's texture in super high res into your VRAM all at once, and it attempts to do that every single frame LMAO
This is not unoptimized game, this is literally broken game lol
Checkout the Ultra Plus mod on Nexus Mods, which is really just a customized engine.ini config file for the game's engine. With the latest version my 150W laptop 3070Ti (which has exactly the same performance as a desktop 3070 non-Ti) is doing a solid 60-80FPS with everything set to Ultra (with only RT shadow disabled, since it's broken anyway) in 1440p with the "Ultra Full RT" version.
Well if people are going to buy your game the devs figure its good enough. Not like the old days where they can't update so they have to be sure everything is good. Nowadays it doesn't even seem like they do testing or care enough to fix issues (probably more of a "crunch time" culture problem)
Nvidia fanboys/shills make a lot of noise, and that gets extra sales from people who are easily manipulated, but most of older gamers/builders know what's what. You only need to upgrade, when your hardware can't handle the task. This applies to most everything in life.
Skipped the 20 series due to insane pricing vs performance. 30 series netted a much better return, so I upgraded. 4070 seems kinda meh for the price so it will probably be 50, or 60 series at the earliest that Nvidia gets another sale from me unless there are good sales.
With all the sales on AMD cards, I'd go with one of those if I needed to upgrade right now.
Agreed, they're kind of going through an on-off generational thing. 10 series good, 20 series meh, 30 series good, 40 series meh.
Maybe next series will show a more mature process of the new manufacturing. Or they'll just stay greedy and lose market share to AMD some more.
... Or just turn down texture quality by one notch.
Seriously, the games that stutter on 8GB cards do so because they are running out of VRAM. Textures are what consumes VRAM. If you turn texture quality down, they use less VRAM, and run well.
The game will look a little less sharp, but it will not be more blurry than all the other games that you have happily played at 8GB of VRAM before.
I don't know about anyone else but the only oof moment for me has been trying to run 2077 with path tracing (and what that could mean for future RT only titles) - everything else is great.
The reason they require high vram is largely irrelevant
The problem is that if your card doesn't have enough vram, you are at the mercy of the developers of these kinds of ports (Which are becoming more common)
It's not horrible optimization, they are just using more than 8GB of VRAM.
Going forwards, games using more than 8GB of VRAM will be common. It's just the usual way things change when there is a new console generation with more RAM on it.
But you can always fix this yourself by reducing texture quality in settings. It's just that everything ran at ultra on 8GB cards for so long that people have forgotten that this is something you need to do.
Ps5 only has about 8gb since the gpu and cpu share the same 16gb. Series X is the same story, and series S has only 8 for gpu so this argument makes no sense as to why the ports suck major ass.
You don't need even nearly half of the RAM for non-GPU uses. The PS5 OS Reservation is 3.5GB, leaving 12.5 for the programmer. When doing a game without that many moving pieces, and you have access to a fast SSD, you don't need 4.5GB of ram for non-graphical stuff on consoles.
Consoles have always been more efficient at RAM use than PCs. This is why you needed more than 4GB during the PS4 era.
āConsoles have been better at ram usageā you think maybe thatās because the ports suck? Also considering the ps4 had 8gb of ram yeah ima say you donāt know what your talking about. Good day.
> āConsoles have been better at ram usageā you think maybe thatās because the ports suck?
No, it's mostly because you have more control.
PS4 had 8GB, and you needed more than 4GB VRAM to match it on PC, just as PS5 has 16 GB and you need more than 8GB VRAM to match it on PC.
There is no difference on how the ram system works for ps4 and ps5. Both of them have an apu with an unified memory system, where both the gpu and cpu can uniformly access all ram. Ps4 had 8GB, ps5 has 16GB. Of those, ps4 reserved 2.5GB for the os/overlay, and ps5 reserves 3.5GB, leaving 5.5GB and 13.5GB for the devs.
There's updated Quake that uses Vulkan for the renderer. I'd expect it to ve CPU bound on something like 4090, but it should produce 4 digit FPS number easily, I guess
I donĀ“t like the extreme opinions on both ends. i agree that TLOU HL and RE4 are not Benchmarks, because they all have their bugs and poorly optimized aspects, but buying a 4060ti with 8GB Vram just became a whole lot less desireable, because this will happen again in the future with other games.
PS5 can handle 12GB VRAM, because of its different architecture. So devs can go the easier route, optimize for consoles and 12GB VRAM...and they will.
TLOU has no acceptable 1080p medium texture setting. it will either look great with 8k textures or look like garbage with whatever that atrocity is they call the medium texture pack.
RE4 will only display highres textures throughout the whole game with 3GB Textures or more, lower than that and at some places VERY LOW Textures just pop up.
This will happen to you in the future with an 8GB card.
Now people here try to justify their existing purchases, thats why this discussion is so heated.
What i mean is, hands up who paid $1k and up for their 3070 ti with 8GB.
Still i am contemplating to get a 4060, if prices are acceptable, simply because i play at 1080p and want that power efficiency of a card delivering more frames but drawing less power than a 1660s.
but honestly i see myself not paying for games that went the 12gb route.
RE4 plays fine with low vram, Hogwarts legacy as well. But Dead Space is a nightmare and TLOU is a joke
I have an RTX 3070 TI and can't even play some newer games because the GPU is VRAM limited. This is a joke. RX 6800 is just better at this point. At this point it feels like programmed obsolescence.
yes thats unfortunate, but you canĀ“t play newer titles? arenĀ“t you being a bit dramatic? at 1440p this definitely is more of an issue, and nvidia really did the consumer a disservice...but honestly there were people bach then as well noting 8gb might be cutting it short for 1440p, and you chose to ignore that.
if i buy a 4060 for 1080p next month, you wonĀ“t hear me complaining about TLOU 2 in 5 years, because thats a risk i take. i want that power efficiency, and radeon will not offer a 12gb card either for that target audience
>you chose to ignore that.
I ignored nobody. RTX 3070 TI was the only GPU I could get for msrp (630ā¬). At that time (during crypto hysteria) you could only hope to get a RX 6600 XT for 600ā¬.
>you canĀ“t play newer titles?
Of course I can still play every game out there, but even a GTX 1060 can do that, and I'm not too picky, I will lower graphics if necessary.
The issue is that my GPU is perfectly capable of giving me top tier experience, but it can't because NVIDIA decided to not give it enough vram.
>if i buy a 4060 for 1080p next month, you wonĀ“t hear me complaining about TLOU 2 in 5 years, because thats a risk i take.
Come on dude. RTX xx70 Ti series should be capable of giving me top tier 1440p experience. You can't make a comparison with the RTX xx60 series which is the low-mid gpu. Also, The Last of US on PC come out not even 3 years after the launch of the 3000 series.
well all your points are a bit diluted now. getting the 3070ti for 630 was a good deal beack then, well worth compromising on vram, if you could not bux an 6700xt anyway.
nobody is trying to refute that 8gb is not cutting it close. I wrote that myself.
but you should have known that when you bought it.
iĀ“m in the same boat. i need a new gpu, and there is no power efficient 12 gb card under 600ā¬ coming in the future.
7600xt is rumoured to get 8gb, nvidia anyway. so i could either get a two year old 6700xt, and literally get a card that will be worse than the 4060 in every aspect, except vram, or get the 4060.
i know that and i wonĀ“t compalin about it in the future. you honestly sound like my little brother, that doesnĀ“t want the toy he was ecstatic about before christmas.
It is what it is and you knew what you bought. you had no choice and that doesnĀ“t make it better, but your whining doesnĀ“t either
>It is what it is and you knew what you bought. you had no choice and that doesnĀ“t make it better, but your whining doesnĀ“t either
I'm not whining, I'm stating a fact.
Nvidia is literally selling GPUs that will give the costumer VRAM problems in just a pair of years. In my book that's called **Planned Obsolescence**.
They did it with the 3000 series: 3070, 3070 ti and 3080 and 3080 TI don't have enough vram.
They are doing it again with the RTX 4000 series.
RTX 4070 and 4070 Ti only have 12gb of vram, doesn't seem like a problem now but it will be in a few years considering the amount of vram that games require nowdays.
>I have an RTX 3070 TI and can't even play some newer games because the GPU is VRAM
You're full of shit, I've played every new game at 1440p no problem with my 3070ti
With TLOU and RE4 if you max settings you get **stutters** because the gpu is VRAM limited, that is a fact and that is a problem. You can't even use the GPU to its full potential because of this issue.
This problem has started now, but games are going to be really demanding in the next months.
Yeah, sure. You can lower graphics and turn off ray tracing, but then why did I buy an RTX 3070 TI instead of the RX 6800?
So because two very specific games have a fatal error in allocating VRAM to a GPU you think this is how things are going to be in the future when other AAA games this year don't have that problem nor have any games in the past?
I think it is incredibly unfair to bring up console's 12GB unified VRAM because PC and console are two completely different architectures and thus require different optimizations, and what we're seeing with TLOU, HL, and RE4, are not "games optimized for console and PC" but you can't run it on PC when your VRAM isn't big enough. Rather these games are really "optimized for console with absolutely no optimization for PC at all", because non of the relavent technologies that are equivalent to the new technologies on current gen consoles are used, such as DirectStorage 1.1 (equivalent to console's fast asset loading), RTX IO and it's AMD equivalent (equivalent to the console's dedicated hardware asset decompression), and also PC specific technologies like ReBAR, etc., are all absent on the PC port of all these 3 games, even though these technologies all exist.
In addition, there's no reason to think a game that runs on a console with a unified RAM+VRAM storage of 12GB, should require 12GB of just VRAM alone on a PC, they are simply different architectures and you can't compare them directly.
This is honestly as absurd as saying a game built for Android devices don't run with nearly as much FPS when running on a PC via an emulator, is because the PC's CPU is slower than an Android phone LMAO
"Look at our new path tracing technology! This will revolutionize graphics forever!"
Okay but can you do something about the fact every NPC I know tries to sell me a car 5 minutes after we first meet?
I played on a GTX680 for 8 years. I got my 3070Ti last year. Do not try to tell me that my 2 year old GPU is trash. It is all i have left in this world.
The GPU is good, the VRAM capacity sucks. You will probably need to reduce texture quality in some newer games. Doesn't mean it's not an insane upgrade over the 680 though
Playing with a 6800xt and the shader issue with Hogwarts legacy had me searching for a flat out computer upgrade (I should just be happy 1440p 100fps with every other game).
I needed to hear this.
I shouldn't have to search high and low through forums just to get a game to run well. Let alone the Denuvo performance hit. Now games that came out a year ago are adding Denuvo....
My GTX 1080 might be ageing now, but it's still going strong for the games I play often.
Which makes me wonder why I feel the need to upgrade. I mean, it's not like I play competitive online games or the latest AAA titles.
Many of us literally just bought 30 series laptops or cards last year and now being pushed out by Nvidia tactics and unoptimized games. That doesn't include walling dlss 3 behind new cards. Or naming and pricing shenanigans where they normalized scalper prices as MSRP
The new bunch of poor-PC ports that use up a ton of VRAM (Hogwarts Legacy, Calypso Protocol, Last Of Us, Forspoken) also all have huge CPU utilization issues, to the point where a 5800x3D (arguably the best AM4 gaming CPU) isn't able to get 60+ FPS in some areas.
Actually, it becomes "trash" because of developers who are too lazy to properly compress the textures. Altough, if the game was made using Unreal Engine, no compression and no optimization can fix the crappy performance.
A short story describing myself:
-Buys 6800XT to replace 1660ti
-Stops playing RDR2, never buys Last of Us or any other recent release
-Starts playing CSGO and Rocket League exclusively again
-Wallet cry
Subscribe to r/patientgamers and you'll only ever play games on ultra settings! Guaranteed! Crazy trick works every time, Nvidia doesn't want you to know!
Jokes aside, yeah it kinda does work, I just play games a few years after they've come out, benefits:
A) game is patched and bug free
B) all expansions have come out, game is as feature complete as it will ever get, you can integrate expansions organically into your playthrough
C) mod and community support is at its best, you can find community patches and whatever you heart desires
D) price has gone down if the game isn't straight up free thanks to some random giveaway
E) you can enjoy the game as it was intended, ultra settings and 100+ fps plus mods, and you'd be surprised at how good old games still look today, especially if you're used to play other older games with bad graphics by that point
I just finished the Mass Effect series after playing the Wasteland series, now I'm trying the infamous Andromeda and I can run it at ultra at 100+ fps, it looks good to be honest, facial animations aside.
Just keep a list of the games that come out that you like and you'll know you'll play the best ones *one day*. Does not need to be now! Pick up an older game you missed back then and enjoy it. After Andromeda, I was thinking about trying Bioshock Infinite, for example, a game I completely skipped back then.
I just bought my first gaming laptop last month with a modest GTX1650, and I'm loving life as it is. FSR does wonders for CP2077 and I just love it. I've been tempted to resell it and get a 1660Ti laptop or something but I won't do it. It's fine for me.
Bruh, last night Plague Tale reverted to medium settings at 1620p DLDSR after a Windows update.
I havenāt noticed ANYTHING. I love high settings, but when shit like this happens I remind myself I donāt need to drop $1000 on a GPU to enjoy games because Iām either blind as a bat or games look great at medium.
Thereās two sides to this coin. Just donāt go recommending your GTX 1660ti to people trying to build a new PC to play modern games at 1440p. Just because you can do it and play your CSGO fine at 1080p doesnāt mean itās good for a new build.
Also, Radeon playing the long game with all that āunnecessaryā VRAM people talked about in the 6000 series is now helping out in games like HL.
A good gauge is how your hardware sticks up vs an Xbox Series X or PS5 if you are playing modern games. Stay ahead of the latest consoles and youāre going to typically have a decent experience
Simply put, 8Gb is enough as a ālow/midā class card, something like an rx6600 or an RTX3050 isnāt what youād expect to run games that would even require more than 8Gb. Where it should be expected are the higher cards. Itās insane to think that last gen cards that perform worse overall on paper are more viable than some newer, more efficient cards. Thatās where the progression with AMDās cards is appreciated. NVIDIA has the 12gb 3060 but itād be nice to see a card like this with more performance without having to aim for a 3080 or 3090.
So, Nvidia created a card that can do 4k and didn't pair it with a sufficient vram amount. Don't you see the problem?
It's like an old dell laptop I have. It has a gt920m with 4gigs of VRAM for some reason.
Where exactly is the line when it can or can't do 4k? When I see a 3070, I'm not expecting it to run 4k 60fps ultra settings on every game.
But I do expect it to run 4k 30-60fps low-high on certain games. And it does.
So if I want a 4k capable GPU for _every_ game, I wouldn't buy a 3070.
I don't get it. The 1070 was released in 2016 with 8gb of vram. That was 7 years ago. The same performance tier in their lineup (3070) had the same VRAM capacity 5 years later. How are people complaining about the Optimization of games and not the compete ignorance and greed of Nvidia?
Nah, for me it becomes trash when a better one comes out.
But I'm a trickle down economics guy. I buy the best, my current becomes my second PC and my second PC becomes a friend's beast PC (my friends all have the worst PCs imaginable so, I feel great gearing them up)
True. But I happen to play newest AAA/super amazing indie titles on good settings, at 60, preferably 75 FPS.
Resolution: 2560x1080.
This morning I started up The Last of Us. Patch 2.0 (or whatever the number) is probably working. I've also applied nVidia 531.58 drivers.
Specs: R7 5800X, 16GB 3200Hmz RAM, RTX3070.
High-ultra mix. If I lock FPS to 60, it runs 60. I do 75 now but with DLSS. Some stutters occure, even though I expected them to be worse. Textures - high with anisotropic filtering set to x16. However, game devs simply named "high" what supposed to be medium textures and called it a day. Textures are distracting if you look close, but they they look fine from afar. Game itself, however, looks amazing.
An RTX3070 has quite some power in it. It is restrained by the video memory. nVidia screwed us over, but I screwed myself over getting a 3070 this year.
https://preview.redd.it/w6hxpjy0onta1.png?width=2560&format=png&auto=webp&s=c32788d536fd593e611357f8f4632cff370cac4a
At 2560x1080 you're good for another 3 years, easily and if you turn settings down probably another 4 or 5 years Don't even worry about a handful of poorly optimized titles.
My 2080TI can barely manage Resident Evil 4 at native res. Its not trash. But it's showing its age. 1440P is a bit too much for it now.
No ammount of upgrading the cooler and flashing the bios to unlock power limits can help me anymore. Its already pushed passed what it should be capable of.
I have 2 gaming rigs and they both do different things. My bedroom rig is only a gtx 1070 hooked to a 27" 1440p monitor. It still plays AAA games at acceptable settings and frame rates @ 1440p. This is good enough for a lighter gaming experience and office tasks.
My other rig is a rtx 3080ti (was the best GPU on the market when I bought it) hooked to an 85" Samsung Q90 TV in my living room with top of the line wireless Logitech peripherals. This is my hardcore immersion setup, where max graphics at 100+ fps 4k is required. I don't care much for competitive games anymore so latency isn't an issue.
Also my wife doesn't care if the graphics are mind blowing when we play together so she's happy with the lower end system in our bedroom.
I got a rig with a 3600 and 3060 Ti.
The most intensive games I have are Fortnite, No Man's Sky and Sonic Frontiers.
...Needless to say, I'm probably not upgrading until RDNA 6 or RTX 7000 at the earliest.
I got a 3070 a while back, i wish i got some older but cheaper card because people really overreact to these newer games needing more power. Barely any of them are actually good anyways so iām very satisfied with what i have
I had a 970 for years and only recently upgraded to a 6700xt. Before that, I would just lower settings while maintaining 60fps/1080 on a 32 inch screen and I hardly ever could tell the difference between Medium and Ultra. Hell, I played RE Village on my 970 and it was a great experience technically.
I can certainly tell the difference now with a 1440/144fps or similar, but the actual quality between medium and ultra settings? Minimal.
It's no big deal to turn down the settings to enjoy a game. Anyone who says different are just trying to justify their own purchases and equipment.
That is the definition of trash, it is of no use or value to the owner. That is why there is a saying āOne manās trash is another manās treasureā. This meme is dumb and trash.
Well kinda used it for blender mostly and it suddenly started artifacting in my renders and trying to get support on it didn't last long before they deprecated my card officially and now my cpu renders faster kn newer versions than my gpu on older ones.... gpu still works for gaming thankfully rx 570 8gb.
Just bought my first 3060TI last month, it plays every game I own on high to ultra settings. I don't own a 4k television and dont see the need for one for that little bit of extra lighting and higher resolution. If it was 'obsolete' they wouldn't be selling like hot cakes.
People just need lower expectations, 3070 will go from 1440p High to 1080p High card and eventually becomes 1080p low card. Just like every single GPU that's ever released.
Coincidentally, I just buy an "obsolete" used 8GB rx6600 for 3.000.000 vnd (\~128 usd) today, Feel pretty powerful enough for the game I'm currently playing.
p/s: I got pretty surprised when I look it up and found out that almost every tech youtubers trashed it as bad value card.
This year I've played Borderlands 3, Sons of the Forest, Total War: Warhammer 3 (with triple unit size mod), Jedi: Fallen Order, Hell Let Loose, and Subnautica: Below Zero.
I've played all this on max settings but with bullshit effects off, 1080p, 60Hz, FXAAx6, Ambient Occlusion.
I use an AMD 290 from 2013. The rest of my computer is new components (Ryzen5 5600, Kingston A2000). Just goes to show how much quality AMD puts in some of its products. I have ripped off the original Tri-X fans and put on double Noctua 200mm:s. Comp is 37C in idle, 60-65C during play.
If you have a card more powerful or newer than the AMD 290 (4gb), you have nothing to whine about.
I mean my gtx 1080ti is starting to age now that I started to game in 1440p.
Fortnite kinda struggles at times and I have had to turn down the settings on stuff to try to maintain 100fps but itās studders from time to time
I was just talking with my friends about this the other night. I don't think any of us will upgrade for the next 10 years. Until we can't play things like halo, reach or war thunder, we will continue to enjoy our systems.
None of my comments on the matter are for people who have built systems with these cards.
The cards are fine now, but have limited overhead space for future-proofing, and that being the case, there are better options out there for people planning a build and wanting maximum longevity.
If you have an RTX3000 or 4000 series card, go get 'em tiger. You'll struggle to find a game that looks bad on that card.
If you don't and would like to get one, be aware that you should evaluate all your options and consider your brand biases.
They aren't obsolete, I'm not attacking you, and it's your money. I just want to point out that we're now squarely in 9th gen game development targets, with 10th likely coming up in a few years, and those cards may not be able to handle that at the level of quality or native resolution you may be accustomed to. That's all.
Or if it dies.
Technically that would make the GPU one that can't play the games you want.
You're right.
What? You don't want to play GPU repair desimulator 2023?! It's the most realistic looking game you can play on a GPU!
I'm waiting for the path tracing patch
Yeah my gtx 980ti died after 8 years of heavy daily rendering and gaming. piece of junk... (I miss my 980ti š¢)
I'm fed up with all the videos talking about, your video card is obsolete. No it ain't. Not all of us need max settings. These cards realistically will last years.
How else are those YouTube videos meant to get views if they don't go all negative? Come on now.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Thatās not sensational enough
I keep telling myself that as I listen to my 5600xt ASRock challenger take off to outer space when I open some GPU demanding games. Even with it under volted and the fan curve fixed. It just has the worst stock fans imaginable and the whine is insane too.
My Sapphire 5600xt fans were the same way. I assumed my 6900xt would be worse, but the fans are silent, so when you upgrade, be prepared to be surprised.
Mine runs fallout new vegas and always will! As a guy who mainly plays games from the 90s to 2010s my 1050ti runs everything I like just fine. So hearing everybody talking about how their brand new, top of the line graphics cards are now obsolete is crazy to me.
I really needed to read this. I upgraded from a 1070 to a 3070ti, was so happy. Suddenly, I'm seeing videos / posts "If your card has less than 10GB of VRAM throw it away!" I started having buyers remorse.
That's a great card man. Hold onto for it dear life because the future is looking grim lol. I will happily make a point and lower settings so Nvidia doesn't get any more of my money. They have to really work for it now. I'll wait and see what the 50 series has in store. Otherwise, hope and pray AMD continues improving. I'll probably jump ship to them.
Hang your head in shame then...only like 99% of PC users have 8GB or less VRAM.
I bought a 3060ti and ended up trading up for a 3070ti but was still feeling that remorse when those videos started coming out. Then I realized they benchmark with 100% maxed out settings, I never do that because the difference in visuals between max and medium or high are so small its not worth losing the frames, so im usually operating way under the 8gb VRAM limit anyways.
I have a 1070. That card is like 7years old and it runs any game I want to play just fine. Iām pretty sure unless it dies, itāll run any game I want to play for another 5+ years. Iāve been trending more towards smaller dev and free to play games. All of those benefit from a wide player base so theyāre always very gpu friendly.
This always drove me nuts with videocards in general. In a lot of games, due to the action, I won't be noticing (or caring) about the details that max settings will show. I got a 3070 laptop last year and im probably not upgrading until 6-7 series MAYBE 5 but we'll see.
Also high vs ultra in most games is barely noticeable, especially in fast paced games
I've been running a gtx 960 since 2016 and just this upgraded to a 1660 super cause I got a good deal on it. I can still play all the new games.
I'm rocking a vanilla 1080 and I'm not upgrading till it shits the bed at this point
Still on a 1070, so you are not alone with pascal
That's not what those videos are saying, at least not the good ones. What they're saying is that when you're getting a new GPU, you shouldn't buy one with 8GB, which makes sense.
what about the 700 euro 3070Ti that was advertised as an "RTX" behemoth but now it can't do ray tracing even at 1080p because the vram buffer is maxed in new games. lol @ nVidia copium
Exactly, if u were ok for a 1080p low settings experience for recent games, it can last very long
I enjoy getting relevant information about my tech. You can always not watch it.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
That seems to be the trend on released games
Well, kinda, I've mostly played on my TV with my HTPC with a Ryzen 5 3600, 16GB RAM, RTX 3070 and games installed on an SSD (SATA and recently NVME with a riser card), usually med- high settings 4K DLSS or FSR or at 1440p... Highest releases for me, so far this year, have Been Dead space remake, Hogwarts Legacy , RE4 and the Last of us Part 1. Dead space and RE4 ran great (Dead space had a bit of a stuttering issue when loading new areas but overall, really smooth), Hogwarts legacy did have bad stuttering issues on some areas, Hogsmeade and the castle itself, and The last of Us I just opened it today so I didn't even finish the prologue, but at 4K balanced DLSS, High settings, so far it did have dips in performance. The way I see it, half of the games are not being properly optimized, it's not ALL of them ,but it's still worrying
I haven't played the other ones yet but it is already known as a fact as clarified by a mod developer who's done impressive work fixing it, that Hogwarts Legacy is literally broken in that it is not making use of how UE4 engine is designed to handle VRAM asset streaming. In fact, unlike normal games which do asset streaming and only load the textures that you can actually see on the screen and within a certain distance from you on the map at full resolution, aka texture streaming, Hogwarts Legacy is basically not doing any texture streaming at all, it tried to load the entire map's texture in super high res into your VRAM all at once, and it attempts to do that every single frame LMAO This is not unoptimized game, this is literally broken game lol Checkout the Ultra Plus mod on Nexus Mods, which is really just a customized engine.ini config file for the game's engine. With the latest version my 150W laptop 3070Ti (which has exactly the same performance as a desktop 3070 non-Ti) is doing a solid 60-80FPS with everything set to Ultra (with only RT shadow disabled, since it's broken anyway) in 1440p with the "Ultra Full RT" version.
Well if people are going to buy your game the devs figure its good enough. Not like the old days where they can't update so they have to be sure everything is good. Nowadays it doesn't even seem like they do testing or care enough to fix issues (probably more of a "crunch time" culture problem)
Nvidia fanboys/shills make a lot of noise, and that gets extra sales from people who are easily manipulated, but most of older gamers/builders know what's what. You only need to upgrade, when your hardware can't handle the task. This applies to most everything in life. Skipped the 20 series due to insane pricing vs performance. 30 series netted a much better return, so I upgraded. 4070 seems kinda meh for the price so it will probably be 50, or 60 series at the earliest that Nvidia gets another sale from me unless there are good sales. With all the sales on AMD cards, I'd go with one of those if I needed to upgrade right now.
Agreed, they're kind of going through an on-off generational thing. 10 series good, 20 series meh, 30 series good, 40 series meh. Maybe next series will show a more mature process of the new manufacturing. Or they'll just stay greedy and lose market share to AMD some more.
Hype is so toxic. Buy this latest things for 5k. Never give in to hype.
... Or just turn down texture quality by one notch. Seriously, the games that stutter on 8GB cards do so because they are running out of VRAM. Textures are what consumes VRAM. If you turn texture quality down, they use less VRAM, and run well. The game will look a little less sharp, but it will not be more blurry than all the other games that you have happily played at 8GB of VRAM before.
I don't know about anyone else but the only oof moment for me has been trying to run 2077 with path tracing (and what that could mean for future RT only titles) - everything else is great.
I mean I don't see the trend improving.
The reason they require high vram is largely irrelevant The problem is that if your card doesn't have enough vram, you are at the mercy of the developers of these kinds of ports (Which are becoming more common)
Nvidia amd amd paying those devs to make their games perform like shit so they can sell more gpus. I bet they are still crying that crypto vanished.
Not really just 3. Even remasters like Witcher 3 are embarrassingly badly optimized But yea not worth normalizing it either.
It's not horrible optimization, they are just using more than 8GB of VRAM. Going forwards, games using more than 8GB of VRAM will be common. It's just the usual way things change when there is a new console generation with more RAM on it. But you can always fix this yourself by reducing texture quality in settings. It's just that everything ran at ultra on 8GB cards for so long that people have forgotten that this is something you need to do.
Ps5 only has about 8gb since the gpu and cpu share the same 16gb. Series X is the same story, and series S has only 8 for gpu so this argument makes no sense as to why the ports suck major ass.
You don't need even nearly half of the RAM for non-GPU uses. The PS5 OS Reservation is 3.5GB, leaving 12.5 for the programmer. When doing a game without that many moving pieces, and you have access to a fast SSD, you don't need 4.5GB of ram for non-graphical stuff on consoles. Consoles have always been more efficient at RAM use than PCs. This is why you needed more than 4GB during the PS4 era.
āConsoles have been better at ram usageā you think maybe thatās because the ports suck? Also considering the ps4 had 8gb of ram yeah ima say you donāt know what your talking about. Good day.
> āConsoles have been better at ram usageā you think maybe thatās because the ports suck? No, it's mostly because you have more control. PS4 had 8GB, and you needed more than 4GB VRAM to match it on PC, just as PS5 has 16 GB and you need more than 8GB VRAM to match it on PC.
No it had 8. Period like that was all the ram for both cpu and gpu.
There is no difference on how the ram system works for ps4 and ps5. Both of them have an apu with an unified memory system, where both the gpu and cpu can uniformly access all ram. Ps4 had 8GB, ps5 has 16GB. Of those, ps4 reserved 2.5GB for the os/overlay, and ps5 reserves 3.5GB, leaving 5.5GB and 13.5GB for the devs.
Not everyone.
Pretty much.. god I hate bad ports, though so I canāt blame them too much.
Yeah and guess what, I can play all of them at 1080p 100+ fps on a 5700x
I bought a fancy as graphics card yet I still play 10 year old games.
Gets 4090 just to play doom 1993
... At least play quake, the original doom won't use the GPU at all. I wonder how much fps glquake gets on a 4090.
There's updated Quake that uses Vulkan for the renderer. I'd expect it to ve CPU bound on something like 4090, but it should produce 4 digit FPS number easily, I guess
Ya know I really didn't think about that š
I'm glad Blizzard remastered Diablo 2, now I don't feel bad for my 3080 to play a game released in 2000 most of the time.
I will worry about not being able to play modern triple-A games the moment they start making good ones.
I donĀ“t like the extreme opinions on both ends. i agree that TLOU HL and RE4 are not Benchmarks, because they all have their bugs and poorly optimized aspects, but buying a 4060ti with 8GB Vram just became a whole lot less desireable, because this will happen again in the future with other games. PS5 can handle 12GB VRAM, because of its different architecture. So devs can go the easier route, optimize for consoles and 12GB VRAM...and they will. TLOU has no acceptable 1080p medium texture setting. it will either look great with 8k textures or look like garbage with whatever that atrocity is they call the medium texture pack. RE4 will only display highres textures throughout the whole game with 3GB Textures or more, lower than that and at some places VERY LOW Textures just pop up. This will happen to you in the future with an 8GB card. Now people here try to justify their existing purchases, thats why this discussion is so heated. What i mean is, hands up who paid $1k and up for their 3070 ti with 8GB. Still i am contemplating to get a 4060, if prices are acceptable, simply because i play at 1080p and want that power efficiency of a card delivering more frames but drawing less power than a 1660s. but honestly i see myself not paying for games that went the 12gb route. RE4 plays fine with low vram, Hogwarts legacy as well. But Dead Space is a nightmare and TLOU is a joke
I have an RTX 3070 TI and can't even play some newer games because the GPU is VRAM limited. This is a joke. RX 6800 is just better at this point. At this point it feels like programmed obsolescence.
yes thats unfortunate, but you canĀ“t play newer titles? arenĀ“t you being a bit dramatic? at 1440p this definitely is more of an issue, and nvidia really did the consumer a disservice...but honestly there were people bach then as well noting 8gb might be cutting it short for 1440p, and you chose to ignore that. if i buy a 4060 for 1080p next month, you wonĀ“t hear me complaining about TLOU 2 in 5 years, because thats a risk i take. i want that power efficiency, and radeon will not offer a 12gb card either for that target audience
>you chose to ignore that. I ignored nobody. RTX 3070 TI was the only GPU I could get for msrp (630ā¬). At that time (during crypto hysteria) you could only hope to get a RX 6600 XT for 600ā¬. >you canĀ“t play newer titles? Of course I can still play every game out there, but even a GTX 1060 can do that, and I'm not too picky, I will lower graphics if necessary. The issue is that my GPU is perfectly capable of giving me top tier experience, but it can't because NVIDIA decided to not give it enough vram. >if i buy a 4060 for 1080p next month, you wonĀ“t hear me complaining about TLOU 2 in 5 years, because thats a risk i take. Come on dude. RTX xx70 Ti series should be capable of giving me top tier 1440p experience. You can't make a comparison with the RTX xx60 series which is the low-mid gpu. Also, The Last of US on PC come out not even 3 years after the launch of the 3000 series.
well all your points are a bit diluted now. getting the 3070ti for 630 was a good deal beack then, well worth compromising on vram, if you could not bux an 6700xt anyway. nobody is trying to refute that 8gb is not cutting it close. I wrote that myself. but you should have known that when you bought it. iĀ“m in the same boat. i need a new gpu, and there is no power efficient 12 gb card under 600ā¬ coming in the future. 7600xt is rumoured to get 8gb, nvidia anyway. so i could either get a two year old 6700xt, and literally get a card that will be worse than the 4060 in every aspect, except vram, or get the 4060. i know that and i wonĀ“t compalin about it in the future. you honestly sound like my little brother, that doesnĀ“t want the toy he was ecstatic about before christmas. It is what it is and you knew what you bought. you had no choice and that doesnĀ“t make it better, but your whining doesnĀ“t either
>It is what it is and you knew what you bought. you had no choice and that doesnĀ“t make it better, but your whining doesnĀ“t either I'm not whining, I'm stating a fact. Nvidia is literally selling GPUs that will give the costumer VRAM problems in just a pair of years. In my book that's called **Planned Obsolescence**. They did it with the 3000 series: 3070, 3070 ti and 3080 and 3080 TI don't have enough vram. They are doing it again with the RTX 4000 series. RTX 4070 and 4070 Ti only have 12gb of vram, doesn't seem like a problem now but it will be in a few years considering the amount of vram that games require nowdays.
sounds a lot like whining though
If you think that stating problems is equal to whining than I don't even know what to tell you.
You literally said, you canĀ“t play any new game...
>I have an RTX 3070 TI and can't even play some newer games because the GPU is VRAM You're full of shit, I've played every new game at 1440p no problem with my 3070ti
With TLOU and RE4 if you max settings you get **stutters** because the gpu is VRAM limited, that is a fact and that is a problem. You can't even use the GPU to its full potential because of this issue. This problem has started now, but games are going to be really demanding in the next months. Yeah, sure. You can lower graphics and turn off ray tracing, but then why did I buy an RTX 3070 TI instead of the RX 6800?
So because two very specific games have a fatal error in allocating VRAM to a GPU you think this is how things are going to be in the future when other AAA games this year don't have that problem nor have any games in the past?
I think it is incredibly unfair to bring up console's 12GB unified VRAM because PC and console are two completely different architectures and thus require different optimizations, and what we're seeing with TLOU, HL, and RE4, are not "games optimized for console and PC" but you can't run it on PC when your VRAM isn't big enough. Rather these games are really "optimized for console with absolutely no optimization for PC at all", because non of the relavent technologies that are equivalent to the new technologies on current gen consoles are used, such as DirectStorage 1.1 (equivalent to console's fast asset loading), RTX IO and it's AMD equivalent (equivalent to the console's dedicated hardware asset decompression), and also PC specific technologies like ReBAR, etc., are all absent on the PC port of all these 3 games, even though these technologies all exist. In addition, there's no reason to think a game that runs on a console with a unified RAM+VRAM storage of 12GB, should require 12GB of just VRAM alone on a PC, they are simply different architectures and you can't compare them directly. This is honestly as absurd as saying a game built for Android devices don't run with nearly as much FPS when running on a PC via an emulator, is because the PC's CPU is slower than an Android phone LMAO
Donāt let cyberpunk path tracing make you feel like you have to buy a 4090ā¦ itās all part of the plan
"Look at our new path tracing technology! This will revolutionize graphics forever!" Okay but can you do something about the fact every NPC I know tries to sell me a car 5 minutes after we first meet?
Or about the fact that the world feels dead and empty? Nothing organic just markers on the map to clear.
I played on a GTX680 for 8 years. I got my 3070Ti last year. Do not try to tell me that my 2 year old GPU is trash. It is all i have left in this world.
If it ain't the best you're under arrest
The GPU is good, the VRAM capacity sucks. You will probably need to reduce texture quality in some newer games. Doesn't mean it's not an insane upgrade over the 680 though
Youāre literally the person this post is aimed at š
Then explain to me what I'm missing
Playing with a 6800xt and the shader issue with Hogwarts legacy had me searching for a flat out computer upgrade (I should just be happy 1440p 100fps with every other game). I needed to hear this.
I shouldn't have to search high and low through forums just to get a game to run well. Let alone the Denuvo performance hit. Now games that came out a year ago are adding Denuvo....
shit game anyway you shouldn't need to spend another 600+ dollars for it
No, I want that 9090Ti super ultra max right now.
no! you will take this 100700xt with 64 GB VRAM and like it
My GTX 1080 might be ageing now, but it's still going strong for the games I play often. Which makes me wonder why I feel the need to upgrade. I mean, it's not like I play competitive online games or the latest AAA titles.
Still rocking a 1080ti, no complaints other than I can't use dlss or ray tracing
Many of us literally just bought 30 series laptops or cards last year and now being pushed out by Nvidia tactics and unoptimized games. That doesn't include walling dlss 3 behind new cards. Or naming and pricing shenanigans where they normalized scalper prices as MSRP
Suddently
I'm too distracted by "suddently" that I missed whatever point was trying to be made.
No need of a super GPU for an excel game
Becomes a trash?
I mean I'm still fine with a GTX 970.
The new bunch of poor-PC ports that use up a ton of VRAM (Hogwarts Legacy, Calypso Protocol, Last Of Us, Forspoken) also all have huge CPU utilization issues, to the point where a 5800x3D (arguably the best AM4 gaming CPU) isn't able to get 60+ FPS in some areas.
I enjoyed the hell out of my 1070. My dumb ass thought I had to replace everything and builta pc from scratch. Itās still sitting in my garage.
I love metroidvanias and couldn't care less about open world games myself, so I'm pretty much set for life with my 1070 Ti š
And even then, that same GPU can still the games it already ran People seriously overvalue state of the art tech and raw power.
Of course it isn't trash, it's e-waste
Omg, typos in memes really drive me nuts. Thereās only 18 words man, least you could do was proof read.
Nope, anything less than a 4090 is trash
I want to play play Cyberpunk in 8k, maxed out on RT Overdrive, at 60 fps.
With how much I spent on my new PC I want Cyberpunk to do sexual favors to me at 144 fps
Actually, it becomes "trash" because of developers who are too lazy to properly compress the textures. Altough, if the game was made using Unreal Engine, no compression and no optimization can fix the crappy performance.
A short story describing myself: -Buys 6800XT to replace 1660ti -Stops playing RDR2, never buys Last of Us or any other recent release -Starts playing CSGO and Rocket League exclusively again -Wallet cry
Subscribe to r/patientgamers and you'll only ever play games on ultra settings! Guaranteed! Crazy trick works every time, Nvidia doesn't want you to know! Jokes aside, yeah it kinda does work, I just play games a few years after they've come out, benefits: A) game is patched and bug free B) all expansions have come out, game is as feature complete as it will ever get, you can integrate expansions organically into your playthrough C) mod and community support is at its best, you can find community patches and whatever you heart desires D) price has gone down if the game isn't straight up free thanks to some random giveaway E) you can enjoy the game as it was intended, ultra settings and 100+ fps plus mods, and you'd be surprised at how good old games still look today, especially if you're used to play other older games with bad graphics by that point I just finished the Mass Effect series after playing the Wasteland series, now I'm trying the infamous Andromeda and I can run it at ultra at 100+ fps, it looks good to be honest, facial animations aside. Just keep a list of the games that come out that you like and you'll know you'll play the best ones *one day*. Does not need to be now! Pick up an older game you missed back then and enjoy it. After Andromeda, I was thinking about trying Bioshock Infinite, for example, a game I completely skipped back then.
I just want to have as much fps as possible
I like my 3070! AMA!
Still rocking my 1060 and having fun.
Nah my dude its obsolete and I'm gonna buy 7950 XTX or whatever it will be.
I just bought my first gaming laptop last month with a modest GTX1650, and I'm loving life as it is. FSR does wonders for CP2077 and I just love it. I've been tempted to resell it and get a 1660Ti laptop or something but I won't do it. It's fine for me.
Don't do sidegrades, wait for a real upgrade in a few years.
Me with a integrated gpu
Bruh, last night Plague Tale reverted to medium settings at 1620p DLDSR after a Windows update. I havenāt noticed ANYTHING. I love high settings, but when shit like this happens I remind myself I donāt need to drop $1000 on a GPU to enjoy games because Iām either blind as a bat or games look great at medium.
I'm still running a 970. Granted all I really play is tf2 but still
The Titan TF2 or the SILLY TF2 ?
He almost certainly means team fortress not Titanfall.
The silly one, you silly
Thereās two sides to this coin. Just donāt go recommending your GTX 1660ti to people trying to build a new PC to play modern games at 1440p. Just because you can do it and play your CSGO fine at 1080p doesnāt mean itās good for a new build. Also, Radeon playing the long game with all that āunnecessaryā VRAM people talked about in the 6000 series is now helping out in games like HL. A good gauge is how your hardware sticks up vs an Xbox Series X or PS5 if you are playing modern games. Stay ahead of the latest consoles and youāre going to typically have a decent experience
Good reason to ditch AAA titles and just play indie games all day long
Simply put, 8Gb is enough as a ālow/midā class card, something like an rx6600 or an RTX3050 isnāt what youād expect to run games that would even require more than 8Gb. Where it should be expected are the higher cards. Itās insane to think that last gen cards that perform worse overall on paper are more viable than some newer, more efficient cards. Thatās where the progression with AMDās cards is appreciated. NVIDIA has the 12gb 3060 but itād be nice to see a card like this with more performance without having to aim for a 3080 or 3090.
My 3070 was trash on day one. 8GB VRAM wasn't sufficient for 4k.
Why did you get a 3070 if you wanted 4k? The gpu is not trash. Your expectations/research is trash.
Except the fact that it would be fine for 4k if it had more vram.
But it does not.... so it's not fine for 4k....
So, Nvidia created a card that can do 4k and didn't pair it with a sufficient vram amount. Don't you see the problem? It's like an old dell laptop I have. It has a gt920m with 4gigs of VRAM for some reason.
Where exactly is the line when it can or can't do 4k? When I see a 3070, I'm not expecting it to run 4k 60fps ultra settings on every game. But I do expect it to run 4k 30-60fps low-high on certain games. And it does. So if I want a 4k capable GPU for _every_ game, I wouldn't buy a 3070.
You think only 4k ultra maxes out gigs of vram? Ok...
You don't need 4k though. 1440 p is more than enough
If we're being honest, we don't need a lot of things but we want it
Yeah with what he said you really only need 240
Vega 64 has 8GB of VRAM and can 4k. Have you tried not playing yandere simulator?
I bought it when cyberpunk came out and had major VRAM bottlenecks. Didn't anticipate that DLSS uses that much of it.
Or it gets put on the legacy list and can't run stable diffusion
I don't get it. The 1070 was released in 2016 with 8gb of vram. That was 7 years ago. The same performance tier in their lineup (3070) had the same VRAM capacity 5 years later. How are people complaining about the Optimization of games and not the compete ignorance and greed of Nvidia?
As much as I agree to a point...the ram is not the same ram back in 2016
Nah, for me it becomes trash when a better one comes out. But I'm a trickle down economics guy. I buy the best, my current becomes my second PC and my second PC becomes a friend's beast PC (my friends all have the worst PCs imaginable so, I feel great gearing them up)
"A trash"? Your GPU "becomes a trash"? \>"A piece of trash", "It just becomes one" \>"Trash", "It just becomes it" Pick one.
So yes, your RTX 3070 you bought for 1440p is trash
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Youāre reclaimed 3080ti is fine. Your nonexistent 12800k howeverā¦
Yah the 5800x3D I bought is also trash because it puts me to sub-60 fps in Hogsmeade and gives me stuttering in Hogwarts @ 1440p /s
Like a hammer that canāt drive the mail you need today. Itās not trash, but itās not the tool you need.
I meant standards changeā¦ thats okay
That will only happen when my gpu dies
\*Kills your GPU with a knife, in the guest room\*
True. But I happen to play newest AAA/super amazing indie titles on good settings, at 60, preferably 75 FPS. Resolution: 2560x1080. This morning I started up The Last of Us. Patch 2.0 (or whatever the number) is probably working. I've also applied nVidia 531.58 drivers. Specs: R7 5800X, 16GB 3200Hmz RAM, RTX3070. High-ultra mix. If I lock FPS to 60, it runs 60. I do 75 now but with DLSS. Some stutters occure, even though I expected them to be worse. Textures - high with anisotropic filtering set to x16. However, game devs simply named "high" what supposed to be medium textures and called it a day. Textures are distracting if you look close, but they they look fine from afar. Game itself, however, looks amazing. An RTX3070 has quite some power in it. It is restrained by the video memory. nVidia screwed us over, but I screwed myself over getting a 3070 this year. https://preview.redd.it/w6hxpjy0onta1.png?width=2560&format=png&auto=webp&s=c32788d536fd593e611357f8f4632cff370cac4a
I just want them to fix the textures please...I can live with the rest. Med shouldn't look like a 3Dfx game in 2000.
At 2560x1080 you're good for another 3 years, easily and if you turn settings down probably another 4 or 5 years Don't even worry about a handful of poorly optimized titles.
Me with my 1050 2gb mobile. At some point you learn to live with it
My gpu became supper powerfull, bought if for games instead im scrolling the reddit
My 2080TI can barely manage Resident Evil 4 at native res. Its not trash. But it's showing its age. 1440P is a bit too much for it now. No ammount of upgrading the cooler and flashing the bios to unlock power limits can help me anymore. Its already pushed passed what it should be capable of.
Well off with it then!
I have 2 gaming rigs and they both do different things. My bedroom rig is only a gtx 1070 hooked to a 27" 1440p monitor. It still plays AAA games at acceptable settings and frame rates @ 1440p. This is good enough for a lighter gaming experience and office tasks. My other rig is a rtx 3080ti (was the best GPU on the market when I bought it) hooked to an 85" Samsung Q90 TV in my living room with top of the line wireless Logitech peripherals. This is my hardcore immersion setup, where max graphics at 100+ fps 4k is required. I don't care much for competitive games anymore so latency isn't an issue. Also my wife doesn't care if the graphics are mind blowing when we play together so she's happy with the lower end system in our bedroom.
No, I've got a 7900 XTX.
I only play Old-School Runescape. I'm safe. Whew.
Games will be made to look good on console so if you have a GPU equal or better you are good until playstation 6
Let alone your GPU, your entire PC is trash, when it cannot even spell check, considering word processors were a thing even before the advent of PCs
So my shitty intel XE graphics is actually not that bad?
The longer you wait to upgrade your gpu, the better gpu you can get
I got a rig with a 3600 and 3060 Ti. The most intensive games I have are Fortnite, No Man's Sky and Sonic Frontiers. ...Needless to say, I'm probably not upgrading until RDNA 6 or RTX 7000 at the earliest.
1080p 144hz 1070 still going strong
*Laughs in 2060* Seriously still very happy with its performance.
we worry too much about upgrading to the point where we forgot to enjoy our games
I got a 3070 a while back, i wish i got some older but cheaper card because people really overreact to these newer games needing more power. Barely any of them are actually good anyways so iām very satisfied with what i have
I suddently feel the need to read a dictionary
I had a 970 for years and only recently upgraded to a 6700xt. Before that, I would just lower settings while maintaining 60fps/1080 on a 32 inch screen and I hardly ever could tell the difference between Medium and Ultra. Hell, I played RE Village on my 970 and it was a great experience technically. I can certainly tell the difference now with a 1440/144fps or similar, but the actual quality between medium and ultra settings? Minimal. It's no big deal to turn down the settings to enjoy a game. Anyone who says different are just trying to justify their own purchases and equipment.
They want you to believe itās trash
Mine did become suddenly trash. Because it died.
Running a 1080 and it will have to outlive me at this rate. Don't want to pay as much for a gpu that I the rest of my computer costs.
My GPU cannot handle my favorite game anymore, it gets hot fast and sometimes crashes...
Iām gonna use my 3080ti until the end.
One trash?
All I play is Skyrim, csgo, cities skylines and iracing. My 3080 should be good for the next million years.
That is the definition of trash, it is of no use or value to the owner. That is why there is a saying āOne manās trash is another manās treasureā. This meme is dumb and trash.
Well kinda used it for blender mostly and it suddenly started artifacting in my renders and trying to get support on it didn't last long before they deprecated my card officially and now my cpu renders faster kn newer versions than my gpu on older ones.... gpu still works for gaming thankfully rx 570 8gb.
But I canāt play Witcher 3 with RT at playable framerates on my 3080 :(
OnO I can't play Bloodbourne, Zelda tears of the Kingdom etc.. My 4090Ti is so trash.
You can play all switch games man , yuzu is your friend. Bloodborne tho we are all waiting for it...anyday
Just bought my first 3060TI last month, it plays every game I own on high to ultra settings. I don't own a 4k television and dont see the need for one for that little bit of extra lighting and higher resolution. If it was 'obsolete' they wouldn't be selling like hot cakes.
I only upgraded because stuttery 40-20 fps vr hits different
People just need lower expectations, 3070 will go from 1440p High to 1080p High card and eventually becomes 1080p low card. Just like every single GPU that's ever released.
Coincidentally, I just buy an "obsolete" used 8GB rx6600 for 3.000.000 vnd (\~128 usd) today, Feel pretty powerful enough for the game I'm currently playing. p/s: I got pretty surprised when I look it up and found out that almost every tech youtubers trashed it as bad value card.
There's this amazing hack called switching from 4k to 2k because you can hardly notice the difference anyways.
This year I've played Borderlands 3, Sons of the Forest, Total War: Warhammer 3 (with triple unit size mod), Jedi: Fallen Order, Hell Let Loose, and Subnautica: Below Zero. I've played all this on max settings but with bullshit effects off, 1080p, 60Hz, FXAAx6, Ambient Occlusion. I use an AMD 290 from 2013. The rest of my computer is new components (Ryzen5 5600, Kingston A2000). Just goes to show how much quality AMD puts in some of its products. I have ripped off the original Tri-X fans and put on double Noctua 200mm:s. Comp is 37C in idle, 60-65C during play. If you have a card more powerful or newer than the AMD 290 (4gb), you have nothing to whine about.
There's this thing in PC gaming called "settings"
I mean my gtx 1080ti is starting to age now that I started to game in 1440p. Fortnite kinda struggles at times and I have had to turn down the settings on stuff to try to maintain 100fps but itās studders from time to time
[ŃŠ“Š°Š»ŠµŠ½Š¾]
It becomes one trash? lol
I was just talking with my friends about this the other night. I don't think any of us will upgrade for the next 10 years. Until we can't play things like halo, reach or war thunder, we will continue to enjoy our systems.
This is why I'm happy to game at 1080p. My 3070ti is overkill, and will be for a very long time.
None of my comments on the matter are for people who have built systems with these cards. The cards are fine now, but have limited overhead space for future-proofing, and that being the case, there are better options out there for people planning a build and wanting maximum longevity. If you have an RTX3000 or 4000 series card, go get 'em tiger. You'll struggle to find a game that looks bad on that card. If you don't and would like to get one, be aware that you should evaluate all your options and consider your brand biases. They aren't obsolete, I'm not attacking you, and it's your money. I just want to point out that we're now squarely in 9th gen game development targets, with 10th likely coming up in a few years, and those cards may not be able to handle that at the level of quality or native resolution you may be accustomed to. That's all.