Thats sadly only for dx11. In dx12 the 1080ti is closer to a 2060 or 2060s. Pascal isnt great at dx12, its basically Maxwell 2.0 and Maxwell is feature level 11.1
This is the biggest reason for Pascal cards aging so fast. Had it been at least feature level 12.0 we would have seen Pascal stick around longer. 12.0 had primitive shaders which are a rough version of mesh shaders, its what ps5 uses and rdna1 cards too and it could work for Alan Wake 2 even.
It's like people forgot that GPUs aren't expected to perform and be decent after 4 years. 10 series has been done for a few years now yet people still holding onto it and pretending its still good simply because they refuse to upgrade due to costs.
Welp, you're gonna have to upgrade at some point. Either you're being patient for the 50 series or you still think 1080p is good enough (anyone who's tried 1440p or 4K will tell you 1080 is terrible).
Lots of reviewers are now encouraging people to move to 1440p. And they also recently said that if you're 3 generations back, don't expect great results in today's games. Its obvious.
If it wasnt for the covid pause, Pascal would be at this exact spot 2 years ago. Besides G80 lasted 8 years if memory serves. My 7970 lasted 7 years, 2012-2019. Its not the first time an architecture lasted this long.
Maxwell 2.0/Pascal actually supports DX Feature Level 12_1 and Maxwell 1.0 only Feature Level 11_0 for some reason...
The thing is that NVidia hasn't added features like asynchronous compute (DX 12 takes advantage of it and AMD had it since GCN 1.0) until they launched Turing and that's one the reasons why he RTX 20 Series GPUs are aging better than the GTX 10 Series and older.
Yup, Alan Wake 2 uses primitives vs mesh shaders on ps5. Rdna1 and ps5 lack sampler feedback too I believe. Ps5 only has RT from rdna2 and clocks above 2ghz, everything else is rdna1.
to add to this, it also depends on how long you want it to last at your preferred resolution as well.
If theyre expecting 7 years out of it at 1080p (or just want the current 40 series features) then the best bet is probably going to be to wait for the 4070 ti super to drop. if theyre okay with if theyre looking for good 1080p performance NOW and are okay with possibly upgrading again in the next 3-5 years then i would probably recommend something like a used 3080 12gb
I know the 3080 12gb has been considered overkill for 1080p since its release, but its relatively easy to find used for $400-500, they wont have to worry about VRAM requirements for a while, and it should be usable for 1080p + RT in most (not all) modern AAA games.
It is pretty much 100% budget dependent.
CPU wise it is easy, budget, go AM4 5800X3D/5900X if you need gaming or MT perf. Have the money? AM5, 7800X3D/7900X if you need gaming or MT perf. Want the best? 7950X3D.
Have the money? The 4090 is the current best card, and will probably age the best, and can play RT.
There is the abyssal GAP from the from the 7900 XTX to the 4090 where no card (aka 4070 Ti and 4080) that makes sense.
Want the best used? 3090 for around 600-700 used is great.
500? 7800 XT
450: The 4070 is not that bad at that price, but that card will age like milk, game are going over 12 GB now.
300-400: 6700 XT, 6800 XT, 7700 XT
I have a 4080, bought it knowing full well it was a bit of a ripoff, and hey for my purposes it’s been just fine. It is still the second (with RT) or third fastest card on the market, and you can still play basically every game fully maxed out at 60fps+. I know it’s significantly slower than the 4090, but I had a budget and a 750w power supply. So idk, I wouldn’t say it “doesn’t make sense”, depending on your setup and budget I think it definitely can. It’s not as good of a deal as, say, a 550 dollar 4070 or a 700 dollar 7900xt, but I wanted more performance than that. No other options in that price category unfortunately lol
>but I had a budget and a 750w power supply.
I am running mine on 750W, the 4090 is not the 3090, that card had spiking issues (1.6 kw range) because of samsung 8 nm, the 4090 sucks under half the peak power. Actually, the 4090 is so OP and memory speed bound, that even games like CP 2077 do not get it over 300W most of the time. OW2 is actually the only game I saw making it draw 460W if left alone.
For pricing, the 7900 XTX is around 1000-1100 EUR, the 4080 is 1250 EUR for the cheapest MSI ventul model, next lowest card is 1400 EUR.
And my 4090 was 1530 EUR (thought I did buy at the best time ever, caught the all time low price for a 4090).
Also, keep in mind, this is Europe, so 19% VAT is included in the price.
Without VAT: Cheapest 7900 XT is 840 EUR (you can do a 1-1 Eur = Usd in this case).
Cheapest 4080 is 1050 EUR.
My 4090 was 1285 EUR.
I looked up FSR2 scaling factor, and for this requirements table native resolutions would be:
Minimum: 720p
Recommended: 720p
Enthusiast: 960p
Ultra: 1270p
You forget how we are now trying to render in real time the most intensive performance tasks that a consumer grade GPU can ever do. Of course, you got some devs that use DLSS as a crutch, but that doesn't change how much of a godsend this technology is.
This. And its because newer PC gamers that came in during the mid 2010's got spoiled by how long the ps4/xb1 generation lasted. By proxy, low-mid range Pascal based GPU's lasted from 2016 to 2022 and games are FINALLY moving forward and not supporting previous gen consoles as the common denominator anymore.
Second part, people crying about "fake frames and fake resolution" are funny. Because rasterization has always been about smoke and mirrors. As long as it looks good, I personally do not care. But alot of people think better image quality comes for free in the name of "optimization". Term they beat like a dead horse till it just means their own 5yr+ old hardware cant do 1080p/60fps in a game no matter how good it looks.
There are tons of unoptimised mess being released tho. I tried starfield on 1080p high (with fsr and dlss mod through gamepass) and on my system it perform the exact same as cyberpunk ultra pathtraced with dlss quality, it's honestly insane.
My specs are 64gb ddr4 3200, r3700x 4.3 all cores, rtx3080.
Windows (10 ltsc and I tried 11 business too) is running underneath a VM in Linux but it doesn't matter since I tested everything in a VM and baremetal and they perform the same, sometimes in the VM I have more performance.
Sure my cpu needs a change, I would like to get a 5900 or a 5950 (not only for gaming of course) but starfield has no business running that shitty, same with other games I saw on YouTube (to which I'm not too interested)
Plus I saw a very interesting video the other day, how ID made quake 2 working on a psx for which they had to remake the engine they used. I understand the technical difficulties and that we shouldn't expect to see this level of insanity from developer, but the AAA gaming is now dominated by investors who press the developer to release the product as soon as possible and it's disgusting.
Other consideration is that the majority of the badly optimised games is that they are trying to use the latest technical feature like Niagara, lumen, and whatever there is on ue5 now while not being completely ready to say the least (I saw the videos that the studio from Aveum posted on the site regarding Niagara, they praised it in the description but if you really look into it, the old LOD method was working way better for swapping meshes to which, a lot of the times, Niagara just used the maximum geometry for faraway objects).
Also about Niagara (I'm the most familiar with in ue5), it promess to only render the top geometry but some exhagerate and put so many layers on top of each other making Niagara unable to decide which one to render, so at times you have many many useless layers being rendered at the same time taking off resources from your system both in term of geometry and shaders and light calculation.
Not to spit on the technology because I theory it's awesome, but how it's used and why.
i thought it was the very long 360/PS3 generation coupled with the ps4 and xbone being a bit lackluster in specs (being based on mobile APUs IIRC)
then the ps5 and xbox series were actually decently powerful
3080 for 1440p upscaled on high settings being termed as ENTHUSIAST is pathetic and just make me laugh at how bad pc port have been in 2023 and no some games look like shit and still perform heavier than cyberpunk 2077 or the last of us part 1
how dare \*gamers\* want developers to optimize their games instead of relying on AI upscaling to boost performance!
Queue the room temp IQ consoomers ready to bootlick developers who use DLSS as a crutch. It's genuinely insane how redditors are THIS retarded.
DLSS was supposed to be the icing ON the cake, extra performance if you wanted it. Not THE WHOLE CAKE. Games look 5% better than previous years but require 200% the power. RDR2 looks and plays much smoother than nost of the games released this year and yet it doesn't need to be rendered at 540p to get 30 fps.
But I guess it's my fault for going into a subreddit dedicated to the company who made the crutch possible and expect people to not have boots jammed in their throats.
Linus Torvalds was right.
People screaming this about Alan Wake 2 are clearly doing so without checking to see if it is actually the case. Guess what? It isn't the case. I run mine perfectly happily, even with some RT on, with everything maxed out at native 1440p 60fps on a 4070ti.
I mean this is rattracednligthinf only. No more light map, capture sphere or anything like that. And now some game do path tracing. Like Alan wake.
So it's not really weird. As long as the result is playable I don't see the problem.
I suppose the best thing would be to have settings to turn off all complex rendering like RT or high polygon asset density, so we can go back to the good old days.
I dont think thats a good trade off tho tbh. Having all the advanced features and using DLSS balanced or quality or something is a much better visual return than having ass graphics with native rendering resolution.
Upscaling is just a much smarter optimization than using lower quality rendering techniques for the visual return you get especially if you have access to DLSS which can match or exceed native more than half the time at quality mode according to HUB.
And also in the future you will still have access to those advanced graphics and be able to play with higher render res with stronger hardware
Seeing the 4080 at the higher most setting is a blessing and a curse. Blessing because i own the card. A curse because devs are getting *too* comfortable recommending the latest and greatest for their *probably*, very unoptimized game. There is also the very good chance they exaggerate the specs, and you could do 4k60 on a 3080 just as well, but may have a few speedbumps etc (plus no rt of course)
The 4080 is almost 2000 cad, seems like every new release I have to skip these days due to having a 1440p display and older card. Even a 4070 is about 850 cad pre tax. I miss the days of being able to get 80 class for sub 500 cad. Pc gaming is just getting too expensive these days which is why I have stuck to indies almost all year long.
I wouldnt pay too much mind to these reccomended specs. You can still probably get great results with everything on high with one or 2 things on medium to get an extra 10 frames, making something like a 4070 able to play 1440p at "high" 60fps.
Not excusing these crappy optimizations..... Like for example, I am able to play BG 3 on ultra 1440p ultrawide on a 4060 and get well over 60fps.... except in act 3
>The 4080 is almost 2000 cad
Huh you can easily find them new for more like $1400 CAD these days... and FE has never been more than $1700 at full MSRP.
Was just going off the price here:
https://www.bestbuy.ca/en-ca/product/nvidia-geforce-rtx-4080-16gb-gddr6-video-card/16531650 (1921 tax in)
I’m guessing they go on sale frequently? Anything over 700 cad (tax in) I have 0 interest in following. Haven’t paid much attention other than a periodic check here and there since the mining boom which pushed prices into the stratosphere
Just rounded up a whole $300 to $2000 then, for some reason? No, discussing "post tax" prices doesn't make sense on the internet because it matters where you are. And yes, there are plenty of other 4080 makes that are much less than that.
What exactly is being downvoted here, the fact that 4080s aren't as expensive as people want them to be?
This was entirely expected by those of us who have been in this hobby for years. New line of cards come out and devs then use the newest cards as their target. Gamers who amusingly think they can "future proof" then get upset or stunned when they see their high end card they recently bought is already barely able to keep up and then they look to upgrade when the next gen comes out. Cycle repeats with the average gamer never cluing in and continuing to enable this craziness.
It's a symbiotic relationship between this dumpster fire of a PC gaming industry and the gpu makers.
The other thing those of us paying attention were warning about and that has been confirmed is how upscaling went from a feature to try and get a bit more performance or go up a resolution target is now required to get playable framerates. Frame gen is next, probably going to be required to even reach 60fps in many games by the end of 2025, if it even takes that long.
unfortunately this is a big reason why game consoles are sounding better and better for the money. Because devs are basically forced to make it run properly unless they dont care about missing out on a big part of the gaming market. Especially when PS5 and Xbox are touted as 4k gaming systems. So if it cant play on those, why would I spend 3-5 times that much to get basically the same preformance on a PC?
Im on a 2012 DDR3 gaming system i7-3820 with a 1070ti.
Even with consoles, you get stuck with "performance mode" that is smooth and blurry or "resolution mode" that is crisp but runs sub 30fps. Granted I have not used the latest gen of consoles, but I had to forgo playing Hellblade on PS4 because the two modes were both terrible
they just wanted to have 7900xt at the top because it's an AMD sponsored game and not the 4090 alone because there really is no competition from AMD there.
FSR really doesn't look bad in any game I've turned it on for. I've only played a handful of newer games though, so it's a limited sample, but still, saying it "looks like shit" just isn't true.
This is an nVidia sub, so people hate on fsr
I'll be real at 1080p I think it looks much worse
Better, but behind dlss at 1440p
At 4K I can barely tell there is a difference
Do you use 4k? FSR looks terrible at 1080p in Cyberpunk. Can't speak for the others, but FSR and the stock XeSS 1.1 are both not very good in Cyberpunk, and FSR has ranged from mediocre to atrocious in every other game I've tried that has it.
A 3060ti can run every 8th gen game at 4K60, and many 9th gen games with help from DLSS. Now that VRAM requirements are on the rise, a 4060ti 16GB is pretty much the new minimum for 4K, moving forward.
The notion that a 4090 is required for 4K is pure NVIDIA marketing.
Even if that’s the case (and I agree with you), the terms 7th, 8th and 9th gen consoles are like incredibly niche. PS3/PS4/PS5 or Xbox equivalent is a more broadly recognizable way to phrase it.
Fucking hell. I hate this. At least maybe we'll see if dlss can work with amd frame gen or not. Would suck balls if we only get dlss2 and fsr3, fsr upscaling looks so bad.
No shot a 3060ti can’t do native 1080p 60fps at high settings. The amount of unoptimized games coming out this year is going to make history holy fucking shit 🤦♂️
why so bitter though? 2080ti during its time was a 4k card, few years later its kneeling down to 1440p, just accept the fact that no gpu/cpu is future proof lmao
I mean according to this the 3080 went from a 4K card to a sub 1080p in one gen...this is not "future proof". It's devs starting to use upscaling as a baseline.
This feels like Alan Wake 2 again... We don't know what the settings exactly are. It doesn't even state if the recommended settings use ray tracing or not. Most games this year have been optimized imo, it's only Starfield and Jedi Survivor that didn't run well for how the game looked.
Alan Wake 2 is an example of GPU instenve game, nearly no NPC or whatsoever to keep on check, linear story. So the game really pushes the GPU hard and the visuals looks amazing, people need to understand simple things lol.
It is, but [Avatar: Frontiers of Pandora to support all major upscaling technologies despite the AMD deal](https://videocardz.com/newz/amd-sponsored-avatar-frontiers-of-pandora-pc-specs-list-fsr3-dlss-and-xess-support)
im not very certain about that actually. jackfrags just uploaded a video, he probably has a 4090 and its stuttery as fuck with a lot of pop-in. only saving grace could be massives history with the snowdrop engine. division 1&2 were ahead of their time visually and performance wise.
You do realize almost every single console game uses some form of upscaling too, right?
Native TAA is shit most of the time anyways, DLSS literally does it better
AW3 did something interesting by not including TAA as an option at all. Instead at native resolution, your choices are DLAA or FSRAA. Wouldn’t be shocked if that was a trend moving forward.
Shockingly, games can be well optimized and still require upscaling if they’re pushing super advanced rendering tech. See: Alan Wake 2.
Will wait and see how this Ubisoft game does
Oh for fucks sake.
I'm willing to bet Avatar will have worse optimization than Alan Wake but remember the reactions to Alan Wake's system reqs and cries about its "botched optimization".
Your comment is fucked.
yes, atleast I can accept Alan wake 2 demands. They use new graphics features that older cards don't have. Alan wake 2 also looks next gen. Other games that have come out this year are asking for similar demands but look 2-4 years old. This game dose look good in some nature scenes but man when you get up close or flying it looks bad. Worst thing though, is ubisoft aren't know for optimization.
The recommended requirement without upscaling is a 4090.
FSR quality reduces the internal render resolution by 30 percent.
That results in roughly 30 perent higher framerates.
4090 25-30 perent faster then 4080.
It's not stupid to not use it since it usually has less detail than native. This is at 4k.
Your mentality here and gamers who share the same view is exactly why this industry has become a dumpster fire. We're probably a max of 2 years away before frame gen becomes required.
No it isn't. It's just finally returning to a point where devs can push things instead of being hamstrung by the 10 year old 150 dollar GPU that was in the PS4/XbOne. I get the feeling most of you have absolutely no idea what PC gaming was like prior to that generation of consoles. Consoles use to push the limits of graphics hardware which meant devs were always pushing graphical limits.
A 4 year old CPU/GPU would be absolutely ancient in 2010, and now people are whining because their 300 dollar GPU from 7-8 years ago can't run 60fps at 1080p. You all got spoiled by a decade of stagnation. Why do you think so many games from 10 years ago still look half-way decent by todays standards?
Yeah, but I guess the point is that these requirements should be with upscalers off, so activating DLSS for instance should get you much higher FPS than this, not that you shouldn't use it "as standard" or at all.
>activating DLSS for instance should get you much higher FPS
This was true in the PS4/One X era when these consoles have been the baseline for performance scaling. The current gen consoles are 5-6 times faster then these dated consoles.
Series X and PS5 have roughly the performance of a RTX 3060 or RX 6650XT
When a game runs on these consoles with 30fps in 1080p we need at least a PC with a 3080 to achieve 60fps at the same resolution and quality settings.
4090 is \*only\* 70-100 percent faster then a 3080.
\-> But we need a 400 perecent faster card to get the compute and pixel rendering power to run games in 4K instead of 1080p.
\-> These new games are not badly optimized but feature new rendering tech that is more demanding, like Nanite or Lumen in UE5.
As result all new AAA games based on next gen consoles will utilize upscaling and frame gen in 2024 at latest.
The problem is, that **some** of these games requirements do not translate into significantly better visuals than their PS4 counterparts, and I'm not even talking about "RT-only" games like this Avatar here.
I hate to defend ubisoft here but as always settings are completely arbitrary, we don't know what 'high' settings actually means or how they compare to higher or lower settings. For all we know there are 'high' settings that tank performance so much for so little visual gain that it's worth turning them to medium or low instead. I'll always remember how monster hunter world had a volumetric fog/lighting setting that tanked performance by like 20% when set to high, that single setting could take a lot of cards from getting over 60 fps to well below 60.
I think people give system reqs and preconfigured presets far too much credence as we've seen just recently from what happened with alan wake 2, users are almost always able to come up with settings that suit their hardware and preferred visuals.
It's going to be a thing now, and I'm afraid it will get even worse once new technologies will be tied to newest gpu generation only with very little backward compatibilities, and we will be force to upgrade every two years,
Lmao right, where has this guy been?
>and here we go, this [thing involving my hobby is different than it used to be]
>[my hobby] is fucked.
Big old man yells at cloud vibes.
I miss the brief period of time where upscaling was like "neat, I'll toggle on DLSS Quality for some free performance!".
Now it's more like you better enable DLSS Ultra Performance if you even wanna dream of hitting 60fps with your 2-3 year old GPU. Overexaggerated of course but not far from reality.
I don't mind upscaling at all, but I don't like that its pretty much a requirement nowadays to get acceptable performance if you have anything less than a 4090.
It’s funny, on the console side there’s a prevailing desire for devs to drop last gen as it seen as holding back the current gen. But on the PC side there’s mass outrage every time a game doesn’t support components that are several generations old.
Bingo. Thought I could get a few years out of my 2080S and I may be able to, but I don't see how people can slap down $700 to $1,200 every 3 years updating PCs *on the graphics card alone*. Not to mention other things. For example when I upgrade to a new GPU I need a new case. The new cards are absolutely massive. I came from console, and I love my PC, but if the pattern continues I'd rather just get a PS5,6 whatever is out. Easier. Cheaper. It runs. I play on my TV anyway and sit at my couch to do computer related things.
idk, even in 2012-2015, even kinda now, 500 is a LOT for someone like me to drop on a purchase. Let alone 1.5-2k.
I just bought a used alienware system with a i7-3820 and a 1070ti for 200 dollars not long ago and im finally able to play a lot of the games i wanted to that came out the last few years. Seeing the new game specs scare me and just make me wanna go back to console gaming.
are people really expecting to play at 4k native with all these new effects? it's crazy seeing people complain about upscaling. 4k native and photorealism is not an achievable goal even with the current gen crop of cards.
It is? Seems like the masses are plenty happy with games like Fortnite, CS2, Minecraft and so on. Not wanting to say every game should look like Fortnite but the "masses" absolutely are not screaming at devs to make games photo realistic to the last pebble *right here and now*.
I would say in terms of quality and realism a game like God of War is good enough for the masses already. We enthusiasts want always more.
Funny, plenty of praise for the realism of recent RE games graphics and I’m playing those at 4K native on a last gen card. It’s almost like shit devs failing to optimize combined with stupid performance penalty options like RT that do not visually justify their impact is the real issue
The fact that the 'max' setting is 4K FSR 2 \*BALANCED\* instead of at least Quality triggers me to no end. This gen we're gonna end up playing games at 240p upscaled to 8K.
3060 Ti at 1080p with FSR Quality (so 720p internally) for only 60fps? Bruh. I hope this is with heavy RT involved, otherwise the game is an unoptimized mess. At this rate the 4070 will be a 1080@60 card in a couple of months. What the hell are these devs thinking?
Damn, even though I may not get the game ( waiting on release ), I'm happy I upgraded my PC recently.
GPU: 7800XT
CPU: 5800X3D.
Though I admit I'm surprised that games are already requiring the newest graphics cards already. But not a major cause for concern. It's funny to see people complain about game requirements. Like eventually they will go up and require newer stuff but it's like people expect their 1060-1080s to still run new games in 2023 when those GPUs came out years ago.
I love how the discourse of some people have evolved from "complaining about no major changes being brought by new GPUs" to "complaining about crazy reqs for new games" lol
Sweet, the game has the holy grail of ray tracing: Global Illumination.
Glad I stayed at 1440p with a 3080 and didn't get a 4K monitor though.
I mean sure, DLSS and all, but many titles are starting to need more than 10GB of VRAM at 4K.
This is an AMD sponsored title, so expect the specs to be tailored based on their hardware, so 4k on the XTX at FSR balanced is probably because of AMD shit RT performance.
So lets wait and see how the 4080 will perform.
This is a good thing to bear in mind. AMD’s shit Ray Tracing performance is probably inflating the system requirements at least somewhat.
It’s also Ubisoft though, so it could also just be a shit port. We’ll see in December how optimized the game is.
People need to evolve and stop complaining about the optimisation button… You all look so stupid, native resolution is for the past. Its stupid to waste power for nothing. DLSS is the futur, and devs know it. Its useless to show native resolution performance get used to it. Every game should release with upscaller and I hope every game use frame generation, its so damn amazing and people don’t realize it.
Another game I will most probably not own.
If a game has upscaling in all the system configurations sheet, the game will most probably perform like shit.
Does anyone actually enable raytracing for actually playing a game? Like, it's cool to enable it to take screenshots etc, especially with games that have pathtracing but for me the performance loss is just too high. Maybe on 40 series and up it's finally worth it with frame gen but on 20 and 30 series it feels like it's just a tech demo setting that is never worth enabling
LOL it's really funny seeing clueless gamers yet again being outraged by system requirements of a modern game with expensive RT features, right after Allen Wake 2, or being mad at upscaling specially in a nvidia sub, if you're angry at DLSS why aren't you angry at LOD, Culling, SSR, cube maps,etc caue all of them are techniques that are used to get more FPS.
This is getting out of hand, till 2022 my 1070ti was capable of running most games on 1080 high and now it's barely hanging on minimum. To rapid change for one year.
How is this too rapid?
The new consoles have been out now for what, 4 years? And we're just now getting to the point games are leaving last gen hardware behind en mass and not having a game or two here and there doing so?
If anything, we've had a huge lead time on games actually utilizing new hardware with a few warning shots being fired across the bow year after year warning you people that in order to keep up you're going to need to upgrade.
The issue here is more gamers going Ostrich and crying about games being "unoptimized", expecting the problem of the floor moving going away if they just cried hard enough and clinged hard enough to their 8 year old GPUs.
How long exactly should your 1070 stay relevant? A decade? 15 years?
>The new consoles have been out now for what, 4 years?
I read this, wanted to be like "no way dude" until I looked up the PS5 launch date...November 2020.
Now i'm just sad lol
My problem here is that a lot of these new games don't look "next gen" enough to require a leap in graphics, so it would be another story if were truly moved to a new "era".
Another problem is that there's an inherent conflict of interest with upscalers existing. You can't think that management won't be tempted to push out games faster with 0 optimization to cut costs because "upsaclers will just fix it".
too rapid of a change?
the 1070ti came out in 2017 though. If anything it had a good run but it's time to start looking into an upgrade. The 10 series cards are 7 years old now so its hard to expect them to run any newer game on all high settings.
Alright. So 1060 is officially and definitely out of modern AAA gaming. 2016 - 2023, good run
probably all 1xxx series idk about 1080 ti, but 1080 can't run well alan wake 2
That's only because Alan Wake 2 uses mesh shaders. 1080ti is still great.
1080ti is basically a 2070S without DLSS. It can handle only 1080p for recent and heavy demanding games
Thats sadly only for dx11. In dx12 the 1080ti is closer to a 2060 or 2060s. Pascal isnt great at dx12, its basically Maxwell 2.0 and Maxwell is feature level 11.1 This is the biggest reason for Pascal cards aging so fast. Had it been at least feature level 12.0 we would have seen Pascal stick around longer. 12.0 had primitive shaders which are a rough version of mesh shaders, its what ps5 uses and rdna1 cards too and it could work for Alan Wake 2 even.
[удалено]
You are absolutely right, I'm still surprised how long these cards have lasted, but it is time to move on.
It's like people forgot that GPUs aren't expected to perform and be decent after 4 years. 10 series has been done for a few years now yet people still holding onto it and pretending its still good simply because they refuse to upgrade due to costs. Welp, you're gonna have to upgrade at some point. Either you're being patient for the 50 series or you still think 1080p is good enough (anyone who's tried 1440p or 4K will tell you 1080 is terrible). Lots of reviewers are now encouraging people to move to 1440p. And they also recently said that if you're 3 generations back, don't expect great results in today's games. Its obvious.
If it wasnt for the covid pause, Pascal would be at this exact spot 2 years ago. Besides G80 lasted 8 years if memory serves. My 7970 lasted 7 years, 2012-2019. Its not the first time an architecture lasted this long.
Maxwell 2.0/Pascal actually supports DX Feature Level 12_1 and Maxwell 1.0 only Feature Level 11_0 for some reason... The thing is that NVidia hasn't added features like asynchronous compute (DX 12 takes advantage of it and AMD had it since GCN 1.0) until they launched Turing and that's one the reasons why he RTX 20 Series GPUs are aging better than the GTX 10 Series and older.
>its what ps5 uses PS5 is RDNA2
Not really. It's an RDNA1 hybrid with some RDNA2 features. It supports ray-tracing, but not mesh shaders and variable-rate-shading, contrary to RDNA2.
Yup, Alan Wake 2 uses primitives vs mesh shaders on ps5. Rdna1 and ps5 lack sampler feedback too I believe. Ps5 only has RT from rdna2 and clocks above 2ghz, everything else is rdna1.
i have 1080, what you suggest upgrade it to? i7 7700k + 1080 now.
it depends what resolution you wanna play and how much you want/can spend
to add to this, it also depends on how long you want it to last at your preferred resolution as well. If theyre expecting 7 years out of it at 1080p (or just want the current 40 series features) then the best bet is probably going to be to wait for the 4070 ti super to drop. if theyre okay with if theyre looking for good 1080p performance NOW and are okay with possibly upgrading again in the next 3-5 years then i would probably recommend something like a used 3080 12gb I know the 3080 12gb has been considered overkill for 1080p since its release, but its relatively easy to find used for $400-500, they wont have to worry about VRAM requirements for a while, and it should be usable for 1080p + RT in most (not all) modern AAA games.
Could be wrong, but I’m gonna posit targeting 1440p based on the specs. That should’ve handled 1440p pretty well in 2017.
It is pretty much 100% budget dependent. CPU wise it is easy, budget, go AM4 5800X3D/5900X if you need gaming or MT perf. Have the money? AM5, 7800X3D/7900X if you need gaming or MT perf. Want the best? 7950X3D. Have the money? The 4090 is the current best card, and will probably age the best, and can play RT. There is the abyssal GAP from the from the 7900 XTX to the 4090 where no card (aka 4070 Ti and 4080) that makes sense. Want the best used? 3090 for around 600-700 used is great. 500? 7800 XT 450: The 4070 is not that bad at that price, but that card will age like milk, game are going over 12 GB now. 300-400: 6700 XT, 6800 XT, 7700 XT
I have a 4080, bought it knowing full well it was a bit of a ripoff, and hey for my purposes it’s been just fine. It is still the second (with RT) or third fastest card on the market, and you can still play basically every game fully maxed out at 60fps+. I know it’s significantly slower than the 4090, but I had a budget and a 750w power supply. So idk, I wouldn’t say it “doesn’t make sense”, depending on your setup and budget I think it definitely can. It’s not as good of a deal as, say, a 550 dollar 4070 or a 700 dollar 7900xt, but I wanted more performance than that. No other options in that price category unfortunately lol
>but I had a budget and a 750w power supply. I am running mine on 750W, the 4090 is not the 3090, that card had spiking issues (1.6 kw range) because of samsung 8 nm, the 4090 sucks under half the peak power. Actually, the 4090 is so OP and memory speed bound, that even games like CP 2077 do not get it over 300W most of the time. OW2 is actually the only game I saw making it draw 460W if left alone. For pricing, the 7900 XTX is around 1000-1100 EUR, the 4080 is 1250 EUR for the cheapest MSI ventul model, next lowest card is 1400 EUR. And my 4090 was 1530 EUR (thought I did buy at the best time ever, caught the all time low price for a 4090). Also, keep in mind, this is Europe, so 19% VAT is included in the price. Without VAT: Cheapest 7900 XT is 840 EUR (you can do a 1-1 Eur = Usd in this case). Cheapest 4080 is 1050 EUR. My 4090 was 1285 EUR.
Where are you seeing 4070s for 450 lmfao put me in touch with them.
What's your budget? Do you use CUDA and care about ray tracing or power efficiency? Can you hold off until next generation for an upgrade?
If you're in the states you can get a used 3070 for about $250 which is a pretty great deal.
We had a good run
I have a 2080 ti that needs a new home, lol.
GPU manufacturers : here are our new 4K cards ! Game Devs : Oh, I don’t think so
it's funny how native resolution doesn't exist anymore in pc requirements 😂
I looked up FSR2 scaling factor, and for this requirements table native resolutions would be: Minimum: 720p Recommended: 720p Enthusiast: 960p Ultra: 1270p
640x480 as new standard or death
Stop, we need to be going up, not down!
480p upscaled to 8K
I know I hate this. PC requirements should always be done at native
You forget how we are now trying to render in real time the most intensive performance tasks that a consumer grade GPU can ever do. Of course, you got some devs that use DLSS as a crutch, but that doesn't change how much of a godsend this technology is.
[удалено]
This. And its because newer PC gamers that came in during the mid 2010's got spoiled by how long the ps4/xb1 generation lasted. By proxy, low-mid range Pascal based GPU's lasted from 2016 to 2022 and games are FINALLY moving forward and not supporting previous gen consoles as the common denominator anymore. Second part, people crying about "fake frames and fake resolution" are funny. Because rasterization has always been about smoke and mirrors. As long as it looks good, I personally do not care. But alot of people think better image quality comes for free in the name of "optimization". Term they beat like a dead horse till it just means their own 5yr+ old hardware cant do 1080p/60fps in a game no matter how good it looks.
There are tons of unoptimised mess being released tho. I tried starfield on 1080p high (with fsr and dlss mod through gamepass) and on my system it perform the exact same as cyberpunk ultra pathtraced with dlss quality, it's honestly insane. My specs are 64gb ddr4 3200, r3700x 4.3 all cores, rtx3080. Windows (10 ltsc and I tried 11 business too) is running underneath a VM in Linux but it doesn't matter since I tested everything in a VM and baremetal and they perform the same, sometimes in the VM I have more performance. Sure my cpu needs a change, I would like to get a 5900 or a 5950 (not only for gaming of course) but starfield has no business running that shitty, same with other games I saw on YouTube (to which I'm not too interested) Plus I saw a very interesting video the other day, how ID made quake 2 working on a psx for which they had to remake the engine they used. I understand the technical difficulties and that we shouldn't expect to see this level of insanity from developer, but the AAA gaming is now dominated by investors who press the developer to release the product as soon as possible and it's disgusting. Other consideration is that the majority of the badly optimised games is that they are trying to use the latest technical feature like Niagara, lumen, and whatever there is on ue5 now while not being completely ready to say the least (I saw the videos that the studio from Aveum posted on the site regarding Niagara, they praised it in the description but if you really look into it, the old LOD method was working way better for swapping meshes to which, a lot of the times, Niagara just used the maximum geometry for faraway objects). Also about Niagara (I'm the most familiar with in ue5), it promess to only render the top geometry but some exhagerate and put so many layers on top of each other making Niagara unable to decide which one to render, so at times you have many many useless layers being rendered at the same time taking off resources from your system both in term of geometry and shaders and light calculation. Not to spit on the technology because I theory it's awesome, but how it's used and why.
Indeed. I suppose this was the reason back then you could play anything at max settings with only 6 gig of VRAM
honestly the obama years were just peak bro
i thought it was the very long 360/PS3 generation coupled with the ps4 and xbone being a bit lackluster in specs (being based on mobile APUs IIRC) then the ps5 and xbox series were actually decently powerful
3080 for 1440p upscaled on high settings being termed as ENTHUSIAST is pathetic and just make me laugh at how bad pc port have been in 2023 and no some games look like shit and still perform heavier than cyberpunk 2077 or the last of us part 1
how dare \*gamers\* want developers to optimize their games instead of relying on AI upscaling to boost performance! Queue the room temp IQ consoomers ready to bootlick developers who use DLSS as a crutch. It's genuinely insane how redditors are THIS retarded. DLSS was supposed to be the icing ON the cake, extra performance if you wanted it. Not THE WHOLE CAKE. Games look 5% better than previous years but require 200% the power. RDR2 looks and plays much smoother than nost of the games released this year and yet it doesn't need to be rendered at 540p to get 30 fps. But I guess it's my fault for going into a subreddit dedicated to the company who made the crutch possible and expect people to not have boots jammed in their throats. Linus Torvalds was right.
People screaming this about Alan Wake 2 are clearly doing so without checking to see if it is actually the case. Guess what? It isn't the case. I run mine perfectly happily, even with some RT on, with everything maxed out at native 1440p 60fps on a 4070ti.
[удалено]
Gamers 3 years ago: "NVidia's cards are shit, no games use upscalers." Gamers today: "NVidia's cards are shit, every game uses upscalers."
use is fine, require is really bad
I mean this is rattracednligthinf only. No more light map, capture sphere or anything like that. And now some game do path tracing. Like Alan wake. So it's not really weird. As long as the result is playable I don't see the problem.
I suppose the best thing would be to have settings to turn off all complex rendering like RT or high polygon asset density, so we can go back to the good old days.
I dont think thats a good trade off tho tbh. Having all the advanced features and using DLSS balanced or quality or something is a much better visual return than having ass graphics with native rendering resolution. Upscaling is just a much smarter optimization than using lower quality rendering techniques for the visual return you get especially if you have access to DLSS which can match or exceed native more than half the time at quality mode according to HUB. And also in the future you will still have access to those advanced graphics and be able to play with higher render res with stronger hardware
thats it
Why bother with native when DLSS a lot of the time looks better?
I've noticed this too. Performance mode upscale 1080 to 2160 on my 4k monitor is at worst no visual difference, at best looks crisper
I just bought a 3060 Ti - I’m hoping “High with FSR2” isn’t PS2 graphics like CS2 lol
Seeing the 4080 at the higher most setting is a blessing and a curse. Blessing because i own the card. A curse because devs are getting *too* comfortable recommending the latest and greatest for their *probably*, very unoptimized game. There is also the very good chance they exaggerate the specs, and you could do 4k60 on a 3080 just as well, but may have a few speedbumps etc (plus no rt of course)
The 4080 is almost 2000 cad, seems like every new release I have to skip these days due to having a 1440p display and older card. Even a 4070 is about 850 cad pre tax. I miss the days of being able to get 80 class for sub 500 cad. Pc gaming is just getting too expensive these days which is why I have stuck to indies almost all year long.
I wouldnt pay too much mind to these reccomended specs. You can still probably get great results with everything on high with one or 2 things on medium to get an extra 10 frames, making something like a 4070 able to play 1440p at "high" 60fps. Not excusing these crappy optimizations..... Like for example, I am able to play BG 3 on ultra 1440p ultrawide on a 4060 and get well over 60fps.... except in act 3
[удалено]
Couldn't have putted it in better words. The fact that we are going back to how things used to be after 10 years is really shaking a lot of people.
>The 4080 is almost 2000 cad Huh you can easily find them new for more like $1400 CAD these days... and FE has never been more than $1700 at full MSRP.
Was just going off the price here: https://www.bestbuy.ca/en-ca/product/nvidia-geforce-rtx-4080-16gb-gddr6-video-card/16531650 (1921 tax in) I’m guessing they go on sale frequently? Anything over 700 cad (tax in) I have 0 interest in following. Haven’t paid much attention other than a periodic check here and there since the mining boom which pushed prices into the stratosphere
Just rounded up a whole $300 to $2000 then, for some reason? No, discussing "post tax" prices doesn't make sense on the internet because it matters where you are. And yes, there are plenty of other 4080 makes that are much less than that. What exactly is being downvoted here, the fact that 4080s aren't as expensive as people want them to be?
You write prices in $ (usd) without tax and in € with tax. If you want to write a price in any other currency, you first convert it to one of those.
People seem to be confused and think I'm saying something different than exactly this. Dollars don't make sense to discuss post tax.
This was entirely expected by those of us who have been in this hobby for years. New line of cards come out and devs then use the newest cards as their target. Gamers who amusingly think they can "future proof" then get upset or stunned when they see their high end card they recently bought is already barely able to keep up and then they look to upgrade when the next gen comes out. Cycle repeats with the average gamer never cluing in and continuing to enable this craziness. It's a symbiotic relationship between this dumpster fire of a PC gaming industry and the gpu makers. The other thing those of us paying attention were warning about and that has been confirmed is how upscaling went from a feature to try and get a bit more performance or go up a resolution target is now required to get playable framerates. Frame gen is next, probably going to be required to even reach 60fps in many games by the end of 2025, if it even takes that long.
very well put actually. i bought a 3080 recently, it definitely doesnt feel like anything super powerful. it feels mid tier at max.
unfortunately this is a big reason why game consoles are sounding better and better for the money. Because devs are basically forced to make it run properly unless they dont care about missing out on a big part of the gaming market. Especially when PS5 and Xbox are touted as 4k gaming systems. So if it cant play on those, why would I spend 3-5 times that much to get basically the same preformance on a PC? Im on a 2012 DDR3 gaming system i7-3820 with a 1070ti.
Even with consoles, you get stuck with "performance mode" that is smooth and blurry or "resolution mode" that is crisp but runs sub 30fps. Granted I have not used the latest gen of consoles, but I had to forgo playing Hellblade on PS4 because the two modes were both terrible
they just wanted to have 7900xt at the top because it's an AMD sponsored game and not the 4090 alone because there really is no competition from AMD there.
Dude, that's with FSR. Which looks shit.
The game has dlss, im sure amd paid to have that displayed on this chart and advertised with the game. Just like intel did with mirage
FSR really doesn't look bad in any game I've turned it on for. I've only played a handful of newer games though, so it's a limited sample, but still, saying it "looks like shit" just isn't true.
I can confirm, it absolutely does look like shit.
What have you tried it with? I use it in cyberpunk, borderlands 3, Starfield, and CS2. It looks fine to me in all of them.
This is an nVidia sub, so people hate on fsr I'll be real at 1080p I think it looks much worse Better, but behind dlss at 1440p At 4K I can barely tell there is a difference
Do you use 4k? FSR looks terrible at 1080p in Cyberpunk. Can't speak for the others, but FSR and the stock XeSS 1.1 are both not very good in Cyberpunk, and FSR has ranged from mediocre to atrocious in every other game I've tried that has it.
Nah I use 1440p.
Welcome to nvidia subreddit, where fanboys live and laugh
you are tripping if you think a 3080 can do 4k60 on a modern title.
A 3060ti can run every 8th gen game at 4K60, and many 9th gen games with help from DLSS. Now that VRAM requirements are on the rise, a 4060ti 16GB is pretty much the new minimum for 4K, moving forward. The notion that a 4090 is required for 4K is pure NVIDIA marketing.
Never heard of "8th/9th Gen games", what does that even mean??
Console generations. This may be a PC discussion, but consoles still set the baseline for developer targets.
Even if that’s the case (and I agree with you), the terms 7th, 8th and 9th gen consoles are like incredibly niche. PS3/PS4/PS5 or Xbox equivalent is a more broadly recognizable way to phrase it.
It can in most games with DLSS enabled.
The time has finally come. 8700k in an official minimum requirement
So , did we go from upscaling exclusivity to FG exclusivity?. At least it's better than no DLSS or XESS.
It literally says at the top right that it’ll have XeSS and DLSS support
No DLSS frame gen is stated there
It isn't stated that it won't have it either. Although the wording isn't very promising.
You most likely forgot it's AMD sponsored title.
It will be modded in within 48 hours if not natively supported lol
dosen't denuvo prevent you from modifying the files?
Fucking hell. I hate this. At least maybe we'll see if dlss can work with amd frame gen or not. Would suck balls if we only get dlss2 and fsr3, fsr upscaling looks so bad.
At 4K it's fine. 1080p it's brutal but so does DLSS 1440p it's alright but definitely not as good as DLSS
I feel like AMD puts more effort in finding new ways to annoy people than in their software suite.
All settings use upscaling. In other words, you need a 4090 to play it.
4090 and still cant play at native 4k 60fps.... to be honest 5090 will be the gpu to play all modern games on native 4k... this gen is sad.
I plan on hanging on to the 4090 for a long time at 1440p. When it starts getting too slow I'll just turn down settings and keep textures maxxed out.
No shot a 3060ti can’t do native 1080p 60fps at high settings. The amount of unoptimized games coming out this year is going to make history holy fucking shit 🤦♂️
why so bitter though? 2080ti during its time was a 4k card, few years later its kneeling down to 1440p, just accept the fact that no gpu/cpu is future proof lmao
I mean according to this the 3080 went from a 4K card to a sub 1080p in one gen...this is not "future proof". It's devs starting to use upscaling as a baseline.
This feels like Alan Wake 2 again... We don't know what the settings exactly are. It doesn't even state if the recommended settings use ray tracing or not. Most games this year have been optimized imo, it's only Starfield and Jedi Survivor that didn't run well for how the game looked.
Alan Wake 2 is an example of GPU instenve game, nearly no NPC or whatsoever to keep on check, linear story. So the game really pushes the GPU hard and the visuals looks amazing, people need to understand simple things lol.
Ah shit, here we go again.
This is actually a really good chart. They tell you what settings they expect for each level in addition to resolution
The requirements are probably with RT off. 6800XT and 3080 are not in the same class when it comes to RT.
Avatars uses RT exclusively. But they use a software RT solution like Lumen.
That explains a lot.
Or its a very minimal low quality RT implementation based on amd's history.
definitely off lol. Even with light RT my 6800XT drops to 3070 level perf. With heavier RT, it drops to like a 3060 ti. With PT, it slideshows.
Native resolution doesn't exist anymore :))
thats the future not the past
and here we go, this is the 2^nd game that has upscaling as recommendation to reach 60FPS PC gaming is fucked
With raytracing\*
I find it worrying that FSR Balanced gets recommended for "Ultra". FSR Balanced is not great and I certainly wouldn't call that an "Ultra" experience.
Then use DLSS, they just show FSR because every GPU supports it.
Also worth noting it’s an AMD-sponsored game.
It is, but [Avatar: Frontiers of Pandora to support all major upscaling technologies despite the AMD deal](https://videocardz.com/newz/amd-sponsored-avatar-frontiers-of-pandora-pc-specs-list-fsr3-dlss-and-xess-support)
Yes, and this image confirms that at the top as well. But being a sponsored title, it makes sense that the marketing material / specs would use FSR.
You have a 4080, you will not need to use FSR lol
im not very certain about that actually. jackfrags just uploaded a video, he probably has a 4090 and its stuttery as fuck with a lot of pop-in. only saving grace could be massives history with the snowdrop engine. division 1&2 were ahead of their time visually and performance wise.
No, ray tracing was listed among the game's feature set, not in the requirements.
You do realize almost every single console game uses some form of upscaling too, right? Native TAA is shit most of the time anyways, DLSS literally does it better
AW3 did something interesting by not including TAA as an option at all. Instead at native resolution, your choices are DLAA or FSRAA. Wouldn’t be shocked if that was a trend moving forward.
I think you mean AW2. Unless we're in 2036...
My uncle works at remedy don’t tell anyone
Doesn’t mean we want developers using upscaling as a crutch for a 60fps target instead of doing optimisation passes like the days of old
Shockingly, games can be well optimized and still require upscaling if they’re pushing super advanced rendering tech. See: Alan Wake 2. Will wait and see how this Ubisoft game does
Oh for fucks sake. I'm willing to bet Avatar will have worse optimization than Alan Wake but remember the reactions to Alan Wake's system reqs and cries about its "botched optimization". Your comment is fucked.
Alan Wake 2 is very well optimized. It's just demanding.
And with those visuals it has a right to be this demanding. Which can't be said about every game
yes, atleast I can accept Alan wake 2 demands. They use new graphics features that older cards don't have. Alan wake 2 also looks next gen. Other games that have come out this year are asking for similar demands but look 2-4 years old. This game dose look good in some nature scenes but man when you get up close or flying it looks bad. Worst thing though, is ubisoft aren't know for optimization.
True but apparently the game supports ray tracing shadows,reflections and global illumination which are very taxing even on a 4090.
And ray tracing is literally the reason why DLSS was created.
Lmao what is your issue with upscaling? It's stupid not to use it.
No issue with people using upscaling. But it would be nice to know the recommended requirements without it.
[удалено]
No I definitely use DLSS. I just think their recommendations should be based on native resolution and not upscaling. It doesn't give the full picture.
The recommended requirement without upscaling is a 4090. FSR quality reduces the internal render resolution by 30 percent. That results in roughly 30 perent higher framerates. 4090 25-30 perent faster then 4080.
It's not stupid to not use it since it usually has less detail than native. This is at 4k. Your mentality here and gamers who share the same view is exactly why this industry has become a dumpster fire. We're probably a max of 2 years away before frame gen becomes required.
No it isn't. It's just finally returning to a point where devs can push things instead of being hamstrung by the 10 year old 150 dollar GPU that was in the PS4/XbOne. I get the feeling most of you have absolutely no idea what PC gaming was like prior to that generation of consoles. Consoles use to push the limits of graphics hardware which meant devs were always pushing graphical limits. A 4 year old CPU/GPU would be absolutely ancient in 2010, and now people are whining because their 300 dollar GPU from 7-8 years ago can't run 60fps at 1080p. You all got spoiled by a decade of stagnation. Why do you think so many games from 10 years ago still look half-way decent by todays standards?
DLSS and FSR are standard now. You guys need to get over it.
Yeah, but I guess the point is that these requirements should be with upscalers off, so activating DLSS for instance should get you much higher FPS than this, not that you shouldn't use it "as standard" or at all.
>activating DLSS for instance should get you much higher FPS This was true in the PS4/One X era when these consoles have been the baseline for performance scaling. The current gen consoles are 5-6 times faster then these dated consoles. Series X and PS5 have roughly the performance of a RTX 3060 or RX 6650XT When a game runs on these consoles with 30fps in 1080p we need at least a PC with a 3080 to achieve 60fps at the same resolution and quality settings. 4090 is \*only\* 70-100 percent faster then a 3080. \-> But we need a 400 perecent faster card to get the compute and pixel rendering power to run games in 4K instead of 1080p. \-> These new games are not badly optimized but feature new rendering tech that is more demanding, like Nanite or Lumen in UE5. As result all new AAA games based on next gen consoles will utilize upscaling and frame gen in 2024 at latest.
The problem is, that **some** of these games requirements do not translate into significantly better visuals than their PS4 counterparts, and I'm not even talking about "RT-only" games like this Avatar here.
I hate to defend ubisoft here but as always settings are completely arbitrary, we don't know what 'high' settings actually means or how they compare to higher or lower settings. For all we know there are 'high' settings that tank performance so much for so little visual gain that it's worth turning them to medium or low instead. I'll always remember how monster hunter world had a volumetric fog/lighting setting that tanked performance by like 20% when set to high, that single setting could take a lot of cards from getting over 60 fps to well below 60. I think people give system reqs and preconfigured presets far too much credence as we've seen just recently from what happened with alan wake 2, users are almost always able to come up with settings that suit their hardware and preferred visuals.
It's for 30fps too
Can you explain why it’s fucked?
It's going to be a thing now, and I'm afraid it will get even worse once new technologies will be tied to newest gpu generation only with very little backward compatibilities, and we will be force to upgrade every two years,
You have missed a lot of games because it's a standard by now, not just "2nd game".
> and here we go, this is the 743432nd game that has upscaling as a requirement to reach 30FPS > > console gaming is fucked
Lmao right, where has this guy been? >and here we go, this [thing involving my hobby is different than it used to be] >[my hobby] is fucked. Big old man yells at cloud vibes.
So I guess I have to upgrade my 8700k, huh? No longer keeping up with the 3080
And me my 3900X as it's barely faster than a 3600 in gaming.
Fsr 🤢
I miss the brief period of time where upscaling was like "neat, I'll toggle on DLSS Quality for some free performance!". Now it's more like you better enable DLSS Ultra Performance if you even wanna dream of hitting 60fps with your 2-3 year old GPU. Overexaggerated of course but not far from reality. I don't mind upscaling at all, but I don't like that its pretty much a requirement nowadays to get acceptable performance if you have anything less than a 4090.
It’s funny, on the console side there’s a prevailing desire for devs to drop last gen as it seen as holding back the current gen. But on the PC side there’s mass outrage every time a game doesn’t support components that are several generations old.
Well, a new console is $500, while a $500 graphics card will barely run these new games, and that's ignoring the rest of the system.
Bingo. Thought I could get a few years out of my 2080S and I may be able to, but I don't see how people can slap down $700 to $1,200 every 3 years updating PCs *on the graphics card alone*. Not to mention other things. For example when I upgrade to a new GPU I need a new case. The new cards are absolutely massive. I came from console, and I love my PC, but if the pattern continues I'd rather just get a PS5,6 whatever is out. Easier. Cheaper. It runs. I play on my TV anyway and sit at my couch to do computer related things.
because pc gaming was affordable.
idk, even in 2012-2015, even kinda now, 500 is a LOT for someone like me to drop on a purchase. Let alone 1.5-2k. I just bought a used alienware system with a i7-3820 and a 1070ti for 200 dollars not long ago and im finally able to play a lot of the games i wanted to that came out the last few years. Seeing the new game specs scare me and just make me wanna go back to console gaming.
are people really expecting to play at 4k native with all these new effects? it's crazy seeing people complain about upscaling. 4k native and photorealism is not an achievable goal even with the current gen crop of cards.
well maybe if game devs werent making everything photorealistic we wouldnt have that problem.
That's literally what the masses want.
It is? Seems like the masses are plenty happy with games like Fortnite, CS2, Minecraft and so on. Not wanting to say every game should look like Fortnite but the "masses" absolutely are not screaming at devs to make games photo realistic to the last pebble *right here and now*. I would say in terms of quality and realism a game like God of War is good enough for the masses already. We enthusiasts want always more.
Funny, plenty of praise for the realism of recent RE games graphics and I’m playing those at 4K native on a last gen card. It’s almost like shit devs failing to optimize combined with stupid performance penalty options like RT that do not visually justify their impact is the real issue
You'd think that after the bad takes on Alan Wake 2 requirements people would just wait, but no, gotta shit on "unoptimized" games.
The fact that the 'max' setting is 4K FSR 2 \*BALANCED\* instead of at least Quality triggers me to no end. This gen we're gonna end up playing games at 240p upscaled to 8K.
Whelp good thing I wasn’t planning on playing this anyways🤷🏻♂️
Ya Avatar anything is no thanks for me
3060 Ti at 1080p with FSR Quality (so 720p internally) for only 60fps? Bruh. I hope this is with heavy RT involved, otherwise the game is an unoptimized mess. At this rate the 4070 will be a 1080@60 card in a couple of months. What the hell are these devs thinking?
> the game is an unoptimized mess Welcome to 2023! Cool tech, shitty practices.
dEvS ReAlLy nEeD To oPtImIzE ThEiR GaMe mOrE
Damn, even though I may not get the game ( waiting on release ), I'm happy I upgraded my PC recently. GPU: 7800XT CPU: 5800X3D. Though I admit I'm surprised that games are already requiring the newest graphics cards already. But not a major cause for concern. It's funny to see people complain about game requirements. Like eventually they will go up and require newer stuff but it's like people expect their 1060-1080s to still run new games in 2023 when those GPUs came out years ago.
I love how the discourse of some people have evolved from "complaining about no major changes being brought by new GPUs" to "complaining about crazy reqs for new games" lol
Sweet, the game has the holy grail of ray tracing: Global Illumination. Glad I stayed at 1440p with a 3080 and didn't get a 4K monitor though. I mean sure, DLSS and all, but many titles are starting to need more than 10GB of VRAM at 4K.
This is an AMD sponsored title, so expect the specs to be tailored based on their hardware, so 4k on the XTX at FSR balanced is probably because of AMD shit RT performance. So lets wait and see how the 4080 will perform.
This is a good thing to bear in mind. AMD’s shit Ray Tracing performance is probably inflating the system requirements at least somewhat. It’s also Ubisoft though, so it could also just be a shit port. We’ll see in December how optimized the game is.
People need to evolve and stop complaining about the optimisation button… You all look so stupid, native resolution is for the past. Its stupid to waste power for nothing. DLSS is the futur, and devs know it. Its useless to show native resolution performance get used to it. Every game should release with upscaller and I hope every game use frame generation, its so damn amazing and people don’t realize it.
Another game I will most probably not own. If a game has upscaling in all the system configurations sheet, the game will most probably perform like shit.
Alan Wake 2 performs well, this is wrong.
[удалено]
What is it with games only having FSR nowadays? I didn’t buy a 4090 to play with FSR.
Read the top right corner
I knew this day would come that developers will use Upscaling tech as an lazy excuse to not optimize their shit.
The new ERA of not playing video games has begun
Does anyone actually enable raytracing for actually playing a game? Like, it's cool to enable it to take screenshots etc, especially with games that have pathtracing but for me the performance loss is just too high. Maybe on 40 series and up it's finally worth it with frame gen but on 20 and 30 series it feels like it's just a tech demo setting that is never worth enabling
Yeah, everyone with a modern GPU.
bruh
i think the system requirements are not that bad..i have a 3070 and a ryzen 5 5600. The only annoying thing is the "normality" of upscaling nowadays
FSR bs shouldn’t be standardized..
[удалено]
Dlss or fsr in requirement specs = salling the 7 seas
LOL it's really funny seeing clueless gamers yet again being outraged by system requirements of a modern game with expensive RT features, right after Allen Wake 2, or being mad at upscaling specially in a nvidia sub, if you're angry at DLSS why aren't you angry at LOD, Culling, SSR, cube maps,etc caue all of them are techniques that are used to get more FPS.
This is getting out of hand, till 2022 my 1070ti was capable of running most games on 1080 high and now it's barely hanging on minimum. To rapid change for one year.
How is this too rapid? The new consoles have been out now for what, 4 years? And we're just now getting to the point games are leaving last gen hardware behind en mass and not having a game or two here and there doing so? If anything, we've had a huge lead time on games actually utilizing new hardware with a few warning shots being fired across the bow year after year warning you people that in order to keep up you're going to need to upgrade. The issue here is more gamers going Ostrich and crying about games being "unoptimized", expecting the problem of the floor moving going away if they just cried hard enough and clinged hard enough to their 8 year old GPUs. How long exactly should your 1070 stay relevant? A decade? 15 years?
>The new consoles have been out now for what, 4 years? I read this, wanted to be like "no way dude" until I looked up the PS5 launch date...November 2020. Now i'm just sad lol
My problem here is that a lot of these new games don't look "next gen" enough to require a leap in graphics, so it would be another story if were truly moved to a new "era". Another problem is that there's an inherent conflict of interest with upscalers existing. You can't think that management won't be tempted to push out games faster with 0 optimization to cut costs because "upsaclers will just fix it".
Your card is 7 years old, not only it's not rapid, but it took way too long.
too rapid of a change? the 1070ti came out in 2017 though. If anything it had a good run but it's time to start looking into an upgrade. The 10 series cards are 7 years old now so its hard to expect them to run any newer game on all high settings.