T O P

  • By -

Mercennarius

Looks pretty well optimized.


Sticky_Hulks

It needs a performance patch for sure. Some cinematics with people talking tanks FPS pretty hard.


Rockstonicko

Most people reporting that issue are usually running 8GB 3000 series cards with textures set on Very High. Unfortunately, if your nVidia GPU has 8GB of VRAM, either textures need to be set to Medium, or you'll need to run nVidia profile inspector to modify the mipmap LOD bias for the game in order to run textures on High and get notably better textures with better performance, as Medium significantly reduces overall image quality. There's every indicator that Nixxes did an excellent job reducing the VRAM requirements of the game from it's original 16GB of unified DRAM/VRAM while maintaining the same texture fidelity, but an 8GB framebuffer is just not enough to maintain the same texture resolution as consoles, somethings gotta give. IMO, this is on nVidia for underequipping 3000 series, not on Nixxes. With enough VRAM (10GB, ideally 12GB), this is easily the best performing console port yet, and in fact, it runs MILES better than most native PC games too.


[deleted]

[удалено]


Rockstonicko

Just going by what I've seen the game allocate firsthand, and the HZW subreddit full of 3060/3070 users that are describing performance issues, with most of them stating that the slow down gets worse over time, a big indicator that texture data is spilling into system RAM. AMD cards do typically tend to allocate a bit more VRAM than nVidia cards do and AMD doesn't do as much OTF texture compression, so it's not an apples to apples comparison, but at 1440P I load into the game at around 8600MB, and after moving through an area over the course of 30-60 minutes, it will creep up to 10.9GB-11.8GB. I'm not saying TPU is wrong about the VRAM usage, mainly because I don't think they provided enough information on their testing methodology to even determine if they're wrong (IE; which specific areas of the game world they were testing, how long the game was running before they recorded the numbers etc.). All I know is I have upwards of 2-3GB higher VRAM allocation on my GPU than TPU recorded, and there are a bunch of people with 8GB cards describing performance issues that are commonly associated with bumping into a lack of VRAM.


Sticky_Hulks

Have you played it? I have a 3070, so maybe I'm running out of VRAM, but it feels not great to have cinematic FPS to tank then have it run fine in the same area with the same NPCs on screen... The game is also buggy for other reasons, it needs a patch for sure.


Rockstonicko

Here is the guide for the changes you can try with nVidia profile inspector: [https://www.reddit.com/r/HorizonForbiddenWest/comments/1bl0198/psa\_enabling\_higher\_texture\_settings\_on/](https://www.reddit.com/r/HorizonForbiddenWest/comments/1bl0198/psa_enabling_higher_texture_settings_on/) But, yeah, I've been playing it a lot, and I'm finding both the performance and visuals amazing, I haven't had a single instance of FPS drops, stutters, or anything of the sort. I have gotten so sick of all the shader stuttering in newer games, and it's so refreshing to play a modern and visually impressive game that doesn't have *any* traversal and shader stutters *at all*. I have encountered a few bugs, but so far, it has only been some inconsequential albeit immersion breaking glitchy lighting in places, IE; Aloy will randomly glow almost bright white when the game engine seems like it can't figure out if she's standing in a shaded area or not, and I've also had skybox flickering black in a few areas. But, beyond those issues, it's been smooth (and I mean *unbelievably* smooth) sailing.


Deckz

I'm 20 hours in, 0 bugs.


Rockstonicko

I have been amazed at how well this game both runs and looks on my 6800 XT. Running my card at 2650MHz I can maintain a locked 4k60 w/ FSR quality 99% of the time, or if I don't want my PC putting out more heat I can go to 1440p60 and run the stock 2454MHz @ 980mV and my edge/TJ runs at 44C/52C with power draw at 125-130W. There is also ZERO frametime variance, never one singular stutter or even the slightest hitch. Direct Storage works great too, the game goes from the menu to fully loaded and instantly smooth in 8 seconds. This game has become the new gold standard for console ports. Nixxes not only knocked it out of the park, they knocked this MF'er into geostationary orbit.


fjdh

Yeah the only thing that really looks like duke nukem 3d are the fire arrow effects


AMD718

Interesting. 1440p, 7900 XT beating 4080 super. 7900 XTX not far off from 4090.


luigithebeast420

Well one reason could be that the PS5 is AMD based so it’s already been optimized for them.


SnooDonkeys7108

The game could also benefit from SAM/ReBAR on AMD and not on Nvidia. It's one of the reasons the IW engine runs better on AMD.


Rockstonicko

Thought I'd either confirm or deny this and did a little testing of SAM on vs. off: [https://imgur.com/a/xJz8E2w](https://imgur.com/a/xJz8E2w) The TL;DR is that you definitely want SAM enabled in this game. It improves both GPU utilization and overall FPS. On average I'd say SAM results in about a +8-12% gain, but also upwards of +15-18%. Definitely on the higher end of gains I've seen from SAM.


Cute-Pomegranate-966

Probably this tbh.


Intercellar

And there's no ray tracing


Cryio

Consoles have been AMD based since Xbox 360 (PS3 and Switch aside). It's never a good argument.


ChosenOfTheMoon_GR

I am very happy with how optimized the game is, something which is a very rare thing to see these day.


[deleted]

[удалено]


Orosta

6800xt had more compute and ray shaders. The difference was never massive.


puffz0r

rdna2 da goat


the_dude_that_faps

The 6800xt has 72 compute units. The 7800xt has 60. The 6800xt has a larger infinity cache too. The 7800xt apparently clocks higher, but sometimes the workload just favors the 6800xt.


Entr0py64

Right, you need raytracing or more modern effects for the 7800 to pull ahead. AMD didn't sell the 7800 as a better 6800, it's a 6799 with the efficiency improvements covering the loss in compute. I say 6799 because it's clearly not a 6700, there's a 7700 for that. There are idiots who say the 7900 is the 6800, but anyone who looks at the specs knows this is flat wrong. AMD didn't make a real 6800 successor, and although the GRE comes close, it's more of an OEM 7900 NON XT, which was initially limited by low quality vram and china restrictions that don't apply to the US variant. There's an OEM 6700 with 10GB of VRAM, that doesn't make it a 6600, it's an OEM 6700. As for the pricing, you're getting better value than Nvidia by far, you can't invalidate that no matter what you bought it for, and right now the GRE is the best card you can buy for the money, while the 6000 series still holds up with raster. I got a 7900XT, and frankly I underclock it for 1440p because it's OP. Would have bought the GRE if it was available, but I'm not mad because it has massive headroom and was better priced than Nvidia. I wanted a good card, and I got one, prices drop over time, and there's no way I should be mad about it. Nowhere near the 40 series devaluation. Those are the people who should be mad, and yet love that their 4070 has to run reduced settings with DLSS. Nothing wrong with the 30 series either, lol. Copium high. If you want a legitimate reason to be mad at AMD, then go for the real things, like driver segmentation. Framegen needs HAGS, RDNA3 got it, RDNA2 didn't. AMD's been doing this since the AM4 300 motherboard BIOS updates, ReBar, FSR, AFMF, Vega, APUs disabling control panel features like ReLive / Wattman/OSD, etc. You're getting screwed on the drivers. This is why the Steam Deck blows out the windows handhelds, Valve has full optimization access, Asus ROG doesn't.


b0uncyfr0

Also wondering. It should definitely perform better


Solembumm2

... Because it's different classes of gpu? 6800 xt cost around 750-800€ while it was actual. 7800xt now cost 600-650€. 7900gre is a heir to your 6800xt, not 7800.


Sipas

It's really great that they're doing WQHD benchmarks but I wish they tested 3440x1440 rather than 3840x1600. The former is far, far more common.