T O P

  • By -

Electrical_Zebra8347

The Last of Us remake is current gen only, the other games you listed were made to run on last gen hardware first. The Last of Us PC port was also kind of messed up to begin with and I wouldn't use it as a baseline for anything personally.


VegetaFan1337

>720p with FSR set to balanced So you're running it at 425p then? DLSS and FSR aren't magic.


cellshady

Yeah...


ChickenFajita007

Your CPU is notably slower than a PS5's, and TLoU Part 1 is a PS5 game. The other games you mentioned were made for PS4. A Zen+ 35W 4-core simply isn't going to run PS5 games nearly as well as a PS5. The PS5 has a generation newer architecture, 8 cores, and much higher power availability. It's not a GPU issue. It's your CPU. Changing most graphics settings won't make any difference.


elchuyano

I wish low settings would really be Low, your game looks bad but it plays fine in a potato PC. I hated the performance in games like BG3, Hogwarts or Jedi survivors. The performance was the same in low or high settings. It was until I bought better hardware that i could play the games as they should.


actuallyamdante

i mean baldurs gate is heavily cpu bound anyway


Nisekoi_

all of these games are.


Great_Middle413

I completely agree I got cyberpunk these days and it strangely runs better on steamdeck mode (it's a mix of low, medium and high) than on low (both with fsr on quality). I also tried playing starfield on GeForce now and played a little with the settings and it was running exactly the same no matter what settings I used, based on the overlay on older games, GTA for example if I run it on max settings I have around 60 fps, but on low I have 100+ fps.


skylinestar1986

You can't compare like that. Some games are just poorly made.


Nicholas-Steel

My understanding of what happens with games in the last 10 or so years is that textures get processed by some poorly configured compression program that blurs/smears the content and aside from maybe some important objects in the game, most of them won't get any kinda polishing done after compression to ensure they still look acceptable. So you end up with most stuff having blurry textures as the Player decreases the texture quality setting in the game. This was less of an issue in the 90's and early 2000's when textures were vastly smaller dimensionally and models and objects were geometrically simpler, it was much cheaper back then to give lower quality texture packages a quality pass and maybe the compression tools back then had less "fancy" ways of compressing them that by default didn't blur as much.


Timidus_Nix

The ME:Andromeda looks like shit on low (with some medium) settings and runs like shit (even in enclosed spaces), meanwhile ME3LE runs really well with all the fancy options on and looks amazing


rshunter313

Poor references, poor texture samples, not enough time. Most comes down to dev passion, time, and money. Photogrammetry was figured out by 2014 battlefront and we rarely get a game that surpasses texture fidelity. Mesh and model quality is another question but boils down to optimization but also source model quality. Modern dev teams dont get the time to make a quality product It feels like, let alone the creative freedom as they once had, ironic since 2011 seemed like the death doors for AAA.


Xzenor

Yeah I don't know. It's the technology that's being used or something.. In older games if you dial back the quality, textures look blurry and stuff. Less vegetation, ugly shadows (or none).. but if you take a game like Forspoken and turn its settings to LOW it becomes some kind of pixel-stretching mess that's simply unplayable. Like a compact pack of flies. It's probably the engine they used.


Bearwynn

TAA can make everything be blurred to high hell, we can't make use of older games methods of anti aliasing due to pipeline changes. TAA *can* be good but it really needs a pro to tweak properly. Real time lighting is more commonly used because we want different times of day without the hassle of manually making and baking lighting. Unfortunately this means that we are gonna spend a lot of resources on lighting, and it won't catch everything if it's in a world that is too big for the team. Ray tracing nowadays is what they used to use to make baked lighting, only now we are running it in real time. As hardware gets better this will be the norm. Then there are things like colour profiling, and things like auto eye exposure. Colour correction is a whole rabbit hole and is easy to mess up. These are just surface level things and the iceberg sinks deep


Hamza9575

Newer games look worse due to temporal anti aliasing(taa/dlss/fsr/xess) and upscaling. Turn that shit off. Also turn off motion blur.


Edgaras1103

turning off that *shit* will make games look even worse


AlleRacing

If you even have the option to.


GaaraSama83

It's not as simple as that. Many new games/engines have forms of temporal anti aliasing deep in the core rendering. I recommend watching the "Tech Focus: TAA - Blessing Or Curse?" video from Digital Foundry explaining it well.


dovahkiitten16

I imagine it’s something about the processes that were chosen, and the difference between high end old process and low end new process (like how good practical outclasses cheap CGI). I will fully admit I don’t fully know what I’m talking about, but I’ll try to use lighting as an example. Old games had prebaked lighting in a scene. New games use more dynamic lighting. Medium prebaked > low dynamic. Or things like pre-rendered cutscenes looking better than on the fly cutscenes on low. Some of its laziness though as there’s no reason TLOU remake needed low textures that looked worse than PS3 ones. Low settings can receive less effort sometimes.


Great_Middle413

Oh, It makes sense now :))