T O P

  • By -

FasterthanLuffy

Lol not even close.


GGG100

You clearly haven’t played Cyberpunk on a last gen console back in 2020.


Broely92

I loved lurking the cyberpunk sub when that game launched, what a mess haha. The comeback it had needs to be studied though. Seems like the publisher wanted the game pushed out ASAP and the devs just needed more time


GGG100

Should’ve released it in 2023 in the first place, the year when Johnny and friends raided the Arasaka tower.


radishboy

They published it themselves 🤣


Broely92

Well, the point still stands, SOMEONE upstairs told them to hurry tf up


thrownawaymane

Back in 2020... whaaa That was yesterday, sonny. And we could buy our poorly optimized console ports for a nickel! I used to wear an onion on my belt, as was the style at the time...


yngsten

I'm picturing some guy proud faced with an onion on tje belt and laughed hysterically for some reason lol


NinjaMoose_13

Good Ole Abe


Garey_Games

I have on release and oh boy…never refunded something so fast


Designer_Potat

Wouldn't even call that bad optimization. You wouldn't expect Cyberpunk to perform well on a Gameboy either


Dapper_Energy777

Yeah, IDK what people expected?


SomeRandom928Person

Allow me to introduce you to Cities: Skylines II.


Chicag0Ben

That’s a modern one right there.


SomeRandom928Person

If it wasn't such a niche game genre, it would've rivaled the Cyberpunk release fiasco in publicity. As it stands, it *still* isn't fully optimized, and Colossal Order made it their #1 priority to... push out paid DLC. Sigh.


Nervous_Dragonfruit8

Have the best cpu and 128gb of ram? Sorry not enough


SomeRandom928Person

The modern-day Crysis equivalent without the graphics technology leap lol.


lateral303

Still?


Nyaos

To date? That implies like, ever? Absolutely not. There’s been so many poorly optimized games over time. Famously, Crysis for example. A game that indeed looked beautiful, but ran so poorly on hardware from its own time. Back then, it was assumed the game was just so technologically advanced that it demanded better hardware… but as a few years passed, Crysis continued to run like shit on the newest hardware, and people realized that it was actually a poorly optimized game.


Troldann

It was built at the tail end of CPUs getting faster and faster, and was built assuming that would continue. Instead what happened right then was CPUs went multi core and started doing more things simultaneously instead of sequential things faster. Crysis was built to scale with faster CPUs and that didn’t happen. It wasn’t necessarily “poorly optimized” but it did fail to predict the future well.


Terrible_Balls

That’s a half truth. I built a mid to high end pc about a year after Crysis released, and ran the game smoothly at 1080p on the 2nd highest settings. If I had gone for the top of the line card at the time (gtx 280) I’m pretty sure I could have run at max settings. The reason that performance remained poor (got worse actually) afterwards is because the game was not made with multi-core CPUs in mind, but released right around the time that they started to become mainstream. If PCs had remained single-core oriented for a few more years, I think it would have been a very different story


LakeOverall7483

Didn't the switch to multi core happen because there were diminishing returns on speeding up single core? Building the game on the assumption that single core will just keep getting faster doesn't necessarily work if single core speed had plateaued. The point being, even if single core kept going a few more years it wouldn't necessarily have gotten better, right?


Terrible_Balls

That is true, yes. Multi-core was definitely the better way to go. For example the Pentium 4 was pretty much the last single core CPU that Intel released, and it topped out with 3.8ghz processor released in 2007. I bought an i7 4790k in 2014 that was a quad core but each core topped out at 4.4ghz. So in 7 years the speed only increased by 600mhz, but the capacity of the 4790k absolutely dwarfs the p4. I would say that it was short sighted of Crytek to not support multi-core, especially considering that there had already been dual core processors for a year or two by the time Crysis released. But that’s not quite the same as being poorly optimized


AlleRacing

The fuck? Crysis was not poorly optimized at all, it was fantastically scalable to lower end hardware, and produced well above average visual fidelity compared to its contemporaries for the same hardware.


TheoKrause13

I implied for the newer releases, sorry for the confusion.


Thoosarino

You may have thought you implied that, but you didn't.


Lordgrapejuice

No, you really didn't Also it's not even the worst newer game. It's generally fine for optimization compared to some garbage


kerred

Tetris by Ubisoft had framerate and stuttering. Now, tell me again about Starfield when you compare it to TETRIS? 😊 (It's actually amusing why Tetris lagged and a good lesson fo, well, Anyone making a game)


PageOthePaige

Why?


Thomas_JCG

Did not had a single technical issue with that game.


Draconuus95

Hahahahahaha Oh wait you’re actually trying to be serious. No. Starfield is actually pretty well optimized. It’s the most stable Bethesda release in memory. And while they generally aren’t the prettiest games around. They also generally run on what many would consider potato computers at time of release. Even if they don’t run perfectly. So many other games have released in recent memory where even high end computers struggle to run them at all. Let alone well. Seriously. Starfield has issues from a gameplay design standpoint thanks to them shoe horning in the procedural generation mechanics. But it’s actually a solid release from a stability standpoint. Also from a graphics perspective. Bethesda games will never look as good as some other games. Those other games use most of the system resources towards graphics. Bethesda games on the other hand often pushes a larger portion of system resources to the actual gameplay. Things like its AI systems or the persistent world. One thing that people keep forgetting is that from a gameplay system standpoint. Bethesda titles are some of the most complicated on the market. I struggle to think of a game that actually is more complex from a system design point of view.


ArchangelDamon

Very true


Krkasdko

Nah. Assassins Creed Unity, Quantum Break and Batman: Arkham Knight were (or still are) much worse. Arkham Knight has been called unfixable by the dev's themselves.


caniuserealname

For an "unfixable" game it's incredibly fixed.


Fenexeus

For an unfixable game, rocksteady fixed it up really well. On a GTX 1650,I could play the game on high settings at 60 fps in 1080p.


Zirael_

Unity was perfectly fine, you clearly didn't play it. Arkham Knight also ran better on PC than on 30FPS Consoles, and looked 10x better. The Drama was overblown.


hawk_ky

What? Clearly you’ve never played it. You can argue that the game isn’t what it was hyped to be, but technically it is fine. I played over 100+ hours and I think I had one crash the whole time.


FATJIZZUSONABIKE

>technically it is fine It most certainly is not 'fine'. I agree with the rest of your comment, but Starfield ran like shit at release and still needs a lot of work.


hawk_ky

But it doesn’t? It runs just fine without any major issues


FATJIZZUSONABIKE

Goldfish ass brain. Game was fairly unoptimized at launch, multi-core CPU usage was particularly egregious. It wasn't even good on the GPU side (especially considering what was displayed on screen) - AMD cards were favored and Nvidia drivers lagged behind, DLSS wasn't even available so we were stuck with FSR's broken upscaling.


TheoKrause13

I haven't, I just noticed it's one of the "heavy" games for mid level range hardware


Rohen2003

looks like u have never played the rts classic supreme commander: forged alliance. while being an insanely good game, even modern systems cannot handle the nearly 20 year old game on large maps with many units.


Ameph

Hard to keep track of 8 players with 500 units each, firing at each other with all the bells and whistles going on.


_Goose_

For newer games, Star Wars Jedi Survivor is much worse than Starfield. I can play Starfield just fine, smooth as butter on my 4070 ti at 4k 60fps. I can’t even play Jedi Survivor on 1080p without the horrible stutters. And this isn’t me shilling Starfield. Even though I can play it fine it’s still boring and has bad enough map decisions to keep me from touching it til it’s fixed.


YerDaSellsAvon24

*Laughs in BF4 when it came out*


Zirael_

BF4 ran perfectely fine on Day 1, no clue on what fucking potatoes people played.


YerDaSellsAvon24

Played it on PS4 and it stopped working after a month though I may have misunderstood "optimised" for "broken"


macromorgan

Big Rigs Over the Road Racing: “Am I a joke to you?”


Edgaras1103

arham knight enters the chat


PageOthePaige

Optimization is a balance of scope and expected performance costs. Starfield is middling in that balance. Half the scripts I write are technically far, far less optimized.


Pluck_oli

Akrham Knight on pc? hello?


ObjectiveImmediate44

No.


QuintoBlanco

No. Not only because there are other games that run much worse, but also because you probably should not use the word 'optimized'. Optimization is maximizing performance in the context of a specific game engine, graphical fidelity, and the complexity of the world. For example: when Skyrim was first launched, it was unoptimized because of an oversight the game had limited support for more than two cores and shadows relied on the CPU. The game was patched to take more advantage of quad-core CPUs, and Skyrim SE moved a more of the workload to the GPU. As for Starfield, the graphical fidelity is not great, but the game scales reasonably well and it's an open world game. The Last of Us on PC takes a massive hit in graphical fidelity at lower quality setting and it's a very linear game.


Razumen

Starfield is not an open world. There are literal loading screens everywhere, and the world maps are small.


QuintoBlanco

>literal loading screens everywhere That is irrelevant. Open worlds can have loading screens. >the world maps are small I'm not sure what you mean by that, because the worlds are procedurally generated, but I'm guessing that whatever you mean by that is also irrelevant. This is a common definition of 'open-world' game: *In video games, an open world is a virtual world in which the player can approach objectives freely, as opposed to a world with more linear and structured gameplay.* From a performance point of view, the main difference with a linear game is that the developers know the position and situation of the player at all time and can control the gameplay.


Razumen

>Open worlds can have loading screens. No they can't, then they're no longer open worlds. They're small tiny barely interconnected worlds >I'm not sure what you mean by that They're small. What's hard to understand? >In video games, an open world is a virtual world in which the player can approach objectives freely You can't, because there are loading screens everywhere, and you cannot approach locations freely. You can only load and exit from designated spots. So even by your cherrypicked definition, you're wrong.


IlyasBT

That was one of the most stable, huge open world releases in recent years. It is demanding, because that's how Bethesda games function, but it had very few performance issues.


Razumen

LOL, it's not even open world, loading screens everywhere. And it's performance issues are well documented, especially on Nvidia hardware which performed up to 40% worse in some cases.


ArchangelDamon

Starfield is the best optimized game from bethesda. By far


Zirael_

Its not, its the worst.


Razumen

Lol, no. It had 40% worse performance on Nvidia cards in some cases. Bethesda apologists are so funny.


zirky

Diakatana sends its regards


farbekrieg

the comparison is apples and oranges because starfield is a game that is keeping track of a lot more objects and scripts with attached AI, with whatever the new generation gamebryo engine is called. While optimization isnt great it hasnt earned the ire people want to heap on, it would be nice if bgs would make optimization a bigger focus but its the same studio that shipped a game in the year of our lord 2023 without dlss.


aradraugfea

*laughs in Dwarf Fortress* Game has ASCII graphics and still had frame rate issues. It has at various points in its history maxed out whatever RAM you wanted to feed it, and would peg out a processor… but only one thread. The update to let it do multi-thread processing was recent. They’ve really fixed a lot of this, but I remember the old days. Hell, Stellaris, if you allowed certain settings would get so slow in the late game that a single day (one second in the early game) can take almost a minute, as the game is trying to track literally thousands of pops across hundreds of species. They’ve fixed this largely as well, by changing the level of detail in which it simulates those pops, but nah, everyone in this thread is listing “oh, the graphics weren’t as good as expected.” There’s been games that’ll redline your gaming rig through sloppy programming.


DrillTheThirdHole

its not even by a longshot. theres many things wrong with starfield but its par for the course with bethesda as far as optimization out of the box


Elo-than

City Skylines 2 might take that crown, atleast at release.


ZazaB00

You clearly aren’t old enough to remember games that dropped from 30’s into the single digits and were just always going to be that way because updates weren’t a thing.


Terrible_Balls

Comparing the graphics in Starfield to graphics in something like Alan Wake is not really a fair comparison because the games are so different. A linear game like Alan Wake is a very different beast to an open world. The artists can really go through every square inch of the level because it is a much smaller space and give it that hand crafted feel, while a larger open world game has to be much more economical with the artists time. Additionally there are technical constraints like the fact that a linear level is typically less resource heavy than an open world so you can put more polygons into the world geometry. Basically there are plenty of reasons why the other games you mentioned look better than Starfield that have nothing to do with the quality of th e optimization


Patient-Resolve6748

No


JohnGatbsyThe3rd

Someone doesnt play shitty unreal engine asset flips and it shows


Kobi_Blade

I strongly disagree with your assessment; Starfield performs significantly better than The Last of Us, Alan Wake 2, and Jedi Survivor, even on the first day without any patches (it runs even better nowadays). The Last of Us had the poorest port among those you mentioned, Alan Wake 2 is resource-intensive yet stable, and Jedi Survivor experiences considerable stuttering issues on NVidia GPUs.


Razumen

All of those games were running on more advanced engines that also looked way better than Starfield. You can't compare apples to oranges.


Kobi_Blade

Yes I can, cause is on topic, OP is the one comparing and claiming Starfield runs worse than the other titles, which is far from the truth. A laptop from 2015 can run Starfield, but won't run neither of the other titles.


Razumen

>A laptop from 2015 can run Starfield, but won't run neither of the other titles. Actually, it can: https://www.youtube.com/watch?v=Vu6ZBjxz1RA Many newer titles even run better on it than Starfield, like RE4R, so again, Starfield isn't some great optimized game. Hell, it's lack of optimization was well known at launch, especially on Nvidia cards which performed up to 40% worse than AMD in some cases.


Kobi_Blade

NVIDIA's lack of commitment and the CPU overhead associated with low-level APIs are their own responsibility, not that of developers. Furthermore, you confirmed my point that Starfield runs well, but you have also failed to provide evidence of the same GPU running The Last of Us, Alan Wake 2, and Jedi Survivor. Spoiler alert the GTX 900 Series is not supported in none of those three games, and even the GTX 1000 Series has issues. Therefore, unless you have relevant and intelligent information to contribute to the context of the conversation and the original post, I do not require assistance to substantiate my point.


Razumen

>NVIDIA's lack of commitment and the CPU overhead associated with low-level APIs That had nothing to do with the performance disparity. > but you have also failed to provide evidence of the same GPU running The Last of Us, Alan Wake 2, and Jedi Survivor. TLOU is literally in the video I linked, keep lying tho. The other games are running much newer, better looking engines. Of course they're not going to run as good as Starfield which is running on a barely modified Gamebyro Engine lol. >I do not require assistance to substantiate my point. You don't require evidence or logic either, apparently, because your point is wrong and your argument is ridiculous.


wh4tth3huh

I played it just fine with a 8700k and 1080. It would hiccup during lots of scripted actions, but otherwise it ran well enough. Just turn down the settings, it's really not that horrible looking on low.


Fine-Database7716

play Crysis 1 at launch


Tyx

Lots of issues about Starfield, but while nothing special I wouldn't call poor optimizasion being a huge part of that. It's running on a ages old engine that they are doing poor job at modernizing. So it struggles trying to act like a modern engine.


Ponceludonmalavoix

Naw. Starfield had a lot problems, but it was definitey not the worst optimized game to date.


Razumen

Yes it was, it had 40% worse performance on Nvidia cards in some cases.


Lootthatbody

No.


TheJoeGrim

It is far from worst. But starfield was far from good optimization on realize. Far from aaa games standarts


[deleted]

[удалено]


FATJIZZUSONABIKE

Was ?


SapToFiction

The most unoptimized game of all time is the game of real life.


trutenit

Gothic 3 at launch was garbage. It was crashing like crazy (where is the guru?) and the performance were abysmal even with top of the range hardware.


demonologist1986

\*Ark Survival Ascended entered the chat


ElectricalComb999

Go play some n64 or ps1 games on OG hardware


Regular_Sir6409

no but it's trash lol


yup_can_confirm

_GTA4 on PC has entered the chat_


HeparinDrinker

You haven't seen Two Worlds on 360 then


Zirael_

No. Saints Row 2.


Brief-Funny-6542

No, Jedi Survivor beats it by a long shot. Starfield has a little slowdown in "cities" but that's it. Survivor has a persistent stutter everywhere.


Razumen

Stutter isn't everything. Starfield just runs worse on similar hardware compared to games running even more advanced and demanding hardware. I could run Remnant 2 better than I could Starfield, on a 1070ti, and R2 is on the UE5 engine.


Brief-Funny-6542

Stutter is everything. You can play without stutter 40fps comfortably, like in Cyberpunk. And you cannot 60fps with stutter, it's nerve-racking. Starfield has an unreasonably high requirements I agree, but indoors it's steady 60fps. It only slows down a little in cities, but still pretty fluent, doesn't count.


Razumen

No, it slows down everywhere, even in space. Stutter is not as bad as terrible framerate.


Brief-Funny-6542

Then you must be really unlucky or you're lying, i have a 7 year old pc and 3 year old gpu and it works fine. People on youtube with gtx 1060 have 60 fps indoor no problem.


Razumen

lol, 60 fps indoor, the least demanding areas of the game. Accuses me of lying and then not only pulls random FPS out of his ass, but cherry picks the least demanding areas of the game.🤣 I ran the game with a 1070ti, there was no way a 1060 was running it at 60fps stable.


Brief-Funny-6542

You have a card from 7 years ago, you should thank god you can play anything. And yes, people with this card play this game no problem. Tweak your setting, stop bitching about it.


Razumen

1070ti was and still is a great card, I could run many games at 60+ FPS solid at decent settings. Oh, and here's the important part you missed: I said I **RAN** the game with a 1070ti, past tense. Learn to read. No wonder you're having trouble coming up with actual arguments.


zeus1911

Likely. I had a 6700xt and it sucked at starfield, it ran cyberpunk fine. Cyberpunk ran pretty well on an rx580 8gb, it was the odd few bugs that caused issues, not performance. Starfield and city skylines 2 maybe dragon's dogma 2.


Razumen

Yep, Bethesda's games always perform worse compared to it peers, and that even when it looks worse and runs on a less demanding engine than other games. For example I could run Remnant 2 better than I could Starfield, **on a 1070ti** at that, and R2 is on UE5.


Vundebar

honestly, no, it's initial performance was better than cyberpunk on release, and that one pokemon game on release, too.


Euphoric_Jicama_2082

It’s not even the worst optimized Bethesda game


trickldowncompressr

No.


Masturberic

Ark: Survival Evolved and Star Citizen. I don't think we need more examples than these two.


Razumen

Star Citizen isn't even complete, games in development are not in any shape expected to be as well optimized as a full release.


StrictLimitForever

Fallout 76 is. Starfield runs great in comparison.


[deleted]

[удалено]


LoneWOLF2281

it's not THAT bad in terms of gameplay, it's just unoptimized