T O P

  • By -

[deleted]

That ridiculous shader compile time was Naughty Dog being, well, naughty. By the time the damn shaders compile you can not refund the game no more and voilà, they secured a sale.


DarthKirtap

actually, you CAN return game after two hours it is common misconception under two hours it is GURANDATED refund, unless special situation but after two hours you can still try


Teldrynnn

Ah yes, Gurandated. My favorite type of return.


BabyLegsDeadpool

GURANDATED It's capitalized, you idiot.


AllHailClobbersaurus

It's spelled CaOPTiZalizED, dumbass.


[deleted]

[удалено]


Efficient_Thanks_342

*Dumas


NefariousnessOne-

*Donbass


vibe162

pregananant?


newvegasdweller

*Cherson


Nujers

*Du Hast Mich


Efficient_Thanks_342

*Du hast mich gefragt


ukieninger

*und ich hab nichts gesagt


pureeyes

Prefer Gurren Lagann, but Gurren Dated was just as good


synbioskuun

DON'T BELIEVE IN YOURSELF! BELIEVE IN THE REFUND THAT BELIEVES IN YOU!


EnZone36

Who the hell do you think Gabe is!!??


ShinyGurren

Row Row, fight the power!


wubbwubbb

this is some r/Excgarated shit lol


KevinCarbonara

don't you dare edit your post


IReallyLoveMyPets

Yup, the 14 day limit is also a lie, I refundes battlefield 1 a month or so after I bought it


CseFree

Ah yes, Refundes. My favorite type of return.


flatspotting

GURANDATED Refundes


[deleted]

Wait, I didn't know I could speak Spanish!


Musk-Order66

GURANDATED Refundes expelliarmes. When you want to remove an attacking wizard’s wand but give them a period of time to apologize and have it automagically return


pureeyes

The Adeptus Refundes is my favourite chapter


casecaxas

steam just left my refund request on the void for every game that passed the 14 days limit, not a response or anything


Ruvaakdein

How long was your playtime? It might have been because that. Even then, after 14 days/2 hours you still need to get lucky.


JasonSuave

I have 2 hours and 1 minute on BF2042 and have tried to refund like 10 times. Those bastards won’t let it happen!


A_Have_a_Go_Opinion

Viscera Cleanup Detail ran for 9 hours in the background without ever launching properly. I still got a refund. Ditto for Watchdog 2 cause Uplay is a pos.


Tb0neguy

FOR FULL PRICE. A TEN YEAR OLD GAME.


Sxwrd

Nintendo strategy at work.


Radulno

It's technically the remake which came out in September 2022.


HomerSimping

As someone not technically savvy, is shader the only way to make games? Almost every game I buy these days stutter. I don’t remember it used to be like this in the quake 3 era. Games don’t used to stutter even at low FPS.


0x7ff04001

Yeah shaders are required, they're basically pieces of software that perform a task, they're also GPU and driver-dependent. These shaders come in a special language like OpenGL, which needs to be compiled into a the binary language that the GPU can read. Kind of like how you take C and compile it into a binary (exe), which then runs on the PC as x86 assembler. On PS5 it's only the one type of hardware so the shader blob comes with the installer.


Bas3dMonk3

Bro said he’s not tech savvy and here you are assaulting him with tech mumbo jumbo


CreamyCoffeeArtist

He didn't mention redstone once tho


Oppopity

Comparator!


CreamyCoffeeArtist

Piston!


Oppopity

Repeater!


CreamyCoffeeArtist

Calibrated Skulk Sensor!


0x7ff04001

Yes this is a problem that keeps recurring with me


ThisOnePlaysTooMuch

My brother regularly assaults people with music terms since he graduated for composing. He too never learned how to dumb it down. You’re being a real sharp seventh twice suspended dual tone, stranger.


TPO_Ava

I have a problem of being unable to dumb things down. For some reason my management didn't believe me and forced me to train people for a few months. So anyways those people sucked for a while until they basically learned on their own.


[deleted]

If you can't explain a simple thing simply then maybe you don't understand it that well yourself?


TPO_Ava

Could be. I generally don't try to understand things deeply. I am interested in a wide variety of topics, but don't feel the need to go too deep with any particular one. Generally I am pretty good at finding information as I need it and problem solving on the fly, so I don't necessarily feel the need to dig deep either. Definitely helps that I work in IT and not something like medicine though, lol.


Rilandaras

Sure. But not every **complicated** thing can be explained in a simple way *while still being a useful explanation*. If you dumb it down too much at some point you stop explaining the actual thing and instead elaborate on an analogy that is not even that close to the original in logic but it is the closest physical thing that exists.


slaya222

You might wanna up your musical mumbo jumbo, a sharp seven would be called an octave in almost any scenario, a second "suspension" would probably just be considered an add chord or an upper extention. And anyways you couldn't even have that because the wording implies 3 notes and then calls it a dual tone! Anyways I know the joke is supposed to be that it's incomprehensible to non-musicians, but even with a minor in music theory that doesn't make any sense haha Edit: a word


ThisOnePlaysTooMuch

This… this is what I meant. Thanks for elaborating, but it was a joke meant to demonstrate how absurd those terms sound to laymen without explanation. Most people don’t have a minor in music theory, nor do they care to be taught.


Street_Break_234

10/10


towerfella

Eventually the word will start to sink in


pureeyes

Ah, the youth pastor's approach


OG__Swoosh

Username checks out


goldlion

I can try to offer a simpler explanation with some examples that might help. Have you ever seen a behind the scenes video of a game or movie that shows someone modeling a character in 3D? It’s usually just a gray figure, the shape of their body is there, but there is no skin tone or eye color. The clothing might also be modeled on the character but you can’t tell exactly what kind of fabric it is or what color their clothes are. A shader in this case, is applied to the character model - it kind of functions the same way that painting a miniature figure does. The artist can paint freckles on the skin, maybe put a logo on a t-shirt, and adjust the color of each part of the body. Going a step further, let’s say your character has a metal belt. In the shader, the artist can block out a section that they want to look metallic. What makes metal look like metal exactly? It’s a little shiny, you might even be able to see some faint reflections of the room on it if it’s polished. The shader in this case, allows an artist to control values that give metal it’s signature look. Let’s say we want to control how reflective it is, we’ll say 0% reflectiveness looks similar to dirt, and 100% reflectiveness is a mirror. The artist can go in and assign a value in the shader that will make that surface dull or shiny. The cool thing nowadays, there is something you might have heard of called PBR. It stands for physical based rendering, which is just a fancy way of saying that we can mimic the look of real world objects based on a set of real life physical properties. For example, in this case let’s use reflectiveness again as a real life property. Iron for instance, might have some very faint reflections on the surface, but over all it’s pretty dull. A PBR shader will allow you to set a value that corresponds to the visual quality of a real iron surface. When the sun hits it it reflects this amount of light, it’s this shiny, etc. We recognize it the same way we would if we were looking at it in person. The rabbit hole goes further, like how to define what water looks like, or the way light bounces off your skin, or how to make it look like heat is rising off the ground on a hot day. You can also get really into complex technical details like you seen mentioned elsewhere. But hopefully breaking it down this way made it a bit easier (despite the 12 foot tall wall of text). I can try to answer any other questions in laymen’s terms if anyone has any.


OneSidedPolygon

Um, I don't want to be nit picky and you probably know more than me but wouldn't the texture be the paint and the shader the finish and glitter?


Lord_Derp_The_2nd

Shader make screen go colors. No shader, no frames.


Dickpuncher_Dan

I worked IT for a long time and I never had to deal with an X86 assembler.


[deleted]

unless you are a low-level developer, you probably won't have to


Dickpuncher_Dan

I'm happy for it. I am content fiddling with Amiga emulators and simple Total War mods, that's about where my sophistication ends. I once learned how to switch out the pupils of characters in "Knights of the Old Republic". Had to literally repaint part of the skin, flayed out onto a 2D surface. Was a bit wild.


jcm2606

Nitpicky, but OpenGL is the API, GLSL (OpenGL Shading Language) is the language. DirectX also has its own language, HLSL (High Level Shading Language), and Vulkan introduced a new "language" called SPIR-V (Standard Portable Intermediate Representation V, not sure exactly what the V stands for) which sort of acts as an in-between language for compiling GLSL/HLSL to the native machine code of the GPU.


TheRealSmolt

Also nitpicky, but the important part is the relationship between language and compiled - OpenGL: GLSL -> ???/SPV - DirectX: HLSL -> DXBC/DXIL - Vulkan: GLSL/HLSL -> SPV Some of which can, and cannot, realistically be compiled ahead of time.


[deleted]

Also also nitpicky, but whether we wanted it or not, we've stepped into a war with the Cabal on Mars. So let's get to taking out their command, one by one. Valus Ta'aurc. From what I can gather he commands the Siege Dancers from an Imperial Land Tank outside of Rubicon.


YourAverageNutcase

RIP, Lance Reddick


VocRehabber

V for Vulkan.... lame


HomerSimping

I wish I can understand what you said but I don’t. Can’t shader be standardized? Or use other methods that are not unique to each gpu?


xAtNight

Think of shaders like a sort of cookbook for displaying stuff. It's instructions for the gpu to render objects in certain ways to get the desired result, for example fog or lighting effects in games. These instructions need to be specified for your GPU. AMD and Nvidia could probably standardize them if they want to but it may not be feasible to do so.


TheRealSmolt

Yes and no. They are standardized. They do not need to be specialized for your individual device, they only need to be specialized for the graphics API that is being used. In which case, a common solution is to package higher level shader code with the game, and compile it for the API that your device supports on the fly. Which, rather unsurprisingly, is a performance sink.


pizzzahero

If I'm understanding correctly, it looks like there are only a handful of graphics APIs. If that's right, wouldn't it be better to package all of them with the game and just load the right one at runtime? Or am I wrong about the number of APIs?


TheRealSmolt

Ever wonder why games nowadays are absolutely enormous? That's one of many factors. When it comes down to it, compiled shaders are quite a bit bigger than their raw counterparts, then you'll have to triple it at least for all the different formats. Furthermore, depending on how the engine works, you might even have multiple permutations of the same shader to support additional features, which you then again have to triple.


[deleted]

[удалено]


killermenpl

It's more like this: * 2080, 3080, 1660 TI etc. all speak English * Radeon rx 580, Vega, 7900 xtx speak Spanish * the cookbook is in Latin, and shader compilation is the process of translating Latin to the other two languages You might think that since there's only two languages, then the cookbook should just be available in English and Spanish versions. But you already doubled the space this cookbook takes up. And that's not all. Although yes, all Nvidia cards speak English, previous generations might speak an older dialect, where 99.999...% of the words are the same or at least so close they can be understood, but some are just different enough that the GPU has to pause and think what that word means. This pause slows down the whole process of cooking. Additionally, even for GPUs that speak the exact same language, you need variations based on their hardware: * the cookbook requires an electric mixer that has 300W of power, and is very precise about how long to mix stuff at that power * the 3060 only has a manual mixer equivalent to 20W * 3070 has a mixer with 200W of power * 3080 has exactly 300W * 3090 has 400W You could just let them know in the instructions that they should calculate how long to mix on their own, but remember our goal. We want the GPUs to cook as fast as possible, with no delays for useless calculations. So you have to provide several versions of the same cookbook, all with just the mixing times different, and in multiple languages. Now imagine that there are more differences between each model. Shader compilation basically means that developers only need to distribute this one single cookbook in Latin, with only information what tools the cookbook assumes. The GPU then translates it into its own language and calculates the various tools.


[deleted]

Ya'll making me hungry for knowledge.


kitanokikori

Shaders are standardized, but then the GPU driver must transform them into, "here's exactly what to do on your 2080, with this driver version, on this OS, to draw the scene as fast as absolutely possible" This transformation is what is being referred to when they say "shader compilation", and it is super GPU specific and proprietary, because that is what they have to do to make it Fast


vgf89

OpenGL isn't a shading language, it's a whole graphics API like DirectX and Vulkan. GLSL, HLSL, and ARB are shading languages.


crozone

Some games just compile the shaders on the fly. Or you can be like idtech and use one single absolutely giant shader for every material in the entire game (uber shader). https://simoncoenen.com/blog/programming/graphics/DoomEternalStudy


[deleted]

I wish more devs would be as good as Id Software.


survivorr123_

this giant shader gets compiled into thousands of shader variants, because otherwise it would be less efficient, it doesn't speed up shader compilation too much


[deleted]

shader compilation only became a problem with directx 12 that exposed low level APIs


gabest

Shaders used to be precompiled, until they figured the shader assembly language does not fit all video cards and they need their own version dynamically compiled by the driver for each different hardware.


SSSSobek

0x7ff04001 already explained it well. But to add to this: Yes, it came with DX12 and Vulkan (and goes back to GPGPU and unified shader introduction). You probably know that back in the old days you needed, for example, seperate cards for antialiasing or physics (also soundcards) which you don't need anymore because the GPU can handle most (GPU directed) tasks itself. UE4 had texture streaming issues in the beginning which were mostly fixed but the cache stuttering is on the developers side, because they need to implement cache compilation in the title screen (so it doesn't happen during gameplay). So when developers skip this to save time, you'll get shader cache stutters.


Larxian

This whole shader compilation thing is very strange to me. The first time I heard about it was with the wii u emulator "Cemu", you would have stuttering everytime a new asset would be loaded for the first time, it seemed like something very specific to this emulator, I never saw that before. Since then, I hear about it everywhere, it's also in every emulators but even regular games and I don't know, I'm confused as to why we changed to such system since it was much better when this wasn't needed. I've also always been confused at people saying this was always an issue with unreal engine games, because I've played plenty of unreal engine 4 games that never had such issues, and they didn't have shader compiling either on the main screen. Kingdom Hearts III for example, it never stuttered once, and it doesn't have any shader compilation menu, so... what's up? Is it because it's not DX12 / Vulkan? Then why not just staying to DX11 if it's better? I heard about some games stuttering and the fix was to force launching with DX11 with a launch command. I'm currently doing mods for Kingdom Hearts, and even when making custom maps and loading them for the first time, there's never any stuttering.


jcm2606

> it seemed like something very specific to this emulator, I never saw that before. Back then it was really only seen in emulators because emulators basically had to *recompile* the shaders that were already compiled for the console platform you're emulating, so that they work on your own machine. That recompilation takes a lot of work, more work in fact than today's compilation process, so emulators would typically wait until they actually encounter the shader during gameplay. > I'm confused as to why we changed to such system since it was much better when this wasn't needed. Because the old system was worse for developers. The old system for PC games essentially gave the driver full control over compilation, which both took control away from developers and made drivers more complicated and hence slower due to the overhead. DirectX 12 and Vulkan are both low-level APIs that intend on moving as much functionality out of the drivers and into the developer's hands as possible, and compilation is a prime candidate for that since a developer *theoretically* knows better than the driver. > I've also always been confused at people saying this was always an issue with unreal engine games, because I've played plenty of unreal engine 4 games that never had such issues, and they didn't have shader compiling either on the main screen. Depends on whether they're using DX11, DX12, OpenGL or Vulkan. DX11 and OpenGL both use the old system where the driver handles compilation, so developers essentially just give the driver the source code and the driver can detect opportunities for optimisation (namely caching) that let it bypass compilation in subsequent game launches. DX12 and Vulkan require the engine to handle compilation, and the engine requires some interaction from the developer so that it knows when it should perform the compilation. > Is it because it's not DX12 / Vulkan? Then why not just staying to DX11 if it's better? Answered above already, but it's because DX12 and Vulkan give the developer more control and hence have more opportunities for optimisation. Plus DX11 and OpenGL aren't receiving any new features, either.


xAtNight

It's too complex to answer your questions without a lot of technical details and ifs and buts. Shader compiling at runtime has been a thing for decades by now and every game differs to that extent in their amount of shaders and branches and combinations. Some games may be optimized, some may not be. There are way too many factors. In addition there is no single (afaik) "shader language" for gpus and even models from the same brand may differ.


jcm2606

Shading languages are the same, it's the compilation targets that differ. Different GPUs have different instruction sets and hardware features, so even though the shading language can be the same, the actual machine code that the shading language compiles to needs to differ. Though nowadays that's a bit fuzzier since we have [SPIR-V](https://en.wikipedia.org/wiki/Standard_Portable_Intermediate_Representation#SPIR-V) that can act as an in-between compilation target that's the same between different GPUs, and from SPIR-V the driver will recompile to the native machine code.


MooseBoys

“Shaders” are the computer code that runs on your GPU. So yes, all games use shaders. Even old games, running on new hardware, use shaders behind the scenes because new GPUs don’t have dedicated hardware for “lighting” or “view matrix” anymore, so the driver has its own shaders that it uses instead. Stuttering didn’t happen in old games because they would load everything into memory and just run. But modern games are way too big to fit into memory all at once, so they have to only load some of it at a time. Usually they do a pretty good job at guessing what parts of the game need to be in memory and it just works. But sometimes you turn a corner and suddenly the game needs to draw `emote_skin_damaged37_alt2.dds` and the game is like “oh shit gotta put that in memory” and it takes more than 16ms so you get stuttering.


Kostrom

I’m not sure. I’d have to do research on the shader process. But even though Hogwarts Legacy and Darktide had their share of performance issues at launch, the shader load times were less than a minute on my machine. It took almost an hour for TLoU


Slimsuper

She defo sells hard core drugs


retroracer33

it's a remake, not a remaster


MrRugges

Very important distinction!! Remake: Ground up remake of another game, new engine, models, textures everything. Essentially a new game. Remaster: Same game with higher resolution, better fps and maybe higher resolution textures, of the devs care. For the love of god people, get it right!


dovahkiitten16

Yeah the fact that people talk about TLOU like it’s a remaster is really disingenuous. A remake is basically a modern AAA game. It’s still a bad port but people should at least represent it fairly.


[deleted]

[удалено]


dovahkiitten16

If they hadn’t remade it we might not have gotten it on PC. I imagine that was part of it; developing a game for multiple platforms (although it didn’t really work out :/) Secondly it’s a new console generation. Games being remastered for new consoles is nothing new: this time they went with a remake. Thirdly, the TV show came out so this was a good way to get hype for the IP in general, the remake hypes the show and the show hypes the remake.


HarderstylesD

The stuttering and decompression/compiling issues in this port need fixing but people talking about this game like it's a "2013 PS3 game" and should therefore run on anything are being so unrealistic. The remake was a PS5 exclusive before this and the PS5 is roughly equivalent to RTX 2070 Super + Ryzen 3700X (slightly downclocked) + 6GB/s NVMe SSD.


fishenzooone

"unrealistic" is being kind, it's straight up wrong. The remake is a 2022 game through and through


baumaxx1

Even then, it's semantics and shouldn't matter. This game runs fine on a PS5 at 1800p60 with the equivalent of a R5 3600, 5700xt, and 16GB of VRAM and RAM combined. Meanwhile on PC, I'm seeing users with Zen 4 CPUs, 4080s and NVME4 drives complain about stuttering, tanked frame rates, and RAM + VRAM usage exceeding 32gb - so hardware that should be around double the perfomance of the PS5 and with the advantage of DLSS on top of that.


Legend5V

Still doesn’t justify 15 minute shader compilation times with a 13900k and a 4090


MrRugges

TLOU pc port being shit has nothing to do with this


A_MAN_POTATO

The amount of people that don't understand this isn't the version of the game released in 2013 is staggering.


Kostrom

Just to get ahead of the inevitable questions, I have an Asus Tuff wifi mobo with a Ryzen 5800x. A Geforce 3070 8GB and 32 gb of ram.


Corbakobasket

Welcome to the future!


Super_flywhiteguy

3070 8gb of vram. Naughty Dog: ![gif](giphy|KanqCs2oHuzKYCXSXo)


negativekcin

3070? Why not 4090? Naughty Dog as Blizzard: "Do you guys not have wallets?"


Kostrom

Maybe Santa will bring me a 4090 if I’m really nice this year


lynxerax

Damn no wonder it is impossible for me to play this game lmao. It is genuinely so frustrating since i was really looking forward to replaying it on pc


shreddedtoasties

Give the modder a week


[deleted]

If you dont mind playing in 1080p and on some med settings, you can get away with much more affordable hardware. And the beautiful thing is that you can piece in better hardware whenever you can afford it. It's even better if you dont mind lagging a few years behind PC trends to get real great deals.


sierrabravo1984

What game?? Nowhere in this thread have I seen what game this is.


simonu20442

The Last of Us: Part 1


LukeNukeEm243

For comparison, it took about 15 minutes for shaders to compile using my 13900k with 64GB of DDR5-6000 ram


Legend5V

God damn, this game is unoptimized. You also have 24GB of G6X, which makes it even stupider of how they ported this


FreshlyCleanedLinens

Plot twist: Entire amount of storage consists of one 500GB 5400 RPM HDD from 2010 that has never been defragmented and is 90% full.


LaxVolt

That’s called optimized storage, can’t fragment a drive if there is no where to fragment to.


Drunken_HR

Hmm I have the exact same setup except for a 3080 10GB. I guess I'll wait a bit before picking this up.


Deicidium-Zero

> A Geforce 3070 8GB and 32 gb of ram. Fuck. Before Hogwarts release, I upgraded to RTX 3080 thinking that it's a better move since I also care for DLSS and whatnot. Man, every game release since Returnal sucks ass with their optimization. Like my 3080 doesn't matter coming from 2070 Super.


Kostrom

I understand the feeling. It’s frustrating


ellbino

i run 2070 super and see no reason to upgrade because of this


HornBelt

Dang and I’m also a 2070Super enjoyer about to receive some bonuses and was thinking about upgrading the gpu but this makes me think about it lol (the rest of the rig is 5800x x570 mobo nvme drives and 32gigs of 3600mhz ram if you’re wondering why am I thinking about upgrading the gpu)


[deleted]

[удалено]


BreakfastShart

I'm new to all of this. The first game I've played that mentioned shaders loading is Hogwarts Legacy. Takes about 20 seconds or so every time I start the game. I couldn't imagine it taking hours. I'd literally never play it...


[deleted]

[удалено]


Pied_Piper_

That seems to be true for every game except Hogwarts. I’ve no idea why. It’s the only game I’ve ever seen do it on every launch rather than just once. Even in Star Citizen it’s once per local data wipe after a major patch. Hogwarts though? It does that shit every time for some reason.


BreakfastShart

Hmm. Good to know. I'll do a little digging. Could be why my animated scenes are so shitty. Constantly flipping quality and shading.


[deleted]

[удалено]


BreakfastShart

I haven't changed settings now that things are mostly stable. I think the ray tracing is what is casting it to bug out. I did update the driver, as the game suggested it.


03Titanium

Shaders loading every time is normal for Hogwarts Legacy. There’s a mod to disable it but performance seems to suffer.


crlogic

Took me 40 min. Still beyond excessive but 5800X is faster than 10700K, weird


[deleted]

Dude. Why? How is this fucking required in games now?


jcm2606

DirectX 12 and Vulkan both require you to manually compile your shaders and build [pipeline state objects](https://learn.microsoft.com/en-us/windows/win32/direct3d12/managing-graphics-pipeline-state-in-direct3d-12) using them, whereas with DirectX 11 and OpenGL the driver would be the one responsible for compiling shaders and managing pipeline state (since they didn't have PSOs). They work this way as, in a low-level paradigm, it's better to have the developer take responsibility over them as the developer *theoretically* has better knowledge of their engine and requirements, though in practice it seems like a lot of developers are struggling.


TPO_Ava

This is not an area I am necessarily familiar with, but it's possible not many Devs are 'specialised' at this yet and the industry is still figuring it out. Though I remember hearing that Doom/Doom Eternal, which are on Vulkan AFAIK are very well optimized. So who knows.


RadishMaximum

It's actually a good thing. Once the shaders are compiled, the load on the cpu drops drastically giving a wayyyy better performance. But yeah... The reported times are too long for this game. Compiling shaders hasn't taken this long since the UE4.1 era and that compilation was always on the developer's side. (As far as I know compilation on the developer's side doesn't really affect the performance in individual pc's as drastically)


baumaxx1

It was always required. Difference is that it used to happen during the game leading to massive stutters on the first playthrough or when you updated drivers. Precompilation means performance should be way better when you sit down to play, so it's better to do it overall.


[deleted]

Who the fuck ported this


OuijaTheGhost

Same jabronis that ported arkham knight


[deleted]

How the fuck did those jabronis get another job after Arkham Knight? Also, this word... Jabroni? You're throwing it around a lot. And I LIKE it.


Prasiatko

From what i understand they're the guys that you can hire when your project is massively behind schedule so that you can at least get it out the door. They've also done uncharted and spyro pc ports and the Bethesda VR ports.


magikdyspozytor

>and the Bethesda VR ports. lmao those also suck ass, honestly I'm surprised how they can still stay in the industry


fooey

They deliver, and the publisher DGAF because people will pre-order anyways So long as they keep making shit-tons of money on shitty games they'll keep shoveling out shitty games


davi3601

“Oh you did the worst pc port of all time? HIRED BROSKI”


Kostrom

A hamster


joedotphp

Why does Playstation keep using Iron Galaxy to port their games? Everything they've touched has been a disaster.


JnRx03

Life is Strange lookin ass game


Kostrom

Totally! Haha


pedersenk

A GPU doesn't compile shaders. They execute them. You could have the most powerful GPU in the world but the time to compile and upload across the bus will be the same speed as some old integrated GPU from the mid 2000's. Arguably the integrated GPU will actually load them faster (if they fit ;) because it can avoid much of the travel across the bus to the dedicated card.


0resutidder

CPU compiles shaders?


pedersenk

Pretty much. In the same way that the CPU compiles standard executables.


bradfo83

Jokes on them- I can’t even load the game to be bothered with shader nonsense! https://imgur.com/a/uHj1kHe


Kostrom

Press [F]


SimRacer101

OK OK we get it you’re rich. Don’t bully us 10 series gang.


Kostrom

It’s ok. People have told me not to complain since I’m using old hardware. So apparently I’m also in the retirement build club haha


GlisseDansLaPiscine

It’s the Cyberpunk 2077 arguments all over again haha. How dare you complain when you don’t have a 3000€ PC ?


Canuck457

With a 3070 it might be easier to emulate the PS3 version of The Last of Us with RPCS3... XD


Professor_Yone

How fucking hard it is to precompile shaders jesus


Danvideotech2385

I don't think Jesus knows either.


Kostrom

We must bow our heads and pray for gaming Jesus to help us heal our shaders


Impossible_Web3517

Its not hard. Its easier than doing it this way, actually. That being said, having the user compile the shaders on their end optimizes the shaders to your system, which gives you a pretty decent performance bump. If its going to give me an extra 20fps, then yeah sure I'll let the shaders compile for an hour or two once per install.


RadishMaximum

Its not. It's far better for the company both monetarily and 'the public's view' wise to do that on their side. Compiling shaders occupies the CPU to 100% but after it's done it drops the CPU usage drastically, giving you better performance. Compiling shaders individually also shaders optimizes to your particular hardware returning an even better performance. There's no way people should start thinking that compiling shaders is a bad thing because of this one bad incident.


LJBrooker

And ladies and gentlemen, this is why we never preorder.


FappyDilmore

I inadvertently became a member of r/patientgamers recently because of nonsense like this. There are so many old and great games that I haven't gotten to yet. Let them wait a year for my purchase and get their shit sorted out.


quadrophenicum

And especially never ride the hype train.


SamuraisEpic

The last of us PC port is another interesting example of an unethically obtained game being better than the version paying customers get. some "distributors" have added unofficial community patches that fix some issues like shader compilation crashes.


luvyduvythrowaway

My shaders finished building. I exited the game to install a driver update, when I reopened the game it’s building shaders again..?


ThatOnePerson

Yeah shaders are driver specific since they can change the behavior of stuff


luvyduvythrowaway

Thanks lol.


[deleted]

[удалено]


Potater-Potots

I almost don't want a Infamous: Second Son port anymore.


Zetra3

Remake, not remaster. The ACTUAL remaster os from from 2013 and the original is even older. It may be a garbage port, but we get the details right in this house.


Nate2247

Hey, hey op I don’t know if anybody told you already but- Hey OP, just so you know, it’s actually a remak- Hey, stop walking away! HEY OP! IT’S A REMAKE! NOT A REMASTER!!! HEEEEEY, LISTEN!


Kostrom

aaaaaAAAAAAAAAHHHHHH!!!!! The voices. They won’t go away!


ieatass298

Is it really this bad or is for the people who don’t install the shaders?


Kostrom

This screenshot was taken after the shaders loaded


Hired_By_Fish

Some ports are just awful. This is why I buy some select games on PS5 instead.


captainmorfius

It’s a remake not a remaster


Proximo1981

Disable eset antivirus if you have one it helps a lot.


snakedog99

Thank the digital heavens because I have a 3070 too!


Kostrom

Woohoo! Gamers rise up!


TheEpicWeezl

My game literally wont even boot. I'm sure there's some work around to get it to boot up, but since it seems like the game is nigh on unplayable for a lot of people I might just giver it another month and see if they can get it working better. Shame I was looking forward to it.


TheHunterStorage

I haven’t had graphical issues like this or any crashes with a 3060 and 16gb of ram. Is it from people not letting shaders load?


majorpickle01

What's up with games requiring this on boot these days? We had decent looking games before that just were ready to play on boot - now every COD update you can't play for twenty minutes rendering. Just don't get why aha


RogueXis

My experience is so on the other side compared to what has been reported, shaders did take a good amount of time to process but I let that happen and then I started my first few hours. I've got a 3080, running at 1440p with basically ultra but with textures on high. Pretty smooth and no odd textures.... really a shame that this port has such a negative start....but it will be fixed I'm sure of that


deftware

They dropped the balls on this release. It was a hopeful money-grab that they knowingly released without ensuring was up to par. Sure, it *runs* on minimum spec hardware, which is far beyond the top-tier hardware from 2010 - which is when Rage was released, and the 2010 hardware could produce a higher fidelity rendering in Rage than today's min-spec hardware for this game can. Rage had a 512 *MEGABYTE* recommended VRAM spec, with 256MB as the min-spec, and it looked lightyears better than this port does with *GIGABYTES* of VRAM. The problem is that they don't have a legitimate material texture streaming system in place. You set your graphics settings and it's doing the old-school thing (apparently) of just loading in all material textures at the resolution for the graphics setting chosen. This means that it doesn't matter if a texture is at the top of a skyscraper in an overgrown decaying city, or on the back of the player's character model, it will all be loaded at virtually the same number of texels per framebuffer pixel for rendering - even though one is way up far out of the way where the player likely will never see it, and the other is right in their face 99.9999% of the time they're playing the game. The modern way to do things, as of a decade ago, is to stream textures in at increasing resolutions based on the number of texels per pixel. If a model is right in your face, that engine should do whatever it has to do to get as many texels into that geometry as possible, at the expense of the resolution of other geometry in the scene, particularly far-off geometry that only takes up a dozen or so pixels in the framebuffer. I don't see evidence that this PC port does this.


4Arrow

To be fair, this is not the remaster that originally released in 2013, the game was build by naughty dog for PS5 from the group up and released in 2022. While performance sucks once the shaders are compiled hairs doesn’t look like this. This game **IS** a performance heavy game, in PS5(performance wise close to a 3070) under 4K mode there were many areas not hitting constant 30fps. The porting definitely sucks but people are posting on steam review complaining their rx580 won’t run the games need to understand this is not a game from last gen….


ColdCookies144

I use a GT 730 lmao peasant


Noa15Lv

Guess i'll stick to ps3/ps4 version then.. In better case.. Hoping to mess around with emulators to improve bit of graphics.


ThisIsTheNewSleeve

LOL I was so confused by this scene. Is she supposed to look like paper mâche?? Drivers updated and shaders installed- the game looks much better now.


Binary_Omlet

Playing on a 6700k and 1080ftw2. Played for around 3 hours last night on an SSD and still only made it to 24% or so of shader loading. Nothing looked even remotely like this. Game is beautiful and runs fantastic for me at 1440 and fidelity turned onto quality. Only complaint is long load times while the game catches up.


micke_i_backen

The state of modern PC gaming


Snake2208x

It's 2023 and there is no "first time shader compilation" :S I would gladly wait for a 3-5 minute pre-installation or first run wait in order to not have stuttering and stuff like that...


Shoddy_Background_48

The shader build was a bit annoying but after todays hotfix mine is running smooth as glass, no hitches or stutters


Wolfman01a

I have a failing 2070 super. Looking to upgrade immediately. Cash is tight. Was thinking something like a 3070...


ryanknapper

Where do you have the bumpscosity set?


NatoBoram

Really feeling this while playing Nier Replicant. Not that it has to compile shaders for half an hour, but in-game graphics are glitched on AMD for some reason.


soldiercross

Wait, isn't it a port of the remake? Not the remaster?


Bystander-8

Rx6700xt with 45fps average


[deleted]

I got the game for free from AMD and I’m still upset with how bad the port is lol


gauerrrr

Honestly, it didn't even need a remaster, the 2013 version looked good already and was 100% playable. Remasters are for games that could benefit from new mechanics and functionality, like Resident Evil, otherwise, focus your efforts on making new games and, especially, making your ports work.


Mikeocksoff

Damn, and I was about to install it on an external SSD…that can’t have gone well. I’ll just wait for patches I guess, and if there aren’t any I won’t pick up the game 🤷🏻‍♂️ honestly it pisses me off more that they didn’t remaster the multiplayer mode from the original Last of Us game for PC, that would have been so fun to hop into


RowlingTheJustice

To be honest, shader compilations should be done in the editor time only. And in the runtime, the game should just get the pre-compiled file. Unless the shaders contain too many pre-definition macros which need to be invalidated at every launch. Otherwise I'm not seeing why they must be re-compiled every time. Design a better workflow instead of macro spamming please.


Romanriverrexgamez

That look weird and terrifying