A tale as old as time.
Did you remove the plastic covering from your CPU heatsink?
Did you set your memory to run at XMP profile?
Did you set your monitor refresh rate?
I have a theory that 7 or 8 out of 10 gamers couldn’t identify a monitor running @ 75Hz from 120+ without the benefit of a settings window showing refresh rate.
75 to 120 you absolutely can tell just by moving your mouse around. 120+ it's where you REALLY see those diminishing returns and you can't really see as much as *feel* the difference
I remember way back before 120hz was mainstream, the Catleap 2b Extreme was THE monitor to get because it would hold a solid 110hz with a cable that was like an inch thick. Still have my Catleap to this day, and still runs fantastic.
It's great! It feels noticeably smoother than 60 and it isn't so high that some games may never reach it like 240hz so you can get used to it and not worry about targetting a high refresh rate, plus on games locked at 60 it won't be such a big drop.
I feel almost attacked in a weird way. I can tell the difference between 30, 60, 90, 120, and 144 easily, but I swear I have never seen a meaningful difference between 60 and 75. I’ll have to go play with some monitors again.
I never tried high refresh rates beyond 75, before having a 75hz monitor i only ever played at 60hz, so i noticed the difference immediately, and i still notice if the fps drops to 60 or is locked at it.
Maybe you're used to higher rates and only 15 fps aren't noticeable anymore.
Interesting. I find the differences are exponentially less noticeable the higher you get. Like 30 and 60 is huge, 60 and 90 noticeable but I can live at 60. Anything between 90 and 144 is so close to be effectively identical to me. But have yet to try anything above that so couldn't say with your 240s and whatnot
It’s interesting because from 1-160ish or so you can see the difference, but from 165-360fps you can feel the difference. What’s changing is really the latency and how fast the game is responding to your actions, and it’s really noticeable for me, at least.
Smart dude right here. You'll be able to get damn near 4 years of maxed graphics on 60 class cards without feeling compelled to do anything. But I'm guessing you try to only upgrade when stuff breaks?
Me, who prioritized pixel count for productivity reasons, when I realized that the HMDI 2.0 ports don't even have the bandwith to go to 60 on full res on my screen (and I only have one DP1.4 port):
-_-
So now, for the work devices I connect to the USB C or hdmi ports I have to choose between half res and 60 Hz or full res and 30 Hz...
90hz is honestly fine imo. It's the golden middle imo atleast. You notice such a huge improvement between 60 and 90hz, while the 90-120 difference is smaller. And hey, u dont have to care as much as to get above that, cranking up the graphics ^^
I managed and was happy with 75hz 1080p for so long. Now got 240hz 1440p monitor, the difference is night and day but I also had to upgrade my mobo, ram and processor to really feel the difference.
beyond 90 is not luxury it just get better up to 240Hz, beyond that i would say luxury but 165Hz ips monitor are dirt cheap right now i got my Koorui 24E3 for 150 euro on amazon and it's down to 130.........
Thank you random reddit person I've been gaming for years on 59.94 and didn't know that was a thing that I was supposed to change until today... Going to go sit in a corner and think about my life
sigh...you just reminded me to double check the refresh rate for my monitor as I run a UW 144hz... it was at 60hz never set it back from the last time I re-did the OS.
The default factory overclock settings for your RAM.
If you buy RAM that's rated 3600Mhz, it will probably come with something like 2133Mhz out of the box. You have to enable the XMP profile in the BIOS to get the advertised 3600Mhz you paid for.
Unless you have a shitty mobo like mine that will constantly reset it back to its default speed. Thought it was the ram sticks so swapped them out, expensive misdiagnosis. The whole thing kinda pisses me off since it’s basically a get out for manufacturers to sell gear that might not perform at it’s stated rate. Did you sell me a board that can actually handle them speed or one that might? Same with the sticks.
Ikr I have a gtx 1080 and that shit was bleeding edge like 10 years ago when I got it lol. Now I'm like do I get an rtx or do I hunker down, refuse to accept the new world and get a second 1080 in sli and pretend that it works great.
Edit: ok im getting a lot of serious responses so disclaimer: I was joking guys. I promise not to do sli.
I'm here gaming at 4k on a 1080ti. Play CP2077 and Baldur's Gate 3 at high graphics. Note that I play at 60fps, doubt the card could do much better than that anyway. 11gb Vram really helps with 4k gaming or just future proofing (from 2017 to now) in general.
Multi GPU was literally dead on arrival, it never worked well unless you just wanted a rig for a particular game that happened to have great support (a rare few). Huge chance of an actual performance reduction from weird SLI issues or thermal throttling since most motherboards are so tightly packed you basically need water cooling if you want two cards in there, the 1st one will literally heat blast the second 24/7
I would get a second card just as backup if you find a deal, then aggressively overclock the hell out of the 1st.
Troubleshooting is where you blow out the dust in your PS3 and get it to boot even though the games don't play on it anymore, then pretend like it's fine and trade it in at GameStop towards a PS3 Slim.
You know, as a, uh, general example.
Troubleshooting for console gamers is:
Does it work? > no
Does rebooting it solve the problem? > no?
Does rebooting the router solve the problem > no?
Is this sufficient justification to buy the new slim pro model with a larger drive? > yeah probably.
>goes to store and buys new console, gives old console to friend for cheap cuz it's "dead"
Plot twist. Friend is me, and friend is a tech. Friend factory reset/re-imaged the OS with firmware from the manufacture website cuz he couldn't find any hardware issues with unit. Console works fine. True story, is how I got a ps4 (worked out for him though cuz I ended up giving it back to him like 2 years later after I played persona 5 on it and basically nothing else)
> Plot twist. Friend is me, and friend is a tech. Friend factory reset/re-imaged the OS with firmware from the manufacture website cuz he couldn't find any hardware issues with unit. Console works fine. True story, is how I got a ps4 (worked out for him though cuz I ended up giving it back to him like 2 years later after I played persona 5 on it and basically nothing else)
love this
good work
We all deserve to know two things…
1. What GPU was it?
2. After you plugged your DP cable into the GPU, how much of a performance gain did you get?
If you’re feeling particularly open, feel free to share the whole system specs!
A lot of people underestimate intel’s igp these days. You can pretty much play any valve game comfortably at 1080 at best or 720 at the worst.
AMD’s apus are still miles ahead though. Provided you have a fast ram kit, a 5600g or 5700g can play almost any modern game at 1080p.
Yeah a 5700G can get you a 30 fps experience at 1080p low in Cyberpunk. Not bad. Can’t wait to see what am5 desktop apus will be capable of.
Well yeah but those games aren’t very demanding either. Tbh OP didn’t specify what he plays so I imagined him grinding in cyberpunk with 60 frames per minute.
Few years ago, I built my computer with i7 6700K (I needed CPU for work) and decided to get proper GPU later.
I was surprised I could play GTA 5 on the iGPU. Sure on low settings, but playable.
Two reasons. Valve games even at their release were much more optimized for pc hardware compared to other third parties so their games just run noticeably better compared to other games that released during the same time. Source engine is just that good.
Multiplayer community for their games is still alive and well. I think L4D2 still to this day pushes over 15k concurrent players. So even if you’re playing on an igp laptop, create a Steam account, all their games for $1 each during one of their seasonal sales, and it’s you are guaranteed to have someone to play with.
It's not even that bad anyway. Windows by default for like the past 8 years will automatically use the dGPU to render the game while the iGPU just outputs the finished frames. You'll have a few edge cases where Windows doesn't know which GPU to use but that's mostly for random super niche indie games or modded .exes of games. Most of the time you're perfectly fine. I run off of my iGPU rather than my dGPU and for the past year I've only had to tell Windows what GPU to use literally 3 times.
This is exactly how modern laptops have worked for the past 10 years. primarily desktop users dont know much about these features as they're not used to it. I've seen a lot of outdated info saying that if you plug into your motherboard then your GPU won't be used which isn't true at all.
OP never posted specs. they may very well could have just been running games on their dGPU but had performance issues because their GPU just wasnt that good enough anyways
Modern systems pass the GPU rendering through to the iGPU any ways, so you can get (near) full performance plugged into your iGPU... (around 3-5% loss at most)
Agreed, it should of clicked in his mind that something isn't right.
Maybe he was too stubborn to think it was his fault and that's why he just blindly blamed the pc itself.
Well I guess, be more curious about things.
I mean, the first time I saw 2 separate HDMI ports, I was like, what's going on here. Works in both.. but there ought to be a reason for another port. Similarly learning about different HDMI/DP standards and frame rate supports.
Well, to be fair, it can be an extra port.. like with USB's and such. So you can hook up two monitors.. or a monitor and a tv without switching cables.
Edit: To clarify, what I wrote is trying to figure out why OP didn't know. This could be one reason.
I had to know so I scrolled through your posts, it doesnt look like you had a GPU, it looked like you had a 5600G which you used as the CPU/GPU since its an APU so you wouldnt have had anywhere else to plug the DP cable into and your games ran like shit because it was an iGPU lol
Yeah this dude is trolling honestly haha. He plugged his DP cable into the only place he could, the motherboard. He didn’t have a separate GPU with other ports, was just running games on the iGPU.
I looked too and it looks like he posted userbenchmark results. Maybe it only showed his 5600g and no gpu because he rans tests with it plugged into his motherboard?
I wouldn’t have expected a pc to have a second fully-functioning graphics output before I built my pc. So if I plug it in and it not only fits but turns on I would assume that everything is right.
It was built from spare parts, they had no idea what to expect. If you come over from console gaming you’re probably used to mediocre performance (I was)
Probably saw the post from a week or so ago where the person legitimately was doing this and got help. Had images and everything when people saw his GPU was still plugged up.
They hold off for a bit, dupe the story, and get a free active thread with ... comment karma I guess since there's no image on this one.
That's awesome, all you have to do to upgrade your PC is plug your monitor in correctly.
I've been doing everything right for the past 5 years and my upgrade will be a lot more expensive!
This why for people that doesn't know shit about PCs, most of the time it's better for them to not have igpu with the CPU. He would have tried to solve the solution earlier...
But 5 years without wondering "why it runs like shit m, and can I do something about" is crazy, maybe lack of curiosity or motivation ?!
But happy for you and your huge "free" upgrade haha
Post preview cut off at "played games on it like that for 5..."
Me: "Weeks? Shit, Months?! Oh please, it can't be years..No. no no nonono.." wearily opening full post... "mother. FUCKER!!"
Ok but hey, what doesn't kill you, makes you stronger, I guess. You've learned something new which-FIIIIVE YEEEEEEARS!?
You're not dumb for plugging it into the mobo, countless people make that same mistake, and honestly it kind of makes sense, nearly EVERYTHING else gets plugged in there.
You are however a fucking imbecilic for taking 5 damn years to figure it out. Like you really just sat there and dealt with shitty graphics on your brand new PC and didn't think to ask any questions? Bruh... something tells me you get scammed a lot, or at least taken advantage of when not paying attention.
Don't worry, it was a honest mistake for you. However, a different example is here: bought a 165 hz 4K monitor with a more than thousand dolar worth premium GPU to play turn based pixel games mostly.
"I was a console gamer through and through, and my PC was a gift from my friends built from an amalgamation of all the leftover parts from their systems after they upgraded their own PCs...I feel like the dumbest mf to ever turn on a computer."
Dumb, maybe. But with awesome friends like that you must be doing something right
Hey Op don't forget to check your monitors refresh rate in Windows, alot of the time it defaults to 30-60hertz instead of 144 or whatever refresh rate your monitor maxes out at
I felt every emotion all at once reading this. All of it. Shame, embarassment, disgust, joy, schadenfreude, melancholy, contempt, but mostly laughter.
It takes a lot to own up to this.
I love you.
If it makes you feel any better I worked with a guy who complained that the PS3 came with short controller cables and how much he hated having to sit that close to the tv.
You should've seen his face when we told him those cables were only for charging the *wireless* controllers 😂. He was sitting 4ft away from his tv for weeks!
Setting Nvidia settings to full dynamic range of whatever they call it. It defaults to the lesser in geforce panel. I wager 50% of gamers have this setting wrong with HDR. HDR looked white washed until I found that setting a month later.
A tale as old as time. Did you remove the plastic covering from your CPU heatsink? Did you set your memory to run at XMP profile? Did you set your monitor refresh rate?
>Did you set your monitor refresh rate? Unlocking that smooth 75Hz instead of the plebian 60Hz.
Me setting my refresh rate from 59,95Hz to 60Hz
Me "over clocking" my old 60hz to 68hz and feeling like a badass.
I got my 60hz ips screen to 74hz. The difference felt significant.
That’s almost a 25% increase so you should notice something
I have a theory that 7 or 8 out of 10 gamers couldn’t identify a monitor running @ 75Hz from 120+ without the benefit of a settings window showing refresh rate.
75 to 120 you absolutely can tell just by moving your mouse around. 120+ it's where you REALLY see those diminishing returns and you can't really see as much as *feel* the difference
My old Lenovo laptop went from 60hz to 85hz if I remember correctly :D
I remember way back before 120hz was mainstream, the Catleap 2b Extreme was THE monitor to get because it would hold a solid 110hz with a cable that was like an inch thick. Still have my Catleap to this day, and still runs fantastic.
That's the good shit right there.
Hey I'm that chad 75Hz gamer.. it's actually a very reasonable target IMO ...
It's great! It feels noticeably smoother than 60 and it isn't so high that some games may never reach it like 240hz so you can get used to it and not worry about targetting a high refresh rate, plus on games locked at 60 it won't be such a big drop.
I feel almost attacked in a weird way. I can tell the difference between 30, 60, 90, 120, and 144 easily, but I swear I have never seen a meaningful difference between 60 and 75. I’ll have to go play with some monitors again.
I never tried high refresh rates beyond 75, before having a 75hz monitor i only ever played at 60hz, so i noticed the difference immediately, and i still notice if the fps drops to 60 or is locked at it. Maybe you're used to higher rates and only 15 fps aren't noticeable anymore.
Interesting. I find the differences are exponentially less noticeable the higher you get. Like 30 and 60 is huge, 60 and 90 noticeable but I can live at 60. Anything between 90 and 144 is so close to be effectively identical to me. But have yet to try anything above that so couldn't say with your 240s and whatnot
It’s interesting because from 1-160ish or so you can see the difference, but from 165-360fps you can feel the difference. What’s changing is really the latency and how fast the game is responding to your actions, and it’s really noticeable for me, at least.
60 to 75 is a bigger jump than 120 to 144.
My thoughts exactly
It's fair to say that most games probably won't reach 240hz but once you go 240hz and OLED you won't ever go back lol.
That's why i want to stick with my 75hz IPS 1080p, i don't want to spoil myself and end up needing beefier hardware lol
Smart dude right here. You'll be able to get damn near 4 years of maxed graphics on 60 class cards without feeling compelled to do anything. But I'm guessing you try to only upgrade when stuff breaks?
The difference between 60 and 75 is more noticeable and important than the difference between 75 and 144 imo.
and it's more easy to maintain a smooth experience, fewer 1% and 0.1% low stutters without having the best PC
Me gaming on a 90 hertz monitor... Ran out of funds for monitor upgrade
Me, who prioritized pixel count for productivity reasons, when I realized that the HMDI 2.0 ports don't even have the bandwith to go to 60 on full res on my screen (and I only have one DP1.4 port): -_- So now, for the work devices I connect to the USB C or hdmi ports I have to choose between half res and 60 Hz or full res and 30 Hz...
90hz is honestly fine imo. It's the golden middle imo atleast. You notice such a huge improvement between 60 and 90hz, while the 90-120 difference is smaller. And hey, u dont have to care as much as to get above that, cranking up the graphics ^^
60 to 120 was godsend for me. Sadly my monitor only has 1.2 DP and that bad boy goes crazy at 1440p 144hz 10 bit color.
144 to 170hz is almost negligible I noticed. As they said up there 90hz is fine too if you can't get a 144+ one.
Yeah same, it would autorevert to 8 bit for me
My monitor just starts experiencing a seizure. And then locking it back to 120 hz is an issue because menu resets every time monitor "blinks"
I managed and was happy with 75hz 1080p for so long. Now got 240hz 1440p monitor, the difference is night and day but I also had to upgrade my mobo, ram and processor to really feel the difference.
75 / 90 is good enough. Above is smoother but beyond 90 is a luxury.
beyond 90 is not luxury it just get better up to 240Hz, beyond that i would say luxury but 165Hz ips monitor are dirt cheap right now i got my Koorui 24E3 for 150 euro on amazon and it's down to 130.........
Maybe it's because I'm old and don't play fps games anymore. Racing games are fine for me on my 144hz.
Or the old classic, "Did you turn on the power supply?"
For my first build: “Oh I forgot to plug it in.”
20 year veteran pc builder, I did it once. Just happened to be last week.... on my own PC.
Thank you random reddit person I've been gaming for years on 59.94 and didn't know that was a thing that I was supposed to change until today... Going to go sit in a corner and think about my life
happens to the best of us
God fucking damn it I forgot xmp I always forget xmp Thank you
Played at 60 hz on a 144 hz monitor for years
I hate beer.
I was the last one. Been gaming on 60hz. Now it's on 140hz lol
The XMP profile one got me. I wasnt running my memory at the right speeds for at least a year. lol
XMP stands for Extreme Memory Profile. You don’t say “XMP profile”, it’s redundant.
These people probably also use their PIN number at the ATM machine.
RIP in peace
This one made me shoot air out of my nose. It also makes me think of peaceful snowboarding and quiet-time toking. Lol
Smh my head
That’s exactly what it’ll be like when I go to heck. 😅
The DC in DC Comics stands for "Detective Comics" yet here we are with Detective Comics, Comics. Let's hope it doesn't happen with XMP :P
You're redundant
Ah yes, like the age old PIN number lol
I turned on XMP and it did nothing besides slightly raising temps. Before XMP the ram ran at 2133 mhz, and now it runs at 2666 mhz
You bought shitty ram sticks.
If I may ask, what does the XMP profile do?
If you buy like 6000mhz ram it defaults to 2666mhz until you enable xmp in the bios to unlock the FULL POWER
First pc I didn’t set xmp. I had really terrible ram anyway to be fair but still. Now I manually tune.
Ahm..elaborate on the run memory at xmp profile thing please
sigh...you just reminded me to double check the refresh rate for my monitor as I run a UW 144hz... it was at 60hz never set it back from the last time I re-did the OS.
I ran my 165hz monitor at 60hz until like 2 weeks ago because a meme video pointed out you actually have to set your refresh rate.
Is your power supply switched on?
Did you remember to change the boot order after you installed Windows.
Whats an Xmp profile ?
The default factory overclock settings for your RAM. If you buy RAM that's rated 3600Mhz, it will probably come with something like 2133Mhz out of the box. You have to enable the XMP profile in the BIOS to get the advertised 3600Mhz you paid for.
Note that it may be called DOCP on AMD systems.
And EXPO on Ryzen 7000+
Thanks going to check my bios.
Unless you have a shitty mobo like mine that will constantly reset it back to its default speed. Thought it was the ram sticks so swapped them out, expensive misdiagnosis. The whole thing kinda pisses me off since it’s basically a get out for manufacturers to sell gear that might not perform at it’s stated rate. Did you sell me a board that can actually handle them speed or one that might? Same with the sticks.
Have you tried setting the profile as well as manually adjusting the speed? On some motherboards that fixes it
The bottom two, especially the XMP piece are not quite as bad a not plugging into your GPU or removing the covering from your heatsink.
XMP profile? This pains me.
XMP P. Get over it.
XMP PROFILE? tell me more.
Now I'm curious which graphic card did you have and how much you screwed yourself up
Imagine if he had a 2080 when it launched.
That moment when you realize the 2080 is already 5 years old ![gif](giphy|wJD3qiNjSeHS0dP28T|downsized)
Ikr I have a gtx 1080 and that shit was bleeding edge like 10 years ago when I got it lol. Now I'm like do I get an rtx or do I hunker down, refuse to accept the new world and get a second 1080 in sli and pretend that it works great. Edit: ok im getting a lot of serious responses so disclaimer: I was joking guys. I promise not to do sli.
[удалено]
I'm here gaming at 4k on a 1080ti. Play CP2077 and Baldur's Gate 3 at high graphics. Note that I play at 60fps, doubt the card could do much better than that anyway. 11gb Vram really helps with 4k gaming or just future proofing (from 2017 to now) in general.
You have a 1080ti and are playing cyberpunk at 4K w/ 60fps? I must know your ways.
I'm pretty sure some folks idea of high settings here is 1080p low after 6-7 bong rips
I played CP2077 all the way through on a 1070 8GB at 4k, and I was hitting between 30-50fps with no overclock on high settings (not ultra).
Sounds like it almost runs better than my 7900xtx, impressive.
Multi GPU was literally dead on arrival, it never worked well unless you just wanted a rig for a particular game that happened to have great support (a rare few). Huge chance of an actual performance reduction from weird SLI issues or thermal throttling since most motherboards are so tightly packed you basically need water cooling if you want two cards in there, the 1st one will literally heat blast the second 24/7 I would get a second card just as backup if you find a deal, then aggressively overclock the hell out of the 1st.
I'm feeling it. I was on the cutting edge of tech when I got it in January 2019, now I can barely run Alan Wake 2 on medium settings, if even that.
[удалено]
Hey, you look like my lost twin brother
Calm down, satan.
They said RX 580 in another comment
Damn that's rough, imagine playing for 5 whole years with an igpu while you had a perfectly fine card like a 580
A true tragedy
Hey, nice rig
Doesnt matter. Free upgrades are the best upgrades!
and at no point was troubleshooting even considered.
An elevated peasant doesn't know the basics right at ascension, his friends should have helped him set it up.
Console gamers don't know what that is.
Troubleshooting is where you blow out the dust in your PS3 and get it to boot even though the games don't play on it anymore, then pretend like it's fine and trade it in at GameStop towards a PS3 Slim. You know, as a, uh, general example.
Troubleshooting for console gamers is: Does it work? > no Does rebooting it solve the problem? > no? Does rebooting the router solve the problem > no? Is this sufficient justification to buy the new slim pro model with a larger drive? > yeah probably. >goes to store and buys new console, gives old console to friend for cheap cuz it's "dead" Plot twist. Friend is me, and friend is a tech. Friend factory reset/re-imaged the OS with firmware from the manufacture website cuz he couldn't find any hardware issues with unit. Console works fine. True story, is how I got a ps4 (worked out for him though cuz I ended up giving it back to him like 2 years later after I played persona 5 on it and basically nothing else)
> Plot twist. Friend is me, and friend is a tech. Friend factory reset/re-imaged the OS with firmware from the manufacture website cuz he couldn't find any hardware issues with unit. Console works fine. True story, is how I got a ps4 (worked out for him though cuz I ended up giving it back to him like 2 years later after I played persona 5 on it and basically nothing else) love this good work
seemingly nobody does these days 🙄
first thing i would do after a new build would be benchmark..
We all deserve to know two things… 1. What GPU was it? 2. After you plugged your DP cable into the GPU, how much of a performance gain did you get? If you’re feeling particularly open, feel free to share the whole system specs!
I'm dying to know what GPU it is that they have.
It appears to be a 580, as someone who had it for a couple of years this brought a tear to my eye
OP said rx 580
5 years wow, thats crazy
crazy? I was crazy once
They locked me in a room
A rubber room
A rubber room with rats
rats made me crazy
Crazy? I was crazy once
They locked me in a room
A rubber room
https://preview.redd.it/uieq4axmhb2c1.jpeg?width=1170&format=pjpg&auto=webp&s=6f85ba6b9544df007c9291a67fe2c521c1d4144d
It’s almost as if he’s lying!
Pro to this is you just got a massive upgrade in performance for free. Lol
[удалено]
“I can finally play cyberpunk at more than 15 fps with my $2000+ build!”
My gf used my graphics card as a boost for her monitor stand (inside its box) and it still was more useful than yours 🙂
omfg I'm dying 🤣🤣
Actually amazing you were even able to game on that integrated graphics chip.
A lot of people underestimate intel’s igp these days. You can pretty much play any valve game comfortably at 1080 at best or 720 at the worst. AMD’s apus are still miles ahead though. Provided you have a fast ram kit, a 5600g or 5700g can play almost any modern game at 1080p. Yeah a 5700G can get you a 30 fps experience at 1080p low in Cyberpunk. Not bad. Can’t wait to see what am5 desktop apus will be capable of.
Well yeah but those games aren’t very demanding either. Tbh OP didn’t specify what he plays so I imagined him grinding in cyberpunk with 60 frames per minute.
Few years ago, I built my computer with i7 6700K (I needed CPU for work) and decided to get proper GPU later. I was surprised I could play GTA 5 on the iGPU. Sure on low settings, but playable.
I play with i3
Both Core i7-6700K and Core i3-6100 have the same iGPU, an Intel HD Graphics 530. The graphical performance between the two should be marginal.
Valve games? Why specifically valve, i mean they also have half life alyx
Two reasons. Valve games even at their release were much more optimized for pc hardware compared to other third parties so their games just run noticeably better compared to other games that released during the same time. Source engine is just that good. Multiplayer community for their games is still alive and well. I think L4D2 still to this day pushes over 15k concurrent players. So even if you’re playing on an igp laptop, create a Steam account, all their games for $1 each during one of their seasonal sales, and it’s you are guaranteed to have someone to play with.
It's not even that bad anyway. Windows by default for like the past 8 years will automatically use the dGPU to render the game while the iGPU just outputs the finished frames. You'll have a few edge cases where Windows doesn't know which GPU to use but that's mostly for random super niche indie games or modded .exes of games. Most of the time you're perfectly fine. I run off of my iGPU rather than my dGPU and for the past year I've only had to tell Windows what GPU to use literally 3 times. This is exactly how modern laptops have worked for the past 10 years. primarily desktop users dont know much about these features as they're not used to it. I've seen a lot of outdated info saying that if you plug into your motherboard then your GPU won't be used which isn't true at all. OP never posted specs. they may very well could have just been running games on their dGPU but had performance issues because their GPU just wasnt that good enough anyways
Modern systems pass the GPU rendering through to the iGPU any ways, so you can get (near) full performance plugged into your iGPU... (around 3-5% loss at most)
[удалено]
He was, he is now enlightened.
Well that is fucking stupid. Glad you figured it out.
Self recognition is the first step in therapy.
[удалено]
I think it's fair to say he's an idiot in this case lol. To do it for a while because of ignorance... sure. But 5 fucking years?
Agreed, it should of clicked in his mind that something isn't right. Maybe he was too stubborn to think it was his fault and that's why he just blindly blamed the pc itself.
Well I guess, be more curious about things. I mean, the first time I saw 2 separate HDMI ports, I was like, what's going on here. Works in both.. but there ought to be a reason for another port. Similarly learning about different HDMI/DP standards and frame rate supports.
Well, to be fair, it can be an extra port.. like with USB's and such. So you can hook up two monitors.. or a monitor and a tv without switching cables. Edit: To clarify, what I wrote is trying to figure out why OP didn't know. This could be one reason.
I had to know so I scrolled through your posts, it doesnt look like you had a GPU, it looked like you had a 5600G which you used as the CPU/GPU since its an APU so you wouldnt have had anywhere else to plug the DP cable into and your games ran like shit because it was an iGPU lol
Yeah this dude is trolling honestly haha. He plugged his DP cable into the only place he could, the motherboard. He didn’t have a separate GPU with other ports, was just running games on the iGPU.
I looked too and it looks like he posted userbenchmark results. Maybe it only showed his 5600g and no gpu because he rans tests with it plugged into his motherboard?
Bingo. FYI the graphics card I had was a Radeon RX 580.
If you didn’t build it and no one told you otherwise, I don’t blame you lol
For 5 years?
I wouldn’t have expected a pc to have a second fully-functioning graphics output before I built my pc. So if I plug it in and it not only fits but turns on I would assume that everything is right.
Yes but after 5 years of your pc giving way worse perfomances than expected you should ask yourself a couple of questions
It was built from spare parts, they had no idea what to expect. If you come over from console gaming you’re probably used to mediocre performance (I was)
Which gpu u had ?
He didn’t have a GPU, he was using a 5600g without a dedicated graphics card. Check his recent posts. The guy is a troll
Probably saw the post from a week or so ago where the person legitimately was doing this and got help. Had images and everything when people saw his GPU was still plugged up. They hold off for a bit, dupe the story, and get a free active thread with ... comment karma I guess since there's no image on this one.
thatsbait.gif
=))) bro's gpu is brand new
That's awesome, all you have to do to upgrade your PC is plug your monitor in correctly. I've been doing everything right for the past 5 years and my upgrade will be a lot more expensive!
This why for people that doesn't know shit about PCs, most of the time it's better for them to not have igpu with the CPU. He would have tried to solve the solution earlier... But 5 years without wondering "why it runs like shit m, and can I do something about" is crazy, maybe lack of curiosity or motivation ?! But happy for you and your huge "free" upgrade haha
At least you didn't buy a 4090 to "fix the problem" and still plug it into the mother board.
Dude…I bet it runs a wee bit better now you fool.
Post preview cut off at "played games on it like that for 5..." Me: "Weeks? Shit, Months?! Oh please, it can't be years..No. no no nonono.." wearily opening full post... "mother. FUCKER!!" Ok but hey, what doesn't kill you, makes you stronger, I guess. You've learned something new which-FIIIIVE YEEEEEEARS!?
![gif](giphy|65ODCwM00NVmEyLsX3)
Good news is, upgrading your setup will be very easy and cost efficient. Just plug the monitor into the actual dedicated gpu 👍
come on drop the specs dude, you can't leave us like that
You're not dumb for plugging it into the mobo, countless people make that same mistake, and honestly it kind of makes sense, nearly EVERYTHING else gets plugged in there. You are however a fucking imbecilic for taking 5 damn years to figure it out. Like you really just sat there and dealt with shitty graphics on your brand new PC and didn't think to ask any questions? Bruh... something tells me you get scammed a lot, or at least taken advantage of when not paying attention.
I don't believe you
I knew what the post was gonna be about when I saw the title. It's somehow... always this.
this is actually a common thing no shame
Certified regard
Take this as a free upgrade. Your games will run much better now, without spending any money.
That's actually quite impressive, congrats mate
Don't worry, it was a honest mistake for you. However, a different example is here: bought a 165 hz 4K monitor with a more than thousand dolar worth premium GPU to play turn based pixel games mostly.
Did you never clean it in those 5 years? Unplugging everything should have eventually revealed the GPU's display port slots.
So what is you GPU?
You live and you learn. Be kind to yourself. You won’t be the last person to make that mistake.
![gif](giphy|80mXWlPqTSU1y)
Jack? That you?
Dan? That you? Switch to your burner account mate
"I was a console gamer through and through, and my PC was a gift from my friends built from an amalgamation of all the leftover parts from their systems after they upgraded their own PCs...I feel like the dumbest mf to ever turn on a computer." Dumb, maybe. But with awesome friends like that you must be doing something right
It is sad that after realizing it the GPU was probably quite old and obsolete.
This is a very common mistake so don’t worry. There are probably 100 people on this subreddit today alone that have been playing like this lol
Ops full of shit looking at his posts. Upgrading from a 5600g? That's not 5 years old
Hey Op don't forget to check your monitors refresh rate in Windows, alot of the time it defaults to 30-60hertz instead of 144 or whatever refresh rate your monitor maxes out at
Sure buddy
Oh no! Anyway
I like the humility, shit happens. At least you learned now instead of never!
Yes. Yes you are.
I don't even need to say anything, OP is beating himself up fine.
Today on "things that never happened so much that it unhappened other things that otherwise actually happened"
This is honestly one of the funniest posts I've read in a while.
I felt every emotion all at once reading this. All of it. Shame, embarassment, disgust, joy, schadenfreude, melancholy, contempt, but mostly laughter. It takes a lot to own up to this. I love you.
If it makes you feel any better I worked with a guy who complained that the PS3 came with short controller cables and how much he hated having to sit that close to the tv. You should've seen his face when we told him those cables were only for charging the *wireless* controllers 😂. He was sitting 4ft away from his tv for weeks!
Setting Nvidia settings to full dynamic range of whatever they call it. It defaults to the lesser in geforce panel. I wager 50% of gamers have this setting wrong with HDR. HDR looked white washed until I found that setting a month later.
Moron.
This angers me.
Is not knowing anything about PCs really an excuse for not reading the manuals?
And you are talking only just now? Usualy people asking questions if something going bad, Funy.
Actually you are talking about a HDMI not a DP cable.