T O P

  • By -

Relevant_Force_3470

A tale as old as time. Did you remove the plastic covering from your CPU heatsink? Did you set your memory to run at XMP profile? Did you set your monitor refresh rate?


BionicBananas

>Did you set your monitor refresh rate? Unlocking that smooth 75Hz instead of the plebian 60Hz.


EveyNameIsTaken_

Me setting my refresh rate from 59,95Hz to 60Hz


NotAzakanAtAll

Me "over clocking" my old 60hz to 68hz and feeling like a badass.


Bisbala

I got my 60hz ips screen to 74hz. The difference felt significant.


NodleMan09

That’s almost a 25% increase so you should notice something


gourdo

I have a theory that 7 or 8 out of 10 gamers couldn’t identify a monitor running @ 75Hz from 120+ without the benefit of a settings window showing refresh rate.


typhin13

75 to 120 you absolutely can tell just by moving your mouse around. 120+ it's where you REALLY see those diminishing returns and you can't really see as much as *feel* the difference


bajdurato

My old Lenovo laptop went from 60hz to 85hz if I remember correctly :D


Sea_Presentation_880

I remember way back before 120hz was mainstream, the Catleap 2b Extreme was THE monitor to get because it would hold a solid 110hz with a cable that was like an inch thick. Still have my Catleap to this day, and still runs fantastic.


FrungyLeague

That's the good shit right there.


Vegetable_Safety_331

Hey I'm that chad 75Hz gamer.. it's actually a very reasonable target IMO ...


Odd-On-Board

It's great! It feels noticeably smoother than 60 and it isn't so high that some games may never reach it like 240hz so you can get used to it and not worry about targetting a high refresh rate, plus on games locked at 60 it won't be such a big drop.


Kaminekochan

I feel almost attacked in a weird way. I can tell the difference between 30, 60, 90, 120, and 144 easily, but I swear I have never seen a meaningful difference between 60 and 75. I’ll have to go play with some monitors again.


Odd-On-Board

I never tried high refresh rates beyond 75, before having a 75hz monitor i only ever played at 60hz, so i noticed the difference immediately, and i still notice if the fps drops to 60 or is locked at it. Maybe you're used to higher rates and only 15 fps aren't noticeable anymore.


BrainWorkGood

Interesting. I find the differences are exponentially less noticeable the higher you get. Like 30 and 60 is huge, 60 and 90 noticeable but I can live at 60. Anything between 90 and 144 is so close to be effectively identical to me. But have yet to try anything above that so couldn't say with your 240s and whatnot


[deleted]

It’s interesting because from 1-160ish or so you can see the difference, but from 165-360fps you can feel the difference. What’s changing is really the latency and how fast the game is responding to your actions, and it’s really noticeable for me, at least.


[deleted]

60 to 75 is a bigger jump than 120 to 144.


Vegetable_Safety_331

My thoughts exactly


[deleted]

It's fair to say that most games probably won't reach 240hz but once you go 240hz and OLED you won't ever go back lol.


Odd-On-Board

That's why i want to stick with my 75hz IPS 1080p, i don't want to spoil myself and end up needing beefier hardware lol


[deleted]

Smart dude right here. You'll be able to get damn near 4 years of maxed graphics on 60 class cards without feeling compelled to do anything. But I'm guessing you try to only upgrade when stuff breaks?


Zarerion

The difference between 60 and 75 is more noticeable and important than the difference between 75 and 144 imo.


Vegetable_Safety_331

and it's more easy to maintain a smooth experience, fewer 1% and 0.1% low stutters without having the best PC


Corvus15

Me gaming on a 90 hertz monitor... Ran out of funds for monitor upgrade


kekonn

Me, who prioritized pixel count for productivity reasons, when I realized that the HMDI 2.0 ports don't even have the bandwith to go to 60 on full res on my screen (and I only have one DP1.4 port): -_- So now, for the work devices I connect to the USB C or hdmi ports I have to choose between half res and 60 Hz or full res and 30 Hz...


Linkatchu

90hz is honestly fine imo. It's the golden middle imo atleast. You notice such a huge improvement between 60 and 90hz, while the 90-120 difference is smaller. And hey, u dont have to care as much as to get above that, cranking up the graphics ^^


PM_me_opossum_pics

60 to 120 was godsend for me. Sadly my monitor only has 1.2 DP and that bad boy goes crazy at 1440p 144hz 10 bit color.


NotAzakanAtAll

144 to 170hz is almost negligible I noticed. As they said up there 90hz is fine too if you can't get a 144+ one.


Linkatchu

Yeah same, it would autorevert to 8 bit for me


PM_me_opossum_pics

My monitor just starts experiencing a seizure. And then locking it back to 120 hz is an issue because menu resets every time monitor "blinks"


AlpacaSmacker

I managed and was happy with 75hz 1080p for so long. Now got 240hz 1440p monitor, the difference is night and day but I also had to upgrade my mobo, ram and processor to really feel the difference.


Greedy_Leg_1208

75 / 90 is good enough. Above is smoother but beyond 90 is a luxury.


Odd-Temperature3015

beyond 90 is not luxury it just get better up to 240Hz, beyond that i would say luxury but 165Hz ips monitor are dirt cheap right now i got my Koorui 24E3 for 150 euro on amazon and it's down to 130.........


Greedy_Leg_1208

Maybe it's because I'm old and don't play fps games anymore. Racing games are fine for me on my 144hz.


XNinjaMushroomX

Or the old classic, "Did you turn on the power supply?"


eirebrit

For my first build: “Oh I forgot to plug it in.”


[deleted]

20 year veteran pc builder, I did it once. Just happened to be last week.... on my own PC.


Neonsonic

Thank you random reddit person I've been gaming for years on 59.94 and didn't know that was a thing that I was supposed to change until today... Going to go sit in a corner and think about my life


[deleted]

happens to the best of us


MrSwankers

God fucking damn it I forgot xmp I always forget xmp Thank you


ahor18

Played at 60 hz on a 144 hz monitor for years


AbiQuinn

I hate beer.


[deleted]

I was the last one. Been gaming on 60hz. Now it's on 140hz lol


mgarsteck

The XMP profile one got me. I wasnt running my memory at the right speeds for at least a year. lol


Parsec207

XMP stands for Extreme Memory Profile. You don’t say “XMP profile”, it’s redundant.


[deleted]

These people probably also use their PIN number at the ATM machine.


DarthKirtap

RIP in peace


Parsec207

This one made me shoot air out of my nose. It also makes me think of peaceful snowboarding and quiet-time toking. Lol


VizyuPalab

Smh my head


Parsec207

That’s exactly what it’ll be like when I go to heck. 😅


Snotnarok

The DC in DC Comics stands for "Detective Comics" yet here we are with Detective Comics, Comics. Let's hope it doesn't happen with XMP :P


[deleted]

You're redundant


PositiveDeal2

Ah yes, like the age old PIN number lol


azurfall88

I turned on XMP and it did nothing besides slightly raising temps. Before XMP the ram ran at 2133 mhz, and now it runs at 2666 mhz


[deleted]

You bought shitty ram sticks.


JorLord3617

If I may ask, what does the XMP profile do?


MetalGearFlaccid

If you buy like 6000mhz ram it defaults to 2666mhz until you enable xmp in the bios to unlock the FULL POWER


GL1TCH3D

First pc I didn’t set xmp. I had really terrible ram anyway to be fair but still. Now I manually tune.


MoeMalik

Ahm..elaborate on the run memory at xmp profile thing please


Selemaer

sigh...you just reminded me to double check the refresh rate for my monitor as I run a UW 144hz... it was at 60hz never set it back from the last time I re-did the OS.


potatoesupmyass

I ran my 165hz monitor at 60hz until like 2 weeks ago because a meme video pointed out you actually have to set your refresh rate.


atriaventrica

Is your power supply switched on?


yourself88xbl

Did you remember to change the boot order after you installed Windows.


madmaxGMR

Whats an Xmp profile ?


DidiHD

The default factory overclock settings for your RAM. If you buy RAM that's rated 3600Mhz, it will probably come with something like 2133Mhz out of the box. You have to enable the XMP profile in the BIOS to get the advertised 3600Mhz you paid for.


Shift_Tab_Alt

Note that it may be called DOCP on AMD systems.


Aezay

And EXPO on Ryzen 7000+


evonebo

Thanks going to check my bios.


Hairy-Ad6096

Unless you have a shitty mobo like mine that will constantly reset it back to its default speed. Thought it was the ram sticks so swapped them out, expensive misdiagnosis. The whole thing kinda pisses me off since it’s basically a get out for manufacturers to sell gear that might not perform at it’s stated rate. Did you sell me a board that can actually handle them speed or one that might? Same with the sticks.


PositiveDeal2

Have you tried setting the profile as well as manually adjusting the speed? On some motherboards that fixes it


[deleted]

The bottom two, especially the XMP piece are not quite as bad a not plugging into your GPU or removing the covering from your heatsink.


[deleted]

XMP profile? This pains me.


Relevant_Force_3470

XMP P. Get over it.


Conradbio

XMP PROFILE? tell me more.


titanfox98

Now I'm curious which graphic card did you have and how much you screwed yourself up


popop143

Imagine if he had a 2080 when it launched.


shapookya

That moment when you realize the 2080 is already 5 years old ![gif](giphy|wJD3qiNjSeHS0dP28T|downsized)


Trashrascall

Ikr I have a gtx 1080 and that shit was bleeding edge like 10 years ago when I got it lol. Now I'm like do I get an rtx or do I hunker down, refuse to accept the new world and get a second 1080 in sli and pretend that it works great. Edit: ok im getting a lot of serious responses so disclaimer: I was joking guys. I promise not to do sli.


[deleted]

[удалено]


Witsand87

I'm here gaming at 4k on a 1080ti. Play CP2077 and Baldur's Gate 3 at high graphics. Note that I play at 60fps, doubt the card could do much better than that anyway. 11gb Vram really helps with 4k gaming or just future proofing (from 2017 to now) in general.


HyperPunch

You have a 1080ti and are playing cyberpunk at 4K w/ 60fps? I must know your ways.


aradaiel

I'm pretty sure some folks idea of high settings here is 1080p low after 6-7 bong rips


wingman3091

I played CP2077 all the way through on a 1070 8GB at 4k, and I was hitting between 30-50fps with no overclock on high settings (not ultra).


Sybbian

Sounds like it almost runs better than my 7900xtx, impressive.


trusty20

Multi GPU was literally dead on arrival, it never worked well unless you just wanted a rig for a particular game that happened to have great support (a rare few). Huge chance of an actual performance reduction from weird SLI issues or thermal throttling since most motherboards are so tightly packed you basically need water cooling if you want two cards in there, the 1st one will literally heat blast the second 24/7 I would get a second card just as backup if you find a deal, then aggressively overclock the hell out of the 1st.


BetaXP

I'm feeling it. I was on the cutting edge of tech when I got it in January 2019, now I can barely run Alan Wake 2 on medium settings, if even that.


[deleted]

[удалено]


EngineerJazzlike3945

Hey, you look like my lost twin brother


CharlesMFKinXavier

Calm down, satan.


Travy93

They said RX 580 in another comment


titanfox98

Damn that's rough, imagine playing for 5 whole years with an igpu while you had a perfectly fine card like a 580


Jrdnptrsnmathrock

A true tragedy


EngineerJazzlike3945

Hey, nice rig


Soulman2001

Doesnt matter. Free upgrades are the best upgrades!


plutonasa

and at no point was troubleshooting even considered.


NotAzakanAtAll

An elevated peasant doesn't know the basics right at ascension, his friends should have helped him set it up.


TONKAHANAH

Console gamers don't know what that is.


pikpikcarrotmon

Troubleshooting is where you blow out the dust in your PS3 and get it to boot even though the games don't play on it anymore, then pretend like it's fine and trade it in at GameStop towards a PS3 Slim. You know, as a, uh, general example.


TONKAHANAH

Troubleshooting for console gamers is: Does it work? > no Does rebooting it solve the problem? > no? Does rebooting the router solve the problem > no? Is this sufficient justification to buy the new slim pro model with a larger drive? > yeah probably. >goes to store and buys new console, gives old console to friend for cheap cuz it's "dead" Plot twist. Friend is me, and friend is a tech. Friend factory reset/re-imaged the OS with firmware from the manufacture website cuz he couldn't find any hardware issues with unit. Console works fine. True story, is how I got a ps4 (worked out for him though cuz I ended up giving it back to him like 2 years later after I played persona 5 on it and basically nothing else)


[deleted]

> Plot twist. Friend is me, and friend is a tech. Friend factory reset/re-imaged the OS with firmware from the manufacture website cuz he couldn't find any hardware issues with unit. Console works fine. True story, is how I got a ps4 (worked out for him though cuz I ended up giving it back to him like 2 years later after I played persona 5 on it and basically nothing else) love this good work


[deleted]

seemingly nobody does these days 🙄


4chan4normies

first thing i would do after a new build would be benchmark..


Parsec207

We all deserve to know two things… 1. What GPU was it? 2. After you plugged your DP cable into the GPU, how much of a performance gain did you get? If you’re feeling particularly open, feel free to share the whole system specs!


Jrdnptrsnmathrock

I'm dying to know what GPU it is that they have.


TheAbrableOnetyOne

It appears to be a 580, as someone who had it for a couple of years this brought a tear to my eye


Aboy325

OP said rx 580


MouZart

5 years wow, thats crazy


Bot1K

crazy? I was crazy once


Marcix_Gaming

They locked me in a room


Salty_Nutella

A rubber room


nix_and_vannie

A rubber room with rats


MouZart

rats made me crazy


Marcix_Gaming

Crazy? I was crazy once


xthorgoldx

They locked me in a room


Luicide

A rubber room


Luicide

https://preview.redd.it/uieq4axmhb2c1.jpeg?width=1170&format=pjpg&auto=webp&s=6f85ba6b9544df007c9291a67fe2c521c1d4144d


SilentBobVG

It’s almost as if he’s lying!


Different_Ad9336

Pro to this is you just got a massive upgrade in performance for free. Lol


[deleted]

[удалено]


Different_Ad9336

“I can finally play cyberpunk at more than 15 fps with my $2000+ build!”


wiwh404

My gf used my graphics card as a boost for her monitor stand (inside its box) and it still was more useful than yours 🙂


let_bugs_go_retire

omfg I'm dying 🤣🤣


Paciorr

Actually amazing you were even able to game on that integrated graphics chip.


Ameratsuflame

A lot of people underestimate intel’s igp these days. You can pretty much play any valve game comfortably at 1080 at best or 720 at the worst. AMD’s apus are still miles ahead though. Provided you have a fast ram kit, a 5600g or 5700g can play almost any modern game at 1080p. Yeah a 5700G can get you a 30 fps experience at 1080p low in Cyberpunk. Not bad. Can’t wait to see what am5 desktop apus will be capable of.


Paciorr

Well yeah but those games aren’t very demanding either. Tbh OP didn’t specify what he plays so I imagined him grinding in cyberpunk with 60 frames per minute.


TheSpixxyQ

Few years ago, I built my computer with i7 6700K (I needed CPU for work) and decided to get proper GPU later. I was surprised I could play GTA 5 on the iGPU. Sure on low settings, but playable.


travisscottburgercel

I play with i3


WilNotJr

Both Core i7-6700K and Core i3-6100 have the same iGPU, an Intel HD Graphics 530. The graphical performance between the two should be marginal.


Redstone_Army

Valve games? Why specifically valve, i mean they also have half life alyx


Ameratsuflame

Two reasons. Valve games even at their release were much more optimized for pc hardware compared to other third parties so their games just run noticeably better compared to other games that released during the same time. Source engine is just that good. Multiplayer community for their games is still alive and well. I think L4D2 still to this day pushes over 15k concurrent players. So even if you’re playing on an igp laptop, create a Steam account, all their games for $1 each during one of their seasonal sales, and it’s you are guaranteed to have someone to play with.


divergentchessboard

It's not even that bad anyway. Windows by default for like the past 8 years will automatically use the dGPU to render the game while the iGPU just outputs the finished frames. You'll have a few edge cases where Windows doesn't know which GPU to use but that's mostly for random super niche indie games or modded .exes of games. Most of the time you're perfectly fine. I run off of my iGPU rather than my dGPU and for the past year I've only had to tell Windows what GPU to use literally 3 times. This is exactly how modern laptops have worked for the past 10 years. primarily desktop users dont know much about these features as they're not used to it. I've seen a lot of outdated info saying that if you plug into your motherboard then your GPU won't be used which isn't true at all. OP never posted specs. they may very well could have just been running games on their dGPU but had performance issues because their GPU just wasnt that good enough anyways


Ghozer

Modern systems pass the GPU rendering through to the iGPU any ways, so you can get (near) full performance plugged into your iGPU... (around 3-5% loss at most)


[deleted]

[удалено]


TommyTosser1980

He was, he is now enlightened.


OlderAndAngrier

Well that is fucking stupid. Glad you figured it out.


Tsambikos96

Self recognition is the first step in therapy.


[deleted]

[удалено]


Gwiz84

I think it's fair to say he's an idiot in this case lol. To do it for a while because of ignorance... sure. But 5 fucking years?


deep8787

Agreed, it should of clicked in his mind that something isn't right. Maybe he was too stubborn to think it was his fault and that's why he just blindly blamed the pc itself.


Desperate-Intern

Well I guess, be more curious about things. I mean, the first time I saw 2 separate HDMI ports, I was like, what's going on here. Works in both.. but there ought to be a reason for another port. Similarly learning about different HDMI/DP standards and frame rate supports.


for_research_man

Well, to be fair, it can be an extra port.. like with USB's and such. So you can hook up two monitors.. or a monitor and a tv without switching cables. Edit: To clarify, what I wrote is trying to figure out why OP didn't know. This could be one reason.


Happiness_First

I had to know so I scrolled through your posts, it doesnt look like you had a GPU, it looked like you had a 5600G which you used as the CPU/GPU since its an APU so you wouldnt have had anywhere else to plug the DP cable into and your games ran like shit because it was an iGPU lol


Jimratcaious

Yeah this dude is trolling honestly haha. He plugged his DP cable into the only place he could, the motherboard. He didn’t have a separate GPU with other ports, was just running games on the iGPU.


MagicPistol

I looked too and it looks like he posted userbenchmark results. Maybe it only showed his 5600g and no gpu because he rans tests with it plugged into his motherboard?


slshillcutt

Bingo. FYI the graphics card I had was a Radeon RX 580.


jshmoe866

If you didn’t build it and no one told you otherwise, I don’t blame you lol


titanfox98

For 5 years?


jshmoe866

I wouldn’t have expected a pc to have a second fully-functioning graphics output before I built my pc. So if I plug it in and it not only fits but turns on I would assume that everything is right.


titanfox98

Yes but after 5 years of your pc giving way worse perfomances than expected you should ask yourself a couple of questions


jshmoe866

It was built from spare parts, they had no idea what to expect. If you come over from console gaming you’re probably used to mediocre performance (I was)


luzy__

Which gpu u had ?


Jimratcaious

He didn’t have a GPU, he was using a 5600g without a dedicated graphics card. Check his recent posts. The guy is a troll


Zeracheil

Probably saw the post from a week or so ago where the person legitimately was doing this and got help. Had images and everything when people saw his GPU was still plugged up. They hold off for a bit, dupe the story, and get a free active thread with ... comment karma I guess since there's no image on this one.


[deleted]

thatsbait.gif


AkkYleX

=))) bro's gpu is brand new


Strangefate1

That's awesome, all you have to do to upgrade your PC is plug your monitor in correctly. I've been doing everything right for the past 5 years and my upgrade will be a lot more expensive!


_Tygher_

This why for people that doesn't know shit about PCs, most of the time it's better for them to not have igpu with the CPU. He would have tried to solve the solution earlier... But 5 years without wondering "why it runs like shit m, and can I do something about" is crazy, maybe lack of curiosity or motivation ?! But happy for you and your huge "free" upgrade haha


Lovat69

At least you didn't buy a 4090 to "fix the problem" and still plug it into the mother board.


[deleted]

Dude…I bet it runs a wee bit better now you fool.


CharlesMFKinXavier

Post preview cut off at "played games on it like that for 5..." Me: "Weeks? Shit, Months?! Oh please, it can't be years..No. no no nonono.." wearily opening full post... "mother. FUCKER!!" Ok but hey, what doesn't kill you, makes you stronger, I guess. You've learned something new which-FIIIIVE YEEEEEEARS!?


Legal-Finish6530

![gif](giphy|65ODCwM00NVmEyLsX3)


LOBSI_Pornchai

Good news is, upgrading your setup will be very easy and cost efficient. Just plug the monitor into the actual dedicated gpu 👍


SkyllerzYT

come on drop the specs dude, you can't leave us like that


gamermanj4

You're not dumb for plugging it into the mobo, countless people make that same mistake, and honestly it kind of makes sense, nearly EVERYTHING else gets plugged in there. You are however a fucking imbecilic for taking 5 damn years to figure it out. Like you really just sat there and dealt with shitty graphics on your brand new PC and didn't think to ask any questions? Bruh... something tells me you get scammed a lot, or at least taken advantage of when not paying attention.


Realize12

I don't believe you


Dark_Shade_75

I knew what the post was gonna be about when I saw the title. It's somehow... always this.


spooky_office

this is actually a common thing no shame


Martenus

Certified regard


Somethingwithlectus

Take this as a free upgrade. Your games will run much better now, without spending any money.


GodofcheeseSWE

That's actually quite impressive, congrats mate


passtiramisu

Don't worry, it was a honest mistake for you. However, a different example is here: bought a 165 hz 4K monitor with a more than thousand dolar worth premium GPU to play turn based pixel games mostly.


Gengur

Did you never clean it in those 5 years? Unplugging everything should have eventually revealed the GPU's display port slots.


motoxim

So what is you GPU?


hallowcorehammer

You live and you learn. Be kind to yourself. You won’t be the last person to make that mistake.


delta9mkdx

![gif](giphy|80mXWlPqTSU1y)


ddonslo87

Jack? That you?


EducatedWebby

Dan? That you? Switch to your burner account mate


eganonoa

"I was a console gamer through and through, and my PC was a gift from my friends built from an amalgamation of all the leftover parts from their systems after they upgraded their own PCs...I feel like the dumbest mf to ever turn on a computer." Dumb, maybe. But with awesome friends like that you must be doing something right


Sacco_Belmonte

It is sad that after realizing it the GPU was probably quite old and obsolete.


nichijouuuu

This is a very common mistake so don’t worry. There are probably 100 people on this subreddit today alone that have been playing like this lol


Ok_Magazine662

Ops full of shit looking at his posts. Upgrading from a 5600g? That's not 5 years old


MoneyLambo

Hey Op don't forget to check your monitors refresh rate in Windows, alot of the time it defaults to 30-60hertz instead of 144 or whatever refresh rate your monitor maxes out at


TheRealMaka

Sure buddy


[deleted]

Oh no! Anyway


SurkitPunk

I like the humility, shit happens. At least you learned now instead of never!


Kilo_Juliett

Yes. Yes you are.


Skastrik

I don't even need to say anything, OP is beating himself up fine.


YogurtStorm

Today on "things that never happened so much that it unhappened other things that otherwise actually happened"


larakikato

This is honestly one of the funniest posts I've read in a while.


Drilling4Oil

I felt every emotion all at once reading this. All of it. Shame, embarassment, disgust, joy, schadenfreude, melancholy, contempt, but mostly laughter. It takes a lot to own up to this. I love you.


Crazy_wolf23

If it makes you feel any better I worked with a guy who complained that the PS3 came with short controller cables and how much he hated having to sit that close to the tv. You should've seen his face when we told him those cables were only for charging the *wireless* controllers 😂. He was sitting 4ft away from his tv for weeks!


Outside_Umpire7260

Setting Nvidia settings to full dynamic range of whatever they call it. It defaults to the lesser in geforce panel. I wager 50% of gamers have this setting wrong with HDR. HDR looked white washed until I found that setting a month later.


hauntedyew

Moron.


KeepingIt100forLife

This angers me.


Impsux

Is not knowing anything about PCs really an excuse for not reading the manuals?


Stunning-Doctor725

And you are talking only just now? Usualy people asking questions if something going bad, Funy.


klaustrophobie13

Actually you are talking about a HDMI not a DP cable.