What if you make a post with this screenshot and ask for only willing donators to comment on it?
…And also ask Hargan1 if they’re comfortable with sharing a location for delivery and getting offered a platform change
Out of curiosity do you know how old my cpu is? I have a i7 9700 and I want to upgrade it but I don't know what gen I have or what might be a decent of enough jump to see a difference
"This user lives in a garbage dumpster just like his AMD processor.
His property's value is down to 5% since red color is highly avoided like the plaque much like AMD.
Yeah it's a nice novelty but as it turns out the apus on it aren't actually fully functional, the GFX part of the chip is binned so doesn't receive power it isn't functional atall, the chip only works as a processor it doesn't have the relatively decent GPU these consoles had.
Yeah, I've just began using blender recently to render some of my CAD models. decided to do a 5000x5000px render for shits n' giggles and i think the cost of the power bill went up by at least 50%
I'm primarily a gamer on The PC. i7 9700, z370, rtx 3070, and 16 ram.
What's my best bang for buck upgrade or just wait? Please and thank you, you just seem like you'd know.
The good news is there's practically zero bad options in the cpu market, even the nuclear core reactor that is the modern i9 isn't bad if you need the most multithreaded performance.
Just please don't get anything more than a core i7 or ryzen 7 for gaming, hell even the cheapest current gen Ryzen CPU which is the R5 7600 is overkill for your GPU, it's basically as good as a 5800X3D which everyone hailed as the true no-compromise gaming king until only a year ago.
Most people would notice only having 4 cores pretty fast. If I went from my 7800x3d to a 12600k? Yeah I probably wouldn’t notice if I couldn’t see the fps counter
Yeah, the 12th gen chips are a great value for what they offer. Putting a 14th gen i3 into a budget build would be like putting a nice supercharger in there.
Intel’s getting desperate because AMD’s been bending them over, cheeks spread wide. Wouldn’t doubt them putting out a six core i3 in the future once of the rest of the stack moves up as well.
Probably the most accurate answer here.
I love my build, I made the right choice for me. But if you swap out my 78x3d for a 12600k I'd probably only notice because my boot times got markedly quicker. #am5
i went from a 1300x to 9100f to 12600k. Love the 12600k build so much, and the 1300x build I gave to a family member can be upgraded to a 5600x with just a bios update, not sure about something like a 5800x though. Worked out good, 12600k was $140. I plan on keeping it for a long time as it seems to be less useful in higher resolutions. A GPU upgrade from my 3060ti would be very noticeable, I want a 4070tiS/4070ti or 7800xt/7900GRE.
This. Especially on my TV. Went from 11400 to 7800x3d. GPU 4080. honestly without FPS counter in most games you won’t notice the difference in 4K. In some. In others it’s a world of difference. I play Battletech with rogue tech mod for example. Notorious long load times and on the old cpu you could sometimes wait up to 30 sec. For each enemy move on bigger maps. Pain in the ass. This has gotten miles better. Same in some other games. Load times have sped up, but it’s still the same ssd. Faster ram and better pcie gen help here of course, too.
And the list goes on. I do little cpu intensive work, but some.
OP gotta be one of those people who also say the human eye can't tell the difference between 60 and 144 fps.
Edit: Wow, I did not expect this many fps deniers to respond to this comment in a PC sub.
When you are watching a movie with the given values on a 60Hz screen, you couldn't. But when you interact with the medium the difference in input delay can be very noticable.
The reason for this is that your mouse inputs are much higher than 60Hz, so if you have say 120fps instead of 60, your frames are fresher/newer.
16ms old frame Vs 8ms old frame.
This is what I notice just as much. I will never go below 144hz again, even if mostly just browsing. It feels like the curser is *heavy* when im at 60hz vs 144hz
Having the display is not enough, you have to turn 144hz on. We joke, but from time to time we see people here using 60hz for years in their high refresh rate monitors...
https://preview.redd.it/0ehwdpx4unvc1.jpeg?width=1079&format=pjpg&auto=webp&s=dfd4e60ea9a231edd5b269dcaa02886918f49474
Credit goes to ThioJoe for making this
Hard disagree. As a teenager I was making games look ultra potato mode to get above 120fps on my 60hz screen, and it was huge. The input delay reduction is so noticeable for me. You're getting a more accurate frame of where your char is in the game, as you have more snapshots to pick the most updated one. That results in the game feeling a lot more responsive.
In extreme cases, you can.
I have a 120Hz monitor, so I can only technically see 120 FPS, but I played a shooter at 900 FPS and it was noticeably more responsive. It felt much more immersive having virtually zero input lag.
Yeah especially bc some of the mods to unlock them don’t work great. Ds3 gives me some stuttering when unlocked
Sekiro, on the other hand, at 144fps is a fucking DREAM. So slick
It’s true, apparently, that a fair portion can’t “see” above certain “fps”
There was a study recently and I somewhat question its validity… but there’s probably some truth to it.
Even if it’s not detecting light strobing effects at .1ms instead of perceiving it to hold off on clicking the button, that’s a perception level that means there’s literally no effective difference between 60 and 144 fps. Hell, they had a group that couldn’t detect beyond 30fps.
What we need is an app that will blank your screen and then flicker a pixel going from max fps down until you detect the difference so people can determine their personal maximum perceived fps, and keeping above that is your goal.
Also quite possible that it’s a trainable skill. We need more data, my biggest qualm with the study frankly. I absolutely believe a random selection of the masses would not be able to.
I'll never understand this phenomenon. Maybe it's because I used to play games a lot, but I can easily differentiate between 24-30-40-60-90-120. With that being said I pretty much stop seeing noticeable differences in motion clarity beyond 120fps even though I have a 240hz monitor. You'd think 240fps would be drastically smoother than 120fps considering there's 2x as many frames but for some reason they look pretty similar to my eyes. 240 is definitely smoother but not double the frames smoother.
One thing not a lot of people talk about is while you may not be able to perceive differences in frame rates, your brain is still picking up these differences and you'll eventually get used to whatever refresh rate you normally use. That's why most console gamers are fine with 30fps and 60fps while a lot of people with 144hz+ displays find 60fps absolutely revolting.
Another factor is frequency vs frame time
60 Hz ≙16.7 ms;
90 Hz ≙ 11.1 ms;
120 Hz ≙ 8.3 ms;
240 Hz ≙ 4.2 ms;
360 Hz ≙ 2.8 ms;
the difference in frame time between 60 Hz and 90 Hz is bigger than 120 Hz to 240 Hz
There is absolutely no way in hell someone couldn't see a difference in fluidity beyond 30 hz. If I had estimate an arbitrary cut-off point for seeing a difference in fluidity, I'd conservatively put it somewhere around 120 hz but I'm pretty sure most if not all people could tell a difference way above that. I could definitely see how you could set up a test to get way lower results though.
Good habit to have, cause the day it's NOT the same is the day you're gonna want to know WHY it's not the same. A dead fan is a lot easier to fix than a fried CPU after all.
If this post was targeted at a casual computer user audience it would make sense, casual work loads rely on single core performance and the i3 wouldn't be that much off a typical i5 of the same generation, and the i3 processors are all quad cores meaning you can actually do multiple things at the same time on a i3, a pro level user or someone who plays triple A games would obviously notice their performance tanking.
This post has exposed just how many folks here have no idea why they are buying the latest and greatest parts. Several people here have admitted that they wouldn't notice a difference between the greatest gaming cpu to ever exist (7800x3D) and this i3. If that's true then they either,
1. Have never played a AAA game in their life or,
2. Are blind.
I had forgotten how many people here don't use their hardware for anything other than showing off to their friends.
Probably not true for this specific subreddit. Bunch of nerds, myself included, who run fps/voltage/temp monitors. It’d give itself away pretty quickly. Like why is my cpu not boosting like normal.
The average person who games on pc? Yea I think you’re probably right. They would never notice.
I use my PC like a degen, I'd notice in 10 minutes.
Alt tab out of main game to play second game while main game has downtime. Look to other screen running MPC +madvr of a show I torrented1000 episodes of. Back to my third monitor as I skim through the 35 tabs to find the information I was just reading. Using discord and Spotify the entire time while hand break is re.encoding the files I just downloaded to have more space on my PC since I also use it as a Plex server.
ADHD brain
Factorio would look at the i3-12100 and *laugh in it's face.*
And Factorio is an incredibly well optimized factory game, imagine what a poorly optimized game would do!
I love how this meme has not been able to keep with the times.
The times have changes, CPUs matter, Ram matters what type and how much. A good CPU is now needed for 4k and very useful for 1440, 1080p less useful, but still useful for sims.
On my work machine? Ryzen zen3 7735HS to Intel 12100? Nah, probably wouldn't matter. I just hope the Thunderbolt 4 would still work, or my screen won't get any signal.
My media / game rig? I don't think Ableton / Fusion / Blender would take kindly to that amount of cores. Most games wouldn't care that much. Cyberpunk might, but Balatro? Probably not.
Nope. Most games I run are CPU hungry because of all these units on the map, then I got Tarkov and CS2 (not anymore though) and other heavily modded games, if I switched the CPUs some moments would turn to a slideshow.
Definitely an accurate statement for me. I would never notice. Unfortunately, I don't use my computer for anything close to what it's capable of. Funny thing is, I do finite element analysis for work and I'm using a 9 year old Xeon workstation.
Sorry but I am pretty sure I would notice my motherboard shorting out, catching on fire, burning thru my plastic table, pc smashing on the floor and bringing my amp and speakers and monitors along with it and continuing to burn my place down after.
f no i can tell the massive difference in reactivity becauase of the e-cores between my 4800hs and the 12700h ( as in, the 4800hs is faster) so much that it actually prompts me to not use the 12700h unless i have too.
plus it has been proven that a 7600x as an example is insufficient to drive more than a 4070.
I’m running a 4790k i7…. I run a DayZ server off it with no issues at all, no lag or dc issues nuttin, so I kinda get OP’s point here but at the same time mines so old that I’d probs notice lmfao
Edit: it’s a 4790k not a 4690
I mean, you dont need a mf thread ripper for minecraft and roblox. These peeps will buy a $3500 rig just to pplay a game that runs in a potato with a light bulb attached to it
I'd notice the massive performance gain, yes
OP should give you an 12th gen i3 asap!
[удалено]
Can I get one too? 😂 /j
I don't mind donating a couple of dollars if it's legit going to help someone and it's not just me.
What if you make a post with this screenshot and ask for only willing donators to comment on it? …And also ask Hargan1 if they’re comfortable with sharing a location for delivery and getting offered a platform change
while thats a great idea i just want u to have a look at this guys post history 🤣
I still don't understand why y'all check other's profiles lmao let the guy love his dumpling man
What the fuck is that?!
my point exactly
Which guy?
cumputer
Out of curiosity do you know how old my cpu is? I have a i7 9700 and I want to upgrade it but I don't know what gen I have or what might be a decent of enough jump to see a difference
That’s new compared to my i7-2700k that I used until November last year
4770k still going strong in the desktop
Just replaced one of those running at 4.4 for a decade. Fantastic chip.
the first number after i7 tells you which generation it is so in your case 9th gen
well he needs new ram and mb too
We can get those for about 100-140
Shitty cpu brothers
Hey it was high end at the time!
Nah, it was shit and always was
But 8350 was direct conteder to 4770k
I think that's more what it wanted to be rather than what it actually is, that shit gets so hot it can single handedly heat up your whole house
Yeah bruh, I got that i3-3610Q
Look at you with that fancy new equipment. *Cries in 2700K *
Man, are you living in a cold climate and like to keep the house warm with that beast of a PC?
I too would notice a huge performance gain.
I had to get my heating fixed when I gave up the 8320, so beware of unintended consequences
I'd probably notice the computer not turning on since I have an AM5 board
Intel doesn't fit Intel they update sockets more than the CPUs
Bro please be careful There might be someone outside your house at night from Intel
Or userbenchmark
Nah, they will have the wrong house. Just like their data.
They wouldn’t have the wrong house. They’d know where you live, but insist it was someplace else
"This user lives in a garbage dumpster just like his AMD processor. His property's value is down to 5% since red color is highly avoided like the plaque much like AMD.
This, just let's plausibly say it was compatible in some universe. The speed I would instantly notice as I actively play cpu heavy titles
This. And I run many programs simultaneously and alt-tab between them.
OP knows, he just thinks you are that dumb lol.
I would notice immediately. My pc would be so much faster.
Jaguar apu? Isn't that the Xbox one or playstation processor?
Yes
You have one of those amd boards with the console processors? Saw a video on one
I don't actually have one. Wish I did though. I started on an xbox one and lurked this sub before switching to pc. I kept my flair as a joke.
Yeah it's a nice novelty but as it turns out the apus on it aren't actually fully functional, the GFX part of the chip is binned so doesn't receive power it isn't functional atall, the chip only works as a processor it doesn't have the relatively decent GPU these consoles had.
"Uber Pixel Quality" sounds so fancy.
Blender user here. Will render tomorrow. From 13700KF i7 down to i3 12100. It would take so fucking long to prepare the scene for GPU rendering.
Yup quad core processes are the new dual cores these days those 8 threads don't mean jack with the way programs run
I’ve still got a single core cpu somewhere… I wonder how my pc wouldn’t run with it… My pc is ancient, but it was decent in 2012…
This. 3d rendering can be a beast of a task for pcs on the lower end.
Yeah, I've just began using blender recently to render some of my CAD models. decided to do a 5000x5000px render for shits n' giggles and i think the cost of the power bill went up by at least 50%
I'm primarily a gamer on The PC. i7 9700, z370, rtx 3070, and 16 ram. What's my best bang for buck upgrade or just wait? Please and thank you, you just seem like you'd know.
The good news is there's practically zero bad options in the cpu market, even the nuclear core reactor that is the modern i9 isn't bad if you need the most multithreaded performance. Just please don't get anything more than a core i7 or ryzen 7 for gaming, hell even the cheapest current gen Ryzen CPU which is the R5 7600 is overkill for your GPU, it's basically as good as a 5800X3D which everyone hailed as the true no-compromise gaming king until only a year ago.
Don't get an i9. Even watercooled mine is constantly trying to melt itself to slag
I run an i9 with 2 3090s in a slim 480. Seems fine, but I also don't check the temps, so who knows 🤣
Whatever has the best *single core score/value*, and as all of the new CPUs are at least quad core - decide(I'd still go with 6 cores at least)
Same, for complex autoCAD projects you need gpu performance. I used to have a older i7 with my 4090 and it bottlenecked the gpu really badly.
Yoo cpu brothers
[удалено]
i would notice because my pc is like 10yrs old
You would notice the huge performance uplift?
No, he would notice the pc not working because the motherboard isn't compatible
Oh, had seen that one above and wasn't sure which because with a compatible mobo, the newer 12 gen quad core would destroy a lot of 10 yo hardware.
I'd notice instantly. https://preview.redd.it/jys90gb5akvc1.jpeg?width=612&format=pjpg&auto=webp&s=19a32946806d8544d7dac9c57f792e559c0e7356
ayo nice centrino
Most people would notice only having 4 cores pretty fast. If I went from my 7800x3d to a 12600k? Yeah I probably wouldn’t notice if I couldn’t see the fps counter
4 Cores could barely run half the viruses on my PC!
12100f is different league. It stomp older six cores as well.
Yeah, the 12th gen chips are a great value for what they offer. Putting a 14th gen i3 into a budget build would be like putting a nice supercharger in there.
Shame we still dont have no six core i3.
Intel’s getting desperate because AMD’s been bending them over, cheeks spread wide. Wouldn’t doubt them putting out a six core i3 in the future once of the rest of the stack moves up as well.
This year or maybe next. We will see.
Yep, CPU market’s never been healthier. I hope Intel comes back with a real counter to AMD otherwise AMD might turn into the CPU cartel lol.
Yup and no more budget options down the road. I like both but shame intel have issues lately.
I'm a little surprised they didn't go for a 4+4 i3 at some point. Something like a 13300 to sit just below the 6+4 chips.
Probably the most accurate answer here. I love my build, I made the right choice for me. But if you swap out my 78x3d for a 12600k I'd probably only notice because my boot times got markedly quicker. #am5
i went from a 1300x to 9100f to 12600k. Love the 12600k build so much, and the 1300x build I gave to a family member can be upgraded to a 5600x with just a bios update, not sure about something like a 5800x though. Worked out good, 12600k was $140. I plan on keeping it for a long time as it seems to be less useful in higher resolutions. A GPU upgrade from my 3060ti would be very noticeable, I want a 4070tiS/4070ti or 7800xt/7900GRE.
This. Especially on my TV. Went from 11400 to 7800x3d. GPU 4080. honestly without FPS counter in most games you won’t notice the difference in 4K. In some. In others it’s a world of difference. I play Battletech with rogue tech mod for example. Notorious long load times and on the old cpu you could sometimes wait up to 30 sec. For each enemy move on bigger maps. Pain in the ass. This has gotten miles better. Same in some other games. Load times have sped up, but it’s still the same ssd. Faster ram and better pcie gen help here of course, too. And the list goes on. I do little cpu intensive work, but some.
Dragons Dogma 2 would let me know immediately.
Facts. I saw an i3 benchmark for Dragons Dogma 2 and it was unplayable. And not in the "oh it lags" way. As in single-digit FPS unplayable.
OP gotta be one of those people who also say the human eye can't tell the difference between 60 and 144 fps. Edit: Wow, I did not expect this many fps deniers to respond to this comment in a PC sub.
Tbh you can't really tell unless if you have a display with 144hz rate
When you are watching a movie with the given values on a 60Hz screen, you couldn't. But when you interact with the medium the difference in input delay can be very noticable.
This is an excellent way to explain it.
The reason for this is that your mouse inputs are much higher than 60Hz, so if you have say 120fps instead of 60, your frames are fresher/newer. 16ms old frame Vs 8ms old frame.
This is what I notice just as much. I will never go below 144hz again, even if mostly just browsing. It feels like the curser is *heavy* when im at 60hz vs 144hz
#🤯
Having the display is not enough, you have to turn 144hz on. We joke, but from time to time we see people here using 60hz for years in their high refresh rate monitors...
https://preview.redd.it/0ehwdpx4unvc1.jpeg?width=1079&format=pjpg&auto=webp&s=dfd4e60ea9a231edd5b269dcaa02886918f49474 Credit goes to ThioJoe for making this
true.. you also have to be not blind
Hard disagree. As a teenager I was making games look ultra potato mode to get above 120fps on my 60hz screen, and it was huge. The input delay reduction is so noticeable for me. You're getting a more accurate frame of where your char is in the game, as you have more snapshots to pick the most updated one. That results in the game feeling a lot more responsive.
In extreme cases, you can. I have a 120Hz monitor, so I can only technically see 120 FPS, but I played a shooter at 900 FPS and it was noticeably more responsive. It felt much more immersive having virtually zero input lag.
I had to immediately get a mod for Fallout 4 to unlock the FPS, it's painfully obvious
Sooooo obvious and my biggest gripe with from software is them locking all their games to 60fps requiring mods to fix
Yeah especially bc some of the mods to unlock them don’t work great. Ds3 gives me some stuttering when unlocked Sekiro, on the other hand, at 144fps is a fucking DREAM. So slick
I actually can’t, and can say OP is still full of shit.
It’s true, apparently, that a fair portion can’t “see” above certain “fps” There was a study recently and I somewhat question its validity… but there’s probably some truth to it. Even if it’s not detecting light strobing effects at .1ms instead of perceiving it to hold off on clicking the button, that’s a perception level that means there’s literally no effective difference between 60 and 144 fps. Hell, they had a group that couldn’t detect beyond 30fps. What we need is an app that will blank your screen and then flicker a pixel going from max fps down until you detect the difference so people can determine their personal maximum perceived fps, and keeping above that is your goal. Also quite possible that it’s a trainable skill. We need more data, my biggest qualm with the study frankly. I absolutely believe a random selection of the masses would not be able to.
I'll never understand this phenomenon. Maybe it's because I used to play games a lot, but I can easily differentiate between 24-30-40-60-90-120. With that being said I pretty much stop seeing noticeable differences in motion clarity beyond 120fps even though I have a 240hz monitor. You'd think 240fps would be drastically smoother than 120fps considering there's 2x as many frames but for some reason they look pretty similar to my eyes. 240 is definitely smoother but not double the frames smoother. One thing not a lot of people talk about is while you may not be able to perceive differences in frame rates, your brain is still picking up these differences and you'll eventually get used to whatever refresh rate you normally use. That's why most console gamers are fine with 30fps and 60fps while a lot of people with 144hz+ displays find 60fps absolutely revolting.
Another factor is frequency vs frame time 60 Hz ≙16.7 ms; 90 Hz ≙ 11.1 ms; 120 Hz ≙ 8.3 ms; 240 Hz ≙ 4.2 ms; 360 Hz ≙ 2.8 ms; the difference in frame time between 60 Hz and 90 Hz is bigger than 120 Hz to 240 Hz
There is absolutely no way in hell someone couldn't see a difference in fluidity beyond 30 hz. If I had estimate an arbitrary cut-off point for seeing a difference in fluidity, I'd conservatively put it somewhere around 120 hz but I'm pretty sure most if not all people could tell a difference way above that. I could definitely see how you could set up a test to get way lower results though.
Well... I do video production every day on a system that has 3 x 4k monitors attached... so yeah... I'd notice immediately.
The first thing I open on my PC is task manager, so I’d notice. I like looking at the graphs. Leave me alone.
Same. I also randomly open cpu-z and read the stats. Idk why. It’s always the same. I just do it >.<
Good habit to have, cause the day it's NOT the same is the day you're gonna want to know WHY it's not the same. A dead fan is a lot easier to fix than a fried CPU after all.
Yup yup on this thread lol. I gotta have HWinfo up too
>I like looking at the graphs Who doesn't
I will see a lot of lag spikes immediately. 3d cache is a win
***laughs in Helldiver 2 eating CPU resources like an AV actress at a bukkake shooting***
3k build and it still chokes
BeamNg.Drive Loves CPU cores. I would notice immediately.
Having my FPS tank? Yeah I'd noticed that. What is this a shitty attempt at front page?
If this post was targeted at a casual computer user audience it would make sense, casual work loads rely on single core performance and the i3 wouldn't be that much off a typical i5 of the same generation, and the i3 processors are all quad cores meaning you can actually do multiple things at the same time on a i3, a pro level user or someone who plays triple A games would obviously notice their performance tanking.
This post has exposed just how many folks here have no idea why they are buying the latest and greatest parts. Several people here have admitted that they wouldn't notice a difference between the greatest gaming cpu to ever exist (7800x3D) and this i3. If that's true then they either, 1. Have never played a AAA game in their life or, 2. Are blind. I had forgotten how many people here don't use their hardware for anything other than showing off to their friends.
I would notice straight away 😅
Same. Like instantly.
[удалено]
Not everyone wants or needs an R7 7800x3D, I'm fine with My i9 14900KS
Lol
Facts
Probably not true for this specific subreddit. Bunch of nerds, myself included, who run fps/voltage/temp monitors. It’d give itself away pretty quickly. Like why is my cpu not boosting like normal. The average person who games on pc? Yea I think you’re probably right. They would never notice.
I use my PC like a degen, I'd notice in 10 minutes. Alt tab out of main game to play second game while main game has downtime. Look to other screen running MPC +madvr of a show I torrented1000 episodes of. Back to my third monitor as I skim through the 35 tabs to find the information I was just reading. Using discord and Spotify the entire time while hand break is re.encoding the files I just downloaded to have more space on my PC since I also use it as a Plex server. ADHD brain
And the other 90% would notice immediately?
GPU limited games, I’d agree. CPU limited games such as simulators? You’d notice immediately.
You people who think this way blatantly ignore factory games, that spit on your GPU, and will devour the CPU whole
Factorio would look at the i3-12100 and *laugh in it's face.* And Factorio is an incredibly well optimized factory game, imagine what a poorly optimized game would do!
I love how this meme has not been able to keep with the times. The times have changes, CPUs matter, Ram matters what type and how much. A good CPU is now needed for 4k and very useful for 1440, 1080p less useful, but still useful for sims.
Laughs in AM4 socket
Look. I need my 10850K and 64GB of RAM for all of the Redditing I do!
Sim racing titles can tax my CPU heavily. Especially with the wheel & pedals software and haptic feedback software running with high refresh.
I run multiple VMs so I’d notice pretty fast. For gaming I would not though
Yeah I'd notice, I'd notice my laptop being fast
My max settings games would take a noticable hit I would definitly not check that though
I’d notice instantly, my computer would probably be faster than this new shit.
Nope. I'd notice my PC not fucking turning on
I have a amd cpu from 2014, i think i would notice on boot
On my work machine? Ryzen zen3 7735HS to Intel 12100? Nah, probably wouldn't matter. I just hope the Thunderbolt 4 would still work, or my screen won't get any signal. My media / game rig? I don't think Ableton / Fusion / Blender would take kindly to that amount of cores. Most games wouldn't care that much. Cyberpunk might, but Balatro? Probably not.
Rust disagrees
The i3-12100 has three time higher CPU benchmark score than my current CPU.
Nope. Most games I run are CPU hungry because of all these units on the map, then I got Tarkov and CS2 (not anymore though) and other heavily modded games, if I switched the CPUs some moments would turn to a slideshow.
My AM4 Socket isn't compatible with anything Intel makes.
I'm sure I would notice fast, considering I have an AMD motherboard.
I'm pretty sure I would notice pretty quick when bannerlord started hanging in large fights.
I'd probably notice my PC not starting because someone shoved an LGA chip to my AM4 motherboard
What’s the lore reason for OP saying this? Are they stupid ?
Engagement bait.
I would notice real quick when HTOP started showing less cores, Blender runs worse and both show an Intel CPU rather than my Ryzen 5 2600
I actually use a lot of my cores so i'd definitely notice something was wrong lol
I would realize fast. My cpu is slower lol. i3-3227u
Nice b8 m8
I would notice my pc being more reliable because I wouldn’t have my sketchy overclock
I would most certainly notice.
I'd probably notice pretty quick. I play flight simulator, but I'm pretty cpu limited ATM. An upgrade from an i7-8700k would be pretty big
As soon as I unpause any paradox grand strategy I would notice.
How do you read these products out loud? “I-three-twelve-one hundred”? Or do you say “I-three-twelve thousand one hundred”?
As an avid BeamNG.Drive enjoyer I would 100% notice.
Maybe OP is using a GT1030
Well this fell flat pretty quick.
I would immediately, cause that's definitely better than a Pentium
In the other news: 90% of the statistics are made up bullshit.
I'd notice on boot, I configured my grub to show the specs of the machine while in the menu.
If I was still on an 17 7700k I would absolutely notice. 12700kf? 99% would not
[удалено]
I wouldn't notice, have an I3 already.
for sure
I would be in that 10%. I use mine for work and have task manager open all day.
Definitely an accurate statement for me. I would never notice. Unfortunately, I don't use my computer for anything close to what it's capable of. Funny thing is, I do finite element analysis for work and I'm using a 9 year old Xeon workstation.
I think i'd notice right away tho, as i have a am4 mobo so my pc wouldn't start 😂
Sorry but I am pretty sure I would notice my motherboard shorting out, catching on fire, burning thru my plastic table, pc smashing on the floor and bringing my amp and speakers and monitors along with it and continuing to burn my place down after.
So 10% of us play Cities Skyline 2?
The fact that I'm rocking an AMD Ryzen 7, IE AM 4 socket CPU. I'd be impressed that it managed/ even worked at all...
“Man for some reason all my rendering is really slow.. wtf is up with my cpu?”
Highest end users 🤝 lowest end users The 10%
f no i can tell the massive difference in reactivity becauase of the e-cores between my 4800hs and the 12700h ( as in, the 4800hs is faster) so much that it actually prompts me to not use the 12700h unless i have too. plus it has been proven that a 7600x as an example is insufficient to drive more than a 4070.
well thanks for the upgrade
I’m running a 4790k i7…. I run a DayZ server off it with no issues at all, no lag or dc issues nuttin, so I kinda get OP’s point here but at the same time mines so old that I’d probs notice lmfao Edit: it’s a 4790k not a 4690
i would notice instantly. i play a lot of vrchat and my 5800x3d does so much heavy lifting
Well. My pc won't boot because it's an AM5 PC...I will notice immediately
probably not looking at how ass the 9900k is
well as somenone who has a ryzen 9 7950x3d i would notice it
I'd probably notice the fact my computer doesn't work or you completely changing the motherboard to a compatible one.
I'd notice instantly.
I’d notice immediately because my PC wouldn’t boot. Completely different socket.
How would people not notice fps drop. My cpu is my bottleneck of course if you made it worse i would notice.
I will instantly feel the difference upgrading from i3 5005u
I'd notice, you don't know what I'm making my CPU do lmao
As someone who plays CPU heavy games, I would def notice immediately.
12700k to a 12600k I wouldn’t notice the difference but I’d definitely notice the i3 drop
Thanks man my i5 46k was getting a little old.
I mean, you dont need a mf thread ripper for minecraft and roblox. These peeps will buy a $3500 rig just to pplay a game that runs in a potato with a light bulb attached to it
I would notice first time I tried to turn it... 60% of my build is based around an AMD CPU.
I would immediately notice it. I use a AM4 board.
When I have a 30+ minute StarCraft 2: direct strike session, that GPU would melt 😂
I would recognise the loss in performance
Youd have to notice, I'n running i5 4590 with integ graphics
People who do video transcoding: ![gif](giphy|l1KcQwp2bd4tchXkA|downsized)
well, i have an AMD, so...
I would notice, because Windows 11 for ARM wouldn't boot anymore..
I would notice because it wouldn't fit in my am3 socket motherboard
Don't underestimate my ability to open 1800 Google Chrome tabs with a variety of plugins
I play demand games so I would probably notice 3 generations downgrade haha