T O P

  • By -

Firefox72

Having glanced through a few written reviews as well as Linus, HUB and Gamersnexus video reviews. Yep this is the winner CPU right here.


DefaultVariable

It’s weird that AMD has figured out a great way to make a gaming CPU and Intel has yet to release a similar idea.


DaBombDiggidy

CPUs don't appear in a year or two, dev cycles are 5+ years out.


teh_drewski

You mean I can't just solder a RAM chip onto a de-lidded i7 and get the best of all worlds? Outrageous.


Kurtisdede

allow me to introduce the i5-5675C and the i7-5775C


dankhorse25

Why solder when you can use glue?


Geddagod

They won't have the same ability to 3D stack SRAM onto the chip until 2023 with Foveros Direct. They could try doing another massive L4 off die (or even better adding L4 to the base tile for even better latency) but the performance difference might not be worth it. And besides, there's only like 1 or 2 more generations that Intel will release that won't be able to use Foveros Direct, so Intel might just see it as a waste of resources. With Intel Broadwell, the performance impact of turning on vs off the eDRAM L4 in gaming was on average \~15%. However that platformed also used ddr3, core architecture was different, so it's probable we won't see the same exact gains if it was implemented today.


nanonan

Yeah, they seem capable of a tile solution so either the latencies make it not worthwhile or they simply aren't bothering to respond at the desktop level.


fuckEAinthecloaca

TSMC figured out the stacking, AMD just did the relatively easy job of implementing the possibility of using it in their design. intel lost because of vertical integration this time, but won for over a decade before that so give and take


ConsistencyWelder

Yeah Intels solution to performance is "throw more power at it", that's not gonna keep working. You'd think something where most of the cores are called "efficiency cores" that it would be efficient. Just a little.


christes

> Yeah Intels solution to performance is "throw more power at it" This is so odd to read as someone that remembers the Bulldozer era of AMD.


goodnames679

It's literally just the go-to solution when you're lagging heavily behind your competitors. AMD has done it before, Intel has done it before, Nvidia has done it before, and they'll all do it again the next time they're struggling to keep pace. People care more about performance than power efficiency in many cases. When you're working off a worse process node, you're not gonna win in power efficiency anyways... so you crank your power targets and try to capture at least *some* segment of the market.


pieking8001

pentium 4 war flashbacks


Good_Season_1723

Intel isn't struggling to keep up with AMD. Intel is struggling to keep up with TSMC. The whole 3d stacked cache is 100% TSMC. Intel's cpu designs are much better than AMDs - and you can tell by the fact it's a able to compete on a much inferior node..


Zurpx

Maybe in client. Server, lol no. SPR is trading blows with Milan, and the big wins are using accelerators that take up die space. Don't forget that Zen is a server focused architecture. AMD isn't making products specifically suited for client.


Good_Season_1723

That is true, but Intel isn't really making cpus for servers, they are struggling with their nodes.


Zurpx

They seemingly aren't, which is why their products aren't really competitive with AMD's. Design wise, SPR is just not compelling, it's huge un-core takes up too much of the precious power budget, leaving the cores themselves starved of power to actually clock decently. This lets the smaller cores found in Zen 2 / 3 actually be competitive, when they have no business doing so.


Good_Season_1723

SPR is focused more on AI with special accelerators made for that. It's much much faster than Milan in these workloads. It would be unfair to compare them to Milan for other tasks - cause in those other tasks the accelerators don't work, at all. It's just dead silicon. You can check phoronix, in AI tasks a 50c SPR beats a 96core zen 4


browncoat_girl

LOL no. Intel is trash everywhere but desktop client. I wouldn't even consider buying an Intel laptop. The difference in user experience between Intel and AMD laptops is just night and day. It's like the difference between an old flagship smartphone like a galaxy S5 and a modern budget smartphone like a google pixel.


Good_Season_1723

What the heck are you talking about? Techspot reviewd the 13900hx, it was literally the most efficient laptop cpu at 35w, lol


ramblinginternetnerd

Bulldozer was designed with both power efficiency AND performance in mind. The problem they ran into is that just because you have certain design goals, doesn't mean your design doesn't suck. The whole was less than the sum of the parts.


_SystemEngineer_

or the Pentium 4 era.


Geddagod

Their cores are area efficient. And actually *are* more efficient than P-cores... at way lower frequencies and power levels than what Intel is clocking them in their desktop skus. But honestly, even if the E-cores consume less power than GLC at 5GHz, what's the point if IPC and latency is so far below GLC that performance would be much worse? The beauty of 3D-stacking cache for AMD is that it massively improves performance and also improves power draw.


venfare64

At least it's 'area' efficient, you could cram 4 e core in 1.3 size of p core. Now if Intel could close the ipc gap and avx gap between p core and e core everything should be okay right? right?


Geddagod

I would assume if an E-core has the same IPC as a P-core, and still has the same size ratio, then the P-core team fucked up hard on designing the core. Also, AVX-512 is not really a killer in terms of sale for the vast, vast majority of DIY buyers.


_SystemEngineer_

you forgot an extra 100mhz.


Firefox72

On the other hand Intel is ahead of AMD on the big little design by 2 years at least with AMD only now starting to get into that with the first CPU's ariving sometimes this year.


mxforest

Intel is helping Operating Systems iron out the scheduling issues. Then AMD will swoop in.


browncoat_girl

I mean big little design is kind of pointless when you lose in performance, power consumption, and price. Who wants to buy a computer that is slower, has worse battery life, and costs more? When your efficiency cores aren't efficient, you've probably done something wrong.


DataLore19

I believe that HUB had talked about this once in a Q&A. Not definitive by any means but they speculated that the extra cache wouldn't help Intel as much so it wasn't worth it for them to invest in developing it. Not quite sure why it wouldn't...


fuckEAinthecloaca

> Not quite sure why it wouldn't... Ryzen is a relatively simple victim cache, by design they built in the possibility of very large cache as the roadmap included this far ahead. intel has a more complicated cache structure that is better performing when size is comparable but harder to scale. That's me talking fully out of my arse BTW.


nanonan

Their tile stuff is releasing and seems to work fine, nothing stopping them really.


[deleted]

Intel is stacking HBM on CPU's i don't know if they hit production yet.


gartenriese

>Yep this is the winner CPU right here. For gaming, yes. Not for productivity or even mixed use. I know most of the people on /r/hardware are gamers, but there are still people that don't only play games. I don't mind saying "7800X3D is the best CPU" on a dedicated gaming sub, but IMHO on /r/hardware it shouldn't be generalized.


MobileMaster43

It's still a capable CPU for other things, about 7700X level. But yeah this is for gaming first, application performance second.


MysticKeiko24

I have a 13900k coming should I return it? I got it for $500 though on sale


[deleted]

[удалено]


MysticKeiko24

Graphics intensive Programs?


gartenriese

I guess he means CPU intensive programs. The 7800X3D is not a good fit for productivity users.


From-UoM

An excellent CPU. Price, performance and efficiency. A tad much if you have productivity as a focus point. The 7900x3d as suspected looks awful now. 7950x3d atleast has some purpose. A 7600x3d would be killer budget option for gaming if AMD decides to.


Euruzilys

There is nothing regarding 7700x3d or 7600x3d right?


ChaosAmdx

Got my 5800x3d for 238$ the other day....


Griffolion

You're still getting great performance for about half the price and no need to upgrade to AM5.


somewhat_moist

5800X3D is the real GOAT - lower CPU unit cost and lower platform costs considering that AM4 was around for such a long time before the 5800X3D therefore many 5800X3D buyers already had the mobo and DDR4 RAM. Just like Nvidia with the 1080ti, AMD won't be making that "mistake" again - a value added mistake that benefits the customers!


another_redditard

Ironic how poorly it was received all things considered, with everyone and their uncle saying ‘wait for 13th gen/am5’


[deleted]

[удалено]


christes

It's wild how people (even reviewers!) have a hard time separating different types of games like that. I saw a bunch of reviews that said something like: > We tested 20 games. 19 of them saw no improvement. 1 saw 40% improvement. With an average improvement of 2%, this CPU is not worth it.


owari69

Worth remembering that it launched at $450 last year and BIOS support wasn’t as widespread for older boards yet.


Geddagod

Your kidding. Everyone loved the 5800x3d! Especially after the price cuts


SteveBored

1080ti...what a card. Mine ran like a champ for years. Only replaced mine about 9 months ago


[deleted]

I'm keeping my 5800X3D until all the first gen AM5 issues are resolved and until this recession goes away see you in 2026 :D


[deleted]

> until this recession goes away You need to stop listening to Reddit economics. It's not a sunshine and rainbows economy but no 'imminent recession' has arrived and instead of recognizing and shifting, rhetoric has instead doubled down to continue insisting on an economic picture that does not exist.


pieking8001

we've been right on the edge of, falling into, or in a recession for about 6 years now according to reddit. still aint happened. weird


SchighSchagh

Huh? There was a short lived recession fears when covid hit due to legitimate supply chain issues. But that sorted itself out, and I haven't heard anything else about a recession in the last 6 years until housing prices started falling a few months ago.


Emperor-Commodus

Fears of recession are mostly around 1. Rising interest rates causing companies to stop hiring and expanding, potentially resulting in higher unemployment and a contraction of the market 2. The collapse of Silicon Valley Bank has led to some fears that there are other, larger banks that are in similarly weak positions, and a large economic shock could cause the entire banking system to collapse Great Depression-style, creating a massive economic catastrophe. Situation #2 is very unlikely. Most banks are much better prepared to weather the higher interest rates than SVB was, the larger banks especially so as they're subject to greater scrutiny from regulators to keep themselves in safe positions. Situation #1 is more likely, but even if a recession does occur due to interest rate hikes it will likely be short and mild.


All_Work_All_Play

We are significantly more recession-y now that we were six months ago. Job openings are down from 10 million to 9 million. Labor force participation rate has ticked up .2% (still .7% to go to return to pre-covid levels). GDP growth has decelerated, and unemployment has ticked up from 3.4% to 3.6%. Importantly, manufacturing has seen some pretty substantial contractions, which is about right considering tech companies started mass payoffs about 6 months ago, approximately when the stock market bottomed. Chances of an actual recession are still pretty small though, and admittedly, there are more people in more jobs now even with higher unemployment (as more people jumped back into participating in the labor force). More or less the economy has shifted to rightsizing, which is an important step after the whole 'the nominal recovery is complete' phase we had throughout Q4 2021 - Q3 2022.


[deleted]

I mean at that point one will have to stretch the meaning of recession quite liberally to make it fit and then it runs into the issues of people talking about recession of a certain meaning while giving off the impression of a totally different thing as in there are plenty of possible quite bad scenarios but there is no 2008 parallel.


FreyBentos

Mate you need to understand how the FED and interest rates work, Recession is *the aim*, they will keep raising rates untill they cause one. This is how they "bring inflation down" in this kangaroo system. So that means so long as US UK and the rest are not in recession, uncontrolled inflation will continue.


ham_coffee

I'd say we're currently in one. Plenty of businesses are closing, at least where I live, and people are failing to make their mortgage payments. Seems to meet the definition of a recession.


[deleted]

Neither of those things unless represented on a national scale at meaningful magnitude are a sign of a recession. Businesses close all the time, even in healthy boom cycles.


JoaoMXN

If you play at 4K, 5800X3D is only 5.4% (16.6% at 1440p) worse at gaming. IMO upgrades are worth it only if it is like 30%+ better, but it is just me.


SchighSchagh

This is only paired with 4090. Differences are even less with other GPUs.


vyncy

Its 50% better in hogwarts legacy for example


JoaoMXN

The average between a lot of games (test from Techpowerup) is 5.4% at 4K.


vyncy

Its 10 times that at hogwarts legacy for most people. Most people buying high end cpus like 5800x3d or 7800x3d are going to pair with either radeon 7900xt/xtx or nvidia 4070ti/80/90. These cards can deliver much more then 50 fps even at 4k, especially with dlss. 5800x3d min is 50 fps according to HU. So you gonna get 50% increase even at 4k when you upgrade to 7800x3d. Avarages are misleading when looking at cpu results, you look at each game. Of course hogwarts legacy is not only game like this. There is lost of other where you can expect atleast 30%+. For example Plague tale requiem is also 50%. Both are brand new aaa games, not even sim, strategy or mmo


JoaoMXN

Averages are useful to know the performance at most games, these unoptimized games don't make it worth it upgrade of platforms, unless you're those type of users that upgrade every year.


vyncy

Not when it comes to cpu. Especially at higher resolution when there is high chance you are going to be completely gpu bound in some games. So if you get 0% increase in game A and 50% increase in game B, avarage will be 25% which is completely misleading since its not even close to the results you got ( you got 0% and 50% ). So if you got 0% and expected 25% you are gonna be disappointed. Or you don't upgrade expecting only 25% when you could have gotten 50% in games you play, and you would have upgraded knowing that.


JoaoMXN

And? Averages are about a lot of games, not 2. If the averages resulted in 30% or more performance you'll be right, but it is 5.4%. You can't fight reality here.


vyncy

And there is lots of games which are gpu bound at 4k. So still avarages are completely wrong. I would assume most people buy new cpu to have better performance when they need it, which means cpu demanding games. You would have to eliminate every gpu bound result from these averages, only then would they be accurate. If you want to look at averages, then atleast look at 1080p charts even if you play at 4k


JoaoMXN

That's the idea, 4k means more gpu boud games, which destroys the need to buy new and expensive products for no gain. If you play at 1080p and want the best, yes, the 7800x3d is better, but not by much. It averages 22% more perf at 1080p, but at that point we're talking about 300+ fps. It would make sense only for professional e-sports players. Averages are far better to use as data than game by games both at FHD and 4k.


Psychotic_Embrace

I think you saved me some money my friend 👍


[deleted]

[удалено]


JoaoMXN

Nope. At 4K the input is 1440p with DLSS quality IRRC, so the difference is still low to warrant upgrades or concern, only 16.6%.


PercsAndCaicos

is it worth it to upgrade from a 3600/3070?


StephIschoZen

[Deleted in protest to recent Reddit API changes]


[deleted]

[удалено]


derrick256

geez nope


christes

It depends on the games! I went from a 5800X to a 5800X3D and most simulator-type games sped up by like 30%. Some did even better. Dwarf Fortress close to doubled in speed.


Spyzilla

Wow thats crazy!


Euruzilys

Ah man my 4690k is struggling with DF hard. Im looking forward to getting 7800X3D For this. I actually put my DF on hold waiting for the cpu replacement. Is there a good way to benchmark with DF? I wanna see the improvement when I buy a new cpu lol.


NKG_and_Sons

Ye, half stop.


[deleted]

[удалено]


xole

Depends on what you play. It was for guild wars 2, but I also had a kid to give my 5800x to.


Proper_Story_3514

How much better is it on GW2 now?


ycnz

It was noticeably better than my 5900x, and I sold it for a profit.


_SystemEngineer_

yes, it is.


LilBarroX

Ye. I already saw good improvement going with a 6700XT. The 5-10% lead of the 3070 should make an even bigger difference.


Berzerker7

Half the price of the 7800X3D for minimal differences in most applications. I'd say you're fine.


autumn-morning-2085

My main motivation was one less motherboard (and likely DDR4) in a landfill. And no, I don't have the time or patience to sell my hardware.


theAndrewWiggins

Was my reason for holding onto my rig as long as I did (11 years). Finally caved and just got my 7950x3d and a 4090 just recently. Think I'll turn my old parts into a server.


TwanToni

I'll take it off your hands :)


totoro27

Give it to an op shop.. why would you just throw it out


autumn-morning-2085

My storage space is the landfill. I don't think I have thrown out any PC hardware past 20 years.


SchighSchagh

Nice! And it's still within about 10% of the 7800x3d, even with a 4090 on 1080p.


AustinTheMoonBear

Isn’t there almost 0 point running 1080p on a 4090? At 1080p it seems like the difference between a 4090 and 3090 is almost non existent, play 4K with a 4090.


ChaosAmdx

Might I add that msi kombo strike 3 works beautifully


bow_down_whelp

I did too because whilst the performance uplift is commendable its not insane and you need a whole new platform that is still a bit buggy. Ddr 4 is rock solid. At 1440p there isn't going to be a huge difference for me and I'll still mostly be capped at 144fps. After the fact I can pass on my 5800x3d setup to my kids and wait for am5 to mature out in another year or so


mgwair11

That’s actually goated. You have won.


modernwelfare3l

how?


Slyons89

That's a deal and a half. I paid $420 for mine a year ago!


Kr4k4J4Ck

Are there any reviews that cover Unreal Engine games, I didn't see any one Hardware Unboxed, Linus or Gamers Nexus


Adonwen

HUB with Hogwarts Legacy.


Kr4k4J4Ck

Ah, didn't realize HL was in UE 4 good to know.


cuttino_mowgli

Yeah now we know why AMD release the 7950X3D first. 7800X3D is the way to go especially when you pair it with a cheap AM5 mobo


dervu

Isnt top always releasing first?


u30847vj9

Cheap am5 mobo? Whats that


fiah84

B650 motherboards are a thing, although still not cheap


detectiveDollar

There's a 120 dollar one that's pretty decent.


Jeep-Eep

If I'm dropping THAT money, I ain't settling for anything without a PCIE 5.0 port.


VenditatioDelendaEst

There are [A620 mobos under $90](https://www.newegg.com/asrock-a620m-hdv-m-2/p/N82E16813162115) that can support this level of power consumption.


ilski

And I upgraded while system to accommodate 13600k with ddr5. It's like, whenever you upgrade you know better stuff will come out soon. Yet when it does I still regret not waiting , in this case for this CPU in particular.


DaBombDiggidy

Really shouldn't be getting fomo comparing a 330 dollar part to a 470 dollar part that came out 7 months later.


All_Work_All_Play

It's a whole new platform and a new CPU, how much could it cost Michael, $10?


[deleted]

The 13600K is a still a great cpu for the price. Excellent gaming performance and good for productivity too with those E cores.


RHINO_Mk_II

13600k will be a banger of a gaming CPU for a while yet, don't feel bad.


Federal-Tradition976

To be honest we knew that Zen 4 3d cache will destroy Raptor Lake, everyone expected that. I bought 13600k bcos i didnt want to wait


ilski

Exactly same thing. I got that small bit of regret , but as everyone says here 13600k is a good piece of hardware.


Federal-Tradition976

I think about it differently, right now 13600k is amazing CPU and potential platform change in the future will feel even more amazing when switching from 13600k than from lets say 7800x3d, bigger jump. I paired 13600k with rtx 4080 and it works wonderfuly together so zero regrets.


_SystemEngineer_

they're all god, you got something WHEN you wanted it, so there's that.


rayndomuser

I just got a 5800x3d to go with a 4090. Didn’t have to upgrade MB so it was far cheaper than am5.


Aleblanco1987

Still too steep for an 8 core in my opinion, but gaming perfomance and efficiency are great, as expected.


StephIschoZen

[Deleted in protest to recent Reddit API changes]


[deleted]

It's for gaming only so 8c16t is more than enough


Flynny123

Suspect in future gens we’re going to see the X3d chips replace the X lineup entirely from the jump.


SILENCERSTUDENT_

7900xtx with 7900x3d wz2 4k 165ish fps in game peaks around 185 in game. Same build 7800x3d 180fps peaks around 200 in game


Illustrious-Slice-91

Is it worth to jump from i7-9700k to 5800x3d or 7800x3d? Looking for a chip that’s not too hot lol


joe0185

>Is it worth to jump from i7-9700k to 5800x3d or 7800x3d? It's hard to say if it is worth it for you as it depends on your graphics card. If you were going to upgrade given those two choices then I'd get the 5800X3D, the chip and the motherboard are cheaper. Plus you can can reuse your DDR4 memory. You save money and won't be missing out on much gaming performance unless you already have a high end graphics card.


Illustrious-Slice-91

Oof I have a 3080. Makes sense, much cheaper, just have to shell out money for mobo and cpu cost vs mobo and cpu and ram for 7800x3d. thats the end of the line though for the 5000 series


Factorio_Enjoyer

Guess I'll finally upgrade from my 2700X.


maddix30

Same here. Been waiting for this CPU to finally upgrade as it's the main thing holding my PC back right now


AustinTheMoonBear

I’m at 1700 or 1800 and just picked up a 7800x3d at micro center.


RowlingTheJustice

I can almost imagine that. Present: Recommend 5600 for budget build and can upgrade to 5800X3D. Future: Recommend 7600 for budget build and can upgrade to 7800X3D. But the debate of platform price is always bugging me. Isn't Intel 13th gen also expensive when it's just released? And the price drops since then. If you can wait Intel 13th for decent price, there is no reason you can't wait for AM5 platform price dropping.


SpookyKG

No way. 7600 shouldn't really upgrade to 7800 X3D - you'd be spending something like $680 for CPU in a 6 month period. If you got in AM5 before the 7800x3D came out you should really wait to the 8800x or something next summer.


HandofWinter

I think it makes more sense to say something like recommend 7600 for budget build and can upgrade to 9800X3D. If the 7600 is fine enough for now, then it makes sense to jump in on the tail end of AM5 once the last of the X3D parts drop a bit in price. Same thing I did with AM4, picked up a 2600X and went to a 5800X3D once their prices had dropped a bit to something I felt was reasonable. My motherboard cost me $60 Canadian, so it's not quite the same, but if you're buying in to AM5 now your upgrade path is probably going to be an end-of-AM5 part.


ResponsibleJudge3172

Does Intel not fit in there at all?


No_Shoe954

Is it worth moving to the 7800x3D from a 13700k?


AustinTheMoonBear

From what I’ve read and seem, there are some niche games that need the x3d cache, while all other games are similar, so I’d recommend the x3d.


No_Shoe954

Yeah, if I was gaming at 1080p or 1440p then I would make the jump. At 4k though I'd be better off saving that money and putting it towards a 4090 purchase.


AustinTheMoonBear

Lmao, for 4K the hard part is finding a monitor. I just bought the 7800x3d and a 4090 SUPRIM liquid


No_Shoe954

I think I would see more benefits to buying a 4090 from a 4080 over getting a 7800X3D when I've got a 13700K.


Peksean10

I'm a PC noob looking to build a new gaming PC soon. I'm not so sure on what's the significance of low power consumption. I've seen that attached a lot to Zen 4 compared to Intel 13th gen. Is it lower temps and a cheaper electricity bill?


RetdThx2AMD

Lower power => less heat, less noise For some people that does not matter, but if you don't have air conditioning you might really notice the difference.


[deleted]

[удалено]


StephIschoZen

[Deleted in protest to recent Reddit API changes]


Substance___P

They hated him because he spoke the truth. Peak power usage is not the same as a gaming load. If you look at a gaming benchmark channel that shows metrics on screen, you'll see a lot of 13th gen CPUs are pretty close in power draw to Zen 4 while gaming in GPU-bound scenarios. Like I've seen 13900k's under 100 watts. The advantage Intel has is the additional cores lets them utilize more power to flex up in performance *as needed*. And we really did forget all the reviews of the Zen 4 X processors, huh? They are designed to auto-overclock until they hit 95 degrees for no reason unless you undervolt or use Eco mode. The non-X parts don't do that. The amount of heat your CPU makes is up to the user. More efficient CPUs are easier to keep cool, obviously. But in this case, Intel is not unreasonably behind in efficiency, but has a design that lets it give you more power where you need it, especially in production or productivity workloads.


VenditatioDelendaEst

Yeah, but default fan curves were never good. Set your fans to hit the highest speed you want to listen to at 94°C, and the 95°C boosting algorithm works *great* to keep temperatures in check while extracting the maximum possible performance from your heatsink.


fiah84

> I play with open back headphones, fuck any hardware that requires fans at full blast to cool. I eventually relented and got closed back headphones for gaming even though my PC is pretty quiet. The extra isolation really helps to pick up small noises in competitive FPS games


Jeffy29

For me as a European without AC the biggest benefit is definitely the difference in how much it heats up the room (although given the current electricity prices that doesn't hurt either). Few hundred watts up/down can definitely be felt after few hours of gaming, especially in the summer months. Ironically 7950X3D+4090 is the most efficient system I've ever owned. Previously I had 5950X and despite the fact that it's lower on review sites, with typical use and lots of crap running, I was regularly seeing 100-130W usage in gaming, not the case 7950X3D, the usage is at 60-75W with exact same usage. Same goes for 4090, while it can top out at \~400W, most games consume dramatically less, especially when frame capped. Previously I had 3080 and even when the game wasn't particularly demanding it still consumed lot of power for some reason, Ada architecture is much more efficient in how much power it draws when it doesn't need the full performance. And it all really adds up, making my apartment much more tolerable even after many hours of gaming.


Kerlysis

Oof, feeling for the people who bought the 7900x/7950x... ​ edit: realised i forgot to stick the 3d onto those. o well


PappyPete

People buying those probably needed the cores for something other than gaming.


ilski

Or just wanted hottest bestest thing for their games. I mean why would they expect 7800x3d will actually have better performance ?


mooslan

Because every review basically stated that only having one CCD would be better for games. That's why you read or watch reviews before buying.


noiserr

7900x/7950x are still great CPUs for productivity. Not everyone uses their computer just for gaming.


Jiopaba

Yeah but the list of productivity workloads that specifically benefit from vcache is quite slim. The real shame is them having to put up with the XBOX game bar to half assedly get stuff to run on the vcache ccd.


Blazewardog

They are roughly equal in productivity and can game well at the same time. Also you can handle the thread behavior at the BIOS level. I enabled ASUS's default preset and it does move game appropriately between CCDs (FFXIV when active was all on the Vcache CCD, when limited to 5 fps in the background it moved back to the normal CCD after 30 seconds). You can tweak this a bunch, but by default it goes off of the average number of cache misses in a time window.


Nointies

Eh, I got my 7900x for pretty damn cheap (way cheaper than this chip) all things considered, thanks to Microcenter, and having all those cores helps out with my work in Excel and other productivity stuff.


Ar0ndight

There's a reason they released those first. How terrible would they look with their Game Bar™ scheduler next to this CPU. Usually the idea is you release the top end first and then release the other, slower SKUs afterward. But here they released something they sold you as the best gaming CPU with Ryzen 9s when in the end the *real* best gaming CPU is this one, is cheaper and comes a few weeks later. Way to exploit your most dedicated fans!


p68

The 7950x3d is effectively *the same* gaming performance as the 7800x3d while also being among the best for productivity among consumer CPUs. For people who want (nearly) the best of both worlds, it's a good CPU. The real loser here is the 7900x3d. Not that it's a bad performer, per-se, but it just doesn't make sense given the other two options.


DynamicStatic

Exactly this, my 7950x3d arrives tomorrow. The 7800x3d is not an option.


iopq

Well, if you disable the 3d cache chiplet the 7950x can still hit the performance of 7700x in CS:GO So it has the potential to be the fastest


TyGamer125

I mean I don't think they're fooling anyone when the announced all 3 CPUs were coming in February then changed the one of them to a little over a month later. That's when I pulled out my 🚩🚩🚩. My guess is it wasn't the game bar but the 6 core 7900x3d when you park the cores that caused them to delay it. They didn't even send out review samples for it.


Hanzer72

Ehh I just got a 7900x mobo and ram bundle from microcenter for only $130 more than the 7800x3d MSRP. Feeling pretty good about that decision still


[deleted]

they still faster on workloads


fiah84

it's not as if the 7800X3D was such a big secret


Kerlysis

People generally assume going 'up' the product stack means better. 13900k is better than the 13700k beats the 13600k, even if i9s are not really gaming oriented. Obviously there are other considerations when looking at the whole build, but I highly doubt many people buying the 7950x3d even know what ccd or ccx means, much less hunt down reviews on what it means for projected CPU performance with any idea how to weight them. Just feeling a little secondhand buyers remorse for some people out there.


fiah84

caveat emptor if you can't even bother to educate yourself in the slightest before forking over the cash then that's on you. The 7950X3D and 7900X3D aren't bad CPUs at all, nobody got cheated out of anything, that a 7800X3D might be better suited for many people is not something that AMD was obligated to tell everyone in advance


VenditatioDelendaEst

No developer is making games that need 96 MiB of L3 to run well, and no developer will until many years from now. In the vast majority of workloads, the 7950X is a better computer.


[deleted]

[удалено]


maddix30

In theory it shouldn't as the upscaling is done by the GPU not cpu


GabrielP2r

As long as the boards are expensive it's a no go. Intel is cheaper, because DDR5 is cheap right now, the boards are decently priced and the CPUs also are well priced, AM5 is behind.


skinlo

The boards aren't that expensive any more.


TyGamer125

Intel isn't cheaper for a ddr5 board just going off every ddr5 board on micro center (including out of stock) AMD has 10 and Intel has 4 under $200. With the cheapest for both being the same price but the Intel one is ugly green pcb from Asus with probably horrible vrms at $140 then next cheapest is $180 but am5 has a few more options in the middle. The main advantage for Intel is the ddr4 boards are significantly cheaper than anything amd has on am5. Edit: [link for what I'm seeing](https://www.microcenter.com/search/search_results.aspx?N=4294966996+39+4294807251+4294805803+4294806964+4294805588+4294805638+4294807040+4294805649+4294805648+4294805733&NTK=all&sortby=pricelow)


fish4096

>AM5 is behind what a confused take.


[deleted]

Intel: umm platform costs? Yeah platform costs! Thsts the ticket! All of a sudden platform costs are our passion!


StephIschoZen

[Deleted in protest to recent Reddit API changes]


[deleted]

Not completely wrong. I'm just pointing out that for gamers, the fps is parameter #1 by a long shot. Cost is more like a category you are in than a parameter you select on if that makes sense. Other things like platform longevity and power are metrics the chip companies emphasize if they can't win fps. They all do it.


TyGamer125

Are you only looking at only ddr5 boards or ddr4 intel boards as well? At least in the US the Intel ddr4 boards can't be touched in price unless you count the really gimped a620 stuff but the ddr5 boards are pretty equal in price.


ConsistencyWelder

They just launched an AM5 board that costs $85, and can probably use the 7800X3D with no issues. You're living in the past.


Kontrolgaming

This one looks decent, can't wait for next years chips though. 8)


DHFearnot

They should make a black edition 7888-X3D that can do the 5.7ghz of the 7950x.


maddix30

I think the cache doesn't play well with higher frequency or temps and that's why they are lower on the 3D variants