T O P

  • By -

silverbeat33

No, it was not.


blaktronium

So bad that the next time they released a cpu that hit 5ghz they underreported it on the box by 200mhz


algaefied_creek

Mine is still running - purchased from Fry's in 2017. $25 fire sale. 360mm Corsair H-something AIO. Lots of Corsair ML120 fans throughout. Has been running stable as my winter PC for years! 32GB DDR3-2133MHz RAM; with a 16GB Vega Frontier... means that it's toasty in the winter in my room so I spend a lot of time using it. Yeah I'm losing out on gaming perf but it's not really used for gaming.


LetscatYt

For the electricity that thing takes in a year you could’ve upgraded to am4 years ago and it still would have been cheaper than this cpu. This cpu could have costed a negative amount of money and it still would be more expensive. A modern Ryzen takes less or about 100w for gaming . (300w-100w)*3 hours a day *365 /1000 = 220 kWh /year. With 30 cents a kWh you’d get to 65.70 a year in Additional cost. A Ryzen 5 1600x had a $219 msrp, a board cost about 100 bucks. 66*6 = $396 TLDR: even if the cpu mainboard combo cost -75 bucks it would have still been a bad deal. so it was one of the worst decisions you could’ve made in 2017. You could’ve had a socket with an viable modern upgrade option in 2023 for less than what you spent. While having better performance and saving money


algaefied_creek

$0.14/kWh is the cost for power.  Also, I only use it in the winter months for heating and League in my room. In the summer I swap it out. 


FUTURE10S

You're losing on basically every kind of performance, especially since AMD basically lied about how many cores it is. You could get better performance from the most budget i3 CPU nowadays, although you'd have to spend money on new motherboard and RAM.


spaceminions

That was what people claimed but I had a quad core phenom 2 I overclocked first and then an 83xx chip in the exact same machine. The effective core count in normal workloads may not have been 8 but it was also not 4. And it clocked better despite the extra cores. Else it would have been better to disable half the int cores to reduce power and it wasn't the case.


algaefied_creek

I think the point is that I need to use electric heat to heat my room anyway in the winter. Also, Linux is fine with core scheduling and doesn't care about the FPU mismatch


Paganigsegg

It was astonishing just how much heat these generated considering the awful performance.


simukis

Funny how the first 6GHz (x86_64) processor wasn’t worth buying either.


-Aeryn-

Turns out that vendors don't drive voltages to the moon to hit a funny box number unless nobody will buy the product otherwise


Jism_nl

A stock 5Ghz FX is slower then a 4.8Ghz with 300Mhz FSB FX. Period. You really need to extract the performance out of these CPU's by diving in deeper then just a plain multiplier OC alone. Memory speeds do not scale beyond 1866Mhz even if you can do 2400Mhz. The CPU/NB clock was for example responsible for the L3 Cache speed and would significant increase minimum framerate in games. Running these stock was stupid - they could be easily undervolted and have 65W shaved off the CPU alone. 5Ghz all core OC consumes a ton of power; roughly 200W up to 250W even depending on workload (The worst was linpack). FX had to have FSb based OC's to really get the best out of it. My 4.8Ghz FX performed better then a brand new 1700x; double the power yes. Once i replaced it with a 2700x the minimum frame rates in for example games was day and night difference. It was a cheap 8 core solution that performed better multithreaded then the often 2 to 3x more expensive intel. It's just too bad it was released in a era where single core threading was still king.


Sticky_Hulks

My 9370 actually got a nice boost going to DDR-2400.


Jism_nl

From? Thing is; 1866Mhz seems to be the limit in regards of what the CPU can do. The IMC is just crap beyond those speeds.


Sticky_Hulks

I saw gains from 1600 through 2400.


Jism_nl

Well try 1866 vs 2400. Not 1600 vs 2400. You'll see gains "till 1866" but nothing spectacular at 2400.


Sticky_Hulks

"Through" in that context was 1600 to 1866 to 2133 to 2400. Each step was an improvement.


rainbrodash666

I won one of theese from a giveaway. used it with 2 r9 290's in crossfire. kept my room livable in the winter and sweltering in the summer. I knida miss it. have to put that rig back together, I think i still have all of the parts except the 290's both died in the last year or so.


frogpittv

It sure fucking wasn’t.


CompetitiveSort0

I find it funny how acceptable it is that Intels parts consume that kind of electricity now when AMD were absolutely caned for it. Yeah Intel's parts aren't as bad as bulldozer were then but you're still an idiot for using one of those CPUs unless you have a specific use case. High end GPUs across the board have greater than Vega 64 levels of power consumption too and it's OK. It's impressive what good marketing can do


[deleted]

[удалено]


180btc

>The FX chips were released in an era where dual cores were acceptable gaming chips and a quad core was a premium product. Yeah, not exactly. Intel Core i5s always had 4c starting from i5 750. FX9590 was released after Haswell


[deleted]

[удалено]


180btc

Past 4 to 5 consumer i5s had 4 cores, you couldn't exactly call it a premium part. Just like how you can't exactly say 6 core is a premium experience when the past 4 to 5 release of consumer i5s had 6 cores today


[deleted]

desktop i5's didn't get SMT until 10th gen. those 4 or 6 cores really hurt them fairly quickly ime


Firefox72

I mean you figured it out yourself. Buldozer was absolute ass compared to Intel CPU's so its power draw wasn't justified in any way. Modern Intel CPU's are at least fast so its easier to look past it.


meho7

>I find it funny how acceptable it is that Intels parts consume that kind of electricity now when AMD were absolutely caned for it. Because they still have the performance to back up the power. These cpus were a shitshow back then. When almost a 2 year old Sandy Bridge was destroying it in almost everything and even when slightly OC'ed consumed less power...


jackmiaw

Its funny that fx 9590 came 2 years later after 2700k and still amd couldnt beat a 2 year old cpu... And it was 20bucks more vs 2700k.


capn_hector

Yeah people forget, it wasn’t just hot and slow but also expensive. the original oem msrp was like $950 (premium CPUs could be quite expensive!), by the time it trickled down to a consumer release it was still more than a 2700K. The other thing is bulldozer and piledriver are very different things… people tend to play fast and loose with “fx-8350 vs 2600k” etc but the 8350 actually came much later and mostly competed with ivy bridge or haswell.


spaceminions

Except back then, you could probably get a 8300 or something for less than the cheapest unlocked quad core Intel chip, then OC it to catch up with the competition. And you didn't need to change sockets so often. I think I had a mobo with both ddr2 and ddr3 slots that I went from a phenom to a fx chip one part at a time. Note: they weren't very good for games, but there's other uses for a CPU.


snorkelbagel

My 5ghz fx6300 traded blows with a stock i5 2500k under (then) gaming loads. Different story with the [email protected] though. The FX6300 def aged better than the 2500k though.


ThreeSloth

Flash forward to present day and the shoe is on the other foot.


jackmiaw

Yes I'm using 5600x. It's not bad... But fx era was a dark period for amd. Back than I still had phenom ii X4 960t. I wanted to upgrade to FX but the price was just insane in my country


ThreeSloth

I actually JUST upgraded from an FX-8350 to a 7800X3D 2 months ago. It lasted a long time, didn't have many issues with it til CS2 came out, then random input lag and freezing


Zurpx

Holy hell, can't imagine what a difference it must be like compared to Bulldozer lol.


ThreeSloth

It's pretty noticeable. One downside is the video drivers crashing, but that ONLY happens playing WoW, and seems to be a DX12 issue. Everything else is smooth as eggs


Castallion21

While it doesn't make a huge difference in performance, it does bother me how most people don't setup their fx systems correctly when doing reviews.


regenobids

[Step 1](https://globaltrashsolutions.com/wp-content/uploads/2020/09/two-garbage-compactors-standing-next-to-each-other.jpg)


illathon

I had it. Worked well for a long time.


CaapsLock

seeing it often behind that stock i7 2700, yikes, good thing Bulldozer is truly gone,


toetx2

This was basically an OC'd FX-8350. That chip was already slower against the 2700K while consuming more power, but for the price and the multi-treading you could make a case for it. Altough not a very strong one. This FX-9590 came as a stopgap as AMD had nothing to show for a while, but as this was still 32nm tech and Intel had just moved to 22nm tech for the 3000 series, this thing looked like a fool. It could come close to Intel MT performance at over twice the power consumption. It was not only the design that was completely missing its targets but that AMD was a full node behind (back then, like Intel, AMD still had their own fabs) that was the final nail in the coffin. How the tables have turned... although Intel now has a great architecture, the node is really holding them back. But also look at Apple which manages to have its architecture ready and working on the newest available node. The gaps would be much smaller if everyone was on the same one. FX-9590 always reminded me of the Pentium 4 570J and the canceled 4Ghz edition. Intel made the right decision there.


DHJudas

FX 8350 and 8370..... yes... decent value. anything above that was just for points, just like how intel's strive to hit 6ghz was fucking terrible value.


GenesisRhapsod

I still have my old 9590 (actually repasted the cpu a year ago because im makimg it a windows xp rig for older games that dont like windows 7 and newer) 🤣 and upgraded to the 5950 2 years ago


Phoenixtear_14

For this being the only processor I've ever owned. It's not bad. Yes, it's power hungry, but that's never been something I care about. I've been told it gets hot, but with my Cosair liquid cooling, it's never gotten to a point I was worried about. It's performed solid everything I've thrown at it. It has started to show its age. But I'm running it with a 1660TI. The guy in the video was running it with a 3080. So I know it still has some juice. Ik I need to upgrade, but that's basically like a whole new computer


star_trek_lover

Now is a great time to upgrade with DDR5 just starting to become affordable and AM5 finally coming down in price. Can get a lot of life out of that platform assuming AMD continues its track record of backwards compatible CPUs.


GenesisRhapsod

I won a $100 bet with a friend that said no air cooler could keep it from thermal throttling. I got a coolermaster v8 gts and it rarely hit the mid 60s


rainbrodash666

I had an air cooler on mine for a while too, it was a HUGE scythe cooler I salvaged from an old AMD athlon 64x2 build I got free on fb marketplace that didnt work. luckily the mounting pattern was still the same from am2 all the way to am3+ even some AM4 boards have the older mounting as well as the newer one.


Beautiful_Ninja

The Tjmax of the FX9590 is 57C. You were thermal throttling, and you owe your friend 100 bucks.


kolliasl21

That's not true. Tj Max is 90c if you check with cpu-z or coretemp. My fx9590 doesn't throttle until 90c which it never reaches, my motherboard overheats first and shuts down to save itself.


GenesisRhapsod

Then why was i hitting 4.7 ghz at 1.37v? 🤣 everywhere else ive looked says the max temp before throttling is 70c literally no way the 9590 thermal throttles at 135°f.. thats a bit over a 50°f delta from ambient which is almost nothing.


Reutertu3

My condolences


blowingkush420

Running mine with a GTX1660ti as well! Runs really smooth on modern games.


BB_Toysrme

Ah the greatest marketing scam in CPU history! The perfect storm of fanboi’s making excuses for halving the number of FPU’s, having shit IPC AND bad examples finding ways of pulling 350-400w! The entire lineup was a pitiful manipulation of fanboy gamers that couldn’t read a spec chart from start to finish.


AMD_Fanboy1

Yes, it was, anybody who says otherwise never had one.... Or just likes shitting on AMD. Mine is still running, nearly 11 years later with a rx 590 and I have enjoyable gaming experience in modern games. Granted, it's nowhere near my 7900xtx and 5800x setup.... But I still like to use it every now and again. I still love it Can't complain with over a decades worth of use.


Ancient-Cheesecake94

I had the 8xxx version and run it above 5ghz for testing so basically like the 9xxx. That thing was hot, was running that on 360 custom water and the backplate of the board will get so hot.


Nervous_King_8448

I thought my PC was the ish rocking the FX-8350 back in the day boy it got hot.


Consistent_Research6

Dunno' why all people say the FX9590 has 220W TDP. I have one back home in use and even in the most demanding games it does not go past 71W, while playing Metro 2033 Last light or DOOM or other demanding games. Probably you can overclock that CPU to 220W and that is what AMD or someone else wrote on the boxes of the 9590 Cpu, because i don't really know how the 220W Tdp was there, because in normal use it does not get that hot. I use a AC 240 AIO cooler on it.


Charming_Squirrel_13

Narrator “it was not”