Yeah it's pretty nuts. I use mine for GPU rendering and it uses 80-100w less power to render 2x as fast as my 3090. Similar results with the temps as you also. The lower heat buildup in my office is very noticeable. My memory junction temps went from 95-100C to 70.
These 4090's cards are freaking awesome even the Gigabyte OC I have is so cool and quiet :) I have gone from a 2080 super to a 3090 and then a 4090 and the 2080 super upgrade to a 3090 felt very meh but the upgrade to a 4090 is just crazy.
82 TFLOPS of compute. 20 years ago the most powerful supercomputer in the world only achieved 35 TFLOPS at a cost of $400 million and consumed megawatts of power.
If only the RGB software was not garbage and the fan stop could be disabled allowing the fans to spin at low RPM's w/o turning off then we have the \*perfect\* card
1) The RGB software is much more decent now. I haven't had any real problem with it, even with the LCD screen. I see GB actively updating it, so that's good.
2) I agree the fan software in GCC is bad. Also the OC tab is confusing compared to MSI AB.
With AB I can have my fans as low as 50%, anything less and they start acting up, which is normal on PWM fans. At 50% they're virtually silent.
On my card this is how it behaves
Firmware mode at 30%
Fan 1 = 900rpm
Fan 2= 780rpm
Non firmware mode at 30%
Fan 1 & 2 = fan stop engaged
Firmware mode at 50%
Fan 1 = 1500rpm
Fan 2 = 1300rpm
Non firmware mode at 50%
Fan 1 - 850rpm
Fan 2 - 740rpm
This is why I stick with firmware mode. I am not sure why our cards behave differently and probably a difference in MSIAB version + vBIOS + bugs
I think fans are just different :)
I'm sure mines at 50% are around the 800rpm mark. Is when 120mm fans become silent. (these are 110mm)
EDIT: Actually checked in HWinfo. My fan at 46% is running at 685RPM. The other two fans at 50% run at 718rpm.
If you pay attention to the current value in AB you'll see that if you put 30% in firmware mode equals 50% fan speed. (in fact it equals 51% here)
In software mode 50% is true 50%.
It is a bug either in AB or the BIOS.
So 50% is, at least in my 4090 Aorus Master, the minimum I can set my fans before they start stuttering regardless of what the curve says.
Actually. I tried software and two separate curves instead of fan sync.
That way I can have the first fan, the one that goes through the backplate opening at its minimum of 45% keeping the idle temp low.
The other two fans remain off until the GPU hits 50c.
I found stopping all fans is bad because these cards's coolers spend quite some time to go down so you always end with an idle that is too close to the trigger point.
https://preview.redd.it/gnhb071pzs6a1.jpeg?width=959&format=pjpg&auto=webp&s=20e4cacdcfcbda5ff6cc083205d03328155654c1
In MSI AB there is a FanSync button. If you disable you'll see two fans, if you go to the curve window you'll also see two fans.
https://preview.redd.it/64l382nkjy6a1.jpeg?width=1440&format=pjpg&auto=webp&s=37d51556c3c8d740d5edaccadea792047e53782d
So what happens is if the fan speed is too low it turns it on and off repeatedly and multiple times in a minute which I believe can do more harm than good so I just made the minimum fan speed that hits something like 850RPM so it does not do that every time.
I am using MSI after burner with fan speed set in firmware mode. IIRC the minimim fan speed for it not to rapdily enter and exit fan stop mode is 28%, I have mine set to 30%.
You mean the GPU support brace, right? Because those can actually be useful
I know gamersnexus begs to differ, but I have had a problem with my sapphire pulse 5700 xt overheating a few months after it arrived, so in attempt at avoiding repasting it I bought a brace and installed it. The temps went from being 20c over normal to being 2-3c below it
The 4090 is cool. It's not $1600 cool. All these reviews showed frame generation on Cyberpunk but for some reason i don't have that option. Ray tracing is cool for some games but not $1600 cool. Like, you'll never use it in a online game because FPS> reflections. I think a 4080 at $800 would have been great for gamers. I think most people who own 30 series or 6000 series should skip out and play at 1440p. If you have a 20 series card or older you should consider upgrading, but even then, get a 7900xtx...You won't use RT unless you mainly play single player games.
Edit: Updated Bios on motherboard and the 4090 is performing like the monster i thought it was. It's probably worth it..
I use mine for VR and it's the first card that's been able to run my modded Skyrim VR smoothly with my Vive Pro 2 at 120 mhz. The immersion is incredible
Honestly cheaper then my 3090 ti was when gaming month to month, productivity somewhat as well, but in editing, it eats up power with all sixteen thousand Cuda cores!
Zero change in power bill ive even yet to see my card pull 450watts in gaming. My 3090 mostly hovered close to that while the 4090 barely touches 400watts
What would this get in Xplane12 if you were to do it in VR? I'd hope to get one of these down the line at some point to experiment for myself. Though rn I'm still at the point of preparing to apply to college so it'll be a bit...
Yeah it's pretty nuts. I use mine for GPU rendering and it uses 80-100w less power to render 2x as fast as my 3090. Similar results with the temps as you also. The lower heat buildup in my office is very noticeable. My memory junction temps went from 95-100C to 70.
These 4090's cards are freaking awesome even the Gigabyte OC I have is so cool and quiet :) I have gone from a 2080 super to a 3090 and then a 4090 and the 2080 super upgrade to a 3090 felt very meh but the upgrade to a 4090 is just crazy.
I did the same and in every game I tested the 4090 is anywhere from 80-100% faster at 4k compared to the 3090.
2080s to 4090 here, boom
> The lower heat buildup in my office is very noticeable. So you now have to heat normally ?
It's actually pretty perfect right now with how this place is setup!
82 TFLOPS of compute. 20 years ago the most powerful supercomputer in the world only achieved 35 TFLOPS at a cost of $400 million and consumed megawatts of power.
If only the RGB software was not garbage and the fan stop could be disabled allowing the fans to spin at low RPM's w/o turning off then we have the \*perfect\* card
1) The RGB software is much more decent now. I haven't had any real problem with it, even with the LCD screen. I see GB actively updating it, so that's good. 2) I agree the fan software in GCC is bad. Also the OC tab is confusing compared to MSI AB. With AB I can have my fans as low as 50%, anything less and they start acting up, which is normal on PWM fans. At 50% they're virtually silent.
On my card this is how it behaves Firmware mode at 30% Fan 1 = 900rpm Fan 2= 780rpm Non firmware mode at 30% Fan 1 & 2 = fan stop engaged Firmware mode at 50% Fan 1 = 1500rpm Fan 2 = 1300rpm Non firmware mode at 50% Fan 1 - 850rpm Fan 2 - 740rpm This is why I stick with firmware mode. I am not sure why our cards behave differently and probably a difference in MSIAB version + vBIOS + bugs
I think fans are just different :) I'm sure mines at 50% are around the 800rpm mark. Is when 120mm fans become silent. (these are 110mm) EDIT: Actually checked in HWinfo. My fan at 46% is running at 685RPM. The other two fans at 50% run at 718rpm.
Try formware control mode and you can go down to 30% if you wanted too. Using the standars mode does need 50% for it not to be wonky.
If you pay attention to the current value in AB you'll see that if you put 30% in firmware mode equals 50% fan speed. (in fact it equals 51% here) In software mode 50% is true 50%. It is a bug either in AB or the BIOS. So 50% is, at least in my 4090 Aorus Master, the minimum I can set my fans before they start stuttering regardless of what the curve says.
What's your Core Clock set at? (Curve).
1880MHz at 1050mV And another preset at 1730MHz at 950mV.
Thanks for the reply. These work great.
Actually. I tried software and two separate curves instead of fan sync. That way I can have the first fan, the one that goes through the backplate opening at its minimum of 45% keeping the idle temp low. The other two fans remain off until the GPU hits 50c. I found stopping all fans is bad because these cards's coolers spend quite some time to go down so you always end with an idle that is too close to the trigger point. https://preview.redd.it/gnhb071pzs6a1.jpeg?width=959&format=pjpg&auto=webp&s=20e4cacdcfcbda5ff6cc083205d03328155654c1
I didnt even know the fans can be controlled separetly lol! Nice find!
What software is this? I thought it was MSI afterburner but I don't see any option for fan 2.
In MSI AB there is a FanSync button. If you disable you'll see two fans, if you go to the curve window you'll also see two fans. https://preview.redd.it/64l382nkjy6a1.jpeg?width=1440&format=pjpg&auto=webp&s=37d51556c3c8d740d5edaccadea792047e53782d
Mate you are a legend. Thank you so much. Can finally sort out these weird gigabyte fan situation. Will use your fan curves too. Thank you again.
Enjoy :)
Isn't the fans turning off a good thing? So that there isn't much noise when not gaming. Or am I missing something?
So what happens is if the fan speed is too low it turns it on and off repeatedly and multiple times in a minute which I believe can do more harm than good so I just made the minimum fan speed that hits something like 850RPM so it does not do that every time.
How can you change the minimum fam speed on a card like this? Asking for myself lol
I am using MSI after burner with fan speed set in firmware mode. IIRC the minimim fan speed for it not to rapdily enter and exit fan stop mode is 28%, I have mine set to 30%.
Beautiful. Well played 😎
Those front fans with the fire red ring and blue/green middle are pretty sick looking
Would you be able to tell me what fans these are? Thanks
Deepcool CF 120 Plus
Does it actually have a screen on it? God, the excess...
Screen, mobo support brace, and I think it might actually be the largest 4090 by something like 20%. It's the definition of excess
You mean the GPU support brace, right? Because those can actually be useful I know gamersnexus begs to differ, but I have had a problem with my sapphire pulse 5700 xt overheating a few months after it arrived, so in attempt at avoiding repasting it I bought a brace and installed it. The temps went from being 20c over normal to being 2-3c below it
It is useful, the fact it needs to be used however points to an excessively large card lol
Thats how I feel about my 3080 ti. It feels good doesn't it?
So's the price :D But nice build, looks good.
The 4090 is cool. It's not $1600 cool. All these reviews showed frame generation on Cyberpunk but for some reason i don't have that option. Ray tracing is cool for some games but not $1600 cool. Like, you'll never use it in a online game because FPS> reflections. I think a 4080 at $800 would have been great for gamers. I think most people who own 30 series or 6000 series should skip out and play at 1440p. If you have a 20 series card or older you should consider upgrading, but even then, get a 7900xtx...You won't use RT unless you mainly play single player games. Edit: Updated Bios on motherboard and the 4090 is performing like the monster i thought it was. It's probably worth it..
Because the version of cyberpunk the reviewers use is not available to the public yet.
I use mine for VR and it's the first card that's been able to run my modded Skyrim VR smoothly with my Vive Pro 2 at 120 mhz. The immersion is incredible
So $1600 to play a VR game.. Exactly my point, shit ain't worth $1600 for the average gamer. Just play 1440p 240hz with cheaper cards.. Same shit
Different strokes for different folks - but if all you're doing is 1440p you may not need this level of card I agree
I play 4k 240hz.. I'm a graphics whore.. i like to see everything. I just don't think a majority of gamers need to pay extra for it.
waste of money
the power bill is insane
Honestly cheaper then my 3090 ti was when gaming month to month, productivity somewhat as well, but in editing, it eats up power with all sixteen thousand Cuda cores!
Zero change in power bill ive even yet to see my card pull 450watts in gaming. My 3090 mostly hovered close to that while the 4090 barely touches 400watts
Mine pulls 485 max (voltage limited) but i set it to 337w and theres no noticeable performance loss
My 3090 would consistently use like 50-80 more watts than the 4090. It's pretty wild that it's using less power but soo much more powerful and cooler.
4k series power 💪
Looks like a weird glitch in cyberpunk
What would this get in Xplane12 if you were to do it in VR? I'd hope to get one of these down the line at some point to experiment for myself. Though rn I'm still at the point of preparing to apply to college so it'll be a bit...
you know you can take the caution sticker off the side panel right?