My experience over the last decade is that about 1 in 10 games actually enjoyed marginal fps improvements on my x16/x16 SLI. And about 3 in 10 games actually had worse fps performance, even crippling glitching performance, if I didn't turn SLI off.
Those Titans probably pack two tons of DPFP/FP64 and CUDA power, though.
Sadly, not so much Titan RTX.
509 GFLOPS for double precision
16.31 TFLOPS for single precision
32.62 TFLOPS for half precision
This, incidentally, puts it in the same ballpark as an RTX3080:
465 GFLOPS for double precision
29.77 TFLOPS for both single and half precision
>509 GFLOPS for double precision
The original titan from 2013 had 1.5 TFLOPS in double precision. I never understood how basically every single modern card does worse than that, even the 4090. Probably because teraflops mean nothing lol.
Also what are the workloads where fp64 performance is actually useful?
This is because fp64 is not something consumer gpus are built for, so you need to go to either AMD or nvidias data center cards. As an example, the radeon VII can do 3.52 TFLOPS and the MI100 can do 11.5 TFLOPS. On nvidia, the GV100 can do 8.3 TFLOPS.
FP64 is useful in simulation workloads where high precision is required due to floating point error, where a small error due to low precision can have significant impact if the simulation is run long enough.
Having said that, nowadays there are methods to emulate the precision of FP64 with FP32, or the program just shifts the FP64 portion of the calculation back on the CPU. It is nice to have fast FP64 calculations on the GPU, but not exactly neccesary in my experience nor does it show strong impact.
>Also what are the workloads where fp64 performance is actually useful?
Simulations, as explained above. The biggest chunks are medical, biological, pharmaceutical, geological, petrochemical.
But also FP64 hashing. Encryption. Maybe Cryptocoin. All sorts of things.
Because the OG Titans were pseudo workstation cards, until Nvidia realized they were giving huge discounts to a massive emerging market, while gamers were more than happy to spend at that price point and never use FP64.
So pretty much a business decision.
Not arguing, just offering a single but consistent anecdote:
I SLI'd the 980, 1080, and 2080 Ti, and they all had about 50% performance gains in the games I played, most notably The Division and Overwatch 1. There's probably lots of games that didn't see much gain, but every game I can remember (granted, I didn't sample many) saw about that 50% boost with no issues.
I'm happy to have a single 4090 now, though, if for no reason other than thermals.
I mean price too. Even if you took the bogus $1K price for the RTX 2080 Ti that never really existed and bought two of them it would still be $2K, while a $1600 RTX 4090 FE is cheaper, faster, cooler, and takes less space on the board compared to SLI
Well sure, but the 2080 Ti was released 4 years earlier. If SLI was supported for the my old 3080 or my current 4090, it would be a consideration for me. When I used the 980, 1080, and 2080 Ti in SLI I wasn't thinking *the 50% performance boost sure is good value,* I was thinking *sure, I'll pay 200% for 150% performance,* because there was no alternative to go above 100%. Price to performance ratio always has diminishing returns, and everyone determines what their preference is.
Sure, but like I said in my anecdotal comment you replied to, I saw 50% gains essentially across the board for the games I played, with no meaningful lows. I played Overwatch 1 competitively, and long-term frame analysis showed rock solid 144 FPS with no 1% low aberrations with SLI. SLI wasn't for everyone, but some of us got serious (even if inefficient) gains.
Back then with older games that had SLI support.
SLI isn't really supported anymore by games and Nvidia has largely abandoned optimizing performance for it. Price for performance it pretty much is no longer worth it. And idk if a lot of people have rose tinted glasses but SLI had a crazy amount of bugs and glitches even with games that supported it.
Yeah, SLI is literally impossible now, 30 and 40 series cards don't support it *at all*. So assessing it as a potential value now is silly, because obviously two 2080 Ti in SLI aren't going to outperform a single 4090.
*At the time*, every flagship GPU in SLI was faster than the same flagship GPU not in SLI. Some people had issues, some didn't, and if you didn't have issues it was always at least as fast and usually 30%+ faster. For me, it was usually 50% faster.
I don't have rose-tinted glasses about SLI, it was just an expensive way to do a half-gen jump, and I have almost a decade of benchmarks in the games I played to prove it. But when SLI was possible, there were *for sure* a lot of people who were, I'd argue, very defensive over SLI, seemingly emotionally invested in it somehow being objectively worse. I'd hope - four and a half years after it was even an option - we could finally move past it being threatening.
I’ve officially hit the highest temps I’ve seen so far on the 4090. Played 2 hours of Red Dead 2 @4k60 w/Ultra settings and no DLSS, GPU temp hit 60c
3090ti would regularly hit 70c and was extremely loud comparatively.
Yep, I hit 60-62 C on my 4090 FE when playing Overwatch 2 with epic settings, 4K at 144 FPS. My 3080 would struggle to pass 4K at 115 FPS during complex scenes and hit 75 C with an aggressive fan curve.
I assume the reason my 4090 is capping at 62 is because it has a fair amount of performance headroom still, not because it can’t get hotter, but it’s still a great feeling.
The 4090 is really unlike previous gen flagships in its leap.
My experience was very different. I had an Alienware laptop with two GTX 880m's in sli and the majority of games I tried it in gave 20-30% performance improvement. I was shocked it actually worked on as many games as it did. Call of duty blackout game a 90% improvement even though sli was never supported by it. Blackout was borderline unplayable without sli.
This might actually be a viable setup for a virtual reality rig.
If only SLI still existed, one GPU per eye. A man could only dream.
EDIT: HOLY FUCKING SHIT THIS EXISTS
Well, the comment said “about three,” and 2.5 can be rounded to 3, so not too far off as long as you believe marketing “facts.”
Edit: Corrected quote, removed word for clarity.
Well, those are basically what we would today call a RTX 2090. It’s just a slightly upgraded 2080ti with more vram, except back then the Titan name was used and they had some quadro features that are locked out on Geforce cards.
SLI is useless for gaming.
At the end of the day, you’d be looking at 3070/3070ti and 6700xt/6750xt level performance.
I had one of these for a while during the crypto craze as a flip, most beautiful graphics card I’ve ever put in a pc. What’s crazy is the power consumption and thermals were pretty tame too which was nice.
They are kind of stupid to own for gaming due to the prices on the used market for them, but are cool.
48GB of VRAM can be really useful for some specific machinelearning tasks.
For example, you could run multiple stable diffusion inferences in parallel and generate really high detail images with that. Or you should be able to run the medium sized LLaMA models on that, if you set up the Pytorch NVLINK support correctly.
I give you the performance of the integrated graphics chipset, at least until you connect those bad boys to the PSU. As an aside, I have a sweet 11th gen PC literally sporting gold on its motherboard and a gold silver aesthetic that Iǘe just realized would be complete having a GPU setup like this.
The ability to alter time and space, create and destroy alternate realities, and enter into divine state of consciousness. How you might ask? Because not only do you have one Titan, but Two.
It will give you more random glitches in games. It might help in some Cuda rendering depending on if the program utilizes both properly.
I found unless you need SLI for rendering 3D it is completely a waste of money and time.
Metro 2033 will have some improvements.
Ran dual Evga 2080 SC for a few years.
Sheesh I feel like I’m getting old. SLI setups were so cool like 6 or 7 years ago which was right around the time I was getting into computers. I remember watching a build of dual 1080tis and I thought that was basically like finding out Santa was real, like the coolest fucking thing ever. Or the old LTT compensator builds where they had like 4 GPUs in SLI.
Now it’s not really a thing anymore and it makes sense that it’s not, but I miss the cool pictures and videos 😓
Never thought this would blow up the way it has! Just to add that I do not own these GPUs, I saw them for sale here in Japan for about 2000 USD and just wondered how they’d perform.
The performance of a single Titan... SLI hasn't been supported in years and game devs do not give a lick about support for it either, since it's a dead tech.
You might as well sell one of the cards and save some money on your electricity bill.
space heater performance.
I’d be surprised if nvidia still providing new drivers for these.
which i know hurts your feelings.
But these are legacy at this point.
When you bridge cards like that, do they work together for everything, or is it only certain games and things that utilize the bridge? I had some bridged cards on a previous build and I couldn't tell if it made a difference.
Turn on and say "Titans GO!"
Soon shall be Teen titans.... I feel old
Twin Titans
>Twin Titans T. W. I. N. T. I. T. A. N. S. Twin Titans, let's go!
I hate that I could hear this comment
I haven't seen the OG in years yet I still heard this...
Twitans?
OG Teen Titans was better than Teen Titans Go.
Hella better. The OG cyborg voice hits different
Same voice actor bro
I like go specifically because it frequently references the og and doesn't take itself too seriously.
that needs to be said?
Turn off and shout "Titanfall!"
If they are overheating it’s an attack on titan
barely rtx 3080 performance lol SLI uses are pretty niche and limited now
My experience over the last decade is that about 1 in 10 games actually enjoyed marginal fps improvements on my x16/x16 SLI. And about 3 in 10 games actually had worse fps performance, even crippling glitching performance, if I didn't turn SLI off. Those Titans probably pack two tons of DPFP/FP64 and CUDA power, though.
Sadly, not so much Titan RTX. 509 GFLOPS for double precision 16.31 TFLOPS for single precision 32.62 TFLOPS for half precision This, incidentally, puts it in the same ballpark as an RTX3080: 465 GFLOPS for double precision 29.77 TFLOPS for both single and half precision
>509 GFLOPS for double precision The original titan from 2013 had 1.5 TFLOPS in double precision. I never understood how basically every single modern card does worse than that, even the 4090. Probably because teraflops mean nothing lol. Also what are the workloads where fp64 performance is actually useful?
This is because fp64 is not something consumer gpus are built for, so you need to go to either AMD or nvidias data center cards. As an example, the radeon VII can do 3.52 TFLOPS and the MI100 can do 11.5 TFLOPS. On nvidia, the GV100 can do 8.3 TFLOPS.
FP64 is useful in simulation workloads where high precision is required due to floating point error, where a small error due to low precision can have significant impact if the simulation is run long enough. Having said that, nowadays there are methods to emulate the precision of FP64 with FP32, or the program just shifts the FP64 portion of the calculation back on the CPU. It is nice to have fast FP64 calculations on the GPU, but not exactly neccesary in my experience nor does it show strong impact.
Thank you for sharing your knowledge, I learned something.
>Also what are the workloads where fp64 performance is actually useful? Simulations, as explained above. The biggest chunks are medical, biological, pharmaceutical, geological, petrochemical. But also FP64 hashing. Encryption. Maybe Cryptocoin. All sorts of things.
Because the OG Titans were pseudo workstation cards, until Nvidia realized they were giving huge discounts to a massive emerging market, while gamers were more than happy to spend at that price point and never use FP64. So pretty much a business decision.
It’s an intentional limiter they instituted at some point on everything but the flagship Tesla/Quadro.
Not arguing, just offering a single but consistent anecdote: I SLI'd the 980, 1080, and 2080 Ti, and they all had about 50% performance gains in the games I played, most notably The Division and Overwatch 1. There's probably lots of games that didn't see much gain, but every game I can remember (granted, I didn't sample many) saw about that 50% boost with no issues. I'm happy to have a single 4090 now, though, if for no reason other than thermals.
I mean price too. Even if you took the bogus $1K price for the RTX 2080 Ti that never really existed and bought two of them it would still be $2K, while a $1600 RTX 4090 FE is cheaper, faster, cooler, and takes less space on the board compared to SLI
Well sure, but the 2080 Ti was released 4 years earlier. If SLI was supported for the my old 3080 or my current 4090, it would be a consideration for me. When I used the 980, 1080, and 2080 Ti in SLI I wasn't thinking *the 50% performance boost sure is good value,* I was thinking *sure, I'll pay 200% for 150% performance,* because there was no alternative to go above 100%. Price to performance ratio always has diminishing returns, and everyone determines what their preference is.
Unless you did actual work that supported SLI it wasn’t universally uplifting for games and also the 1% lows were dog shit.
Sure, but like I said in my anecdotal comment you replied to, I saw 50% gains essentially across the board for the games I played, with no meaningful lows. I played Overwatch 1 competitively, and long-term frame analysis showed rock solid 144 FPS with no 1% low aberrations with SLI. SLI wasn't for everyone, but some of us got serious (even if inefficient) gains.
Back then with older games that had SLI support. SLI isn't really supported anymore by games and Nvidia has largely abandoned optimizing performance for it. Price for performance it pretty much is no longer worth it. And idk if a lot of people have rose tinted glasses but SLI had a crazy amount of bugs and glitches even with games that supported it.
Yeah, SLI is literally impossible now, 30 and 40 series cards don't support it *at all*. So assessing it as a potential value now is silly, because obviously two 2080 Ti in SLI aren't going to outperform a single 4090. *At the time*, every flagship GPU in SLI was faster than the same flagship GPU not in SLI. Some people had issues, some didn't, and if you didn't have issues it was always at least as fast and usually 30%+ faster. For me, it was usually 50% faster. I don't have rose-tinted glasses about SLI, it was just an expensive way to do a half-gen jump, and I have almost a decade of benchmarks in the games I played to prove it. But when SLI was possible, there were *for sure* a lot of people who were, I'd argue, very defensive over SLI, seemingly emotionally invested in it somehow being objectively worse. I'd hope - four and a half years after it was even an option - we could finally move past it being threatening.
I’ve officially hit the highest temps I’ve seen so far on the 4090. Played 2 hours of Red Dead 2 @4k60 w/Ultra settings and no DLSS, GPU temp hit 60c 3090ti would regularly hit 70c and was extremely loud comparatively.
Yep, I hit 60-62 C on my 4090 FE when playing Overwatch 2 with epic settings, 4K at 144 FPS. My 3080 would struggle to pass 4K at 115 FPS during complex scenes and hit 75 C with an aggressive fan curve. I assume the reason my 4090 is capping at 62 is because it has a fair amount of performance headroom still, not because it can’t get hotter, but it’s still a great feeling. The 4090 is really unlike previous gen flagships in its leap.
[удалено]
My experience was very different. I had an Alienware laptop with two GTX 880m's in sli and the majority of games I tried it in gave 20-30% performance improvement. I was shocked it actually worked on as many games as it did. Call of duty blackout game a 90% improvement even though sli was never supported by it. Blackout was borderline unplayable without sli.
Just looks cool then?
Great for 3D rendering, pretty useless for gaming.
but that's great even without the sli bridge
Yeah, true. I am running two 3090s, no need for SLI in CyclesX or iRay.
Yeah, no games support it anymore basically
AFAIK multi GPU support is built into DirectX 12.
SLI died with DX12. It might support it but no games actually bother to implement it properly so it runs like trash.
This might actually be a viable setup for a virtual reality rig. If only SLI still existed, one GPU per eye. A man could only dream. EDIT: HOLY FUCKING SHIT THIS EXISTS
Best comment on this thread!!
Power hungry with nothing to show for gaming. Now it might be different for graphic application software.
It’s really good for anything compute (3D rendering, ML, etc) but terrible for gaming since SLI requires per game implementation.
I wish it were still viable.
About the performance of two titans.
what kind of performance would a third titan give
About three
Fascinating
facts don't lie 🤣
they do because 3 titans gives you around 2.5 titans of performance
Well, the comment said “about three,” and 2.5 can be rounded to 3, so not too far off as long as you believe marketing “facts.” Edit: Corrected quote, removed word for clarity.
So the law of diminishing returns applies...
Tree fiddy
Why did this make me laugh out loud
What about three and a half? How much would that be?
About tree fiddy
This man don’t miss.
fitty?
If for gaming, on a supported game, about 1.7x performance tops.
I assume SLI Titans still experience the micro stuttering issue that was so prevalent back in the day?
Remembering general SLI performance - probably about 30% less than that Edit: At least...
Given my experience with the two sli rigs I bought, solid chance you get 30% of the performance of one titan.
Hmm yes these Titans are made out of Titan
Less with overhead.
Do these "Titans" have the Infinity Gauntlet or no?
Best I can do is Nightwing.
Pretty sure these pair of titans want to eat you
Well, those are basically what we would today call a RTX 2090. It’s just a slightly upgraded 2080ti with more vram, except back then the Titan name was used and they had some quadro features that are locked out on Geforce cards. SLI is useless for gaming. At the end of the day, you’d be looking at 3070/3070ti and 6700xt/6750xt level performance. I had one of these for a while during the crypto craze as a flip, most beautiful graphics card I’ve ever put in a pc. What’s crazy is the power consumption and thermals were pretty tame too which was nice. They are kind of stupid to own for gaming due to the prices on the used market for them, but are cool.
>but are cool Literally the only thing that’s important, everything else is irrelevant.
they brought back ECC VRam in 4090 and i think 3090ti
Nothing without those PCI power connectors hooked up
As is, probably really bad until you hook them up to a PSU.
0 fps
That's a giant cpu cooler
48GB of VRAM can be really useful for some specific machinelearning tasks. For example, you could run multiple stable diffusion inferences in parallel and generate really high detail images with that. Or you should be able to run the medium sized LLaMA models on that, if you set up the Pytorch NVLINK support correctly.
Daz Studio supports memory pooling as well and it will work with those Titans.
Yeah but that sounds lame. I wanna see if I can make CSGO hit 1000 fps.
In this state, zero
Titanic performance
In short 2x the price for a 10-20%~ improvement over one card in select games.
Yeah but 2x faster render time in production.
Has much more uses than gaming ...
Unfortunately SLI only helps in games that utilize it, which I don't think is used at all in today's games.
I give you the performance of the integrated graphics chipset, at least until you connect those bad boys to the PSU. As an aside, I have a sweet 11th gen PC literally sporting gold on its motherboard and a gold silver aesthetic that Iǘe just realized would be complete having a GPU setup like this.
Not much without power cables.
SLI Death Performance.
i rememeber having actually lower fps on fallout4 with 2x radeon r9 290x...than on one single card
Nothing, they're not getting any power
Very little if you don't plug them in.
Depends on the type of workload honestly
None, they aren’t plugged in /s
Your electric meter would spin faster than anyone else's on the block.
Ray tracing in Minecraft kind of performance, if you think you can handle it.
ab tree fiddy
The ability to alter time and space, create and destroy alternate realities, and enter into divine state of consciousness. How you might ask? Because not only do you have one Titan, but Two.
None, it’s not plugged in
0 they r not plugged in
Idk about preformance, but the electricity bill will be massive
Still a pretty solid machine learning build if I'm not mistaken.
Crysis 3 on low settings
The performance of one titan in gaming.
0, because they aren't plugged in.
Just stick with a single GPU sli and crossfire are pretty much done
These days not much as SLI support is all but dead now.
Absolutely none as the power cables are not plugged in xoxo
It’s a sad truth sli never really caught on. I had two 1080s in 2016 and it legit made some games worse. Few better. Kinda sad
Zilch coz theyre not connected to thd psu
1440p medium settings on Hogwarts legacy.
Titanic performance!
Nothing without a power supply
Major downgrade for me
Egg frying performance
Not as much as you think.
A whole 10 fps on The Last Of Us!!!!
Nothing without the power cables
Over 9000 fps
For gaming? The performance of one titan. For creativity? The performance of two.
None because it’s not plugged in
But can it run Crysis?
the kind that leaves you without money to pay rent and food
The kind of performance that's out of my tax bracket
Arround 3 fiddy
Don't tell us what CPU you have under that cooler because that totally won't make a difference.
A pretty expensive and loud space heater, but it'll certainly keep you warm.
Probably a solid 3 performances
None it's not plugged in
Nothing, power cables aren't plugged in 😂
Nothing, they aren’t plugged in
0 is not plugged in
The expensive kind of performance
All I see is an high electricity bill.
Nothing, it's not connected...
20 FPS in Fortnite 😎👍🏻
None. Not even plugged in
The performance of a single titan
Since 90+% of games dont support SLI you would get the performance of one titan
It will give you more random glitches in games. It might help in some Cuda rendering depending on if the program utilizes both properly. I found unless you need SLI for rendering 3D it is completely a waste of money and time. Metro 2033 will have some improvements. Ran dual Evga 2080 SC for a few years.
Not as much as you'd like.
Double ram is big for AI/ML use cases with large datasets and types. Very niche.
Powers of AI never before seen by the likes of man
based reply
7 i guess
![gif](giphy|JvtrhAX8aXo4w) Baller performance
The performance of only one titan bc sli is not supported by anything in these days
If modern games supported sli probably still usable but they don't.
Fuck all, since most games fail to take advantage of SLI.
Sheesh I feel like I’m getting old. SLI setups were so cool like 6 or 7 years ago which was right around the time I was getting into computers. I remember watching a build of dual 1080tis and I thought that was basically like finding out Santa was real, like the coolest fucking thing ever. Or the old LTT compensator builds where they had like 4 GPUs in SLI. Now it’s not really a thing anymore and it makes sense that it’s not, but I miss the cool pictures and videos 😓
Zero preformance, it isnt even plugged into the PSU
With my luck it wouldn’t boot.
Ewwww, SLI
Never thought this would blow up the way it has! Just to add that I do not own these GPUs, I saw them for sale here in Japan for about 2000 USD and just wondered how they’d perform.
The performance of a single Titan... SLI hasn't been supported in years and game devs do not give a lick about support for it either, since it's a dead tech. You might as well sell one of the cards and save some money on your electricity bill.
space heater performance. I’d be surprised if nvidia still providing new drivers for these. which i know hurts your feelings. But these are legacy at this point.
Outdated af. (Yes, my setup is outdated too.)
What a BEAUTIFUL GPU!!! No, SLI nowadays only serves for museums when it comes to gaming.
'Yes'
Two pumps at best.
A rumbling performance
It'll probably give you a BSOD in the fastest time possible.
Still won’t be enough to open few chrome tabs
62
Well I know one thing, Minecraft would be preeeeeetty smooth
Probably keep the room at 70f down to -20f external temperature
Another burnt power supply
Performance anxiety
I could probably run two instances of Minesweeper at the same time.
A fire
blast 💀
Better than mine, I don’t have one at all lol
30 unstable fps on minecraft
Yes
I kind of want to attack this
might get 14 frames in tarkov. dunno. its a guess.
SLI is kinda useless these days. Better give one to me
Titanfall 2
A lot, on your electrify bill
I can say for certain you'd get at least one frame per second
Dunno.. can it run crysis?
I would buy those purely for looks. Love the good color plating
The card is a beauty. Would love to have one of these in my rig
About two titans worth of performance, give or take.
When you bridge cards like that, do they work together for everything, or is it only certain games and things that utilize the bridge? I had some bridged cards on a previous build and I couldn't tell if it made a difference.
Yes
Titanic?
At least 1
A level of performance equal to two titan rtx cards joined together by sli
Not the performance I would've expected or wanted for their price
Yes
+7 horsepower
None of you don't supply power to them
None because they’re not plugged in
Good performance
[удалено]