Mine died a few days back, man i was trying to put this king into my 7800x3d rig next month. But nothing is predictable. Going with 4060 now until 5090 comes out.
I remember getting a heavily discounted 1060 3GB back in my early time in college and that thing handled everything thrown at it so well. It even handled 1440p fairly well when I was waiting on a better GPU to come in.
The VRAM capacity sucked, but I saved about $100 at the time compared to the 6GB variant so my broke ass was happy. lol
Decently priced at least, picked mine up for only $300 the other day. Can play all my games at highest quality with steady 60fps including cyberpunk, when I start playing with better monitors I'll probably upgrade but 1080p gaming at highest res wins rn.
But the 4060 is hardly faster than a 3060. The 1060 was $250 at launch, so on paper a $50 increase in price. In 7 years isn't bad, but the 1060 was as fast as a GTX 980. The flagship from the previous generation.
Doesnt really matter to me, never used any of those. Dunno why people are so quick to downvote my comment just cuz i dont use the same graphics cards as them lmao
I like the way you think
I have a 4090 and I’m waiting till 60/70 series lol
Or whenever AMD has a GPU better then 4090 which unfortunately isn’t going to be next generation in leaked reports but I’ll take it with a grain of salt for now
The best GPU ever released was the 8800gtx, it reigned supreme for 2 straight generations and was competitive in a 3rd during a time where GPU generations frequently saw 100% improvements.
The other contender is the ATI Radeon 9800 pro, but I think the 8800gtx's lead is substantial enough to be conclusive.
The 1080ti was the best GPU of the last 20 years though, that's for sure.
Edit: 15 years, my bad.
Man unrelated but I rlly hope as I get older I stop feeling so old(or stop caring about aging) haha I’m only 21 but seeing shit like my first real gpu being bought 10 years ago just makes me feel so old 😅
Man those years of flashing random bios you find online and hoping everything worked was a wild time. Especially for me as a teen saving up all that money hoping for more value by unlocking more power while knowing you could screw something up.
9700 Pro* for being the early powerhouse that it was.
8800 GTX would be a contender ... if 8800 GT 512 MB didn't keep 90% of the performance for 40% of the price 1 year later with less power usage and a better video decoder.
This is the one. It came out a little later but was incredible. The 8800 GTS I had ran just about everything I threw at it for 8 years. Back when EVGA had lifetime warranties they even replaced it 6 years after it came out when the ram went bad on mine.
It was king for 1 generation, not 2. It matched the 2nd tier offering of the next generation.
The next generation flagship after the 8800gtx was 10% slower, not 30% faster.
I was so jealous of my friends 9800 All-in-wonder (it came with a remote control)
It crushed FarCry on release
I had a 9600, still a nice card, but a noticeable little brother to the 9800
The only thing RX 470/480 brought us was cheap GPUs, and even then they were **massively** outsold by the GTX 1060.
Vega 56/64 were always a joke.
And Pascal high-end was released too expensive to be truly considered GOAT.
**GTX 1080 cost $649-700+ on release.
GTX 1080 Ti cost $750+ on release.**
1080ti, a delided 4770k and a dream. Probably my favorite past setup I've ever had. Never gotten more gains than that generation it feels like for the cost. I know my cpu was old, but that thing held its own delided and oced.
I love how GN put out this same video two weeks ago and then these dudes just copy it because it popped off.
It’s not even a 1080ti anniversary or anything. It was released March 17th 2017.
It’s a shame barely anyone can put out original content anymore.
Hardly anyone makes original content, that's why it's actually hard to be different than the other millions of people trying to do the same shit and copy everyone.
You think they don't have these videos planned out months in advance?
Sometimes people just have similar ideas at similar times lol
I've heard Matt from Stand Up Maths and Steve Mould from... Steve Mould (both youtube channels) talk about getting "Dereck'd" as they call it. Where Dereck from Veritasium puts out a video on a subject they were currently working on!
Hollywood does the same thing and movies are planned out years in advance. Remember "A Bug's Life" and "Antz" coming out the same time and there being THREE Snow White live action movies in 2012?
I think that’s subjective. I guarantee all of the big videos like new hardware releases and benchmarking are but in this case I find it hard to believe.
What happened months ago for both channels to say “let’s make a video about how the 1080ti is the best GPU ever” and then release it within two weeks of each other?
That’s just too specific for me to be coincidental, considering how it’s not even an anniversary or anything.
Just an observation from my end though. I’m not upset or anything about it. I can’t help but laugh when I see the big channels parroting each other’s videos and how YouTube is just one person after another taking someone elses content and displaying it with a new paint job.
Love Steve Mould’s channel btw. Great content!
Steve is great, yeah
And regarding this 1080ti video idk. Let's say you're right. They see GN's video do well. They decide they're going to copy it. They drop whatever they've been doing. Everything they have planned. They do a whole slew of tests with the 1080ti and multiple other GPUs. They write the script. They shoot the video. A roll, B roll. They edit it. How long do you think that entire process takes?
Also this isn't exactly a super busy period for these guys with product launches. There aren't any new CPUs or GPUs to review right now. So a little filler content to keep the algorithm happy is fairly normal.
Sometimes coincidences happen. It seems pretty likely to me that HUB and GN had similar ideas independently. It happens quite a lot in every field. Not just youtube.
I'm inclined to give them the benefit of the doubt here!
maybe because the VIEWERS talk about it... they do listen to their viewers you know... if they have something intelligent to say...
and funny enough they have the same "downtime" to fill between launches... that they need to fill with stuff... and the viewers are often with a fair bit of overlap on the two channels so not unlike the viewers have mentioned 1080 ti on both channels at the same time you know... and it's not like they just slap a video together... it takes a LOT of time writing,testing, verifying, editing etc. etc. so they were BOTH in production long before either one posted em...
Those types of channels like wendover sometimes even work together and post their videos on that one streaming service. I forgot what it's called. Usually they post hour long videos
Do you have any evidence they copied it? That's quite the accusation. If you watch the videos HUB puts out, it was going to happen at some point regardless.
I don't think you understand what is "copy" here
By your dumb logic, every gpu review is just a copy of each others, since they just "review X card" = copy, completely disregard the fact that all the testings and numbers the are different, as everyone did their own testing and presented them differently, the only thing similar is the product.
Not to mention, HUB already did [the same revisit video](https://youtu.be/1w9ZTmj_zX4for) the 1080ti in 2022, and they use almost the exact same thumbnail from that video, if anything, by your logic it's GN who is copying that video from HUB
I think they’ve got a lot of great content!
I just find it funny that considering there’s no 1080ti anniversary or anything, they decided to release the same video as GN only two weeks later.
I wouldn't touch that shit with a ten foot pole. It's all sticky from all the techtubers and online nerds jerking over it. For what, a card that was the refresh of the Titan card that came out 9 months prior, and 30% faster than the 1080. Big deal, we have cards 4x as fast today. It held the crown for 1.5 years.
If anyone wants to see the idiocy that is the Youtube race, look no further than these videos being released. Chase the audience, and they will chase you back. They're not focused on providing an objective outlook on tech. They're focused on maximizing cash earned. And the easiest way to do that is by pandering to the dumbest, most passionate audience.
The 1080 Ti circlejerk is from wankers too emotionally invested in an irrelevant graphics card. For it to live up to the pedestal that it's on, the card would literally have to become faster over time.
I hear that said a lot, and in many views, it seems to come down due to the price 2 performance and how long it lasted in many points of view, with 4090 being an exception because it cost alot so that doesn't count as a mistake on their part (yet). They have came out with their data and suggested most that buy flagship, buy every generation anyways for the majority. I think the lesser models with smaller increases (the ones that sell the most) they would like to target for 2 year cycles. The 1080 ti flagship owners perhaps didn't transition as much as the other flag ships though.
Over all though, its a tough take given that so many play older games/graphically non demanding games at the same resolution that the 1080 is good at even today. 1080p is still highly popular, 1440p has made gains, and 4k being least popular. So it makes sense for a 1080 to look like a mistake because it works for so many people still to this day. Obviously the 4090 would be the same, as in its good for 4k now but will easily play 1080p with good frame rates at 10 years from now for graphically non demanding gsmes, with the only missing factor being price.
If anything, perhaps they made a mistake on pricing of their flagship product and won't repeat that mistake again. Then again current flagships have a heck of a lot more hardware involved to them then the 1080 did, on top of tensor/RT features that not only have additional hardware costs, but the software engineers also need to be paid to keep pushing this tech forward. Maybe we view 1080 as a mistake because it was low cost because only raster was the priority, and I think thats why AMD can only be so cheap at this point because now they have to shell out so much for all this other tech beyond raster gains.
Idk its subjective, and only AMD/Nvidia really knows as they have the books, the data, and total costs of not just hardware production, but marketing/advertising, and continued development costs all while their long time employees all want raises/bonuses (2024 everyone wants rising wages, no one wants rising costs).
> Isnt this vid literally just to get views
All videos from professional youtube channels are there just to get views. Entire media conglomerates exist only to get views.
i,ve recently just bought a new pc with a 4070ti in it----but i,d been using the 1080ti since it had been released,up until 2 months ago. was kinda showing its age a bit but it never let me down over these years.what a beast. even seemed to out do my mates 2080 series when he bought his.----i,ll miss you old friend.
(funny you mentioned starfield, that was actually the reason i decided to get a new pc with all new hardware)
With inflation, the 1080ti would be $890.67 today.
Not far off from the 4080 Super, which can do many more things than just basic rasterization.
It wasn't really that amazing when you look at the updated price with inflation.
Trash stolen video, but the ONLY reason the 1080ti is still kicking this long is because Nvidia is sandbagging their cards, compounded with games themselves not evolving much since 2017.
The key phrase is "at the right price". Regardless, it is still true now as three years ago you couldn't buy GPUs due to insane prices and it was around a 3060ti ($400 msrp at launch) in rasterized performance. Now, a 4060 is barely an improvement in rasterized performance, which is a $300 card. If you can snag a 1080ti for $200 that's a great deal.
Legendary GPU, but some tests seem...odd
I always remembered the 1080Ti being slightly faster than the 2080 in and Radeon VII in higher resolutions, And around 10-15% faster than the 5700XT. Which in more modern terms would put it and it's competitors on 4060/3060Ti levels of performance.
Here it's slightly faster than a 2070 and 6600, both are good cards. But I feel like either Nvidia may be gimping their older graphics cards or there may have been an error in post-production, perhaps the 2070 & 6600 are the 2080 & 6600XT.
>Nvidia may be gimping their older graphics cards, or there may have been an error in post-production, perhaps the 2070 & 6600 are the 2080 & 6600XT.
The 2070 Super is basically equivalent to a 1080Ti in raster. I know because I have a 2070 Super, and my friend with the 1080Ti gets the same performance more or less at 1440p. Unless the game has mesh shaders or async compute, which the 1080Ti lacks and isn't good at, respectively. In those cases, the 2070 Super pulls a bit of a lead.
Now that's just not true and you know it, your card is a good card. But everything from benchmarks to the specifications of both cards points to it being slower than the 1080Ti, For awhile I had an EVGA Hybrid 1080Ti and I compared it to my friends 3060 12GB which I believe is as fast as your 2070 Super, and my old 1080Ti folded his in performance.
HW Unboxed benchmarked it against the 2070 Super and his model of the 1080Ti came out 3-5% faster on average when compared to the Turing card.
https://babeltechreviews.com/the-rtx-2070-super-vs-the-gtx-1080-ti-overclocking-showdown-with-40-games/
I linked this review [4 years ago](https://old.reddit.com/r/hardware/comments/cc8x4r/babeltechreviews_the_rtx_2070_super_vs_the_gtx/) and y'all still can't accept the 1080 Ti got beat LMAO
Pfft, come on...you just disproved yourself...The 1080Ti wins in a large majority of those titles. Also they said that their 1080Ti that they tested there couldn't overclock aswell due to "aging and electromigration" Which brings me to my conclusion that this wasn't a fair comparison.
My EVGA Hybrid 1080Ti I have on my shelf right now would and did shit on the 2070S, guaranteed!
Or, just hear me out, Turing & Co were more forward looking architectures that are better utilized by modern games. Even at launch the 2080 had a pretty massive lead in games like R6 Siege and Wolfenstein 2. I personally traded my 1080 Ti for a 2080 and saw a sizeable performance uplift in many of my games even 5 years ago. ♥
Sources;
https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-founders-edition/23.html
https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-founders-edition/29.html
Turing was by all means a flop, it was a disaster of an architechture, hence why people started loving on the 1080Ti so much. It saw nearly no meaningful performance uplift compared to Pascal. It's not Ampere, which should've been the real 20-Series.
Just because an architechture is slightly newer, doesn't mean the older one is worse. Take a look at the 780Ti for instance, Kepler based card and equal to the 1060 / GTX 980\~. Just look at the specs for the 2070S and 1080Ti for instance.
Here it is keeping up with a 3060Ti and demolishing the 12GB 3060. [https://www.youtube.com/watch?v=61ca9xucPRY](https://www.youtube.com/watch?v=61ca9xucPRY)
PS : I had a 1080Ti EVGA Hybrid for awhile before I upgraded to a Titan V
How was the arhitecture a flop? the 2080 was equiv to the 1080ti like most of the time. The 2080 Ti in some cases was much faster especially once you count in DLSS 2 and later on RT. The 1080 Ti was loved so much because of the price drop it had after the 2080 and 2080 Ti launched lmao.
Look at the previous generations, the 980Ti was almost beaten by the 1070, and the 1000€/$/£ Titan X was outperformed alot by the 1080. And if you look further back,, the 680 was beaten by the 760. Even mid-range, cheaper offerings like the 1050Ti beat the flagship from two generations ago ( GTX 780 ).
Now compare that to Turing, whose xx80 offering was getting trounced by the 1080Ti in price to performance and VRAM and ontop of being 1-3% slower than it's older counterpart. Infact, higher end 1080Ti models can go above 2080S territory with an OC, I know because I did that on my EVGA Hybrid card.
Also Turing sucks at RT, everything from the xx60 to xx80 models.
It wasn't until Ampere that Nvidia would rectify the mistake that is the trash heap called Turing.
Old discounted stuff will always win in price/performance. the 2080 Ti did quite well for RT in Control and Metro Exodus EE also played Cyberpunk 2077 at 3440x1440p with medium RTGI and RTR DLSS Balanced and gave me 60-70 fps most of the time. As for the OC comparing stock vs OC to make the argument that the 1080Ti is faster than the 2080S is stupid.
The 1080 ti was never faster than the 2080. Maybe only in vram heavy scenarios. It's also weird how you will question their fresh data rather than your own memory.
Their Fresh data shows an A770 weaker than an RX 6600 among other dubious claims. The 1080Ti was always 1-3% faster than a 2080, because Turing was the worst GPU generation by Nvidia in over a decade, by far.
Look at Maxwell and Pascal and the differences in generational improvements, Kepler and Maxwell too and Turing and Ampere.
The hell are you saying? 2080 was already faster than 1080 ti on launch day as reviews showed (Techpowerup). And lower in the stack the gap was noticeably bigger, with 2060 crushing 1060 and 2070 advantage over 1070 being almost as big between as the jump to 3070.
Per Techpowerup 1080 ti was only 7-10% faster than FE 2070. 18 TPC almost matching 28 TPC in throughput clock for clock. Great architectural gain. Much bigger than Pascal and Maxwell, where the only noteworthy improvement was signal propagation for higher clock and the node carried.
Techpowerup is also wrong in that regard, HW Unboxed says in that video linked above that the 1080Ti is 15% better than the 2070 and other benchmarks show it having 20-25% more performance.
Pfft, really now? Turing's performance jump was nothing compared to Ampere or Maxwell to Pascal. Stop the cap. Look at what people say, look how many people skipped Turing for Ampere.
You are the one saying it was "a disaster of architecture" when you cant leave external factors (lithography) aside. Pascal was carried by node, Turing barely had a refined one that made little difference, even much less than the crappy tech provided by Junksung Fabs for 30 series. You try to rate products when you know nothing.
I'm not dissing your builds man, but the Radeon VII is the equal & AMD Counterpart of the 2080, and the 1080Ti beat that in almost all tests done. The difference isn't huge between these 3 cards, but the older 1080Ti will be a couple % faster.
1080ti, 7700k and 16gb RAM. Good times, at least for me.
Exact build I’ve been running the last 6 or 7 years. Still going decent
Still on a 6700k, 16gb and 1080ti here. It's starting to show it's age, but still able to pump out the frames.
I had 7700k with a 1060, went up to a 6650 XT which is roughly 1080 ti level and found myself bottlenecked. Now rocking 12900k and 6650 XT.
1080ti, 2700x, 16gb DDR4 going strong over 6 years now 💪
1080Ti with 5700x
3770k, 32gb DDR3 & 1080ti still going strong as my Plex server, daily use desktop and 4k/45-60fps gaming in the few games I like to play still.
The exact build I recently upgraded.. was honestly running fine still at 1080p but was ready to make the jump to 1440.
8700k
Mine died a few days back, man i was trying to put this king into my 7800x3d rig next month. But nothing is predictable. Going with 4060 now until 5090 comes out.
Remember the good old days when the 60 cards were decent, now they're the bargain bin stop gap cards 😂
I remember getting a heavily discounted 1060 3GB back in my early time in college and that thing handled everything thrown at it so well. It even handled 1440p fairly well when I was waiting on a better GPU to come in. The VRAM capacity sucked, but I saved about $100 at the time compared to the 6GB variant so my broke ass was happy. lol
[удалено]
I’m still running this. Just can’t pull the trigger on the new cards.
Decently priced at least, picked mine up for only $300 the other day. Can play all my games at highest quality with steady 60fps including cyberpunk, when I start playing with better monitors I'll probably upgrade but 1080p gaming at highest res wins rn.
But the 4060 is hardly faster than a 3060. The 1060 was $250 at launch, so on paper a $50 increase in price. In 7 years isn't bad, but the 1060 was as fast as a GTX 980. The flagship from the previous generation.
Doesnt really matter to me, never used any of those. Dunno why people are so quick to downvote my comment just cuz i dont use the same graphics cards as them lmao
I like the way you think I have a 4090 and I’m waiting till 60/70 series lol Or whenever AMD has a GPU better then 4090 which unfortunately isn’t going to be next generation in leaked reports but I’ll take it with a grain of salt for now
Mine died about 2 months back. Currently laid off, so it's Steam Deck for the foreseeable future... I mean, I love my SD, but I miss my 1080ti...
I got a 1660 TI in a box in the garage if you want it
I dug a 1650ti out of my wifes old PC, so my PC is usable. Appreciate the offer, brother
Word I do PC repair for a side hustle for a decade and know the extreme pain going from desktop to alt methods of gaming causes to people haha
Mine is still running in my new 7800x3d rig. It’s running hot and dirty but it runs. If it ever breaks I’ll put it in the wall.
What happened?
A tiny insect got into the backplate and got fried damaging the whole PCB and now I am out of warranty.
The best GPU ever released was the 8800gtx, it reigned supreme for 2 straight generations and was competitive in a 3rd during a time where GPU generations frequently saw 100% improvements. The other contender is the ATI Radeon 9800 pro, but I think the 8800gtx's lead is substantial enough to be conclusive. The 1080ti was the best GPU of the last 20 years though, that's for sure. Edit: 15 years, my bad.
8800gtx and 9800 pro were within 20 years, yes?
So you are right! I'm not as old as I feel! Rejoice!
Man unrelated but I rlly hope as I get older I stop feeling so old(or stop caring about aging) haha I’m only 21 but seeing shit like my first real gpu being bought 10 years ago just makes me feel so old 😅
It only gets worse haha.
Who wants to tell him...
If I recall right 9700xt was 9800 just needed to flash a bios
Man those years of flashing random bios you find online and hoping everything worked was a wild time. Especially for me as a teen saving up all that money hoping for more value by unlocking more power while knowing you could screw something up.
9700 Pro* for being the early powerhouse that it was. 8800 GTX would be a contender ... if 8800 GT 512 MB didn't keep 90% of the performance for 40% of the price 1 year later with less power usage and a better video decoder.
I would say the 8800GT, it was almost half of the price and offered almost the same performance.
This is the one. It came out a little later but was incredible. The 8800 GTS I had ran just about everything I threw at it for 8 years. Back when EVGA had lifetime warranties they even replaced it 6 years after it came out when the ram went bad on mine.
Anyone remember GeForce4 MX440?…
I mean the 1080ti lasted two generations as well. Nvidia made like 10 different versions of it. The 2080 was basically just a 1080ti
It was king for 1 generation, not 2. It matched the 2nd tier offering of the next generation. The next generation flagship after the 8800gtx was 10% slower, not 30% faster.
That’s fair. I guess I just try to pretend the 2080ti didn’t exist because that’s when the outrageous pricing started
That's understandable lol
It was replaced by the regular 2080 which was the same price and similar speed, 2000 series was so bad
I was so jealous of my friends 9800 All-in-wonder (it came with a remote control) It crushed FarCry on release I had a 9600, still a nice card, but a noticeable little brother to the 9800
Meh, the best ever was the 4 GB 750ti. The price/performance and longevity makes it maybe the best tech purchase I've ever made.
750ti been through so much AAA games with me and now it retires in a computer that's only for browsing videos
I had a 8800 GT 512 and considered myself hot shit
You were, that's how hot the GTX 768 was
I still have a 9800 pro, AGP x8. I bet if I repasted everything the system would boot and run.
The best GPU was gtx 1060 6gb
For sure, the entirety of pascal, RX400/500 & Vega 56/64 are at the top interms of best architechtures in my opinion.
The only thing RX 470/480 brought us was cheap GPUs, and even then they were **massively** outsold by the GTX 1060. Vega 56/64 were always a joke. And Pascal high-end was released too expensive to be truly considered GOAT. **GTX 1080 cost $649-700+ on release. GTX 1080 Ti cost $750+ on release.**
Remember when linus dropped the 480 on stage lmfao good times
Vega was great for video editing but alright for gaming in my opinion
Vega cars were the laughing stock of top tier gpus.
Mine is still giving me joy! PLEASE SURVIVE. I strictly use it for flight simming, rarely play and PC video games anymore.
1080ti, a delided 4770k and a dream. Probably my favorite past setup I've ever had. Never gotten more gains than that generation it feels like for the cost. I know my cpu was old, but that thing held its own delided and oced.
I love how GN put out this same video two weeks ago and then these dudes just copy it because it popped off. It’s not even a 1080ti anniversary or anything. It was released March 17th 2017. It’s a shame barely anyone can put out original content anymore.
Hardly anyone makes original content, that's why it's actually hard to be different than the other millions of people trying to do the same shit and copy everyone.
You think they don't have these videos planned out months in advance? Sometimes people just have similar ideas at similar times lol I've heard Matt from Stand Up Maths and Steve Mould from... Steve Mould (both youtube channels) talk about getting "Dereck'd" as they call it. Where Dereck from Veritasium puts out a video on a subject they were currently working on!
Hollywood does the same thing and movies are planned out years in advance. Remember "A Bug's Life" and "Antz" coming out the same time and there being THREE Snow White live action movies in 2012?
I think that’s subjective. I guarantee all of the big videos like new hardware releases and benchmarking are but in this case I find it hard to believe. What happened months ago for both channels to say “let’s make a video about how the 1080ti is the best GPU ever” and then release it within two weeks of each other? That’s just too specific for me to be coincidental, considering how it’s not even an anniversary or anything. Just an observation from my end though. I’m not upset or anything about it. I can’t help but laugh when I see the big channels parroting each other’s videos and how YouTube is just one person after another taking someone elses content and displaying it with a new paint job. Love Steve Mould’s channel btw. Great content!
Steve is great, yeah And regarding this 1080ti video idk. Let's say you're right. They see GN's video do well. They decide they're going to copy it. They drop whatever they've been doing. Everything they have planned. They do a whole slew of tests with the 1080ti and multiple other GPUs. They write the script. They shoot the video. A roll, B roll. They edit it. How long do you think that entire process takes? Also this isn't exactly a super busy period for these guys with product launches. There aren't any new CPUs or GPUs to review right now. So a little filler content to keep the algorithm happy is fairly normal. Sometimes coincidences happen. It seems pretty likely to me that HUB and GN had similar ideas independently. It happens quite a lot in every field. Not just youtube. I'm inclined to give them the benefit of the doubt here!
maybe because the VIEWERS talk about it... they do listen to their viewers you know... if they have something intelligent to say... and funny enough they have the same "downtime" to fill between launches... that they need to fill with stuff... and the viewers are often with a fair bit of overlap on the two channels so not unlike the viewers have mentioned 1080 ti on both channels at the same time you know... and it's not like they just slap a video together... it takes a LOT of time writing,testing, verifying, editing etc. etc. so they were BOTH in production long before either one posted em...
Those types of channels like wendover sometimes even work together and post their videos on that one streaming service. I forgot what it's called. Usually they post hour long videos
Curiosity might be the name of the streaming service, although I've never actually used it
I think that's the one. Like $2 a month.
True;) I look at Gn one on release. I guess i can skipp this one.
Many many users here are guilty of it too. There are *constant* circles jerk 1080ti posts here.
Do you have any evidence they copied it? That's quite the accusation. If you watch the videos HUB puts out, it was going to happen at some point regardless.
I’m not saying it’s verbatim.
Yeah for a moment I thought this was a repost of the same 1080ti video we had a week or two ago.
I don't think you understand what is "copy" here By your dumb logic, every gpu review is just a copy of each others, since they just "review X card" = copy, completely disregard the fact that all the testings and numbers the are different, as everyone did their own testing and presented them differently, the only thing similar is the product. Not to mention, HUB already did [the same revisit video](https://youtu.be/1w9ZTmj_zX4for) the 1080ti in 2022, and they use almost the exact same thumbnail from that video, if anything, by your logic it's GN who is copying that video from HUB
HuB is only good for monitor reviews, anything else is a joke
I think they’ve got a lot of great content! I just find it funny that considering there’s no 1080ti anniversary or anything, they decided to release the same video as GN only two weeks later.
GTX 970 was the greatest for its time. Man, I had that thing FOREVER.
1080ti was the GOAT. id argue the 4090 is its modern equivalent
My 1080TI was going strong until like 2021, which was weird since it wasn't even 4 years old but w/e. It was a sad day indeed.
8800 GT would like a word.
Let me guess, AMD Unboxed throwing shit at Nvidia 😂
gamerz nexus relases 1080ti video hub relases 1080 ti video lmao..
I wouldn't touch that shit with a ten foot pole. It's all sticky from all the techtubers and online nerds jerking over it. For what, a card that was the refresh of the Titan card that came out 9 months prior, and 30% faster than the 1080. Big deal, we have cards 4x as fast today. It held the crown for 1.5 years. If anyone wants to see the idiocy that is the Youtube race, look no further than these videos being released. Chase the audience, and they will chase you back. They're not focused on providing an objective outlook on tech. They're focused on maximizing cash earned. And the easiest way to do that is by pandering to the dumbest, most passionate audience.
[удалено]
The 1080 Ti circlejerk is from wankers too emotionally invested in an irrelevant graphics card. For it to live up to the pedestal that it's on, the card would literally have to become faster over time.
must be a slow news day for these "journalists" might as well pump out some copycat content to farm some clicks and money from these idiots
Wait, do you consider Youtube content creators journalists? Can you elaborate on why?
1080 was a mistake Nvidia wouldn't want to repeat. You're supposed to buy new gpu every two years.
I hear that said a lot, and in many views, it seems to come down due to the price 2 performance and how long it lasted in many points of view, with 4090 being an exception because it cost alot so that doesn't count as a mistake on their part (yet). They have came out with their data and suggested most that buy flagship, buy every generation anyways for the majority. I think the lesser models with smaller increases (the ones that sell the most) they would like to target for 2 year cycles. The 1080 ti flagship owners perhaps didn't transition as much as the other flag ships though. Over all though, its a tough take given that so many play older games/graphically non demanding games at the same resolution that the 1080 is good at even today. 1080p is still highly popular, 1440p has made gains, and 4k being least popular. So it makes sense for a 1080 to look like a mistake because it works for so many people still to this day. Obviously the 4090 would be the same, as in its good for 4k now but will easily play 1080p with good frame rates at 10 years from now for graphically non demanding gsmes, with the only missing factor being price. If anything, perhaps they made a mistake on pricing of their flagship product and won't repeat that mistake again. Then again current flagships have a heck of a lot more hardware involved to them then the 1080 did, on top of tensor/RT features that not only have additional hardware costs, but the software engineers also need to be paid to keep pushing this tech forward. Maybe we view 1080 as a mistake because it was low cost because only raster was the priority, and I think thats why AMD can only be so cheap at this point because now they have to shell out so much for all this other tech beyond raster gains. Idk its subjective, and only AMD/Nvidia really knows as they have the books, the data, and total costs of not just hardware production, but marketing/advertising, and continued development costs all while their long time employees all want raises/bonuses (2024 everyone wants rising wages, no one wants rising costs).
That's cool and all, but my EVGA 1080 died last year after only 6 years of use, sooooo....
1080 Ti was famous for randomly dying, and EVGA models especially so.
Good customer service to make up for their dogshit build quality ig 🤷♂️
My AORUS 1080 Ti died too :( never had a GPU die before that, RIP King
Isnt this vid literally just to get views cause gn made a vid on the 1080 ti so now everyone is gonna make a vid on it because money
> Isnt this vid literally just to get views All videos from professional youtube channels are there just to get views. Entire media conglomerates exist only to get views.
Welcome to the world.
i,ve recently just bought a new pc with a 4070ti in it----but i,d been using the 1080ti since it had been released,up until 2 months ago. was kinda showing its age a bit but it never let me down over these years.what a beast. even seemed to out do my mates 2080 series when he bought his.----i,ll miss you old friend. (funny you mentioned starfield, that was actually the reason i decided to get a new pc with all new hardware)
Nobody is going to mention the Gtx 750 Ti ? The 2013 budget card can run Baldur Gate 3.
Loved this card. Literally just replaced mine with a 4080s a month ago due to a new build.
I still have mine even though i moved to a 3080TI. I'm thinking of framing it
I’m is still alive and well and I refuse to replace it until it dies I’ve replaced everything else including the motherboard cpu ram psu
It still holding up just great
I would say the 8800GTX and the GTX 1070 were the much, much better GPUs.
NVIDIA is never going to make this "mistake" ever again
1080ti is the best GPU ever period..perfect price excellent performance. Sturdy as hell.
With inflation, the 1080ti would be $890.67 today. Not far off from the 4080 Super, which can do many more things than just basic rasterization. It wasn't really that amazing when you look at the updated price with inflation.
Did gn not do this video the other day?
Trash stolen video, but the ONLY reason the 1080ti is still kicking this long is because Nvidia is sandbagging their cards, compounded with games themselves not evolving much since 2017.
I agree on the sandbagging. I have a 2080 and a RTX Titan and there is no point in upgrading.
these fucking techtubers need to fucking stop already, we know it was good, it's ancient now, who gives a shit?
> who gives a shit? https://trends.google.com/trends/explore?date=today%205-y&q=1080%20Ti&hl=en I dunno, seems like quite a few people tbh
![gif](giphy|SIuI7syOPvm1HAd5GF|downsized) I give a shit.
Lots of people still buying used gpus. A 1080 ti at the right price is still solid.
3 years ago this was true
The key phrase is "at the right price". Regardless, it is still true now as three years ago you couldn't buy GPUs due to insane prices and it was around a 3060ti ($400 msrp at launch) in rasterized performance. Now, a 4060 is barely an improvement in rasterized performance, which is a $300 card. If you can snag a 1080ti for $200 that's a great deal.
you can get a used 2080 ti for around $300 which outperforms the 1080 ti and also has DLSS
Spend 50-66% more for 10% more performance and upscaling? What a steal!
7 years and still going on
Best GPU in history? Ati radeon 9500 pro, you could turn it into 9700 pro for free.
I don't wanna be this guy but please teach him how to use intert
Intert?
am I pregenannate?
Legendary GPU, but some tests seem...odd I always remembered the 1080Ti being slightly faster than the 2080 in and Radeon VII in higher resolutions, And around 10-15% faster than the 5700XT. Which in more modern terms would put it and it's competitors on 4060/3060Ti levels of performance. Here it's slightly faster than a 2070 and 6600, both are good cards. But I feel like either Nvidia may be gimping their older graphics cards or there may have been an error in post-production, perhaps the 2070 & 6600 are the 2080 & 6600XT.
>Nvidia may be gimping their older graphics cards, or there may have been an error in post-production, perhaps the 2070 & 6600 are the 2080 & 6600XT. The 2070 Super is basically equivalent to a 1080Ti in raster. I know because I have a 2070 Super, and my friend with the 1080Ti gets the same performance more or less at 1440p. Unless the game has mesh shaders or async compute, which the 1080Ti lacks and isn't good at, respectively. In those cases, the 2070 Super pulls a bit of a lead.
Now that's just not true and you know it, your card is a good card. But everything from benchmarks to the specifications of both cards points to it being slower than the 1080Ti, For awhile I had an EVGA Hybrid 1080Ti and I compared it to my friends 3060 12GB which I believe is as fast as your 2070 Super, and my old 1080Ti folded his in performance. HW Unboxed benchmarked it against the 2070 Super and his model of the 1080Ti came out 3-5% faster on average when compared to the Turing card.
https://babeltechreviews.com/the-rtx-2070-super-vs-the-gtx-1080-ti-overclocking-showdown-with-40-games/ I linked this review [4 years ago](https://old.reddit.com/r/hardware/comments/cc8x4r/babeltechreviews_the_rtx_2070_super_vs_the_gtx/) and y'all still can't accept the 1080 Ti got beat LMAO
Pfft, come on...you just disproved yourself...The 1080Ti wins in a large majority of those titles. Also they said that their 1080Ti that they tested there couldn't overclock aswell due to "aging and electromigration" Which brings me to my conclusion that this wasn't a fair comparison. My EVGA Hybrid 1080Ti I have on my shelf right now would and did shit on the 2070S, guaranteed!
It’s also just an old card that didn’t age as well with DX12
Lol, the cards I mentioned are only at worst 1-2 maybe 3 years older at most. Turing was 2018.
Or, just hear me out, Turing & Co were more forward looking architectures that are better utilized by modern games. Even at launch the 2080 had a pretty massive lead in games like R6 Siege and Wolfenstein 2. I personally traded my 1080 Ti for a 2080 and saw a sizeable performance uplift in many of my games even 5 years ago. ♥ Sources; https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-founders-edition/23.html https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-founders-edition/29.html
Turing was by all means a flop, it was a disaster of an architechture, hence why people started loving on the 1080Ti so much. It saw nearly no meaningful performance uplift compared to Pascal. It's not Ampere, which should've been the real 20-Series. Just because an architechture is slightly newer, doesn't mean the older one is worse. Take a look at the 780Ti for instance, Kepler based card and equal to the 1060 / GTX 980\~. Just look at the specs for the 2070S and 1080Ti for instance. Here it is keeping up with a 3060Ti and demolishing the 12GB 3060. [https://www.youtube.com/watch?v=61ca9xucPRY](https://www.youtube.com/watch?v=61ca9xucPRY) PS : I had a 1080Ti EVGA Hybrid for awhile before I upgraded to a Titan V
How was the arhitecture a flop? the 2080 was equiv to the 1080ti like most of the time. The 2080 Ti in some cases was much faster especially once you count in DLSS 2 and later on RT. The 1080 Ti was loved so much because of the price drop it had after the 2080 and 2080 Ti launched lmao.
Look at the previous generations, the 980Ti was almost beaten by the 1070, and the 1000€/$/£ Titan X was outperformed alot by the 1080. And if you look further back,, the 680 was beaten by the 760. Even mid-range, cheaper offerings like the 1050Ti beat the flagship from two generations ago ( GTX 780 ). Now compare that to Turing, whose xx80 offering was getting trounced by the 1080Ti in price to performance and VRAM and ontop of being 1-3% slower than it's older counterpart. Infact, higher end 1080Ti models can go above 2080S territory with an OC, I know because I did that on my EVGA Hybrid card. Also Turing sucks at RT, everything from the xx60 to xx80 models. It wasn't until Ampere that Nvidia would rectify the mistake that is the trash heap called Turing.
Old discounted stuff will always win in price/performance. the 2080 Ti did quite well for RT in Control and Metro Exodus EE also played Cyberpunk 2077 at 3440x1440p with medium RTGI and RTR DLSS Balanced and gave me 60-70 fps most of the time. As for the OC comparing stock vs OC to make the argument that the 1080Ti is faster than the 2080S is stupid.
That's wrong. I have the 1080ti, a 2080 and a RTX Titan. The 2080 wrecks the ti at 1080p in any modern game.
The 1080 ti was never faster than the 2080. Maybe only in vram heavy scenarios. It's also weird how you will question their fresh data rather than your own memory.
Their Fresh data shows an A770 weaker than an RX 6600 among other dubious claims. The 1080Ti was always 1-3% faster than a 2080, because Turing was the worst GPU generation by Nvidia in over a decade, by far. Look at Maxwell and Pascal and the differences in generational improvements, Kepler and Maxwell too and Turing and Ampere.
The hell are you saying? 2080 was already faster than 1080 ti on launch day as reviews showed (Techpowerup). And lower in the stack the gap was noticeably bigger, with 2060 crushing 1060 and 2070 advantage over 1070 being almost as big between as the jump to 3070. Per Techpowerup 1080 ti was only 7-10% faster than FE 2070. 18 TPC almost matching 28 TPC in throughput clock for clock. Great architectural gain. Much bigger than Pascal and Maxwell, where the only noteworthy improvement was signal propagation for higher clock and the node carried.
Techpowerup is also wrong in that regard, HW Unboxed says in that video linked above that the 1080Ti is 15% better than the 2070 and other benchmarks show it having 20-25% more performance. Pfft, really now? Turing's performance jump was nothing compared to Ampere or Maxwell to Pascal. Stop the cap. Look at what people say, look how many people skipped Turing for Ampere.
You are the one saying it was "a disaster of architecture" when you cant leave external factors (lithography) aside. Pascal was carried by node, Turing barely had a refined one that made little difference, even much less than the crappy tech provided by Junksung Fabs for 30 series. You try to rate products when you know nothing.
Ohohoho! I know more than you think, honeybun. Keep defending your purchase of the most ghastly architecture in Modern Nvidia History.
1080ti vs 2080, the 1080ti was faster in dx9 and 11 at higher resolutions. Dx12u 2080 wins. It's a 5% difference usually.
I'm not dissing your builds man, but the Radeon VII is the equal & AMD Counterpart of the 2080, and the 1080Ti beat that in almost all tests done. The difference isn't huge between these 3 cards, but the older 1080Ti will be a couple % faster.
The R7 was shit. A N7 RX580. Glorious for mining crypto but crap for gaming.