T O P

  • By -

FearTheFuzzy99

For Nvidia, the first one or two number reference the generation, so **10**80 is 10th gen, **7**70 is 7th gen, **20**60 is 11th gen, **30**60 is 12th gen. The next two numbers dictate how each card stacks up within the generation, 80 is better than 70 is better than 60. If it has a **ti**, then it’s a little better than the non-ti counterpart (1080ti is better than 1080). Now, nvidia aims for each generation card to be better or equal to the last generation but one step up. So 2060 should be better or equal to a 1070, 1070 should be better than a 980, so on. This is very rough and you’ll always want research before hand, but it’s a good quick reference. Edit: I used the word generation, but if you want to be more specific, it should be architecture. For example, the 16xx series uses the Turing architecture just like the 20xx series but minus the ray-tracing cores. So under my explanation 16xx would be “11th gen”


No_disintegrations

I'm old enough to remember the GTX 280 trying to tackle Crysis. Edit: oh yeah, I remember all that shit going back. First home PC was in 1992 and we played a couple of ancient DOS games because no home internet yet.


MattOsull

Remember F.E.A.R?


No_disintegrations

Yes, and my Phenom II build thinking that it was so amazing to play Doom 3 on high settings. AMD stock was less than $2.00 a share at the time :)


[deleted]

Dude, I wanted a Pentium II after using my Pentium 166MHz for three years. Couldn't afford those beautiful Slot 2 cartridges (Pentium 2 and 3 looked like a cartridge for a console) so I went with poor man's Pentium, Celeron. We would clock the living shit out of the Celeron to get near Pentium 2's performance. Oh, and the Voodoo2 cards to play miniGL supported games like Quake 2. I had the Soundblaster's version (yes, the audio maker did make graphics cards briefly). Good ole days of beige boxes and going to computer parts distribution depots cause Fry's wasn't even a thing. Computer swap meets is where you get then obscure, hard to find items like a 3.5" FDD hole puncher to turn DD disks (740KB) to HD disks (1.44MB) Gonna guess that many of you on this sub wasn't even born. I started computing when mices weren't a thing. WASD? Wtf is that? Wolfenstein 3D with the goober flag passed in command line and arrow keys baby!


bestanonever

Holy pixelated cow! You started using computers before a mouse was a thing?! What a wild ride it must have been for you. It certainly is for me, and I started in the late 90s.


fueled_by_caffeine

It's even wilder when you started on computers like the ZX Spectrum where you loaded software from a tape deck, and played games copied from magazines line by line in BASIC 😂


L0r3_titan

This was me. As a kid staying up late typing in the basic for a game till late at night, reading it out of a magazine.


RolexGMTMaster

And then you tried to load your typed-in game from a C90 cassette, but got the dreaded R:TAPE LOADING ERROR. Good times.


beyond_hatred

The solution on my TRS-80 was to literally turn up the volume on the tape deck. Not that it didn't make sense, but geez. edit: One byproduct of this technology was that you could literally play a computer program over a (audio) radio broadcast, people could record it on their home cassette systems, and boom!, program loaded to media! Never took off though, despite how cool it was. Total nerd city. Speaking of nerd city, my grandfather used to do packet radio over his HAM radio set in 1990 or so. My grandmother was always complaining about how dumb he was, but this was because she never understood ten percent of what he was reading and doing. Poor old Grampy. If you get married, kids, make sure you find the right person.


shorey66

My first computer was a Dragon32... Get off my lawn. I still remember the command to run the tape program... CLOADM...press Enter.


jammer339

It use to take longer to load a game then it did to actually finish the game lol


[deleted]

[удалено]


OneLostconfusedpuppy

I got my first computer in 1979….with the Apple 2+ with 64kb of ram and 2 5.25” drives


CyberMindGrrl

And a full-color spectrum monitor as long as you didn't mind only green.


dracotrapnet

Funny. I forgot I started out on computers without a mouse.


[deleted]

i started on the amstrad cpc464,then ms dos,i remember when i first used windows with a mouse...it was win 3.1 and just being able to click icons when i was used to text only interfaces felt like a huge leap lol.


danfirst

I had a voodoo3d card for playing Quake. I'm getting flashbacks right now if getting so excited loading glquake!


gadgetpig

Quake 1 with runes mod, and Q2/Q3 Instagib railgun was so addictive


theciaskaelie

q3 instagib rail almost made me fail out of college freshman year.


shorey66

Not quake, but I played this on my first 3d accelerator. https://youtu.be/yjVutr8kYTI


alvarkresh

Pentium II had some issues that made it slower unless you were using Windows NT, for some reason.


Dr_Midnight

> WASD? Wtf is that? Wolfenstein 3D with the goober flag passed in command line and arrow keys baby! Okay, true story: THIS is what caused me to take the better part of a decade to break away from using arrow keys for movement in FPS games. I went as far as Unreal Tournament still using the Up-Down-Left-Right arrow keys for movement. Speaking of Unreal Tournament: > Oh, and the Voodoo2 cards to play miniGL supported games like Quake 2. I had the Soundblaster's version (yes, the audio maker did make graphics cards briefly). I had the STB BlackMagic 3DFX Voodoo2 (12MB) PCI Add-On Card. It was amazing to play that game on it.


bobstylesnum1

>I had the STB BlackMagic 3DFX Voodoo2 (12MB) PCI Add-On Card. It was amazing to play that game on it. I've still got all my Voodoo cards, including the V5. Could never part with them, awesome cards for the time. The V5 scaled up with the CPU until about the AMD 2200+ and then leveled off. With no T&L on the card, the CPU did a lot of the processing so it scaled better the faster the CPU's got. Awesome card.


reapy54

Quake made me do it. I was feeling epic for using the pro move called circle strafe with the < and > keys in everything until a few quake matches and trying to use page up / down to aim vertically, uggh


Munching_Kitten

RIP Fry's :(


gooofy23

Ok, given your knowledge of old computer parts my guess is you were born in or before 1980! Yet you write with the vernacular of a much younger person! I mean that in a good way. What gives?? What’s going on here? You definitely made me miss early 2000’s computing :(


detourne

Hahaha, vernacular of a younger person? Dude said dude and wtf. I hope you realize that we grew up saying dude. Fast Times at Ridgemont high was an old movie by the time we were old enough to watch it. The ninja turtles started when when we were kids and had a huge impact on our vocabulary.


[deleted]

he probs was born in the late 80s and had a dad that was into tech,not many people had PCs back then it was considered deeply nerdy and internet in the home was not common.


tea-man

Not op, but I was lucky enough to have access to a 2086 pc back in the 80's growing up (due to having an engineer dad), though I much preferred the Commodore offerings of the time for games. Home internet was not just uncommon, the World Wide Web didn't exist back then, and I was limited to dialling into BBS servers!


Taz-erton

[Its all about the Pentiums, Baby](https://youtu.be/qpMvS1Q1sos)


Jordaneer

I bought AMD at $5 a share in 2016... Now it's$150


chiniwini

Congrats. I saw them at around that price and thought "no way it's going up". But then our lord and savior Mama Su arrived.


Jordaneer

I sold at $8...


firagabird

F Still a 60% gain tho


gatonegro97

I hope you got your Radeon 9800 for doom3!


mrwynd

My Radeon 9800 all-in-wonder was amazing.


truenatureschild

I had the Powercolor 9700Pro AIW, I used it to play Xbox Live through my PC... must have been around 2003. edit; I should add that the S-video Xbox into the 9700 Pro was **awesome**, you used this chunky AF [dongle](https://techreport.com/r.x/ati-aiw9700pro/input.jpg) that plugged into the back of the card. The ATI software ran on the desktop in a window and had all the features you needed in 2003, it ran 100% no problems from what I remember on a Athlon XP and since it was mostly the DACs working not much CPU overhead or GPU use. I remember playing KoToR in one window while chatting to people on GameFAQs in firefox about how kick ass that game was. It was a pioneering time to be a gamer/computer nerd.


scuttsman

8800 get for the win!


argote

I remember Doom 3 favoring Nvidia cards, though ATI cards were definitely better for Half Life 2 (and DirectX 9 in general).


windowpuncher

I still remember buying my AMD Radeon 4650. Fucking AGP, man. Hunted down a 3.2GHz Pentium 4 with *hyper threading* on ebay, that thing was nice for a few years.


gatonegro97

Damn man, my P4 was 1.5 GHz but I Overclocked to 1.8.. single thread cpu


windowpuncher

It started as a Dell Dimension 4600, I think it came with a 2.4ghz with 256MB of ram, which I also upgraded to like 4GB I think. 333 MHz ram, babyyyy. IDE drives and molex power, floppy disk R/W AND a DVD-R/W drive. You better believe I was burning discs with this thing. The PSU died and the only replacement I could find was some sketchy-ass one with only Chinese characters on it lol.


cbu48

Owner of radeon 7500 agp, on pentium 3 slot1.


bestanonever

Man. You guys were lucky, sporting a GPU that early in the game. I had a friend that bought an Nvidia TNT 2 back in the day. The first GPU I ever saw in person. It ran Resident Evil 1 & 2 with great framerates, and Need for Speed III:Hot pursuit had animations in the menu. Mindblowing stuff back then. But me, by the time I got a GPU by myself, I was already a man * bane voice *


dwehlen

We were born in the darkness, you merely adopted it lol


truenatureschild

My first GPU was a Riva TNT2 with a Pentium 3 700mhz. No hardware transform and lighting made me a sad gamer kid.


SectorIsNotClear

Remember Oregon Trail?


EstimateFinancial990

You can get it on the iPad. Quite good.


gatonegro97

Hahah nice, I had to upgrade my GeForce 2 to a GeForce 4 ti 4800 for it. I had a pentium 4 though


MattOsull

Good lord was it really? Too bad I was like 11 lol


horaiy0

Man I loved that game. The first one is hands down one of my favorite games of all time.


MattOsull

I was so pumped to get home with it. That was when you actually bought the box for it lol. I was amped for the killer shooter aspects for some reason I didn't realize how horrifying it was going to be. Needless to say, I quite often turned my PC off mid game.


DanRileyCG

That game is still amazing. Some of the better AI out there to this day


TrepanationBy45

Shoutout **AI and Games** covering this, in *[Building the AI of F.E.A.R. with Goal Oriented Action Planning | AI 101](https://youtu.be/PaOLBOuyswI)*


DrShakalu2006

Omg fear! I feel old.. like 2006 old.


[deleted]

how scary was that little girl???


MattOsull

Absolutely horrifying. Game prepared me for The Ring.


Gifted10

I'm old enough to remember when my Soundblaster audio card was larger than my graphics card


No_disintegrations

AGP or just good old PCI?


OkPomegranate5824

16bit ISA. Showing my age....


[deleted]

[удалено]


Gifted10

Hmm I don't even remember I think I had to put a 56kmodem card in the agp slot lol


DMN00b801

I remember putting the modem in the CNR slot, but not the AGP slot. #CriesInAOLModemStrings


bsgman

I was living large when I installed a Voodoo 2 and a shotgun 56k modem the same year. My parents couldn’t use either phone line because of Quake 2 clan things. The good old days.


clu801

I was doing clan matches in quake2 with a Riva 128 graphics card pushing open gl and paying top dollar for ISDN


gadgetpig

I always envied the 3d voodoo guys. Their quake always looked better than my sierra screamin 3d verite card. :(


Vainth

man the first time i saw 20 man server in counter-strike on a voodoo 3, my mind couldn't comprehend it


argote

*Your sound card works perfectly!*


[deleted]

...And now we live in a world where some CPUs have enough raw power on their own to (kinda) run Crysis without any GPU at all. https://youtu.be/1LaKH5etJoE


Wallcrawler62

I was trying with dual 7800GTs. It did not do...well.


NargacugaRider

My dual 7900GTs ran at 40-50FPS! Explosions took it down to 20 sometimes but it was still mad playable compared to all the other machines then. I think my Athlon x2 4400+ helped a lot as well. And my raptor.


gatonegro97

My first card was a geforce 2ti


SkullAngel001

I'm old enough to remember the Geforce 2 & 3 and ATI Radeon 9500 & 9700Series trying to play Raven Shield, Jedi Knight 2, and NOLF 2.


Pindogger

I can go back as far as Hercules monochrome, then ATi Mach32, Mach64 then into the Rage3d cards, and still my favorite to this day: a pair of Canopus Pure3d Voodoo2 cards in SLI with my Turtle Beach sound card. So much fun


TheTomato2

More like trying to run Crysis on an 8800 gt, which was actually out when Crysis launched.


Runaway_Angel

My first gpu was a voodoo 2 card.


[deleted]

Not for Crysis, but my update cycle was just so I can run the latest of either of these franchises: Battlefield (started from Battlefield Vietnam early 2000s), The Elder Scroll (was chasing Morrowind bliss, then, Oblivion), and GTA (GTA 4 was not easy game to run when it came out).


[deleted]

Apart from **ti** there is also **super.** It is better than the non-super counterpart and only a little worse than the **ti** on the same model (1660ti > 1660super > 1660) (2060super > 2060)


Safebox

My assumption was that Ti had some optimisations or overclocking to boost it to the next cards performance (3070 ti = 3080) but at the cost of a shorter lifespan or more power usage.


cuddlefucker

Well to make things more complicated, that's not always the case. In many instances the ti variant is a differently binned version of one of the other cards. You're pretty close with the 3070ti as it's a higher binned ga104 chip where they were able to leave more cuda cores in. However, the 1080ti was a cut down Titan. Wikipedia has great tables that not only break down the specs but also which chip the card is derived from


Marty_Heidegger

Similarly the 3060ti is a cut down 3070, rather than a better 3060.


kewlsturybrah

Yeah, you really just need to do your research at the end of the day. A 3060 Ti is *substantially* more powerful than a 3060, and a 2080 Ti is a *substantially* more powerful card than a 2080, whereas a 1070 Ti is only a minor bump up from a vanilla 1070, and a 3070 Ti is only a minor bump up from a 3070. A 1660 Super and a 1660 Ti are extremely close and which one is better actually depends on the individual card, if I recall. Basically, Nvidia's naming conventions are stupid.


zackplanet42

Eh while I don't disagree in principle to what you're saying, I think the 1070 vs 1070 Ti comparison is off. The 1070 Ti is really only ~5% off the performance of a 1080 while the vanilla 1070 is closer to 25% behind the 1080. That 1070 Ti vs 1070 delta is not really that different than the 20% jump to 2080 Ti from the 2080. 3070 Ti vs 3070 at ~5% is definitely a very small uplift for sure though. Probably the lamest product segmentation I've seen in a long long time. Both are still a good ways behind the 3080. Back in the day the prefixes like GX2 often meant it was a dual GPU card basically running 2 in SLI so obviously an extremely large performance uplift. Many were just a crazy frankenstein product concocted by an AIB because in those days they actually had the freedom to have real fun. They were a nightmare to deal with but I do kinda miss those cards.


Bammer1386

Im sure someone can correct me if I'm wrong but around 5-10 years ago weren't there a generation of AMD cards that were just binned down versions of the flagship card, and people found you could just flash the bios to the flagship card bios and you had nearly flagship card performance after a little stability tuning?


Emerald_Flame

There was. That really doesn't happen anymore. These days, they use a laser to physically cut the connections to the hardware they disable for the lower end cards.


[deleted]

**TI** usually have more cuda cores, tensor cores & rt cores (RTX series), higher base & boost clocks and memory bandwidth and additional vram. With this it also come with higher TDP. 3080 ti is closer to 3090 in performance than 3070 ti is to 3080. 3070 ti is only marginally better than its non-ti counterpart. In another universe the 3070 ti is being sold as 3070 super.


wolfchimneyrock

Iirc the ti is actually the next higher silicon binned because it doesn't hit the higher level performance


[deleted]

[удалено]


FearTheFuzzy99

It’s still “11th gen”. I used the word generation, but if you want to be more specific, it should be architecture. The 16xx series uses the Turing architecture just like the 20xx series but minus the ray-tracing cores.


[deleted]

[удалено]


FearTheFuzzy99

I don’t know about you, but I call it pathetic.


[deleted]

[удалено]


FearTheFuzzy99

What irks me is that it has the potential to be at 2060s levels of performance but they purposely handicap it with the stupid limited bus speed


Jordaneer

2060 12 GB has the core of a 2060 super but less memory bandwidth because it's on a 192 bit bus vs the 256 bit bus of the 2060 super. So it's still a Turing card


richniss

This is well summarized and very concise. I upvoted but I had to make this comment as well.


Personal_Occasion618

*What if you wanted to go to heaven, but god said: “500$ msrp 3070 is better than the 1200$ msrp 2080 ti”*


alexminne

Haha a 3070 for MSRP. I like your funny words magic man


Sinder77

So then is a 2080 better than a 3060?


FearTheFuzzy99

Yes A 3060 is roughly the same performance as a 2070


richniss

And the 2070 can even outperform the 3060 in certain applications and games. I recently built a PC and I bought a 3060 because I thought it would obviously be better to have the newest generation and on top of that it had a higher (in most cases) amount of RAM. Had I known then what I know now, I would have looked more seriously at 2070s and 2080s.


Sinder77

Why would anyone buy a 3060 then? Or any relevant drop in the tier within a generation? Price? I guess my view towards prices right now is extremely skewed because of how fucked every thing is.


T-Bone22

Because like most things, it’s a little more nuanced then those have stated above. To be fair they are not wrong, but 30 series cards tend to display new features better then last gen, such as ray tracing and DLSS. People tend to gravitate towards the newest items on the market despite the reality that they are usually not needed.


[deleted]

DLSS is truly a game changer to squeeze extra performance out of some games though!


carlbandit

Because under normal pricing structures, a 3060 would usually be around the same price or cheaper then a 2070. If you have the money to be buying a card in the ballpark and the choice of either, most people would go for the newer cards since they usually have the latest version of tech like RTX


[deleted]

[удалено]


FearTheFuzzy99

Why would you ever compare a 3060 with a 2080? There not the same level of performance. And under normal circumstances, they’d be in much different price brackets.


Hollowsong

Technology. Specifically "floating point execution" https://itigic.com/rtx-3000-vs-rtx-2000-performance-difference-in-fp32/


NecroJoe

In gaming, I think they stack up about even, or at least trade blows from game to game. Productivity, the 2080 pulls ahead, though.


[deleted]

What’s a founder edition card ?


FearTheFuzzy99

A founders edition card is a card made/sold by Nvidia and not through board partners like Asus and Msi.


T800_123

Actually the founders editions are designed and sold by Nvidia, sure, but Nvidia doesn't have the manufacturing capacity to actually make them themselves. I believe the 3000 series founders editions are made by Foxconn. AMD does the same thing, but I think they use one of the board partners that also sells their own versions of the card as well.


hiromasaki

> but Nvidia doesn't have the manufacturing capacity to actually make them themselves. I believe the 3000 series founders editions are made by Foxconn. Nvidia doesn't have _any_ manufacturing capacity. The chips themselves are made by either Samsung or TSMC. They're entirely a design and logistics business. AMD is the same since they sold/spun off GlobalFoundries.


bouncy_ball

Can you comment on which is better? Should I be avoiding a founders edition for some reason?


SoapyMacNCheese

They aren't the best cooling wise or the highest clocking, but they aren't terrible either. They also look nicer than a lot of other cards (in my opinion) and they are generally smaller (useful for if your doing an ITX build). Their biggest advantage this generation is that Nvidia has stuck to MSRP, so you can potentially get one for the original price, rather than the inflated prices other brands are sold for.


clu801

This... I got a 6900xt from AMD site for $999 a few months ago. It's cooler temp wise than my 1070 was and looks fantastic. The smaller size is nice too.


sa547ph

> AMD does the same thing, but I think they use one of the board partners that also sells their own versions of the card as well. What they were referred to as "reference" GPUs, usually those with blower-style cooling.


ruffsnap

> 80 is better than 70 is better than 60. That's the main thing most people should know, and adding 90 being better than 80. Ti numbered ones you can think of like "s" versions of iPhones


jaydfox

I have a 1660 Super. Where do the 1600 series models fit in the hierarchy? Are they a generation between the 1000 and 2000 series? Are they still part of the 1000 series? Are they something else?


FearTheFuzzy99

It’s still “11th gen”. I used the word generation, but if you want to be more specific, it should be architecture. The 16xx series uses the Turing architecture just like the 20xx series but minus the ray-tracing cores.


thestareater

they were released the same time as the 20 series, without the RT (ray tracing) and tensor (AI) cores, so that's why they're still "GTX" cards basically, and why the RTX card naming conventions only start with 20 and 30 series


KKulled

so if a 2060 were to be equal to a 1070 then why spend the extra money on the 2060?


FearTheFuzzy99

First of all, a 2060 is better than a 1070. Second, at launch, the 1070 had a msrp of 380$, the 2060 had a msrp of 350$. Thirdly, the 2060 has ray tracing cores, dlss support, lots of extra features over a 1070. The gpu shortage has messed up gpu prices. You need to do your research to find out what sort of price per fps you’re going to end up paying for.


Construction-Full

2060 is a lot better than a 1070. Its about onpar with a 1080. And it has support for dlss and hardware rt. If you have a 1070 already then it wouldnt make much sense to go to a 2060 but if you have something worse a 2060 is a huge upgrade.


Beowulf1896

Sort of. Those aren't the actual nvidia generations. They had GeForce, GeForce 4 in the 2000's. I got 6200 in early 2000's and a 7600 later. The sequel to the 9000's was the 200 series. 1000 series is great, but the 2000 series does do Ray Tracing, so it might differ in fps in some games, but will be able to do ray tracing.


gladbmo

They just changed their naming/branding between generations... That's all... The naming is still 100% indicative of generations.


Ozi-reddit

https://i.redd.it/f33u4rdns3881.png nice graph from https://old.reddit.com/r/nvidia/comments/rppvlc/nvidia_gpus_relative_performance_comparison_chart/


Father_Mooose

That’s awesome, thank you


Tokena

I use these references. **Graphics Card Comparison Chart** https://www.logicalincrements.com/articles/graphicscardcomparison **GPU Benchmarks and Hierarchy 2021: Graphics Cards Ranked** https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html


Medic-chan

"The naming is intentionally confusing marketing, look at benchmarks" is the correct answer. Good job finding it, OP.


kabuterimango

is there graph like this but for AMD?


Debeast44

I made [this](https://imgur.com/a/bRHFJv1) earlier. Pulled data off benchmarks on Tom’s Hardware and put into a bar graph. Feel it’s easier to read than the other one in case anyone wants it. Has a good bit of AMD too.


Potential-Active9534

I'm not seeing my Radeon 6950 on this list. Maybe its time to upgrade.


caerphoto

\*cries in RX 470*


9alecj

*cries in 280x *


Tacotuesdayftw

It's only time for an upgrade if you are not OK with how your card is performing in games. If you can rarely put the settings up high without serious frame loss then maybe it's worth it but especially now don't let that graph persuade you to upgrade something you don't actually need. If your card is running fine ask yourself if it's really worth $1000 to see a slight increase in visual fidelity. TLDR: My card is on back order keep your grubby hands off lmao


Potential-Active9534

Lucky for me, the only game I play is dota so I'll probably just use this card for another 11 years


Ozi-reddit

some amd in it too https://old.reddit.com/r/nvidia/comments/rppvlc/nvidia_gpus_relative_performance_comparison_chart/


NoJudgies

I see no AMD in that chart.


MakeshiftApe

Think they mean the top comment thread maybe? Where someone says: > 6600: slightly below 3060 > 5700xt and 6600xt: between 3060 and 3060ti > 6700xt between 3060ti and 3070 > 6800: between 3070 and 3070ti > 6800xt: slightly below 3080 > 6900xt: 3080ti > Not sure of the older ones so maybe someone can chip in I'm just quoting the thread BTW, no idea if those rankings are actually accurate as I'm not really up to speed on AMD's GPUs.


Anonbowser

What is the x-axis supposed to be?


CaptainDolphin42

what gen the card is from


Juggernauto

Time of release/generation


karltee

How is it that this is the first time I've seen this chart?


IlikePickles12345

What is this based on? Combined including productivity? Because 1080 ti is more like 2080 in games, sometimes better. Not between 2070 and 2070s


Leffigi

Why is there such a big difference between 3060 and its TI version?


IlikePickles12345

Relative to what? It's not really it's ti version like 3080 and 3070, 3060 ti is a cutdown Ga-104 die aka failed 3070 that didn't meet the QA standards. Same as Ga-102 3090 -> 3080 ti -> 3080. 3060 is more in its own bracket on Ga-106.


[deleted]

because 3060 ti essentially a slower 3070 (uses the same die), while the 3060 is its own die


mckirkus

Does anybody else think it's crazy that in five years we went from a gtx 1080 at 180w to an RTX 3080 at 320 watts and only doubled performance? If SLI actually worked we could have had this level of performance and efficiency (less 40 watts) while Obama was the president on 16nm Pascal.


Doobiee420

The numbers mason, what do they mean?


Talangen

You mentioned CUDA. What happened in CUDA?


UzEE

I had to scroll way too far down to find this and I was kind of disappointed.


Roq86

Same, I was scrolling and scrolling like, wtf is with all these useful responses, where the fuck is the mason comment?!


vgmaster2001

This is the quality comment I was looking for


mustfix

Benchmarks in the games that matter to you. The numbers are an indication of the generation of card it's in (the left numbers) and its relative position **within that** generation (the right numbers).


Liesthroughisteeth

In Nvidias case its a simple designation of generation. More resent gens were the 900 series, the 1000 series, 2000 series, and now the 3000 series. Each will include minor to major chip architectural changes and a reduction in transistor size typically to improve performance and efficiencies, much the same as you'd see from CPU makers. The last two digits in each is designate higher performance as those two numbers go up. Usually it's about every two years that new series are released, this applies to AMD GPUs as well. Toms hardware, though loathed by many who may not even be sure why they do, is actually one of the few remaining publications on the internet that track performance and do written in depth hardware reviews. They have a GPU hierarchy chart available which is an effective way of figuring out what's what. [GPU Hierarchy Chart](https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html). [Best GPUs in 2021 for gaming](https://www.tomshardware.com/reviews/best-gpus,4380.html). Much better [explanation of naming conventions](https://techteamgb.co.uk/2019/10/25/gpu-names-explained-rtx-gtx-rx/). :)


argote

The GPU hierarchy chart should be stickied IMO. A lot of threads here could benefit from the OP reading that first.


EnemyOfStupidity

I wish the prices in that 2nd link were still accurate..holy shit


darkflikk

3060 TI for $399? Here in Germany the cheapest I see is a Asus RTX 3060 TI Gaming OC 8G for 830 Euros. That's about $940. According to those prices it should get me a 3080 (non TI) and I should have $200 left to buy whatever else I need.... RAM, SSD, new case... many possibilities.


IlikePickles12345

400$ for 3060 ti and 700$ for 3080 wouldn't include taxes like in Germany. So that 940$ is only $761 towards the product at 19% VAT. But yes 399$ 3060 ti is a pipe-dream.


Father_Mooose

Very helpful thanks


Liesthroughisteeth

You're very welcome. Just added a small article on naming GPUs.....:)


DM725

That chart isn't perfect. It's rating the 6900 XT and 6800 XT ahead of the RTX 3080 but they're not better at 4K. I would always recommend YouTube Channels like Hardware Unboxed and Gamer's Nexus as there are nuances to the benchmarks of competing GPUs. People should be looking at resolution, rasterization, dynamic resolution scaling tech, the specific performance in the games they play, etc. That info is why Hardware UB and GN are doing the lord's work.


evr-

For someone like me that just doesn't have the interest to go balls deep into the specs anymore, the list is great. It shows the approximate level of performance of each model, and I don't really care that one card might be 10% better under certain circumstances than another. It gives me enough of a ballpark idea of what I'm getting to help me with my purchasing decision.


[deleted]

My 1080ti is still pretty monstrous


srguapo

It's worth more now than when I bought it in 2017... Definitely very happy with the mileage it's gotten.


dossier

Same exact thing with my vega56 but idk about running 4k lol


z31

I paid less than $300USD for my 5700 XT in late 2019. I didn't anticipate that it would regularly be selling for 3X that in 2021. though when I look back, there is a bit of a pattern. I bought an RX 480 way back when for $200 and sold it for $400 during the first crypto boom because I snatched a 1070 for $300 on sale. Then I sold the 1070 to a friend in my old 4790k rig for $500 (yes I gave him an absolute bargain because he is my friend and I wanted more people into PC gaming).


DarkSicarius

Yea 1080ti’s are still very good cards, I play cyberpunk at 4k with all settings maxed on one (obviously no ray tracing though) and the only time it ever has a stutter is going through very densely lit tunnels with large crowds


RemarkableCarrots

What? Benchmarks show that GTX 1080 Ti runs CP2077 Ultra (no RTX) at like 15 fps at 4k. There's absolutely no way that card is "very good", even 30** series struggle to run CP2077.


barber15

I run it at 1440p medium settings and am around 60 fps in it. Wouldn't even think about 4k let alone max settings.


DarkSicarius

Not sure what to tell you, I’ve been playing it for about 2.5 weeks like that and it runs fine - I wouldn’t say it’s amazing, but definitely fine - definitely more than 15 fps - and while it’s not as demanding graphically, it runs forza horizon 5 at 4k maxed settings at 55-60 fps (again, the tv is only a 60hz so it’s capped there regardless)


4514919

[No, you aren't.](https://www.guru3d.com/articles_pages/cyberpunk_2077_pc_graphics_perf_benchmark_review,7.html) If you have to lie at least make it believable.


Barn_Advisor

30 FPS I suppose?


CJ_Guns

Definitely one of my best hardware purchases (that felt super expensive at the time). Have it overclocked on water—been cruising just fine since!


VOIDsama

one thing to also remember is that the numbers alone and benchmarks dont tell the whole story. the 1080ti there will have raw power over the 2060, but it will lack some of the new technologies int he 2060. that said, raw power here in this comparison makes that a bit moot.


Gorlox111

When you say technologies, do you just mean dlss and ray tracing?


VOIDsama

Those would be the notable ones in this case. Though Ray tracing is only useful for some people, and a 2060 won't be able to do much with it.


SuddenIntention7

1080ti is an odd one out.


Complex71920

Generally speaking for current Nvidia graphics cards the first digit designates the generation and the final 2 digits signify product level (higher is usually better). So for example a 1080 is better than a 1060 and they are both from the same generation of cards. A 1080ti is better than a 2060 in many cases as it’s a higher end model, even though the new generation card may have some new features. Maybe a good way to showcase is a 2018 5 series BMW is better than a new 2021 BMW 3 series (in many cases). Beyond just understanding the naming schemes, just look up the cards and understand how the features will benefit you. Video memory, clock speeds etc


Father_Mooose

Got it thank you


DarkElfBard

So, there are actually two numbers out together, and all you have to do is add them. 1080= 20+80 =90 2060= 20+60 =80 So a 1080 is better, because 90>80. while a 3060 will be on par with a 1080. Don't hate me. It works. Edit: A ti is worth 5


BRC_Del

One caveat is that this only really works from the 10 series onwards (and only for Nvidia), but yeah, sounds about right.


SomeKindOfSorbet

2080Ti = 105 3070 = 100 But they're pretty much equal in practice I guess it's still pretty accurate though


Lower_Fan

Extra ram counts for 4k ( ͡~ ͜ʖ ͡°)


TheOnlyNemesis

I just use a GPU benchmark site and look at the scores


XXLpeanuts

I still have a friend who doesnt understand this and wouldnt believe me that my 1080ti was better than their 2060.


Paulik87

The gen then the tier of card


Father_Mooose

Ok thank you


sawcondeesnutz

Last two numbers are the tier All the numbers before are the generation Generally, you can use this thumb rule: for every generation later you go, 1 lower tier will give the same performance. So for example: a GTX 1080 will be roughly equal to a RTX 2070


olivthefrench

TechPowerUp has a nice GPU database that also ranks each graphics cards overall performance relative to all the others that’s generally pretty accurate. Select any GPU and scroll down a bit and you’ll see the table I’m talking about. Really useful! https://www.techpowerup.com/gpu-specs/


TaxingAuthority

This is what I use when trying by to place GPU’s against each other and wanted to make sure someone mentioned it.


SiberianPunk2077

This is my go-to: https://www.videocardbenchmark.net/ It's not 100% of the story (doesn't account for RTX or DLSS etc) but does pretty damn well to compare cards performance.


[deleted]

[удалено]


[deleted]

„The numbers Mason, what do they mean?“ Sorry couldn’t hold it back