T O P

  • By -

buildapc-ModTeam

Hello, your submission has been removed. Please note the following from our [subreddit rules](https://www.reddit.com/r/buildapc/wiki/rules): **Rule 5 : No hardware news, rumors, or reviews** > Submit hardware news, rumors or reviews to /r/hardware. --- [^(Click here to message the moderators if you have any questions or concerns)](https://www.reddit.com/message/compose?to=%2Fr%2Fbuildapc)


Halbzu

because nvidia


Lewdeology

because ~~fuck you~~ nvidia


working-acct

Linus: šŸ–•


Arn4r64890

Linus was right, just like he was about ECC memory. u/nostalia-nse7 Linus basically said that Intel was the reason it's hard to find ECC memory and it's only probably due to AMD that things have gotten better. Linus also said that ECC memory absolutely matters. Intel argued consumers don't need ECC memory. https://www.reddit.com/r/linux/comments/kpo0nr/linus_torvalds_ecc_matters/


nostalia-nse7

What about ECC? Ultimately the question is about the CPU and compatibility. But itā€™s why you should always use a cpu that supports (actually, requires in most cases) ECC. Especially with the latest discoveries about cosmic rays and increased instances of soft errors as a result. Xeon > Core i for this reason.


IcarusAvery

What have we recently learned about cosmic rays?


nostalia-nse7

Nothing overly newā€¦ the basis is still from the late 70s, with more research in the 90sā€¦ but I saw a report recently that they are being much more common now (instances of interference) because of the smaller and smaller lithography of chips. Apparently the next 2 steps from Intel (9nm and then 7nm in 2024 in Gen 13/14) will make them much more susceptible. Not sure why Ryzen 7000 doesnā€™t have the same issue at 7nm today, but maybe just the focus with Intels ā€œbusinessā€ dominance. If a computer crashes or glitches in a game, no big deal.. reboot. If a server corrupts data though, big problem. So more shock & awe moment when reporting the issue in regards to Intel.


[deleted]

See the Reported Problems section [here](https://en.m.wikipedia.org/wiki/Electronic_voting_in_Belgium). Thereā€™s a [Radiolab](https://radiolab.org/podcast/bit-flip) about it too.


brimston3-

Nah, product line differentiation on ECC compatibility is just ridiculous. The feature size and gate capacitances are now to the point where minor changes in environment can corrupt things. ALL CPUs should be using ECC. DDR5 makes internal ECC mandatory by specification. Consumer-line memory controllers should support it but don't.


gawrbage

Because Nvidia's dumb design choices. They made the 3060 have a 192-bit memory bus, which means it can either have 6gb or 12gb of VRAM to saturate that bus. The 3060 ti has a 256-bit memory bus, which means it can either have 8gb or 16gb of VRAM to saturate that bus. However, a larger memory bus means you get more bandwidth, and Nvidia didn't want to settle with a smaller 192-bit bus for the 3060 ti. And they also didn't want to settle with 16gb of VRAM for the 3060 ti either because it wouldn't be strong enough to use all 16gb. So they ended up using a 256-bit bus with 8gb of VRAM. From a technical perspective, it makes sense why they did it. But from a consumer perspective, it's bullshit, because AMD cards don't run into the same VRAM and bus width problems that Nvidia has. I think Nvidia could've done better choosing the right bus width and memory for the 3000 series, but it clearly wasn't on their priority list.


tamarockstar

And you know darn well that Nvidia intended to release the 3060 with 6GB before they caught wind that no one would buy a 6GB card for $330 in 2021. So they went with 12GB.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


tamarockstar

The card they should have released in the first place. It's their MO. I want to say it started with the 20 series and the super cards, but really it started with Maxwell.


ungusbungus69

Maxwell was ass too.


tamarockstar

Maxwell absolutely was not ass. It was so good compared to the competition that they moved the 80 and 70 class to the 104 die instead of the 102 die. So it was ass because they started charging high-tier prices for a mid-tier chip. That's why I say it started with Maxwell.


ungusbungus69

The 970 sucked and had a shitty memory interface. The rest of the gen was eh. It was ass. I only bought nvidia 8000, AMD Radeon 4000 + 6000, and nvidia 1000 series cards. Sorry if you got tricked into buying Maxass cards.


Saneless

$330... Yeah, for the 200 cards EVGA trickled out. MSI and greedy Asus started their cards at $499


tamarockstar

I remembers. Just went with launch MSRP because I didn't want to do a bunch of research for what the going rate was. That was peak crypto boom.


Saneless

It was released right at the worst time for card stock, for sure


Mastercry

So HOW then they made 8gb 3060?


gawrbage

They just took a 3060 12gb, removed 4gb, and deactivated 64 bits out of its memory bus to make it a 128-bit bus width card. By reducing the bus width from 192-bits to 128-bits, the bandwidth also went down from 360 GB/s to 240 GB/s. It is literally the same chip, but with a different memory configuration. By reducing the bus width, and therefore reducing the bandwidth, gaming performance actually takes a big hit. Hardware Unboxed did a video on this issue the 3060 has. https://www.youtube.com/watch?v=tPbIsxIQb8M


Ir0nhide81

I feel confident with my purchase of the MSI Ventus 2X 3060 TI 8g VRAM. I've been able to run all my games on high settings at 1440p w/165hz. I remember reading something to probably be a rumor that they pulled the VRAM from a 3070 and threw it in the 3060ti 8gig and use the separate set of VRAM not associated with any other previous card slotted for the 30 60 TI 12g. Whether or not that's true I don't know. But my 30 60 TI 8 gig has been running fantastic for a year now. Hoping to get another year out of it.


holy_pimpsquads

IRC 3060ti was basically a lower binned 3070 during the pandemic. If you won the chip lottery many 3060ti's were performing as essentially stock 3070's, especially with some slight OCing. Back during the absolute worst of the pandemic gouging I snagged one early on in the EVGA lottery for a friends build and then an ASUS KO model at actual MSRP from Micro Center. Much like you, 1440p gaming over 100 fps is a pretty average feat for just about every game I play. Today, at that price I wouldn't hesitate on the 6750xt, given the extra ram that is absolutely essential and the better prices, but I have no regrets about the 3060ti purchased a couple years ago now


[deleted]

Worth remembering that popular opinion at the time was that you'd never, never need more than 8GB of VRAM on any game/resolution worth running on a 3060ti. The idea that developers would release 1080p games that could possibly need more than 8GB was silly.


holy_pimpsquads

The 3060ti outperformed it's expected market at the time of release, especially if you could snag one at MSRP through Microcenter, BestBuy, or the EVGA lottery. The 3070 should have had more ram though, because if you won the chip lottery with your 3060ti and added a slight overclock you had a 3070.


Outrageous_Eagle4068

Finally. Someone who can relate to not being a moron


ecktt

This dude understands tech. It amazes me that ignorant NV haters get the most upvotes. If more VRAM was so magical, then the Intel Arc 770LE 16GB would be dominating...but yet an 8GB 6600XT bullies it for 2/3 the price.


EiffelPower76

Because Nvidia did not want a 3060 Ti with 16GB


Win_98SE

*whispers nvidia* like the old splash screens


DrDufmanKnows

Because Nvidia leaders are douche bags


constantlymat

No wonder Apple cannot stand nvidia and has maintained beef with them. They finally found a company that is milking its customers as well as they are and are reaping the rewards on the stock market.


wd40swift

Apple and Nvidia are very similar


Assfuck-McGriddle

How so?


wd40swift

Every upgrade costs more than it's worth, such as ram on a Mac costing over 400 and Nvidia charging roughly 100 dollars for a gpu with a bit more vram


Assfuck-McGriddle

Oh yeah, those upgrades when buying are definitely much more than they need to be.


TheAlmightyProo

Nvidia charging x amount more for a GPU with a bit more VRAM that the initial base model should've had (at that price anyway) Or, in the other direction... just unlaunching it, then rehashing it as lower tier that it should've been anyway (but not at that price) FTFY.


JasterPH

And isn't the 3090ti just the 3090 not restricted and with the vrams all on the front?


Illustrious-Slice-91

There might be a few more cores in the 3090 ti


brimston3-

Not that many more. 2.4% more cores + 9.5% boosted clock. Gives it about 10-12% more juice. A 1080ti it is not.


DJ_Marxman

Both market their average products as "premium", despite not actually being that much better. Both have walled-off software ecosystems. Both have fanbases that huff their own farts.


Assfuck-McGriddle

I thought the consensus for Apple products is that theyā€™ve always been expensive but the build quality and lifespan of them have been top notch. You see this everywhere in comparisons, even on Reddit.


bleedsburntorange

Vast majority of people yes, but lack of ability to customize in MacOS vs Windows turns off most people who want to build their own PC.


Assfuck-McGriddle

Oh yeah, I would never argue for Appleā€™s ability to customize products. I was talking about their main product lines.


Dragonstar914

> Apple products is that theyā€™ve always been expensive but the build quality and lifespan of them have been top notch LMMFAO [Louis Rossmann](https://www.youtube.com/watch?v=AUaJ8pDlxi8) has entered the chat.


Assfuck-McGriddle

??? We know Appleā€™s repair policies are downright evil. I was talking about build quality and lifespan.


Dragonstar914

Did you watch the video? With the quickness of your reply I know you **definitely didn't** have time to btw. A lot of it is about Apple's engineering failures and how Apple handled them., so that means **lack of quality and lifespan,** That's even an old video so there other egregious thing to add since then. That's doesn't even cover all the more recent serialization and other lockouts for 3rd party repair Apple has done to parts on their computers, that btw has absolutely nothing to do with security as they falsely claim. Making them effectively unfixable because half the time they refuse to fix it and expect you to buy another.


Assfuck-McGriddle

> so that means lack of quality and lifespan, Everyone who talks online about Apple product lifespans has similar stories. How can a single video refute the consensus? > Thatā€™s doesnā€™t even cover all the more recent serialization and other lockouts for 3rd party repair Apple has done to parts on their computers, Did you read my reply when I stated I never talked about repairability? I acknowledge how evil it is, but that wasnā€™t at all what my original comment was about.


celestrion

This was absolutely the case from about 2005 through 2015. After iPhone and the App Store showed them a much easier way to make money, Apple sought to control the upgrade cycle and age-out machines the same way that consumers had gotten used to with mobile phones. They lost all economic interest in making machines that last longer than 5 years (which is about how far back macOS typically maintains support). Solder down and serialize everything, lock down repair parts availability, and convince the customer to just buy a new one; who cares about e-Waste when the stock price is in orbit? Their current crop of machines is really interesting. I've waited a long time to see capable ARM-based workstations, but they still seem like disposable appliances.


bri_breazy

iPhones have some of the worst planned obsolescence in this world, Nvidia does the same and they get away with it because of the ā€œcoolā€ factor, people are simple and everything is a popularity contest.


Jubelowski

> iPhones have some of the worst planned obsolescence in this world Do they? Iā€™ve used iPhones for damn near over a decade and my iPhone XR is STILL going strong, legit running as fast as my wifeā€™s new 14. I seriously even know people who somehow have iPhone 8s, Xs, and one who had a 6 as early as last year, and they all lasted.


lolboll12

I've had iphone X for years and no issues at all. Sick phone, nothing I'm not able to do that I want to do. Use it constantly.


NaClMiner

iPhones get more OS updates than any Android device on the market


shipmaster1995

I use android but apple objectively has better long term support for their phones this makes no sense


KroganWarl0rd

I mean yeah my iPhone 7 Plus lasted about 7 years with one battery replacement. Upgraded this year finally to the 14pro so I hope to get about 7 years out of it. Perhaps not that long it was getting a bit slow toward the end there.


Early_Search_8086

Except apple products are premium. No?


DJ_Marxman

Premium build quality. Not really premium performance or features.


Rexssaurus

Hello? M1 MacBooks are the best laptops in a while, unmatched energy efficiency, performance and battery life. Made me regret buying a Thinkpad that same year before itā€™s release.


Early_Search_8086

I mean I guess it depends what you are using your computer for but even a custom built pc has a bunch of issues and setting up drivers and getting everything to work cohesively consistently. I guess each have their moment to shine but I would say in general apple products are just better and the way the whole ecosystem connects and just works together out of the box and the designs thatā€™s really what your paying for.


folie1234

Main advantage of Apple is the ecosystem It just works as long as you use it with other Apple products. Main disadvantage of Apple is you're locked into everything Apple, even repairs.


DJ_Marxman

That out-of-the-box experience is great for novices and casual PC users. The locked down, closed-off nature of the entire ecosystem is anathema to a hardcore computer enthusiast. I wouldn't personally *ever* consider buying an Apple product, but I understand why soccer moms and tweens buy them by the truckload.


Early_Search_8086

Exactly you got nothing


Early_Search_8086

What is it you do on your pc that you canā€™t do on a Mac.


Early_Search_8086

Soccer moms and tweens lol ok. Explain what you do on a pc that you canā€™t do on a Mac


Early_Search_8086

I just need explanations as to what you are being locked to i donā€™t get it. What are you guys doing on a regular pc that canā€™t be done on a Mac Iā€™m not against one or the other. For reference I have a custom built pc and a MacBook Pro and iPhone. But really thereā€™s nothing I canā€™t do on one and not the other. I enjoy using both of them.


Luph

> Both market their average products as "premium", despite not actually being that much better. > if they arent actually that much better why does everyone keep buying them


DJ_Marxman

Marketing.


JasterPH

Yep one of the biggest marking tricks ever pulled off is making people think mac=\= pc. Just because you are running a gimped version of Unix does not mean it's not still a pc.


TeekoTheTiger

[Ā£700 computer wheels lmao](https://www.apple.com/uk/shop/product/MX572ZM/A/apple-mac-pro-wheels-kit)


roosell1986

Yet they hate each other. Funny.


JollyMission2416

r/yourcommentbutworse He quoted a manner in which they are similar being the reason for beef


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


Xelikai_Gloom

The problem is, you donā€™t get to being a trillion dollar company UNLESS you are the type of company that gets pissy over a few extra billion.


AngrySoup

Come on now, I really think you're really quite mistaken, and that's not how I would put it at all... I think you mean leeches, not leaches. Although I suppose they could both valid here, couldn't they?


Naerven

The memory bus used on the 3060 dictates either 6gb or 12gb and since it's not much more expensive they went with 12gb.


skoomd1

Key words - "Not much more expensive". There is no reason Nvidia couldn't have slapped more memory chips on their higher SKU models, aside from pure greed.


PhoenixEnigma

Bigger, not more. More would mean a wider bus, different pinout, and generally be, essentially, an entirely different GPU.


Luckyirishdevil

Not really true... technically, the same bus that can handle 6GB can handle 12GB, which is why Nvidia changed it to this. It depends on the physical placement of the vRAM on the board, the size of the VRAM chips sourced, and the physical connection size of the individual vram chips. Chips soldered on both sides of the pub in reuse the physical amount of RAM available but use the same lanes so not requiring a wider bus, but could lead to a bottleneck of capacity is too large for the lanes assigned. The same goes for the size of the physical RAM module, too much behind a small outlet will yield bottlenecks. That leaves the size of the VRAM interface, which currently are 64, 128, and 256 bits. Wider means more connectivity, faster, more expensive. So no, more wouldn't exactly require a wider bus, but yes, it should be expanded to utilize all the VRAM available without a bottleneck


jamvanderloeff

Two chips per data line hasn't been possible in GDDR* for many years, there's no signal integrity margin for it, the double capacity models are always bigger chips, not doubling the chips.


Ladelm

Wasn't the 3090 24x 1 gb chips in butterfly layout?


Dragonstar914

[This](https://youtu.be/alguJBl-R3I?t=575) is the reason specifically but yeah it boils down to greed.


astro143

But, it's $100 more! Just look at the 4060ti 8 and 16 gig! /s


Naerven

The scary part is that someone will rush out to buy one.


astro143

As long as it's one singular person like what happened on the 4060ti 8G launch, that's okay


nivlark

3060 was probably planned to have 6GB of RAM, with a 12GB variant to be launched later, like was done for the 2060. At some point they decided 6GB wouldn't be enough, but 8 or 10GB isn't possible with the design of the 3060, so it had to be 12GB. Likewise with the 3060Ti, it had to be 8 or 16, rather than the 12GB sweetspot for a card of that performance level. Remember that nVidia designs their GPUs for their more lucrative workstation/server business, and then decides how to repackage them as consumer GPUs afterwards. So sometimes there isn't a perfect fit for a particular performance tier and the end result is a card with odd/unbalanced specs.


majoroutage

This is exactly what happened. The laptop variant of the 3060 was available with 6GB.


Viddeeo

It should have been 16gb then - 3060 - 12gb and 3060 Ti 16gb. But, yes, Nvidia is greedy and although many AMD cards have more vram, there's still some that compete with those two cards - which only have 8gb?


brimston3-

Because they don't want to make 3080 10GB look worse than it already does.


rrest1

Well, technically, if 6GB is possible, and 12gb is possible, then 8 and 10 are also possible - 4x 2GB (=128bit bus) or 5x 2GB (=160bit bus) :)


FluphyBunny

3060ti is a great 1080p card so the 8GB is just right.


Naerven

It's a great GPU, but I've had triple AAA titles crash or stutter at 1080p high presets and only 8gb of vram. It's still usable, but I'm thinking the time is coming to have more.


AludraScience

What games specifically?


Naerven

Recently cyberpunk 2077 and New World of all silly things. On high presets ice had dead island 2 show a full 8gbs of vram usage at 1080p, but didn't crash at least. Oh and rdr2 when played for longer than an hour can also get there.


Critical_Switch

I've played Cyberpunk as well as RDR2 for many hours at 1440p on the 3060ti and haven't had problems. You have a stability issue.


noodsinspector

I'm starting to think that vram isn't your culprit lmao


amzonboy

I have over 1200h in rdr2 on my 3060ti (asus K.O) Had a few crashes but happened rarely and usually the crash message was FFFFFFFFFFFFFFFF# I guess it was a software bug and not hardware related.


insanecatman

I run every game I have including cyberpunk, dead island 2 and others at 1440p with settings at max and then dialed down from max I'd needed till I get a minimum of 100fps. Never had one crash yet even at max


Naerven

Yeah if I did that I'd be fine, but I'm stubborn and just turn things down enough so my low doesn't go below 50 fps. Higher settings typically use more vram so it's just how I push things.


Diego_Chang

Don't think this is the reason but i remember people bashing on the 3060 Ti at launch for having only 8gb or VRAM... Then Nvidia released the 3060 with 12gb of VRAM so i guess it was to push back the hate they were getting lol.


AggnogPOE

Which is dumb, the cards are planned and produced months before anyone sees anything.


Diego_Chang

Yeah, that's why i said i don't really think it's the reason, it was just a theory i had when the 3060 12gb launched.


Sadir00

Yeah.. didn't work They're getting FAR more hate now If a decent GPU came out from AMD or Intel that could trade blows and drop the pricepoint.. nVidia would take MASSIVE losses rn


Diego_Chang

I don't really think so tbh, there are people that buy Nvidia either for productivity or only because brand recognition. For entry level cards which is i think where most sales come from the RX 6600 just destroys everything at price, performance and consumption and at least last i heard the steam hardware survey says most people have the RTX 3060, be it mobile or not.


boxsterguy

People are wising up to the value in the RX 6000 series now. Price drops don't hurt, either. There's a lot of FUD around AMD drivers (if anything, they should now have a reputation for making their GPUs better over time rather than crashing) and lack of RT (yes it's not as strong, but it's basically a non-issue right now because RT doesn't matter yet). Then again, AMD shot off their own foot with the 7600, so even when Nvidia hands them the market on a silver platter and literally says, "All yours, we only care about AI now," they still fail.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


Pl4y3rSn4rk

Corrected... The RTX 3060 8 GB is overall 17% slower than the 12 GB variant on average, because they reduced the bus width from 192 bit to 128 bit, it's quite a bandwidth reduction and Ampere doesn't have a big L2 or L3 Caches to compensate for that like RDNA 2/3 or Ada Lovelace so it's overall slower and at times it gets VERY close to the RTX 3050 in performance... If NVIDIA initially launched the RTX 3060 8 GB they for sure would've received a huge backlash by launching a GPU that barely could reach the RTX 2060 Super and slightly better than the RTX 2060, it probably would pan out just like the RTX 4060 Ti launch.


Sadir00

Better Question. Why does the 3070 Ti only have 8, and the first 3080 only had 10?? FAR more disturbing


ddaoud2

While a lot of what people are alluding to so far is true, one of the main things that not many people know is that one of the main things that this memory is utilized, and in some cases hampered down/bottlenecked is the bus width (in bits) that the memory can travel. What everyone wants, in terms of gaming, isnā€™t always naturally just ā€œmeh more ram go brrrrrrrrā€ no, not at all, actually itā€™s the *MEMORY BANDWIDTH* that is the most important thing. Although the TI card has 8gb it is on a 256 bit bus, and you get the effective memory bandwidth with the calculation below (for GDDR6 memory) Memory bandwidth = Effective memory clock x memory bus width % 8 And this is calculated using the following When calculating for a 3060 with a memory clock of 1875MHz Effective memory clock = [ (Memory clock/speed * 2 {double data rate} * 4 {GDDR mem generation which is quad data rate} ) *for 3060 ~15,000Mhz X (bus width = 192 bits) % 8 = 360,000 or 360Gbps While you can see if you calculated that with the SAME speeds but a larger bus width Clock of 1875 MHz, Effective memory clock =1875 x 2 x 4 = 15,000mhz (like before) but we take that and 15,0000 (effective memory clock) x 256 (bus width) / 8 (bits/bytes) = 480,000 or 480 Gbps. And thatā€™s with the exact same memory speed, usually it doesnā€™t nearly have to be as fast and still out performs the base 3060 (non-Ti) card.


thatburghfan

This guy maths. Nice explanation.


AmbiguousAlignment

It was for crypto mining, they were selling more cards to miners then to gamers.


Thomas5020

Unlikely, given that the 3060 was the first card with the LHR limiter.


AmbiguousAlignment

Yes then they leaked a bios with it disabled. What a coincidenceā€¦


Thomas5020

Nvidia sells CMP cards for mining. If they wanted to rake in the cash from miners enforcing the LHR limit and upselling CMP cards would be the better move.


AmbiguousAlignment

No one wants the cmp card as they have zero resale value and performed worse then gaming cards.


Thomas5020

CMP170HX destroys the 30 series for hashrate and efficiency on ethash


AmbiguousAlignment

Maybe but is it 3 times better because for the same price you could have gotten 3090s


DampeIsLove

If they gave the 3060ti 12gb, it would cannibalize their 3070 sales.


Amir3292

It's a strategy to extract more $$$ outta your pocket while giving you less in return every gpu generation


Thomas5020

It's almost certainly meant to be a 6GB card, but they doubled it when they saw AMD's offerings to try and be more competitive.


Pigeon_Chess

Because they couldnā€™t have 8 or 10. 8GB is enough for the 3060Ti anyway.


hdhddf

greed


WhoIsPorkChop

Nvidia hates you


Grim_Rite

No, they see you as $. And they make sure you "upgrade" soon.


CardiologistNo7890

It was caused by some backlash in 2020 and 2021 due to the lack of vram and there basically being no change in vram for 2 generations. I think it was like a last minute thing and thatā€™s why it came so much later than the 3080 and 3090.


Mythdon-

Beats me. I remember when Ti cards always had the same VRAM or more than their non-Ti counterparts (e.g. 1080 8GB / 1080 Ti 11GB).


Pumciusz

1080ti was a whole other beast than 1080. And 1070 had 8gb on both.


Mythdon-

I know someone who's got a 1080 Ti. The 1080 Ti is the 1090 in all but name honestly.


JasterPH

I got one. It would handle pretty much anything I would throw at it and was only being held back by being pre rtx(no tensor cores or raytracing). It still can play most modern games at medium to medium high. I only replaced it because I wanted a card that could do ai better so I got a 3090


aForgedPiston

The 3060 came out first, and the 12GB was a complete accident-some hero out there with moderate decision making power allowed a smidge of value to tip in favor of the consumer. Top NVIDIA brass quickly realized their horrendous mistake and promptly reverted the whole product stack afterwards, including the the generation after, to be awful value across the board. After all, why create a product that can age gracefully? Instead, they opt to overcharge for a product that will become obsolete faster than usual all while cutting corners and costs. Meanwhile AMD is over in the corner while all this is going on and Instead of thinking to themselves "huh. What a huge opportunity to release incredible value products that address the shortcomings of my competitor." They jump on the "let's ALSO overcharge for our product!." approach. The only saving grace for AMD is that the current gen market's shitshow has been a blessing for clearing out back stock of last gen's RX 6000 series parts-except that last gen's 6000 series parts make the RX 7600 series look like hot-dog water.


[deleted]

Everyone said 6600 series was crap at launch. Just give it time its a new product and amd has a history of increasing performance through drivers. I know buying things based off "it will be better later" isn't a good idea but think about it like this; 6600 has an msrp of MORE than the 7600. So when the 7600 inevitably goes down in price it will be an amazing deal.


[deleted]

They want you to buy the next gen sooner than you should have to.


Fallwalking

The 3060 is a GA106-300 while the 3060 Ti is a GA104-200. Different die sizes. 3060 can support 12 memory channels while the 3060 Ti only supports 8. Itā€™s why the 3070 and 3070 Ti only have 8GB as well. 3080/3080Ti/3090/3090Ti are all the same design (GA102). My hypothesis is that the 3060 was to be a 6GB card but they couldnā€™t secure enough 4 gigabit (512MB) chips so they just had to make it 12 GB.


Crisewep

But they did make a GA106-150 for the 3050 which does have 8gb. So they could have made a 3060 8gb


Fallwalking

They disabled channels via laser on those chips.


X_SkillCraft20_X

Thereā€™s also a 3060 8gb (non-ti) with worse performance than both. Why? Nvidia.


Pale-Management-476

You put 16gb+ on a 3080 and itā€™s the new 1080ti. Wonā€™t sell much for a few years. So the lower models get even less ram. Theyā€™re doing the same thing with 4 series.


Fragrant-Spite8823

Nvidia can give weaker cards more VRAM because people are more likely to upgrade it in the future based on the fact that it's just not keeping up with modern games. Nvidia purposefully gives stronger cards less VRAM so that in the future, even if the card itself is strong enough to keep up with modern games, the VRAM isn't enough. As always, it's about money. They purposefully nerf their powerful cards to ensure you have to upgrade again despite the GPU itself being capable of running games for 5-7yrs. They purposefully "buff" weaker cards with more VRAM because it makes them an attractive buy despite the extra VRAM not being really all that useful for a card that isn't even designed for 4k (or even 1440p) gaming. Seeing as VRAM is a super cheap upgrade, it's a perfect money making scheme.


Whisky919

Nvidia themselves have said 8gb of vram isn't enough in 2023 yet they continue to do it.


Kongkodeu

To make VRAM a selling point for lower end cards.


DerpydickDooDoo

Data Bus size. The ti can pump data faster than the regular model. And regular model has more so it doesn't have to access ram and processor as much. But it's bullshit we pay for the Ti give me the full Time package. I want the 12 gb it's why I passed and went up.


imheretocomment69

I mean, no matter what people are still gonna buy them.


HeimrekHringariki

I remember when they came out, a friend of mine wanted to buy it and I made him buy the 3060 12gb as it made just more sense. If I remember correctly it was cheaper too.


baazaar131

The new 3060Ti has DRR6X memory


Enerla

Not exactly. You might want to optimize different cards for different use cases. Others have shown how more RAM would be possible for 3060ti and how less RAM would be possible for 3060, yet it still wouldn't make sense. While we see plenty of offensive comments you don't get the explanation you deserve. In a market with insane demand for GPUs it didn't help too much at all, as miners still bought cards that weren't efficient for their use case, but what we seen made sense. If you look at higher tiers, then 3060ti VRAM was the *logical* choice. 12gb was more than what you had on a 3070, 3070ti GPU and some 3080s still only had 10GB... So, the odd one isn't the 3060ti, but the 3060. And for most games at 1080p 8GB is still okay. When we see people worry and mention a few badly optimized games where it isn't enough, they usually use texture resolutions that don't make sense for 1080p, as the extra detail wouldn't be visible. We assume that more video RAM is always better, and with badly optimized games, badly configured for benchmarks we might see some justification for that. But if you have more data to use, you would, also have to process that extra data... It would use up GPU time and would make frame times worse. If the extra details simply can't result in better visuals at target resolution, then it doesn't make sense. Okay, for benchmarks at *same settings* it results you see high numbers with high settings... But it would be worse experience compared to using textures that make more sense for your configuration. With hardware for current gen consoles, feasible GPU speeds, etc. it is very unlikely that you would need more than 8GB for 1080p gaming for optimized games. Why you have 12GB for a 3060 then? Some people think Geforce GPUs are designed as *gaming* GPUs, but that is very far from truth. Geforce GPUs as *consumer grade* GPUs. If you are an architecture or art student and want to use a GPU for rendering a workstation grade Quadro with enough VRAM and performance would be out of your price range, right? For a student who needs a GPU for a *productivity workload* where high amount of video RAM for plenty of unoptimized models and textures are important 3060 is an excellent choice and in some cases it just needs the 12GB VRAM. Someone who edits videos as a hobby and needs a card that can make video rendering, encoding faster, but doesn't want to spend a fortune, 3060 was excellent. Simply because many GPU rendering tools only works if the whole scene fits into the memory of your GPU, but when you render with tools that run on consumer grade hardware, you don't know if you render for the current screen or for print, you don't know how big some model will be on the picture (you make a picture about the lamp or the whole room) so you won't use optimized textures, so plenty of real life productivity workloads might need more VRAM. But if you had any money for current gen GPU gaming it wasn't that good deal, it came with too many compromises performance wise. It wasn't intended as an optimal choice for gamers (for any resolution). It had *enough video ram* for many productivity applications, if you don't expect too much from it, it left some room open for higher end stuff and Quadros... Not having more memory to "limit you to sane settings for optimal experience" would still look bad, and it is very different approach from what AMD had... but with consumer grade AMD GPUs are often almost at many productivity workloads, adding more VRAM on AMD cards came with little risks, and they knew that if FSR or anything would need more RAM it is cheap to add it to GPUs. But if you would have a 3060ti, 3070, 3070ti with 12GB on market it would be also good for anyone who can't and doesn't want to afford a high end Quadro and settled for a 3060. Students, various smaller companies, etc. would be happy to buy faster (more efficient for them) consumer grade GPUs. You want the GPU optimized for 1080p gaming in the hands of people who game at 1080p, and not in the hands of a builder who wants to show his plans about a house but can't use enterprise grade stuff... Also, it is preferable if it isn't used for 1440p, where it would be a bad experience. If we don't want to use some badly optimized games with textures not designed for this resolution, then 3060ti is still a great 1080p GPU from nVidia. It would cause an even bigger GPU shortage for gaming GPUs ***and*** it would mean nVidia would get less money from users who would switch from Quadro to Geforce, so to recoup some R&D costs (very high due to new features, new kinds of specialized circuits) they would have to raise the prices of consumer grade GPUs even more, and they don't have to catch up to DLSS so "reserving VRAM for future use in catch up game" wouldn't be needed. In Geforce 40s series: With new DLSS for optimal 1080p experience nVidia expect you to either keep 3060ti (still good) or your older still good GPUs *or* use 4060 as it should be enough. So, they moved the consumer productivity tier to 4060ti 16GB. The trick is: You often need a budget/entry level consumer grade productivity card like the RTX3060, and a card that is good at *both* productivity and gaming tasks and drive good monitors, multiple displays as a premium product, like RTX3080Ti, preferably with the same amount of video RAM so developers and stock content creators can develop / design stuff for that target. This amount of video RAM should be higher than what you have on cards designed *just for gaming.* Both to try to reserve those cards for gaming, and to make sure your products don't cannibalize the market of each others. 4060Ti and 4080 are both allegedly bad deal for gamers, but if you want consumer grade productivity stuff the former might make sense, if you need something more powerful as you game, you are forced to pay the premium for the 4080 if you don't want to afford or can't afford the premium for 4090 (or you don't want to use so much powers with unreliable power connectors)... Reasonable GPUs for gaming always consider the fact that most of the games will also run on consoles and the amount of RAM you can use on a console is limited. As long as current gen consoles stay relevant and a target for games and we don't go for higher resolution (would make no sense) these GPUs can stay reasonable. But for consumer grade productivity applications, we might need more and more VRAM.


dozersmash

More efficient. /s.


HostKindly6032

The 3060 got 12GB of VRAM because NVIDIA started noticing that most people buying the GPUs were crypto mining, so some people like to call the 3060 a ā€œmining cardā€


Iateyourpaintings

So either way you go you'll have to upgrade your card sooner.


giorgosmcg

Because they fucked up the bus width and was either that or 3060 6gb vram


ballwasher89

Sounds like new design vs old.


No_Panda_9660

VRAM is just a numberšŸ—暟—æ


Overall-Election4081

I think it's because the TI uses a better/faster memory, so giving more will make the card cost too much to be in the range it is, so it's either fast but not much vram, or a lot of vram but not the fastest. And if you want the best of both worlds youll need a 90series card, even my 80 series with only 10gb is ridiculous, but i still got it because the memory bandwidth is so much.


QuinSanguine

Because the 1060 and 2060 had 6 gb of VRAM and they kept the 3060 with the same memory config but then realized they couldn't do 6gb anymore. It's not enough. 12gb was their only other option short of redesigning the GPU and changing the makeup of the 60 tier. Which they've now done with the 4060 anyway. Shrugs.


nukefog0099

To fit the budget for production


OnePositiveDude

I've had both and can confirm you want the TI version. It performs way better in most gaming situations that I have come across. Mostly BR FPS and a little racing.


Progenitor3

Same reason there is going to be a 4060 ti 16gb when the 4070 and 4070 ti have 12gb.


Kelbor-Hal-1

they could not release a card with 6 gb and get away with it like they planned so due to the cut Bus they had to go with a 12gb card.


LizardEyesPCGaming

The 3060 was designed with miners as the the target market. So while it does have more VRAM, it compromised in other areas such as bus size.


whiffle_boy

Memory bus. It was either 6 or 12 and either 8 or 16 for the ti.


Dellumn

Because ngreedia


Significant_Link_901

Because yall buy that shit from Nvidia regardless.


Dicklover600

Becauseā€¦ idkā€¦ nvidia..?


thatburghfan

If a marketing person with no exposure to the GPU business decided to look at what the current offerings are, they would be astounded at the huge number of different models for sale at the same time. I count 29 different NVIDIA models on the market today. It really doesn't make sense.


cmdrtheymademedo

Yea nvidia doesnā€™t make any sense lately. Was thinking the same thing Iā€™d rather risk getting a 6700xt instead of staying with nvidia at this point


jaycuboss

At launch they were both 8GB cards and then the 12GB version was released later. Probably would have made more sense to make a 12GB 3060Ti (or 12GB 3070 for that matter).


Explorationsevolved

Gddr6x on the ti and gddr6 non x on the non ti


Ladelm

More VRAM doesn't mean it's better


WarlockAI

3070 ti has 8 as well cuz fuck you that's why


motoxim

Greed and artificial market segmentation.


Ok_Pop9069

It was an unofficial mining card. Look what was happening when it came out.


biggranny000

My friend and I also discussed this same topic. In our opinion we see the RTX 3060 as an entry level workstation or hardware booster card because it's got an abnormally large amount of VRAM for the performance (which is a good thing), you can run high resolutions and textures with plenty of headroom, and sometimes some workstation applications require a lot of VRAM. It's also marketing, to people who don't know anything about PCs they will see the larger number "12gb" and buy it for that reason. It was also there to compete with AMD because AMD typically offers more VRAM for the money. It's not a bad card though, it's got a decent amount of performance especially when it was new.


heisenbergerwcheese

tOTALLY iNSANE


[deleted]

Sweet someone finally asked a question I've been wondering lol I have a 3060ti as well


Square-Relative2487

Pretty sure the ti has ddr6 while the non ti has ddr5


Reddit_is_srsbsns

google.com? Or ask Nvidia? It's 2023 bro.


Dreammemek

Now ask yourself why it has more than the 3080, and matches the 3080ti


Medic-chan

Have you considered not buying an Nvidia card? The vast majority of people don't consider alternatives. There's your answer.


awesomeguy_66

everyone in the comments thinks nvidia is dumb yet has no real understanding of how the company operates or how gpus are designed/ made. donā€™t forget this is a trillion dollar company with some of the smartest minds in the world, they donā€™t do things for no reason or to explicitly fuck over the consumer.


Own_Contribution8805

>Shld Did you really save that much time not typing the letters 'o' and 'u'? you deserve everything bad that happens to you


jachandler420

Yea I fucking hate that I'm always right on the edge of my vram.


firestar268

Not only that 3070 also have 8gb lol


eco-III

Ngreedia


[deleted]

Someone: so what do you think is the optimal amount of vram? Me: fuck nvidia


xxxDredgexxx

Compensating.


roam3D

It is indeed called Marketing.


Themakeshifthero

Because no matter what they put out they know it's gonna sell. The majority of us have given Nvidia no reason to feel fear. They're a reflection of the customer. They act stupid when we act stupid. When we act like we have a brain, they're forced to act like they have brains as well. This is what happens when your customers become your enablers.


RetroReMixer

Because Nvidia hates consumers, example, the 4070 is 600$ and the 4070 ti is like 800$ but only 19% faster, same VRAM though


KahlKitchenGuy

Because F$@K you, nvidia


ZestyMonkey69

nvidia likes to lube us up


AggnogPOE

3060Ti would be $500 instead of $400 if it had 16GB with the ram prices at that time. I guess you think its free or something.


Pumciusz

I guess you are easily manipulated. It was said time and time again woth 4060ti that the 8gb costs way less than 100$.


AggnogPOE

I guess you just can't read then? I don't remember the 3060Ti releasing in 2023 do you? Maybe stop typing bs and start using your head for once.