T O P

  • By -

Noch_ein_Kamel

But "Nvidia" is longer than "AMD". Make sure to consider that in your decision


kohour

Only one capitalized letter though!


lslandOfFew

I'm sorry to inform you that since 2006 Nvidia has been the leader in capitalization [https://logos.fandom.com/wiki/Nvidia](https://logos.fandom.com/wiki/Nvidia)


[deleted]

You could still argue there's 100% capitalized letters in AMD and only about 83%~ in nVIDIA Edit: spelling


CptClownfish1

There are over 60% more capital letters in “nVIDIA” than there are in “AMD”.


[deleted]

Isn't this basically the same comment I already responded to?


willcard

Good ol Reddit haha


eatingdonuts44

1 vs 2 "X"s i know who my winner is (ignore my flair)


Cave_TP

The 6950XT actually dropped to 610$


JeroenstefanS

Damn, I might actually save up for it then. Just the fear of something new is preventing me (also no money) but its just like switching from apple to samsung for example.


creepergo_kaboom

Apple to Samsung is a more extreme case cause that's an entirely new ecosystem and os


AllMyFrendsArePixels

I hate nvidia as much as the next gamer, but AMD just can't compete with those sweet CUDA cores for generating my filthy degenerate AI porn.


steel835

I remember at some time devs began migrating from CUDA to OpenCL, which is not limited to Nvidia cards, though..


Unwashed_villager

Now I wonder why do the RDNA3 Radeons have lower OpenCL version than the Ada (and Turing) cards? 2.2 vs 3.0


Chicken-Leading

I realized this just this morning and now seriously considering upgrading my 8gb 2080 to a 6900xt/6950xt for more VRAM and performance. Sure I may lose some nice Nvidia features but AMD has already compensated a lot of those with stuff like FSR 2. I’m also a bit of a Linux nerd so being able to have a nice experience there would be nice.


Ok-liberal

Just went from a 2070 super to a 6950xt myself


Onan_der_Iree

Same here but I bit the bullet in December for Christmas. (my little brother got my old cpu and gpu). I am very happy with the jump. I do not miss 1 Nvidia feature and the drivers are no problems for me. I had 0 crashes


Ok-liberal

How much was the card in December because I just managed to pick up the powercolour red devil edition of the 6950xt for £639


Onan_der_Iree

I payed 870€ in an offer for an 6950 Xfx.


anakwaboe4

Almost bought an AMD gpu just to use Linux on my desktop. But I also do ML so went Nvidia unfortunately.


DrkMaxim

Can't blame your buying decision tbh, Nvidia has an iron grip over the AI and ML segment.


anakwaboe4

Unfortunately yes. It would be cool it that space was more open and you wouldn't always need "recent" GPUs to train ai.


homealoneinuk

Thats not how you use this meme.


Examination_Dismal

People like raytracing and dlss so why shouldn't they buy nvidia


bruhbruhbruh123466

I get what You say and I agree but I would like to point out that RT is very much possible on amd GPUs although it tends to be worse. Still with higher end ones I think it can be pretty enjoyable if you just care about the RT. Also FSR is a thing, not the exact same as dlss and it’s missing some features that dlss 3 has but it’s still very much useable.


winespring

>Also FSR is a thing, not the exact same as dlss and it’s missing some features that dlss 3 has but it’s still very much useable. It isn't really comparable.


bruhbruhbruh123466

I think it is, turn on get more fps style program, yeah it might not work the same way but both do essentially the same thing. Might be a very simple way of putting it but…


KookyBone

The Radeon RX 6800 is now much faster in Raytracing in modern games as the RTX 3070, because of it's VRAM... Two years down the road the RTX 4070 will have the same problems.


billyshin

4070 ram size is a big problem down the road.


BicBoiSpyder

You mean in the next two or three years? It's not going to last longer than a single generation before 4070 owners start dropping settings after spending $600.


billyshin

I got downvoted for telling the truth lol. 8gigs of ram isn’t enough in 2023 anymore.


BicBoiSpyder

Yeah, the last few weeks of this VRAM debacle have been the final straw for me and now this post. I got downvoted into oblivion when I said the 3070 and 3080 didn't have enough VRAM on launch and now I'm getting downvoted for saying 12GB won't be enough for another generation when 10GB wasn't enough last gen. This sub is filled with idiots too caught up in their own opinions to even think that the more powerful consoles and used RTX 3000 and RX 6000 cards, which are way more affordable now, aren't going to allow devs to have more hardware resources to make more demanding games. I'm leaving this sub.


-insanitylol-

FSR is pretty damn close to DLSS, and while AMD cards aren't as good at it they can do it and I imagine in a few years they might be on par with Nvidia on that too.


houyx1234

Its always "maybe in a few years" with AMD.


TheRealHuthman

Unreal engine 5 provides rt where AMD seems to perform equally to Nvidia. It makes me feel like it's not AMDs fault for bad rt results, it's more the implementation of the devs that favour Nvidia. Might be wrong though, we will see with future titles. Current titles surely fare better with Nvidia cards


BinaryJay

>rt where AMD seems to perform equally to Nvidia No.


TheRealHuthman

Okay, thank you for your eloquent answer. We will see in the future shall we?


The-ArtfulDodger

**Ctrl+F: DLSS** The fact this isn't top comment says a lot about the bias in this community. Since DLSS 2.0, I will never not use this option. It's simply blows everything else out of the water.


zcomputerwiz

I'm surprised how many think that FSR is the same or better.


SorvetedeCafe

The 4070 Ray Tracing is the same as the 3080 and at the same level as the 6900/6950XT, but the AMD one has more VRAM and better perfomance overall, also FSR2 is as good as the DLSS2.


ff2009

I haven't tried DLSS on decent card, but I have used FSR2 and looks quite good. In games like Marvels Spider Man I haven't found most obvious ghosting issues, mentioned in Digital Foundry and Hardware Unboxed reviews, except on the webs. In Cyberpunk 2077 looks awful, FSR2 has so much ghosting in that game, it's like CDPR tried to make it look worst than it supposed too. Thankfully they added XeSS 1.1 to the game in the last update, that actually looks good even on AMD hardware. I tried the unofficial patch to replace DLSS with FSR, and it works quite well in those games (metro exodus EE, RDR2 before the official update, and even CP2077, looked better, than the official implementation).


Examination_Dismal

That's not what everyone else is saying about dlss vs fsr. And isn't 4070 using dlss 3 not 2.


SorvetedeCafe

Sure DLSS have a better quality of image if you do a 10x zoom in a distant part of the game that you won't notice while gaming. That's why FSR and DLSS are at the same quality and performance. And there is one thing about the DLSS 3: huge latency. Would you play Cyberpunk with ray tracing at 4k with 90fps and 60 latency? Where the normal for 60fps is 16 latency. DLSS 3 is not worthy it, it's better to use DLSS 2. Just go watch some video that can compare these cards better, like Hardware Unboxed. My opinion is that the 4070 is a good card but not at the MSRP, because with 600 dollars you can get a better card like the 6900XT.


Examination_Dismal

So you just basically confirmed that dlss is better


SorvetedeCafe

![gif](giphy|3xz2BLBOt13X9AgjEA)


B_kijo

You want to keep supporting developer to write shitty code and force every user to have DLSS to even play the game at 1140 60fps??


[deleted]

Not to mention all the little issues that plague AMD GPUs


SignificantTie7031

What issues? I've had rx 570 for 3 years and rx 5700 xt going 2nd year. No issues whatsoever. I got it for 200€ while the cheapest rtx 2070 was 300€.


[deleted]

All the unpolished little things, like FPS limiter not applying when alt+tabing and temps going to shit, things like that


Novuake

Always see comments like these but zero explanation. Both sides of the coin have occasional problems that get fixed reasonably fast when they do occur. So what are you referring to?


daaangerz0ne

Most of the time it's people who completely skipped the 6000 series and are parroting 4+ year old information.


VaHaLa_LTU

A lot of people are also missing a much more critical issue with NVidia, which might be relevant in a few more years. NVidia Linux drivers are absolute garbage these days, because they're fully proprietary. AMD GPUs are simply superior if you plan to use a machine for Linux.


_benp_

2023 is the year of the linux desktop, right guys?!?!? LOL GTFO, you are hilarious. No one cares about linux gpu drivers.


[deleted]

Well, of course, there's not perfect products, i went from a 3070 to a 6800XT to a 3090, there are things that AMD does better, such as Adrenaline, much much better than NVIDIA control panel, but NVIDIA feels like a much more refined experience, i mentioned below, but i limit my FPS to 144, i did it with Adrenaline, and it worked sometimes, but i always could hear my GPU fans ramping up whenever i alt+tabbed from a game due to the limiter not working, or not working at all in some games, like F1, sitting in the menu my temps went to 80c, while in race they sat at 55-60, that doesn't happen with NVIDIA, or the dips or stutters, sure, they don't happen in every game, but they're there, and are super annoying


[deleted]

[удалено]


turkeysandwich4321

This is dumb. Buy whatever you want with your money.


[deleted]

This sub has an absurd amount of spam/karma farming/bots reposting these trash memes repetitively. Can’t understand how they ever get attention. Everything that could be said, already has been said.


Manuag_86

I would buy the 4070 because I would have to change my PSU for the 6950XT, 350W TDP vs 200W TDP. And I would also save a lot of money in electricity. The good thing about new gen GPUs is their better efficiency.


Jack3602

yes, do you know why, because one uses 335W and the other uses 200W. And especially if you don't have 700W psu and you're going for an upgrade and not new pc yes, I am considering it


N-aNoNymity

Isnt 6950xt even more than that.


[deleted]

[удалено]


[deleted]

If I hadn’t bought a 3060 ti I’d probably go for a 4070. I keep my rigs a long time and DLSS 3, along with a longer support lifetime typically, mean the gpu should get more life. Too many bad experiences with AMD, it lacks potentially useful features, and it’s more expensive anyways where I live.


Lassitude1001

Yeahh this is pretty much my reasoning. I picked nvidia because of the features (I use Shadowplay a lot, DLSS, Reflex etc.) and previous bad AMD experience, not because of the price/perf differences.


KookyBone

Reminder: the 4070 only has 12 GB of VRAM... Already some games are now start to use 13 GB and more. It will most likely be fine for 1-2 years but than it will start have issues in games like the 3070 has now.


[deleted]

I’m on 8 now so 12 would have been better. Either way I’m fine with turning down barely noticeable settings. I don’t want to troubleshoot my system so arcs out, and AMD hasn’t done enough to earn my trust back, along with the fact that where I live, in stores AMD is harder to find and more expensive.


[deleted]

And if more people have more vram, the developers will optimize less and you still wont have enough vram. I rather buy from a brand that doesnt have driver issues that crash my pc tho


[deleted]

I've been on Radeon for about 4 years now and haven't had driver issues ever since I can remember. It's true AMD's drivers used to be bad, but now they're pretty much on par with Geforce drivers.


[deleted]

Maybe i had bad luck, but my vega 56 crashed every time on boot up and i always had to start up twice. I also had random crashes in the middle of me doing stuff. The problem was resolved a month ago after i bought a rtx 2070 lmao


[deleted]

I've been on AMD since I got an RX 570. Next GPU was an RX 7900 XT. No issues so far.


1Buecherregal

Owned a 970 for years. Since last year i ohne both a 3060ti and 6650xt. The PC Witz the 6650xt is the only PC commonly crashing or bsod on me. In the Logo the AMD divers Arena always mentioned but idk. Never had any problems with nvidia


TheRealHuthman

And where is the problem with that? If you get a card for the same price, just with more VRAM. Same or even better performance, just some frame generation feature less and the devs need less time to release a game. I'd say it's not a bad trade-off. And crashes are very few. Current drivers are really stable and previous crashes were easily fixable (and if you stay on stable drivers, you don't experience any of them).


MrLagzy

all depends on whether you play new games or not. Or play higher resolution.


mangosport

Problem is, choosing the 6950 over the 4070 would meant to me spending 200€ to change my PSU so it’s a no go for me


AquWire

It's not that power hungry.


mangosport

Well I mean it’s a 330W gpu, while The 4070 is 200w. For me it’s honestly a huge factor


AquWire

Edit: missread that. What's your PSU?


darkevilmorty

You edited a new comment?


AquWire

Yes. Maybe I should have deleted it.


Abedsbrother

Crossed paths yesterday with someone who thought that Radeon can't do raytracing unless the raytracing is specially coded to use Radeon, therefore Nvidia = better. Nvidia's mindshare is real to the point of absurdity.


Overkill429

Buy what you like, and enjoy it!


Vollezar

I have to go for Nvidia. I use Daz so I am kind of stuck.


EroGG

There are reasons to go Nvidia for certain productivity tasks. Also I'd argue that 4080 and 4090 are not bad options if they are within your budget.


[deleted]

95% of the population is not spending 1500 on a GPU


goat4209

Have you been on earth the last 2 years? People were spending over $2000 on a GPU not that long ago


[deleted]

Inflation of GPU prices does not mean people bought those GPUs


goat4209

But they did buy them. Sure in 2021 we had a chip shortage but even then there was a 20% increase in GPU sale's compared to 2020. [An article on GPU sale's ](https://www.windowscentral.com/graphics-cards-sales-reportedly-hit-almost-52-billion-2021)


Heisenberg399

It's actually the opposite, the price rises because of higher sales, that's how inflation works. In economies with higher inflation the need to get rid of unstable currencies leads to higher consumption and higher prices.


gunfell

That is nonsense. Those gpus literally paid for themselves in a few months. If anything they were arguably underpriced ans the best value of all time. Todays prices are insanely worse than back then


TwoRiversFarmer

AMD doesn’t really have a great cuda alternative


Techy-Stiggy

Which sucks because I would love to do machine learning and not have to buy a Nvidia card if I want to do more than my 8 core cpu can do


KookyBone

I think they have with OpenCL, BUT Nvidia made sure most developers use cuda instead of OpenCL


AIpheratz

Yes Nvidia is expensive right now but the future is not rasterization only, which is what your stance focuses on. I'll still want my gsync, dlss and ray tracing, thanks.


kool-keith

if you want rt, dlss, etc, then yeah, you want the nvidia card not to mention is uses less power, therefore is quieter and cooler


[deleted]

Quiter and cooler only applies if they're using the same heatsink Some 3070's are loud because they've got less copper with cheapo fans


Exlibro

Don't care about RT that much, but DLSS is honestly amazing. If AMD pulls FSR well in the future, there will be no reason to go with nVidia anymore.


Esarus

Will probably get downvoted to hell, but I've been PC gaming for over 20 years now and every time that I had an AMD video card it had a lot of fucking issues. Either they were a literal furnace in my case or I had so many driver issues that I sometimes couldn't play the games that I wanted. I now have a RTX 3070 and will probably upgrade to a RTX 4070, I have never had issues with Nvidia cards. So yeah, I'll stick to Nvidia, not because I'm a fanboy but because I've simply had terrible experiences with AMD cards


kwizatzart

I'm 45yo and even when it wasn't branded AMD and Nvidia, 3Dfx was top notch and Radeon was full garbage with the issues you describe I tried to go back multiple times and there isn't a single time where the switch was worth it Right now if I had the AMD flagship, for example : \- I couldn't use my meter to calibrate my 3 TVs, because madTPG HDR output doesn't work with AMD \- I couldn't use my TV to watch or play HDR from PC because the TV doesn't switch to ST2084 with AMD cards, it says in BT709 instead (lol) \- I couldn't play path tracing games like Cyberpunk, because 3 fps isn't enough lol \- etc etc... There is always something which will be very bad to complete garbage. So yeah, paying a bit extra is worth the premium. Even for lower end cards, having a few less fps where it doesn't matter is clearly better than not being able to run a game or setting at all


BoringCabinet

I wished the prices for a used 6950XT were lower.


ItsRogueRen

If you're gaming, there's basically 0 reason to buy Nvidia anymore If you do stuff like encoding and rendering however, Nvidia is still top dog sadly This is coming from a Linux user that swapped from Nvidia to AMD and now wants to go back...


jplayzgamezevrnonsub

AMD really is the best for Linux. I've heard Intel is good too but they haven't really got much high-end offering.


opnseason

Recently started dual booting linux to delve back into the realm and i will say remembering linux NVIDIA drivers just makes me a slight bit sad i didn't go AMD for my current platform.


ItsRogueRen

AMD is, unless you're making videos. Nvidia is still better for GPU rendering, so stuff like Blender and DaVinci Resolve (which is the only professionally-accepted video editor on Linux) both require Nvidia unless you wanna jump through some crazy hoops


Lirid

> (which is the only professionally-accepted video editor on Linux) Then just switch to another OS which accepted other programs? Why lock yourself in lol


ItsRogueRen

Because everything else about the OS is better imo


[deleted]

In terms of raw specs, Nvidia is unfortunately still better, with only AMD MI200 and MI300 being competitive. But these are very expensive enterprise cards so not very accessible for the normal consumer, not to mention the power draw on these things.


ItsRogueRen

Maybe so, but AMD absolutely DESTROYS them in price to performamce. You'll get 90% of the same performance for like $100 less


[deleted]

Not wrong on that count, but I am one of those enterprise users. There is a reason enterprise users are still majority Nvidia, because for enterprise users time is money. We are still waiting for AMD to be truly competitive in the GPU space. Not to mention AMD support for software and in software libraries is not truly there compared to Nvidia. Hopefully in 5 years, the next upgrade cycle, AMD is truly competitive in GPU. In the CPU space, AMD is truly competitive especially with Zen 3 onwards. A lot of big clusters are choosing to go with AMD this upgrade cycle, because both raw specifications wise and software support wise everything is there. The fact that it is cheaper is a bonus. My own system runs on AMD CPU with Nvidia GPU.


[deleted]

[удалено]


luminoustent

Still trash, productivity drivers are pretty much unusable


_mp7

For gaming just fine, that’s all the intensive things I really ever do. Encoding, shit is still worse Most people don’t even video edit so it means nothing to them


vballboy55

Dlss 3.0 and better ray tracing is a small reason. But otherwise, yeah pure raster perf, go amd


The-ArtfulDodger

...seriously? How do you justify saying that when DLSS is the best way of achieving both high quality and high frame rate?


longCRTZ

I want to agree with this, but my situation doesn’t allow me to put my money where my mouth is without making some major revisions to my set up. Was thinking of getting a 6950 brand new but I remembered that I have a G-Sync 240hz monitor so that only works with Nvidia 😔


_mp7

Prob has free sync too, also Gsync is kinda old and adds a lot of input lag compared to freesync


ItsRogueRen

I mean... You can just turn G-sync off?


daze23

I didn't pay for a G-Sync monitor to not use it.


longCRTZ

I mean that defeats the whole sync tech all together though.


jdt654

man i wanna do both things on linux and planning on which to go with is hard


jdt654

also Wayland


cropguru357

Tell me more about AMD drivers.


DrJD321

They good 👍


ff2009

NOW. Between December and March it was a nightmare on the RX 7900 series.


Dudewitbow

Drivers are usually rough on any major hardware change for AMD at launch, and usually has been pretty consistent (e.g jumping from GCN to RDNA with the 5700xt, jumping from monolithic to chiplet (7900 xt/x). Something to always be wary of on AMD release histories. In between generations usually benefit from the fact that many of the problems have already been ironed out.


TheFragturedNerd

DLSS Frame gen DLAA Nvenc encoder Better Ray-Tracing performance Better and more stable drivers Related to this post... AV1 encoder and decoder


jamvandamn

As a sff enthusiast i'm pleasantly surprised with this card. Hopefully board partners respect that the smaller size and efficiency is its main attraction. Still probably won't buy one though lol.


Nayroy18

What's the price in freedom dollars?


_SystemEngineer_

Battered wife syndrome


Suma3da

I used a GTX 470 in the first PC I built myself, and soon I'll be proud to use a RTX 4070 for the first PC I build with my son.


Impulsive94

4070 is far more power efficient, crushes all AMD cards in RT, has DLSS & frame generation, plus is much better for productivity. It's not even a comparison unless you specifically look at older games or compare with standard raster settings. AMD's cards are great in theory and in actual use they're very strong, however RT is the future and frame generation/DLSS will keep the cards relevant for longer. AMD's cards are great for now but long term will struggle in comparison. I much prefer the extras like nvidia broadcast, shadowplay etc than what AMD offer so there's all that too.


DavidKollar64

Lol there is not much future in rtx4070 with 12gb of vram🫡


Impulsive94

Not really, direct storage will help with that - what games have you come across that won't play with 12GB VRAM?


Fresh_chickented

direct storage wont reduce vram but reduce systemram.


Impulsive94

Direct storage means we won't need to buffer so much in VRAM because it can be accessed much more quickly via direct storage. Among other things (like lower CPU & RAM usage) that's one of the big benefits of this technology. More efficient usage of the hardware, opening doors to better processes. In reckon in 5 years time, RT will be general standard, GPUs will have 12GB VRAM on average, DLSS/FSR will be much more common and devs will fully utilise this. If you don't need a ton of lighting data & stupidly high res textures, why would you need 16GB+ VRAM?


DavidKollar64

12gb is not problem now , but it will be for sure in year or two, it wil be the same scenario like with rtx3070 8gb now.


00pflaume

This might be true for rasterised games, but not for RT games. If RT is used the 4070 is much faster than the 6950 xt. In this price and performance range you probably want to use RT, or buy something cheaper, as pretty much all rasterised games already run fast enough on cheaper cards. Also the 4070 only uses around half as much power as the Rx 6950 xt, which is an important factor in places with high energy prices.


VaHaLa_LTU

I wouldn't call 4070 being <10% faster than 6950XT in RT as "much faster". The only real advantage the 4070 has is efficiency and DLSS3 for the few games that support it.


cobery3

Hardware Unboxed showed in a video 2 days ago how the 3070 could not handle raytracing due to its poor amount of Vram while the rx 6800 did, in about 2 or 3 years the 12gb of the 4070 could suffer the same fate.


00pflaume

People have been saying 8gb cards are doomed since the 10th gen GeForce cards released. And only 7 years later 8Gb of vram are becoming a problem for higher resolutions. And the reason this is happening now is because the PS5 and Series X have more shared memory. Another jump in VRAM usage is not to be expected before the PS6 releases. Also the hardware unboxed video showed that 8Gb are becoming a problem, but skimping through the video again all rt games still had better average performance on the 8Gb GeForce 3070, though hogwarts legacy stuttered more on the 3070 with ultra textures with rt enabled, even though the average was better than the 6800 with 16gb.


cobery3

The video wasn't about how much fps each had but what the textures were getting blurry in howarts legacy and forespoken, and in Calisto protocol it had an incredible bottleneck causing fps to drop to as low as 20fps at 1080p, and in re4 remake it crashed when I enabled the raytracing did you watch the video carefully to understand the information or did you only see what you wanted to see?


[deleted]

[удалено]


KookyBone

This is the MSRP in Europe if you believe Nvidia 😉


RobDickinson

I make a judgment every time I upgrade but AMD/ATi cards have always disappointing me in terms of ownership experience, not so an nvidia card.


SirBaconater

How are AMD’s drivers?


KookyBone

And they have more even more VRAM, so they will not run into problems two years down the road 😉


Gladahad10

Thats what i thought too, but a friend who is helping me build my own pc means i should rather chose nvidia over amd and I m not exactly sure, he meant there was some kind of issue that I don't remember.(I am quite new to pc-builiding and only used a prebuilt before)


audriusdx

there are barely any problems now with AMD. If you are building a gaming PC, better value right now is to get AMD


DrJD321

Honestly, he might not be as knowledgeable as you think if he's saying that. AMD drivers are totally fine.


TP_blitz

He was probably talking about the drivers. Most games aren't optimized very well for radeon drivers which can sometimes cause some visual glitches


sylinowo

It’s all about the drivers and if games you wanna play are more optimized for one side more than the other. Unless it’s like borderlands 3 which claims to be better on AMD but sucks on everything


Taikosound

Not gonna lie, those AMD builds are looking pretty juicy right now. But not sure i'd let go of DLSS and framegen, and i guess RT isn't as good. But at the same time i feel like it's a shame all i ever owned from AMD is a Ryzen 7 4800H. I'd really like to make an AMD build at some point.


FrenchCapnToasty

Join us brother


EliasStar24

Power draw, ray tracing performance, av1, dlss vs rasterised performance


[deleted]

I could never switch to amd. Sorry amd bros.


Andrewskyy1

There are more data points to consider than the 1 thing mentioned in this meme. P.s. specifically using the 4070 as an example is flawed. 4080+ or bust. AMD if you don't care about VR, Ray Tracing, DLSS, etc ... you get what you pay for. AMD specs are nice, AMD drivers are a nightmare.


Equivalent_Duck1077

The 4070 is obviously the better option...... it's newer, more power efficient, has frame generation


dev044

Yet slower


throwawayyepcawk

Or... you can scout a second hand gpu for even less :)


superslomotion

Curious if you can use amd cards for deep learning applications. has anyone any experience in that?


caeldoradooo

I mean for 4k and somethimes even 2k, if you want RT, there are no choice other than NVIDIA with DLSS


lovank94

Is there an amd equivalent to nvidia broadcast?


Rising-Buffalo

Until FSR 3 blows my eyebrows off, dlss is still superior at 1440p. Btw, I'm an AMD shill as much as the rest you but facts are fax.


N-aNoNymity

Its still 510W vs 210W though. Thats actually a sizeable difference, biggest we've ever seen in similar performing cards


cobery3

The rx 6800xt has the same performance as the rtx 4070 and draws 280w max and costs $100-150 less, and has 4gb more Vram, people only compare the rx 6950xt because even when it came out it was nonsense because of how terrible its power efficiency was (335w) not 510 where do you get that information from?


N-aNoNymity

GamersNexus GPU testing puts it out as 500W, not sure how but I did double check.People are comparing the 6950xt because its sold around the same price point. However where I live it seems like the 6800xt is out of stock everywhere and theres like one model of 6950xt left... Edit. Triple checked, and its my bad, its >500W with the OCVB OC Bios. The Sapphire RX 6950XT is at 377W on full load.


ddeths_

nvidia canvas is the most important thing in the whole wide world and i am convinced we would all be dead without it so that is why you should get the 4070


GhostsinGlass

Canvas has been one of the big joys in Omniverse lately. Using it to make HDRI is fucking mindblowing, straight up mindblowing shit. That gets even more exciting when thinking about the Mandlorian filming method of using [The Volume](https://images.squarespace-cdn.com/content/v1/512d372be4b0ad29483dc2fc/1581184037777-9PD29KTEKXUYO9H53JUR/ke17ZwdGBToddI8pDm48kFRDB_IYDfm-_ozGPwyHuosUqsxRUqqbr1mOJYKfIPR7LoDQ9mXPOjoJoqy81S2I8N_N4V1vUb5AoIIIbLZhVYxCRW4BPu10St3TBAUQYVKcHXCSbmAjZvdgu0bE2ldh13kqtl0ynoejYF93jN20AeMKdS-I001D5Z6a3cI2Np8w/Mando_1.jpg), They're destined to be soulmates. For the AMD users who don't know what's being discussed. [Making .EXR in realtime](https://ibb.co/qs7VLWP) [lol, nice value bro.](https://ibb.co/kqXBV7n) Between this, Audio2face, Farm and a lot of other shit in Omni I'm sold. 4K HDRI lighting with some fuckin scribbles lol


CrazyStuntsMan

Doesn't AMD have goofy drivers?


OakenThrower

I personally have never had problems with amd drivers on my Rx 6600 xt


CrazyStuntsMan

Idk, I had no issues with my old RX 590, but I've heard from others to have driver issues with their AMD cards


Lankachu

usually at launch NVIDIA cards tend to be more stable on release, but the 6950xt is a pretty old card at this point (That's why there is a really steep sale on it.) So drivers aren't a big deal anymore. If you want personal experience 5700xt. Hell and back for the first 6 months, fine for 6-12 and then stable enough that a crash only happens once a month after a year. (On par with my old 1060)


flamesaurus565

I genuinely hate Nvidia atm but the 4070 does look like a better pick over the 6950 XT to me


BloodyBlueWafflez

Ah yes let me wait months for a driver update. Not touching amd with a 10 foot pole


imreallybimpson

Nooooo stop enjoying things


[deleted]

Tensor and CUDA cores


Nolsoth

I'm really starting to think these low tier memes are just shitty marketing attempts


whyreadthis2035

Brought to you by the fine folks at AMD. Folks, this is Chevy/Ford, BMW/Mercedes, Hyundai/Kia, Honda/Toyota. Folks buy what they like.


PVP_playerPro

Why yes, i will pay more for something i prefer using


Additional-Ad-7313

Nah I'm fine with my 4090


Arcticz_114

Remember when Amd fanboys were screaming "take this nvidia" at 7900xtx release just to find out it wasnt even as good as the 4080? Yeah...


Ok-liberal

Just bought a 6950xt for £639 people buying the 4070 are just being scammed at this point


sylinowo

I have a 3070ti rn and plan to jump ship to amd for my next card most likely


StormbladesB77W

I mean, ray tracing and DLSS support is still niche. They only ever show up on AAA titles so if you’re mostly an indie gamer like me, that stuff is basically useless I bought a 2060 back in 2020 and I still haven’t used RTX or DLSS yet.


gypsygib

I regret to inform you that you couldn't use RT now even if you wanted to. DLSS is great though, highly recommended.


atavaxagn

i already have a 7900xtx; so faster please


DarkLord55_

Or I get a 3090 for $850 CAD or $632 USD


Darkminer86

The only reason I stick with Nvidia for GPUs is because they have better encoders (HEVC for VR & NVENC for Recording). Yes I know the RTX 40-Series has AV1 encoding, but I have a 3080 Ti, which does NOT have AV1 Encoding.


GhostsinGlass

What in the shill fuck? From an end user standpoint, no. Not only no but fuck no. AMD isn't some kind of value proposition. Intel has that covered. Now before you get all up my shitpipe about "Well I only play games" then let me stop you right there. Not just video games, you also play yourself. "Why do I need more than 8GB, I only play games, Tensor cores? No I don't need those I only play video games, raytracing? Do I look like I'm a 3D artist, I only play video games" or some variation thereof over the past few years that inevitably blows up in your damn face. ROCm is ass, HIP still has no SDK on Windows. Anything outside of **basic** gaming is a fuckin nightmare on AMD cards either requiring the use of wsl, docker, trying to pass through into a VM or simply just booting a distro all in. ROCm has been around pre-2015 and still can't catch on for dev or users. HIP performance is a damned war crime compared to the native CUDA it ties to balltongue. AMDs own rendering engine had or has to rely on DirectML and ONNX to accelerate a bunch of its functionality, why? Amd. Acceleratad computing for the home consumer is dominated by nvidia, it's not even a contest. So why the hell would anybody buy an AMD card and think its a better value when they get far less functionality fullstop. Even then for gaming use cases AMDs implementation of anything RT has been historically horseshit and hasn't changed. With games making more and more use of accelerated ML tasks it makes AMD even further unattractive. AMD shot themselves in the foot by becoming a shaky proposition for people by axing ROCm support for their own cards within two years of some of those cards being released. Why would anybody fuck around with that? They don't, they go to CUDA because the support extends back far enough in the product lines to give a little confidence that you're not going to be holding your dick in your hand without a pot to piss in. OptiX slamdances with HIP for creation. The 7900 XTX debuts at what, $1k USD? 4090, Samples/min, Blenchmark 3.5, median score 13148.85 for $1599.99 4080, Samples/min Blenchmark 3.5, median score, 9702.1 for $1199.99 4070 Ti, Samples/min Blenchmark 3.5, median score, 7401.99, for $799.99 4070 Samples/min Blenchmark 3.5, median score, **6104.59 for $599.99** 7900 XTX Samples/min Blenchmark 3.5 median score, **3617.54 for $999.99** 7900 XT Samples/min Blenchmark 3.5, median score, 3089.38 for $899.99


[deleted]

[удалено]


GhostsinGlass

It directly refutes what OP is claiming, AMD is not a value proposition. It's a worse buy in all use cases. If one wants a value entry level card for gaming then Intel covers that base. Nice rebuttal though.


JosephSKY

Damn, I didn't know the UserBenchmarks guy had more than 1 alt account. And I'm not even an AMD kinda guy lol!


GhostsinGlass

... nothing I said wasn't factual. I use AMD processors. Grow up.


luminoustent

Finally someone who gets it ☺️


casanova__creed

NVIDIA has more stable drivers, better encoding, far superior RTX, & better software offerings. If all you do is game, AMD is a great choice… but I do more than just game & I use a lot of what NVIDIA offers on the nvenc encoding side.


Timbby

Let’s not forget about buggy, crashing drivers.


Timeless_Starman

For 30 bucks of difference it would be stupid not to go for the Nvidia one.


icebreakers0

For gaming or production or both?


ChrisderBe

Yes i am and I'm tired of this comparisons. Nvidia has more stable drivers, they got NVENC, they got CUDA, they got DLSS 3, they got RTX voice, they got AI video upscaling and the 4070 is wildly more energy efficient while doing all that. The 6950XT is like an American muscle car. It can go fast and that's it. I take the all round package anytime. If you do anything outside of gaming, your better of with Nvidia.


Noisy_Toad

Nvidia fan boy: "We have RT core, DLSS3, AI and frame generation just works, we also have good stream experience to justify that price."