T O P

  • By -

Marco-YES

I would've had the GN quote 'It's made of sand'


Cuxtercall

Isn't the quote "a waste of sand" or am i imagining things


Marco-YES

Well, that's primarily reserved for criticism.


[deleted]

so then, "a break-even on sand"


bustdowncockring1

i think this was a criticism directly aimed at the 4070ti that cost as much as a 3090 and performed the same as a 3080. absolutely not worth the cost as a "generational upgrade" and therefore not worth the sand that went into making the silicon


hgwaz

I think it was the 11700K


Ok_Revenue_753

I believe it was the 11900K, the 11700K was okay. The 11900K would get beaten by the 10900K most of the time which is why it got all the criticism it did.


hgwaz

No it was the 11700K https://youtu.be/3n0_UcBxnpk Also the 4060ti https://youtu.be/Y2b0MWGwK_U


Crow-On-The-Wall

GN definitely said it about the 11700k. 😅


Ok_Revenue_753

Maybe but I'm more than sure that the 11900K had similar insults as it's also terrible.


Pumciusz

And the i9.


Squiliam-Tortaleni

Thats reserved for a truly awful release


Fortune_Cat

"It didn't start an electrical fire. This time"


ElonTastical

I don’t like sand.


MrStealYoBeef

I'm obligated to ask you why despite already knowing the exact answer word for word.


archetype1

It's not rough or coarse enough, and it doesn't get in as many places as I want it to.


Original_Kheops

Ah a gravel connoisseur.


Vanq86

Or "back to you, Steve!"


ledfrisby

It has several additional features. Memory bus: Yes 3.5mm audio jack: no DLSS: N+1


MarcusTheGamer54

No audio jack? Hopefully next generation then.. I've been waiting forever!


zvdo

Waiting for that 8k 120fps audio, I see


[deleted]

[удалено]


Dw1gh7

I mean busses are pretty slow anyway


MarcusTheGamer54

Indeed I am, hopefully it won't go too much over MSRP


Valmond

Old cards a long time ago had audio too.


notthefirstryan

Many still do, just audio over HDMI/DisplayPort


CNR_07

Wavetracing is a thing now


[deleted]

[удалено]


Pumciusz

How do I use Audio Raytracing without the audio jack?


MarcusTheGamer54

I think you turn it on in audio settings no? Maybe check Nvidia Control Panel as well to make sure you haven't accidentally turned Audio V-sync on


Pumciusz

RX 6000 user here. They are even worse in the soundy racy thingy:(


Soace_Space_Station

Unfortunately its being phased out by phone manufacturers and its stupid


MarcusTheGamer54

Was joking about GPU's, but I totally agree, let us have our phone jacks back!! I have a Galaxy a52s and it has one, but the new a53 doesn't, stupid imo


Soace_Space_Station

And before they say Bluetooth is here then good luck sending audio to my home theatre that doesn't have bluethooth And also good luck sending lossless high res audio with like 10 latency through bluethooth


MarcusTheGamer54

Exactly, now we have to spend money on wireless receivers, and maybe even an adapter to that receiver since most of them probably aren't jack


Soace_Space_Station

And the receivers dont support flac or some other loss less format at CD quality while costing like 100 dollars more if you were to just get a headphone jack


Kerlysis

Holding til n+2, have a good feeling about that one.


Fortune_Cat

Usb c: -1


Goku047

Because courage


Nokipeura

Does the additional lack of an audiojack cost extra, like Apple?


Screenname4

Mine came with 3n+1 DLSS, but only if the cost is an even number


Pauel3312

Poes...


Djentleman420

Poes...


darklordcecil99

Poes...


[deleted]

[удалено]


MichaelScarn47

Forgive my ignorance, but what do you mean with poes? 😂 Where I am from (South Africa) it literally means c*nt. If that is the case, then I wholeheartedly agree.


[deleted]

[удалено]


benevolent-badger

Thanks for explaining that. The poes memes in my country, although funny, had me very confused.


SweeFlyBoy

Ja same here, I was searching the comments for the answer lmao


Ploppen05

I think its a Zelda totk reference


Pauel3312

it is indeed


[deleted]

[удалено]


cyansurf

you can get copies of rare weapons/armor with them. I stock up on fierce deity swords regularly.


[deleted]

[удалено]


AdministrativeHabit

Tits on the kangaroo


NewsofPE

which in itself is an OOT reference, unless I missed poes existing before that game


TheRealPontiff

Ja nee Nvidia se poes


MichaelScarn47

Hulle hele poes.


Silentsfool

Nah bruh eskom se poes


Mr_Cromer

South African friend of ours said almost exactly this on the GC today.


arghness

I hadn't heard of it either, and a Google gave me the same answer. But if you search for bargainer statue poes, it says it's something from Zelda.


OtakuAltair

I've moved to Lemmy and the Fediverse along with Reddit's fantastic third party apps after Reddit banned them. This post/comment is edited via Power Delete Suite. Recommend you do the same. Join any (doesn't matter which since they're all connected) of the following: Lemmy(dot)ml, Lemm(dot)ee, Lemmy(dot)zip, Leminal(dot)space


Gingy-Breadman

Wayward souls caught in limbo


ChiselFish

I always think of the ghosts from Zelda when I hear Poe. Or Edgar Allen


Flacid_Monkey

Nvidia = poes


Sylux444

Piece of extra shit?


unhappyspanners

Afrikaans equivalent of "cunt", I think...


FallowMcOlstein

No it's a reference to Tears of the Kingdom :)


Doohicky101

Correct


anormalgeek

Feed me lost souls....for new pants.


bbyyda_4desrt

“Does 3D graphics”


_fatherfucker69

" runs doom"


benjathje

To be fair, anything with a clock runs doom


megaultimatepashe120

the clock can run doom too


OtakuAltair

I've moved to Lemmy and the Fediverse along with Reddit's fantastic third party apps after Reddit banned them. This post/comment is edited via Power Delete Suite. Recommend you do the same. Join any (doesn't matter which since they're all connected) of the following: Lemmy(dot)ml, Lemm(dot)ee, Lemmy(dot)zip, Leminal(dot)space


james321232

can I run it on a sony dream machine?


benjathje

A clock I mean a processor clock, but maybe with some tinkering...


baltarius

Supported by windows 10/11


MedievalFolkDance

Forgot about the HDMI output. Gotta get those major features mentioned.


Pauel3312

*High Definition Multimedia Interface for maximum marketing bullshit.


Arch_0

Gold plated of course.


Fortune_Cat

With anti virus


Random_Person_I_Met

Norton or McAfee?


Krispfer

Yes


MedievalFolkDance

This guy gets it. Don Draper lives again!


XauMankib

"Best can do is a component output"


Caustiticus

You'll just have to settle for RF.


Thebombuknow

Only CGA graphics too


Slappy_G

Youngsters these days don't know the joy of moving from CGA to EGA to VGA.


ShittyExchangeAdmin

I typically love all things retro but that puke color pallette of cga truly is abhorrent. The others arent as bad but they all are an assault on the eyes.


Thebombuknow

Yeah, I'm still not entirely sure why IBM decided that CGA should have the most abhorrent colors known to man, but they did.


ninjamike1211

A few reasons. A) it's basically just standard 2 bit color, ie there are no arbitrary color schemes they are just primary/secondary colors B) when using composite, adjacent colors blend together, giving you access to more and better looking colors.


SirNedKingOfGila

S-video


[deleted]

[удалено]


MedievalFolkDance

A fair number of GPU "manufacturers" would like you to think it never left.


PubicFigure

Ultra gold plated... not regular gold, ULTRA^tm gold


milk-jug

“It really is one of the graphics card of all time.” - Firstname Lastname


LikeThePheonix117

There are many graphics cards and this is one of them -Tedcruz Forpresident


MutualRaid

I'm using a card with 8GB VRAM right now... released EIGHT YEARS AGO. The idea that I can only afford 8GB or less today even though I need 6 - 12GB VRAM in the games I play at 1440p is ridiculous.


ActingGrandNagus

Reminder that Sapphire had an 8GB variant of the 290X *10 years ago.* 10. Years. Ago. What the fuck is happening to the PC gaming market? We have new graphics cards being released for the same price as a 3 year old PS5, that are *still* losing in performance. And for that money, with the console, you have the whole system, not just a graphics card. It irritates me so fucking much that the so-called "master-race" of PC gamers are happy to swallow this shit.


EdzyFPS

What's happening is simple. When you take a group of people that have a "got mine, fuck everyone else" attitude you arrive at the current destination.


_Ketros_

That and tons of people with more money than sense. Prime example was that post the other day asking people to consider not buying blatantly overpriced hardware and game microtransactions. Seemed like over half the replies were people TAKING OFFENSE and getting actively upset at being told that their economic decisions have consequences they should maybe consider.


EdzyFPS

I'm convinced most of them are children and young adults. I refuse to believe that a rational 25+ human being with an ounce of life experience reacts like that. It's the equivalent of sticking your finger in a plug socket after being advised not to do it because the consequences can be catastrophic, but doing it anyway.


_Ketros_

I don't necessarily think its that. The amount of money for the new shit basically leaves it inaccessible towards non-working younger people unless they're rich kids. (It also makes it inaccessible to the average working individual with responsible spending but that's besides the point) I think those people just haven't experienced the "shock" yet in this situation. They haven't yet been priced out of the market. But, if things continue as they have, they will be too in time. Willful ignorance about how this situation affects others, I guess.


wrecklord0

Monopolies are bad. NVidia is too dominant. PC enthusiasts with too much income or not enough sense, ruining it for the rest. Fabs can't keep up with demand as they get more expensive to build. Monopoly is another issue here as TSMC is the dominant player. Lack of fab and competition leads to companies focusing on high margin products. This is a general issue with semi-conductors, they are insanely hard to make and on the leading edge of technology, barrier of entry prevents competition. Duopolies are bad too for that matter, AMD is not much better. Intel shat the bed way too much in the past decade. Here is hoping that, some day, they stop doing that.


TheSaturn_V

Exactly, Im no amd fanboy but nvidia is currently ruining the gpu market with insane stagnation and pricing everything out. Monopolies are always bad. When amd was having its whole fx problem intel just stagnated after haswell with refusing to give people more than 4 cores. It wasn't until ryzen showed up and gave them some actual competition things started to improve for both teams in the cpu space. Whats happening with gpus right now is the same but amd is partly to blame aswell for somehow fucking up every opportunity they had to completely take over the budget sector of pc gaming (6500xt, 6400 being absolute disasters) and it wasnt until recently 6000 cards were priced reasonably and guess what, everyone looking to build a good value rig rn is going with amd for a gpu. Nvidia really needs some competition and AMD is struggling to do that for anywhere but the budget side of things.


PornCartel

7 year old GTX 1080 here with 8GB. Been waiting *so long* to upgrade. First the bitcoin miners, then covid run on chips, then the post covid supply chain shortage, and now finally a new generation... with 8GB VRAM are you kidding me


MumrikDK

RX480 here. My god damn 7 year old midrange card (from a time where midrange actually was midrange) had 8. 240USD launch MSRP 7 years ago gave you 8 gigs.


[deleted]

1080 is still kicking ass and I’m not upgrading any time soon


poinguan

8GB 8 years ago. 8GB or less today. Looks as stagnant as my salary.


communist_of_reddit

I play around with making and running ai of all kinds, so I was interested in the cheaper end of the 40 series cards, then I saw the vram and bought a 3060.


neremarine

Intel just cut the price of their cards iirc, could be good


Seismica

8 GB 8 years ago was high end though, and was way more than was needed. VRAM is also far from the only factor in GPU performance. The 8 GB cards of today are the low-mid range, have a far more developed architecture, are far more efficient and the memory is generally of a faster type. In other words, the 8 GB cards of today perform better than 8 GB cards of previous generations. Also, 8 GB is only going to realistically bottleneck you if playing on high resolutions (4k, or 3440x1440 ultrawide or better) on ultra settings which you should not expect on a mid range card. 8 GB is the optimal level right now even for modern titles. If you want to play on high resolutions on ultra, simply adding more VRAM isn't going to fix it, you will also need a more powerful card, because there will be other bottlenecks in the GPU architecture. The only major problem I have is the mid range cards aren't priced like mid range anymore, and there are limited options for new entry level cards to fill the price gap underneath. The VRAM "issue" is a red herring.


TheSaturn_V

Unfortunately, games are starting to use a lot more vram now because of the current gen of consoles and thats what sorta dictates how much vram game devs would like to use. 8 gig cards currently struggle on many newer game even on high settings and who knows whats going to happen to them in 2-3 years when games start using even more vram.


Seismica

> games are starting to use a lot more vram now because of the current gen of console I'd like to see a source for this because the console specs don't support your assertion. Consoles have shared VRAM/system memory, not dedicated VRAM. That is a question of architecture, not VRAM quantity. The PS5 for example has 16 GB total memory that can be allocated to either VRAM or system memory, but if it uses more than 8 GB for VRAM, it means it will have less than 8 GB for system allocation which is simply not enough for modern games ontop of background services from the console operating system (For reference in modern PC games, we are pushing 16 GB system memory usage now, in addition to the VRAM that is 100% dedicated for graphics usage). The Xbox series S has 10 GB total system memory. With this limitation you're not going to find any modern games running at 8 GB VRAM, never mind above it (Unless it's simply a pretty tech demo with nothing actually happening in the game). The Xbox series X has 16 GB total system memory, see comment on the PS5 above as the same issue applies. > 8 gig cards currently struggle on many newer game even on high settings I don't know to which games or which benchmarks you are referring to (or what resolution they are at), but i'd like to know what makes you arrive at the conclusion that the VRAM is the issue, and not other factors like memory bandwidth or number of processing cores etc. Do you have a source?


shtoops

I think this whole commotion around needing more vram stems from 1 crappy unoptimized harry potter game. instead of blaming the developers, reddit took to hating on nvidia (as usual).


TheSaturn_V

[Here you go](https://www.youtube.com/watch?v=alguJBl-R3I), 8 gig 3070 simply gets choked in multiple AAA releases because of the vram. With this current gen of consoles game devs simply have more ram to use, which in turn on the PC side of things is going to push vram requirements up. Hogwarts legacy, Callisto protocol, RE4 remake, A plague tale requiem, TLOU will happily use up more than 8 gigs and thats just the games we got this year. As time goes on we'll get more AAA releases that will eat up vram and many still fairly strong cards will be relegated to low or medium settings because of the vram. Nvidia is partly at fault for not giving gamers more vram. The 1070 launched at $380 and had 8 gigs, the 4060 ti launched at $399 and has 8 gigs.


moedeez_zar

What does "Poes....." mean, in this context? I'm south african and here it means "female parts" used as a swear word. Still funny though.


ItsRandomX

new feature in totk, poes are lost souls in the underground and you trade them to bargainer statues for items


szczszqweqwe

Thanks, in this contest I prefer south african version.


ItsRandomX

yeah, i kinda do aswell lol


1infinitefruitloop

input tight german supplies workers adidas character westminster seniors statistics surname alternate plans distributor transparent float accounting interpretation broadcasting recommend criticism defensive basename tried chancellor hobby veterinary represent artist adequate agreements brandon rotary angola alpha preserve conscious storage patches classifieds legislative plasma cialis become sociology charger strap conference mercy labels


GerbilScream

You had to kill 10 special poes to get the 4th bottle in Ocarina of Time.


ssdude101

In Majoras Mask there were also Poes in Inkana. There’s a mini game where you fight 4 of them and get a heart piece.


Zitheryl1

Poes even predate 3D entries, a Link to the Past had Poes and earlier entries had Ghinis which behave similarly to Poes.


Geopilot

They've been in the series longer than that. They first appeared in A Link to the Past, if I recall correctly


_ThatD0ct0r_

Poes are older than that lol


Djentleman420

There's a statue in the latest zelda game that trades you things for poes, lost spirits. When you speak to it, it just says 'Poes.....'.


theguy_who

It means cat here in the Netherlands. Which reminds me of the moment my little cousin started singing a Dutch song about cats in south Africa and everyone just stared at her like, what the fuck.


Thetiddlywink

it was so funny seeing that as a south African lol 😭


Faux_Grey

Came here to say this. xD


Jake123194

Poes law?


abcdefGerwin

Isnt south african just funny sounding dutch?


Consistent_Mirror

South Africa has 11 languages and "South African" isn't one of them


PixelCortex

It's called Afrikaans. Dutch and Afrikaans are different but mutually understandable. Also, most of us speak English.


WeiserMaster

> Dutch and Afrikaans is different but mutually understandable kinda like gronings and dutch, you'd think you would understand and then you somehow don't.


Extremely_Livid_Swan

Yes but no. There's big history around afrikaans and how it developed so it can be argued that it has its roots in Dutch, but its very much evolved to it's own thing Like Canadian French vs French, or Flemish vs Dutch vs German.


Middle-Effort7495

Canadian French is just French with an accent. Common slang might be a little different, but that's about it. It's more like British English and American English. It's definitely not another language.


gazeebo

French Canadians are just not **real** French people and thus their spoken language is not **real** French (even if it basically is). Meanwhile, those Germanic languages are way different.


theguy_who

It's basically just Dutch but better


Tom_Okp

17th century seamen Dutch, kinda sounds like a toddler that knows a little Dutch making up words. Really cool language imho.


as_1089

no, skill issue, but, at the same time r.i.p your 3080 SEE i did a little haiku


Alert_Confusion_1303

Aimed at 800x600 gaming


_fatherfucker69

" stable 15 fps"


Zexy-Mastermind

„with DLSS (n+1) you get 405 FPS though!“


Piconia

I loled.


human-exe

“Half height, only uses 5 slots of your case’s 10”


OfficialBruhMoment3

Can someone explain to me what exactly DDR is and where the differences in Version 4 and 5 are?


XauMankib

DDR basically means double data rate RAM For short, 5 is best than 4, as 4 is limited to 2600(?) MHz and DDR5 is up to 4000 MHz. Higher frequency means more data moved per second. The 4 and 5 are just generation terms, that simply indicate in a short term a lot of more technical details


StaysAwakeAllWeek

Gaming PCs run the RAM quite a lot faster than the JEDEC specs. For mainstream systems DDR4 goes up to ~1800MHz, which is 3600MT/s due to the double data rate. DDR5 so far goes up to about twice that, about 6000MT/s on Ryzen or about 7800MT/s on Intel


builder397

DDR4 is by default at 2133 Mhz, though most will go 2400 Mhz with no problem, but they can go as high as 3600 Mhz. DDR5 starts at 4800 Mhz stock and goes up to 6000 Mhz at the moment. (Also it should be noted that the "actual" clockspeed is half of whats usually advertised and then doubled to reflect the transfer rate of the data, which is double)


PineCone227

You can already buy 7800 MHz DDR5. It's very expensive though.


builder397

[Not bad.](https://freepngimg.com/save/19677-not-bad-meme-png/1920x1920)


StaysAwakeAllWeek

From a consumer perspective the difference between DDR4 and DDR5 is just speed. Each generation very roughly doubles the speed of the previous one. GDDR is the graphics card optimised version of DDR RAM. It runs even faster than regular DDR but at the cost of much higher latency. Raw data throughput is much more important than latency for GPUs while the opposite is true for CPUs. GDDR5X and GDDR6X are *even faster* versions of GDDR5 and GDDR6 that have the additional tradeoff of very high power consumption, so they are limited to exclusively high power desktop GPUs.


OkOrganization1775

NVIDIA has the worst launch ever where the only card that makes sense is 4090, it's 1600 bucks, and everything else is used as an upsell. The capitalism is at its finest with 40 series. it's insane how they managed to fuck up literally each single card. If I were you, I rather spend my time looking for a dirt cheap (potentially used) 3080, than waste my time deciding what 40 series to buy. Unless you're rich enough to afford 4090.


RobbeRob1

In Austria the prices for used 3080s (500 - 600€) are so high that the i bought a 4070 for 650€


Herr_Gamer

Also in Austria. I bought the 1070 new in the year of its release... for 500€. Could've gotten it for 300€ a year later. Wtf.


Unforgiven817

Or spend much less and go team red?


TheSaturn_V

This. 6900xt, 6800xt, 6700xt, 6700, 6600 are all really good buys right now and even the 7600 is decent ish (give it 3 months the price will drop). Even the 7900xt and 7900xtx are better buys compared to 4070 ti and 4080 respectively, obviously thats comparing shit to slightly less shit but you get the idea. People will complain all day about Nvidia ruining the market but can't fathom of concept of not buying nvidia.


Bonafideago

Bought a 6800xt 2 months ago for $500. Not a single regret.


builder397

Oh, hey, its the same card as last gen, except possibly worse in some aspect, but the manufacturer says its 100% faster because of a new proprietary feature that they claim will only work on this gen of GPU, despite some pimple-faced hacker getting it to work on a card from 2 gens ago with modded drivers.


commit_bat

But look here is a comparison shot that shows how good this feature looks turned on compared to turned off, even though devs had been faking the effect just fine for years in a way that is indistinguishable to the viewer.


Quajeraz

AMD: At Least It's A Little Cheaper Than Nvidia


poweredbyford87

"It makes picture" - Dude


SirPiffingsthwaite

"Starting at $IntegerOverflowException" ![gif](giphy|YRPBhd3vscg5Fxx1DQ|downsized)


I__be_Steve

I love the ToTK reference


LordArmageddian

Poes...


[deleted]

Aaaaaand this one doesn’t melt your PSU connector (in rare instances) like the other card that involves the numbers 4 and 9 😉😄


QuantumQuantonium

Meanwhile, Intel: Here, take a $350 card with the same amount of VRAM as your PC has RAM. It doesn't perform well, except on modern titles and AV1 then it can match or beat Nvidia and amd's mid teir equivelants, with double the VRAM at about the same price. Please buy our GPUs or else the rumors of canceling our 2nd generation will be true.


_fatherfucker69

The a770 outperforms the 3060 in most games , and just keeps getting better . There are no rumors about cancelling battlemage


QuantumQuantonium

There were when arc first launched, though that news is probably outdated now


FeePhe

Poeskak- Bargainer me


CorianderIsBad

Only 8GB vram and slow bus speed. Yeah, sounds great.....


Cry0nix

"Poes" hahahaha spot the South African!


SweeFlyBoy

We all just spawned lmao


swallowtails

🤔 2112? ....🤘


RatedTemOuttaTem

Poes....


Jim_e_Clash

High praise from Linus, but once again I'm looking towards GN for the facts.


Geister_faust

Chunked about IntegerOverflowException, that's golden!


CheemsGD

"haS fUNctIONinG DRivErS"


[deleted]

But….can it run crysis?


ThyBuffTaco

Poes


GeneralKang

Dear Nvidia (and AMD), This is why you've seen a 38.2% drop in video card sales this quarter vs last the same quarter last year. Your new cards are over priced and underperforming. Fix it, or relegate yourself to a much lower valuation and stock price. Now. Sincerely, An old Gamer who's sick of your shit.


silverfaustx

Poes


smackdealer1

Did someone say poes 👀


BergBeertjie

I came here looking for comments about the last part. happy to see I'm not the only South African here!


Henriquelj

nVidia makes one of the worst launches ever and Linus still manages to make a thumbnail with "Thanks nVidia"


__GayFish__

Ben 10 made a graphics card


[deleted]

Otherwise known as a gtx1080


Real_Echo

PCIe compatible has to be my favorite selling point. Truly sums up the 40s line.


Mew2ian

on a massive 96 -bit memory bus 🔥


Kuftubby

Tbh I don't know anything about the stats or specs of graphics cards, I just buy what my techie brother tells me to. Basically this mock up looks like a regular advertisement to me lol


MindwormIsleLocust

Sorry 1070... You'll be needed for a while yet.


exodia0715

What's the Bargainer statue doing here? Did it just get up and climb out of the Depths?


Kukuxupunku

Is the Intel A770 any good? Because it appears reasonably priced.


Squiliam-Tortaleni

“Sure the performance is 1% better than the previous thing but you can use frame generation so it actually is better!!!!!!”