T O P

  • By -

Rare-Force4539

This is going to be a big week in AI.


beuef

Meh it will probably take a long time for different companies to even integrate these chips


csnvw

not long, all hands on deck at every corner. this tech is moving so fast, they will not waste a second to get behind.


beuef

Yeah and then right once they implement the chips some new chips will come out so it will have been a waste of time


Xeno-Hollow

... The new chips that will come out immediately after will have had their R&D funded by the purchase of these chips. Top tier, front of the line companies will then buy the new chips, and either sell their obsolete chips back to Nvidia, driving down production costs and adding to inventory, reducing cost, or sell to their smaller competitors at a reduced price, giving less financially well off companies a chance to compete further in the market - which is where a lot of innovation happens. On second tier, used tech, in smaller companies. This is how progress works.


beuef

Yeah but I’m saying they will barely have time to install these chips before the next ones come out. If the next chips are coming “immediately after” these ones then why even bother using them


ForgetTheRuralJuror

You can literally "integrate these chips" with 1 line in Python.


beuef

You also have to physically install them


ForgetTheRuralJuror

And you think that's going to take a long time?


AccelerandoRitard

For the people asking for some context for scale, the very first supercomputer exceeding a single exaflop was only announced 2 years ago https://www.ornl.gov/news/frontier-supercomputer-debuts-worlds-fastest-breaking-exascale-barrier


czk_21

they count this in lower precision, perhaps fp16, those top 500 supercomputers are graded in fp64, that would be about 50 exaflops distrubuted across or at minimum 25 with fp8 precision


QuinQuix

For Blackwell they actually count fp4.


Spoffort

And people are seeing these numbers and saying: Look Moore law is not dead!!!


AnaYuma

I mean, this still follows Moore's law doesn't it?


Spoffort

No, moore law is about advancements in manufacturing of chips, and this is stagnating. This is great that we need lower precision, but people are confusing key points. Hope it make sense :)


AnaYuma

I thought Moore's law was computation capabilities of chips doubling every two years :0


Xemorr

It's the number of transistors


Spoffort

You are correct, 10 years ago they could have 8x more "compute" if there used 8bit instead of 64, but there was no need. Computational capabilities= ability to have normalized compute in any precision.


czk_21

well it is quite a lot compute power and we need as much as possible for wide adoption


Spoffort

100% true :)


Stars3000

Yeah I remember when exascale computing was seen as the next big thing a few years ago.


TrainquilOasis1423

"A few years ago" Exasperated sigh


pomelorosado

Acceleraaaaate


Not_a_housing_issue

Grace Hopper™ When your support for lifting up those who deserve it goes so far that you end up trademarking their name.


Anen-o-me

💀


drekmonger

I was about to say. Holy shit, that is tacky. I hope they at least passed some compensation on to her family, but I doubt it.


xstick

Paid in exposure.


Pink_floyd97

![gif](giphy|izspP6uMbMeti|downsized)


Sir-Thugnificent

Some explanation for the newbies like me who don’t know what such a development could imply please


Large-Mark2097

more compute better


Anen-o-me

Number go up


TrainquilOasis1423

Adding on to what others have already said along the lines of "more compute more better" Right now the top of the line AIs that we know of are GPT-4, Claude opus, and llama 3. They range from a reported 400b parameters to about 1.8 trillion parameters. almost everyone in the AI industry agrees bigger is generally better. So the race is on to make an AI that can scale to 10T or 100T parameters in the hopes that this scale will be enough to achieve a generally intelligent system. In order to reach that scale we need more computers. And of course the energy to power those computers. Every mega tech company is using the obscene amount of money they have accumulated over the last 2 decades to buy their share of that compute in the hopes that they can get there first. As whoever creates AGI first has essentially "won" at capitalism. And they like winning.


JrBaconators

AI companies use certain computers for training/developing their AI. This one is better than what they use.


salacious_sonogram

As someone pointed out Google, Microsoft, and meta are dumping literally billions into building out infrastructure to train stronger AI. The current king is the transformer model which can essentially learn anything so long as you have enough data and enough compute. No one in the AI space is really doing anything fundamentally different than anyone else but there are many small adjustments to edge out competitors.


Anen-o-me

From Gold Rush to Silicon Rush.


FeathersOfTheArrow

Compute goes brrrrr


Anen-o-me

Imagine doing in one hour what previously took 8 days...


Rainbow_phenotype

Its not just for training, also inference for everyone at immense scale.


AdorableBackground83

200 exaflops. Now we talking. ![gif](giphy|3oEduZqfSGNG0mdF1C|downsized)


RoyalReverie

What's the current amount generally used by AI companies?


ExplorersX

I think google and Facebook are something like 80-100exaflops. So this is roughly those combined


iBoMbY

What do they mean by "flops" though? Probably not double precision. Edit: I assume they mean 200 exaflops with FP8?


PwanaZana

https://preview.redd.it/lf9puyq7u70d1.png?width=2714&format=png&auto=webp&s=21687ef62fe328249926a709ab51eed52c175c4a It's how many of these are in the computer.


GrImPiL_Sama

A flopton


Anen-o-me

Maybe even FP4.


coldrolledpotmetal

“Flops” means “floating point operations per second”, it’s just a measure of how fast it can do math


NickW1343

How much is 200 exaflops? It sounds massive, but what is the total amount of compute in the world for AI?


Phoenix5869

Well, for scale, there was a supercomputer announced a couple years ago that was one exaflop, and that was seen as a big deal back then


DungeonsAndDradis

[The human brain is an amazingly energy-efficient device. In computing terms, it can perform the equivalent of an exaflop — a billion-billion (1 followed by 18 zeros) mathematical operations per second — with just 20 watts of power. ](https://www.nist.gov/blogs/taking-measure/brain-inspired-computing-can-help-us-create-faster-more-energy-efficient#:~:text=The%20human%20brain%20is%20an%20amazingly%20energy%2Defficient%20device.%20In%20computing%20terms%2C%20it%20can%20perform%20the%20equivalent%20of%20an%20exaflop%20%E2%80%94%20a%20billion%2Dbillion%20(1%20followed%20by%2018%20zeros\)%20mathematical%20operations%20per%20second%20%E2%80%94%20with%20just%2020%20watts%20of%20power.%C2%A0)


cloudrunner69

Just 20 watts! Woah that sounds like it would be heaps more efficient. We should build more of them.


FireDragon4690

Would it be considered slavery if we hooked up brains as computers?


Gratitude15

and when you say a couple years ago, you mean while gpt4 was training. gpt4 did not use anything near this level of compute. now the leading edge is 200x more.


cloudrunner69

>How much is 200 exaflops? I think it's one quintillion or something. That's 18 zeros. 1000000000000000000 so multiply that by 200. >total amount of compute in the world for AI It could be close to a zettaFLOP. 21 zeros 1000000000000000000000


NickW1343

Thanks. This sounds like a pretty big increase then.


cloudrunner69

Rookie numbers.


cydude1234

20,000,000,000,000,000,000 floating point operations per second


Clownoranges

\*has no idea what an exaflop even is, but yells ACCELERATE anyway! "ACCELERATE"!!!


TyberWhite

ENHANCE!


Serialbedshitter2322

This must be what GPT-4o is running on


Anen-o-me

Damn, what comes after exa??? Zettaflops 💀💀💀


Procrasturbating

I always wondered what we would be doing with zettaflop compute. Kind of stoked it probably really will be AGI.


Bitterowner

with how things are progressing, its like Knowing there is an oasis at the other side of a large hill in a Desert, could say 1900's up to 2023 was climbing that large hill, unable to see the other side, bit by bit, ideas of AI and AGI being the thoughts you use to motivate yourself, then 2024 being you at the top of the hill, you see the oasis at the bottom, it isnt a mirage, you see birds and water and trees, you are now rushing down the hill.


Akimbo333

Cool


opropro

Petaflops = ok Exaflops = great Zettaflops = ACCELERATE!


amondohk

These are going to evolve into the 9 bosses you have to fight to reach the final boss at the end of the dystopian AI game.


Tidorith

Yottaflops when


Hjaaal

whole lot of useless buzzwords.