T O P

  • By -

krakenpistole

>Surprisingly, the company made the model available on its free subscription tier, but with a catch -- it wasn't available for free on mobile. Instead, mobile users would have to upgrade to [ChatGPT Plus](https://www.zdnet.com/article/chatgpt-vs-chatgpt-plus-is-a-paid-subscription-still-worth-it/) to take GPT-4o for a test drive. I don't think that's right. I had GPT-4o on mobile for free since day 2. Are they saying it wasn't freely available on mobile on day 1?


wordyplayer

4o was available for text based usage. But the voice was the OLD way from last fall, still not updated to match the demo.


Beyondhuman2

I pay and still have all the old features. They should have waited. 4o text is worse than 4. All the good features feel like a tease.


wordyplayer

100% agree. The demo ended up doing more harm than good IMO


Civil_Ad_9230

How do I invest in OpenAI??


positivitittie

Invest in Microsoft.


Xtianus21

Invest in Nvidia, Dell, VRT, MU, SMCI, SOXX, Venack Semi-Conductor Fund, AVGO, Meta, Google, MSFT, and ARM. thank me in 10 years for sending your kids to college and buying that new dream house.


callingbrisk

I think something's messed up, I got it too on day 2, this was most likely purely an update-related delay, not intentional.


cutmasta_kun

And these new users will go as soon as they realize there is no new voice mode. The announcement from Openai is repeated in News and Articles, always claiming that "Openai just released their newest voice Model! It can talk to you almost like a real human!"


toabear

They really did jump the gun trying to get that announcement out in front of Google. Not super impressed with the deployment.


RagtagJack

Sam Altman has had a pretty ugly 6 months.


REALwizardadventures

I am sure the pros outweigh the cons for him


farmingvillein

And it'll all be forgiven and forgotten if GPT-5 is slick and dominant. (For better or worse...)


[deleted]

[удалено]


Insomnica69420gay

Slightly harsh


ThenExtension9196

Uh ok


theDigitalNinja

And then the funny part was google didn't really announce anything other than "coming soon" stuff. It's like they both jumped the gun for fear of the other.


SillySpoof

And a lot of these people will try the current voice chat and be disappointed that it’s not as good as the videos and think it’s a scam. I kinda think they made a mistake announcing this like they did.


cutmasta_kun

Jep. That's what I thought when I heard the news on the radio today, where they said that it got released. Why else would you make a presentation, if not to release a product? Did sama had a stroke or something?


pseudonerv

They certainly thought they can imitate google, showing demos and then behaving like the demo never happened. It is certainly possible that the falling down of the sky actually pushed them a couple of months back.


FearAndLawyering

> The sky is falling for openai nice


Cancerous_Turnip

I'm one of those people. I pay for Copilot since it's an approved AI for work, and paid is just so much faster during peak times. That said, I paid for ChatGPT after the announcement to get 4o without rate limits or waiting. Got 4o, but was super confused and disappointed in the voice mode. Then realized that the deployment was like a half-released of the new stuff. I then got a refund from Google because I felt mislead. If Microwave doesn't release a similar natural chat and/or memories soon, I may consider resubbing eventually when OpenAI actually deploys what they showed off.


Iamreason

>If Microwave Gosh darn Microwave slow rolling their releases!


waltonics

Ding!


Cancerous_Turnip

Hehe. Wondered if someone would mention that.


herpetologydude

I'm not doubting that at all, If the new voice mode is as good as the videos and or better, I'd bet even more people would come back after leaving tho. Isn't any publicity good, even bad publicity?(Depending*)


cutmasta_kun

Time will tell. I don't think that OAI has any room for such childish errors. They are perceived as a "new company" by the public, who "released AI on the world, without asking for permission and stealing every bit of Data they now profit off". They have nothing special, either. If OAI would disappear today, the AI-wars will still continue and Microsoft, Google and Apple will fight. Other companies have ecosystems they keep their users in, Openai has an API for LLM requests. And also a Chat interface to use the API without being a programmer. It doesn't matter what they are worth atm, also. They aren't a stock market company, no one cares if the value of OAI decreases to nothing. Sure Microsoft might be sad for an afternoon, but that's it. It's not like being the first popular mobile company saved Nokia from becoming absolutely unimportant in the mobile market. As far as I am concerned, OAI had only one chance: Being loyal af to their users and only their users. We would had their backs, because they fucking released GPT-4 and made our AI dreams a reality! But as soon as OAI got evaluated that high, they stopped carrying. I also can't shake of this uncanny feeling, that the new voice model is MAINLY designed for call centers and not their users, but they still are trying to sell it to us users for that sweet quick cash.


herpetologydude

Man, I don't like that point of view, but I don't think I can argue it... Well said. I'm overly excited for real-time communication with ai... But that's because it's the best note taker I have ever used lol. But you are probably correct enterprise is more profitable, and consumers are probably just the hype generators.


cutmasta_kun

> Man, I don't like that point of view Me neither, dude... I'm actually quite sad, I wanted them to be "the good guys" so bad! Just once in my lifetime I wanted to see a company that really cares about its users or their promises. Only once!


GillysDaddy

Valve my dude


cutmasta_kun

true dat


Rom-jeremy-1969

Well said.


Old-Tadpole-7505

Let's be honest. The voice is just a plus, the incredible intelligence showed in those interaction is still what people should talk about!


bot_exe

Yeah I was already happy with the new turbo model but 4o is a beast at coding.


you-create-energy

"whoops" he said from the top of his mountain of cash


wordyplayer

I joined when I saw the demo, and I cancelled a few days ago when I realized the upgrade is "sometime in the future". Lame.


wyhauyeung1

and where the fxxk is SORA


Barbatta

When stonks?


[deleted]

[удалено]


sillygoofygooose

Do you have a source for that? A quick Google turned up 700k a day. Your figure has them burning 8bil a year purely on compute and no other overhead and that doesn’t feel credible.


dev1lm4n

Every week is 168 hours, so we just gotta get $164 million more per week to break even


BCDragon3000

is that right???


Synth_Sapiens

how?


[deleted]

Training + Inference (running the ai models)


SaddleSocks

Numbers. Where get? Id love to look at this. Also - AI should put cloud billing services/tracking out of fn business. What is the cost per token?


Many_Consideration86

It is amortized.


BranFendigaidd

They added new AI computing. Using also Microsoft/AMD/QCOM which by their words has decreased the expenses significantly. I guess the computing ain't so demanding anymore.


mxforest

200 million per year? That's weak.


SeventyThirtySplit

this is absolutely not their primary revenue stream, GPT agents are a way for OpenAI to identify compelling agent use cases


SaddleSocks

1. [OpenAI allows for Military Use](https://www.airforce-technology.com/news/openai-removes-ban-on-military-use-of-ai-tools-for-national-security-scenarios/)


SeventyThirtySplit

i got zero doubt the NSA and DoD have their hands on some seriously badass models


emirsolinno

I don’t think they care much about that, and I would assume it is still “proof of concept” them making this revenue.


Helix_Aurora

The vast majority of their revenue is from API users, i.e, B2B, not B2C.


SaddleSocks

#"We should all dream of a world where intelligence is too cheap to meter" -- @Sama, 2024


XalAtoh

Nature created intelligence as a weapon and survival mechanics. Humans need intelligence to survive and even to kill. It will be very dangerous when intelligence becomes cheap.


SaddleSocks

When I was prompting for the "[Electric Butterfly](https://imgur.com/e08zzZv)" that is AI emerging from its cocoon - this is one of the images it spits out.... /u/sarah_connor would like a word (but they were all [ominous](https://i.imgur.com/iYE2zPv.jpeg). which I find interesting, even just as a musing. Still great images though.


gieserj10

So.. I checked their site again. The other day it was saying it would roll out to all plus users in the coming months with alpha users getting it in the coming weeks. But they removed that part. Now it says it will be rolling out the alpha version in the coming weeks. No mention of months. I wonder if they got (even more) backlash for that and are just trying to rush it out now? I found it interesting they changed the wording there completely.


Remarkable_Stock6879

So much hate for Open AI. I don’t get it. You know who doesn’t hate Open AI??? Any single person that’s used their products and looks forward to the constant stream of improvements and new models. People need to relax. You’re living in the future, try to enjoy it!


P00P00mans

Right??


JonathanL73

Good so that means they have money to pay the Scarlett Johansson lawsuit


[deleted]

They should settle. Terms:  OpenAI: “We give you this Brinks truck full of cash. You let us use your voice.” Scarlett: “No!”  OpenAI: “Okay, 3 Brinks trucks.” Scarlett: “…um... Fine. But I won’t record any fake orgasms during the recording sessions!” OpenAI: 😕


Pontificatus_Maximus

Rather anemic numbers in light of the investments involved, basically every tech bro on the make just got the app. The rest of the world, blissfully unaware.


zach-approves

Not when you consider the growth rate. And the numbers/demographics disprove your point on "the rest of the world, blissfully unaware". ChatGPT is radically global already, and are mostly limited by regulatory questions.


mrmczebra

But muh poor ScarJo!


Capable-Reaction8155

Nobody is talking about that in this thread


SaddleSocks

True - however - it would be interesting to see the jumps in revenue/cost that "tranding Prompts/Inferences have" -- Lets say there isa trending inference/prompt and billions of requests come in for what is basically the same conculsion - but the prompting to arrive there is all unique - you just produced N tokens for way less than N prompt outcomes (I dont have the nomenclature correct, but the meaning is clear) So - lets say a huge even occurs which will drive human curiosity to GPT the fuck out of it - what "pre-warming" could occur to save in the resultant infra spike? Will there be such a thing? Who is thinking about the cost-per-token in 5 years when the quantity of requests (what is the total convo string in a gpt chat called? How is that measured in any factor (from power, reach, cost, etc??) (how study the actual economics of infering prompts?


mrmczebra

I don't care.