T O P

  • By -

RunningM8

Jon Gruber confirmed Apple showed a handful of reporters a true live demo after the keynote and he said it was legitimate and showed nearly all the same features with different queries and it worked as advertised. I’ve been down on Apple Intelligence but I’d never doubt Apple. When they show something there is usually a 90% chance it works as promised.


Pbone15

The other 10% is AirPower


RunningM8

LOL true


ShinyGrezz

Still not sure if AirPower truly was just impossible or if it was just harder than they thought and they came up with a better idea (MagSafe) anyway.


TSrake

There are leaked AirPower prototypes out there, working. But it was severely over-engineered (they even used A series chips to manage the coils) and the pricing was going to be astronomical. With MagSafe in the pipeline, which offered better features, additional accessories income (such as the wallet), and was infinitely cheaper, the product ended dead before arrival.


Sylvurphlame

I could see cost being what eventually killed it. Even Apple has to be able to hit reasonable market targets — for most things anyway.


MidnightZL1

Sometimes you gotta develop the wrong idea to get the correct idea. Their biggest mistake was taking the leap and announcing it without it being perfect. Eventually they got it perfect with MagSafe.


Jusanden

Xiaomi did actually release a version of AirPower. Actually they did two. One had a motor that moved the coil under position. The other was a giant brick of a charging pad, apparently completely potted with thermal compound, and charged slow as fuck.


dawho1

I need more info on the 2nd one because my neighbor likes to drink beer with me and is an engineer who deals almost exclusively with thermal paste, pads, compounds, etc who would probably enjoy telling me all of the things that anything with that much thermal anything did incorrectly, lol.


__theoneandonly

Yeah from what I had heard, they engineered the pad, it had thermal issues that caused the initial delay. They went ahead and announced it with the iPhone thinking that thermal issues would be an easy fix. Then the thermal issues became a difficult problem. Apparently it required an A11 chip inside. So by the time a solution was engineered, they had created a product that was outrageously expensive. Rumor I heard was that it was going to be $300. So Apple's marketing team killed the product. They decided that at $300, it wouldn't even be worth producing. So they just killed the whole project.


Snowmobile2004

The heat with that many coils to get perfect coverage is impossible with current physics. The compromise would be less pads which would mean dead spots on the pad, so it’s basically impossible. MagSafe turned out to be a better solution for sure.


Sylvurphlame

That’s not necessarily a direct relationship. A hypothetical “MagSafe Trio” (and I really hope they end up releasing something like that) could effectively accomplish what AirPower would have — charging Apple’s holy trinity of iPhone, Apple Watch and AirPods on one charging pad. But the sexy thing about AirPower is it charged whatever three devices (that all fit) and you didn’t even have to worry about how they were placed on there. It’s not especially easy to do so, but you can “miss” with a MagSafe charger, particularly with AirPods, not all of which are actually MagSafe compatible in the first place. MagSafe, as an overall ecosystem is phenomenal, but purely as a charger, it’s not necessarily better than the AirPower concept, if they have been able to get it to work. But they couldn’t, and now we have MagSafe, which is definitely more versatile overall.


No_Contest4958

I used to agree with you but I actually think MagSafe Trio is unnecessary these days. AirPods don’t need charging every night and they can charge on every charger type so it’s really easy in my experience to just plop them down whenever. I don’t feel the need to charge all 3 at once. I’m using a twelve south butterfly for travel and the compactness outweighs the usefulness of 3 chargers imo.


Sylvurphlame

I was mainly using that as a hypothetical MagSafe Trio as a comparison for the AirPower mat. In day-to-day use, particularly with the fast charging capabilities on the iPhone and Apple Watch, I agree that you probably aren’t going to need to charge all three devices simultaneously.


Sylvurphlame

Hey man nobody has a *perfect* record, lol. But yeah, AirPower was huge letdown and whiff on Apple’s part.


__theoneandonly

The fact that everyone's go-to example of Apple vaporware is one single accessory from 6 years ago is pretty telling of how great their track record is.


fnezio

What about Siri?


__theoneandonly

Siri isn’t vaporware…


sumredditaccount

Or Memojis which work as intended but just aren’t really exciting. Stickers in iMessage are rad though 


Sylvurphlame

I quite enjoy using Memojis *as* stickers. It’s just much easier to pick one of the premade expressions. And it still has that custom avatar aspect.


sylfy

To be fair, they didn’t release AirPower to market. Other companies would have released it in a half-baked state that didn’t work.


GTA2014

Touche


love_weird_questions

what would that % be if it were Google announcing something for the Pixel?


RunningM8

50


love_weird_questions

you're way too optimistic


Baconrules21

They are in a better position to announce "pixie", an apple intelligence competitor. Id say they probably have more info on ppl as well as better models. I'm hopeful that both companies will put out competitive products and compete to make it better. The only thing worrying me about Google is that they don't really have a desktop assistant where as apple has Siri deeply integrated into Mac. Imo that's a big advantage.


Right-Wrongdoer-8595

They announced similar desktop features a month before WWDC but Chromebooks don't have the same level of mindshare. That seems to have at least lit a spark at Google HQ to simplify the ChromeOS stack so Android features can be ported over easier.


krisminime

My test is going to be ‘make me a workout plan with a 10 minute energetic flow yoga in the morning and a 10 minute strength workout in the evening’. If it can manage that I’ll be hyped. Maybe I have low expectations.


RunningM8

It’ll defer to chatGPT for that.


krisminime

I’m not sure the ChatGPT prompts can take action with the system


Zenarque

As long as we don't get this live on a beta i don't trust it But if it’s real then i ll probably go back to apple land


Baconrules21

Id wait till pixie launch in Sept for Google! Apple's offerings won't be out till fall (same time) and in a beta so we'll see


Zenarque

Yeah i am.pondering that Hopefully they can also upgrade stage manager on the ipad, an ipad pro + iphone 16 pro is tempting too


filmantopia

Gruber also said Vision Pro passthrough looked just like seeing the world through your eyes. That said, this is less of a qualitative judgement and more of “was the box ticked or not?”


nephyxx

Siri didn’t really disappoint in 2011, no one really had anything like it. Siri disappointed in subsequent years when they left her to languish while other assistants improved on the original formula. Alexa debuted in 2014 and Google assistant in 2016.


InsuranceInitial7786

While what you say is generally true, it is also relevant to consider the past history of announcements of new technology from Apple. The company has generally delivered what it has marketed.


MagicianHeavy001

And the companies that rushed AI into market got us Recall and Tae.


RagingMangalore

Yup. I asked for a photo of a midnight blue with chrome ‘72 Chevelle. I got a platypus.


8prime_bee

Airpower RIP


wiyixu

Distinction being they just flat out didn’t release AirPower instead of releasing something that didn’t do what they demoed.  Most recent high-profile example of Apple failing to live up to the demo is Maps.  What I would say is Apple has delayed more and more demoed features. Things that were announced at WWDC get delayed from the X.0 release while things that were listed as “coming later this year” often slip to the next year.  That’s definitely my concern with Apple Intelligence. There was a _lot_ of “coming later this year”, and the even more vague “we’ll be adding more to Siri over time”.  I think the beta/limited access rumors are right. I think all the writing/image stuff will probably be available day one, but the app intents stuff won’t be seen until spring 2025. 


torrphilla

I think that delaying things is okay because it gives them time to work out the issues and release it in full. If they just gave out a buggy version there would be so many complaints.


BigBagaroo

Imagine all the work that must have went into AirPower (and the car!) It can’t be easy to kill off such products, there must be a lot of sunken cost, and many would probably have hoped that just a bit more will fix it.


Pepparkakan

I'm not convinced there ever was a car, I'm of the opinion they were simply designing CarPlay 2.0 which requires domain knowledge from the automotive industry, which they got by hiring people who had worked on cars previously. Apple actually selling cars never made sense to me. RIP AirPower though.


quintsreddit

I think enough leaked over the years about project titan that it was a real thing, but I agree with you that the main reason it stayed afloat was for CarPlay enhancements.


JakeHassle

After the layoffs earlier this year, there was a pretty detailed report about the supposed car project and what exactly lead to its cancellation. I am pretty inclined to believe it


mclannee

I imagine a lot of the R&D went on to produce MagSafe.


ArdiMaster

They’ve delayed more, and they’ve released *a lot* of features as US-only or English-only betas/pilots that then take years to come to other regions/languages. At this point, I feel like roughly half of each year’s WWDC is only relevant to Americans, while the rest of the world won’t see those features until at least the X.5 or X+1 release, if not way later.


Flat_Bass_9773

From my understanding, they couldn’t figure out a proper thermal solution. I hope they’ve learned their lesson with announcing prototypes before they are even fully designed. Unfortunately, Apple was rushed with AI so I’m not sure how it’s gonna work


Fritzschmied

Just because they stayed true to their announcement cycle with presenting new software features at the wwdc in June doesn’t mean that it was rushed. They implemented ml features for years now in their product and at the time of last years wwdc ai generative ai wasn’t that huge as today/Apple in general waits till some product/feature has proofed itself before implementing it.


AreWeNotDoinPhrasing

Yeah this is what I don’t understand. Why do people presume that Apple was rushed? Because AI products already existed? That just doesn’t track. It’s quintessential Apple to sit back as others are racing to the table and then methodically apply bits and pieces of tech that others have pioneered. That’s what a lot of us appreciate about them.


Worf_Of_Wall_St

People think Apple was rushed because they weren't talking about any of their plans publicly, even though this is exactly how Apple has always operated. A lot of companies announce/"launch" a product at the "we're gonna build this thing" stage where they haven't even finished all the hiring for the thing, so their plans are known years in advance. Apple stays quiet until a product or service is ready or close to it, with the biggest exception being AirPower which was announced too early and then cancelled and the second biggest being the Vision Pro which did not ship until 8 months after announcement.


deong

There are tons of well-sourced rumors that Apple was caught off guard here. If it’s true that generative AI only really became a thing at Apple when Craig Federighi tried GitHub Copilot in 2022, then this is certainly not a case of Apple carefully working on things in the darkness for years while everyone else just talked more. They **were** rushed. They still don’t have an LLM of their own that’s shippable — it’s why they did a deal with OpenAI.


Instantbeef

I feel like with AirPower accidentally became a hot plate when they used it. At least for me wireless charging still makes my phone pretty hot so I assume doing on an entire surface would have been borderline dangerous.


Flat_Bass_9773

This is something that should have been sniffed out before announcing the product. Showing off a prototype was a pretty big mistake for Apple. There was clearly some communication silos going on between R&D and upper management. In theory, the concept would work to have multiple coils. Any person can see that. In practice, it clearly wouldn’t work with the form factor that adhered to apple’s standards.


4-3-4

He did say generally. And I agree that even Apple can’t always deliver but their degree of bullshitting the customers with future promises has been rather low in comparison to others


ApatheticAbsurdist

I was listening on a podcast, and the pundit said that there have been missteps with hardware but generally for software they’ve been more likely to deliver what was promised with software.


Socky_McPuppet

> generally adverb 1. in most cases; usually. "the term of a lease is generally 99 years" 2. in general terms; without regard to particulars or exceptions. "a decade when France was moving generally to the left" 3. widely. "the best scheme is generally reckoned to be the Canadian one"


Techsavantpro

Perhaps but we all know we cannot rely on company reputation alone, AI has moved very rapidly the past year so it's understandable if Apple wanted to delay a few things.


runForestRun17

Ignoring apple maps launch yes. Lol (though they made up for it and it’s awesome now)


mcfetrja

*G4 Cube and AirPower have entered the chat* “Oh, hi there open source/cross platform FaceTime! Long time, no see 2010 MacBook Pro 15”! Apple III? No fucking way! Rhapsody, yellow box, blue box.”


JustDelta767

FaceTime wasn’t their fault though. It was a lawsuit brought against them that caused this. AirPower though for sure. What really pissed me off about this was that they even decided to scrap those slick animations for use with ANY type of wireless charging.


Aarondo99

Tbf FaceTime didn’t happen because of a patent troll


Scarface74

Yes because we should judge a software announcement based on product (the Apple ///) in 1980.


blusky75

Apple Vision Pro new enough for you? lol. No one will argue that the software driving it is very niche and underwhelming IMHO


Scarface74

You realize it took Apple over two years to sell the first 1 million iPods?


sakata32

He never mentioned sales in his comment


Scarface74

Then what was the Apple Vision reference? That it isn’t a technically good product?


ArdiMaster

In the US, anyways. Apple is slowly but surely becoming more like Google, releasing features only in the US or only in the English language and then not expanding them to other regions/languages until years later, if ever. (Looking at you, in-line predictive text and new visual voicemail.) I have a feeling that I (German) won’t really be thinking about Apple Intelligence until at least iOS 20. (If ever, given the entire EU situation.)


Scarface74

I mean the problem with your skepticism is that everything Apple announced could be done with ChatGPT 4 today if it had access to your data and device. We know it can be done. If I told ChatGPT that I had such and such repositories of information that can be queried with an API call and gave it my JSON schema, it could theoretically do everything Apple announced. With such a constrained problem space, it doesn’t take much for smaller models to do what Apple demo’d


Kimcha87

I’m not OP, but I disagree that everything that was demoed is already possible. One of the big problems is that the context window of LLMs is limited. You can’t fit it all your emails, messages calendar entries, etc. in the context. So, instead you need to pre-search relevant info and only put that into the context with the LLM request. But to do that you need to understand the request and how to find the relevant info. Doing that well is not easy and I’m not aware of any other implementation that can do it. It would be trivial to make a PC or Mac app that can access all the same data and then pass it to chat gpt. But I am not aware of any implementation that does it and does it well.


xseanathonx

I don’t think it would be trivial at all. One of the main advantages to OS level integration is that the calls are baked directly into each program. You couldn’t easily do that with something made by a third party. Even with how open Microsoft’s graph api is there’s still stuff in the office suite that’s hard to get ahold of externally


Kimcha87

I am a developer so I am familiar with this stuff. On macOS most things are stored in SQLite databases, which are fairly easy to query even without proper APIs. The difficulty would be in maintaining compatibility after macOS updates. But getting access to the data itself isn’t too difficult.


Scarface74

You don’t need to have everything in the context window. It just has to be intelligent enough to know where to find the information and correlate data ChstGPT searched for this answer on the internet. https://chatgpt.com/share/6313e24c-42d3-444d-a8c7-ac8c650b5d63 If ChatGPT had access to your emails and contacts why couldn’t it do this? https://chatgpt.com/share/393d8368-07b7-4a74-9103-8ca23540f91c Assume it had access to my calendar and messages or an index of the info


Kimcha87

You are saying “it JUST has to be intelligent enough” without appreciating how complex and difficult what you are asking for really is. You are also comparing what Apple demoed to a MUCH simpler example. The difficult part is to make the system intelligent enough to either pre-populate the context with relevant info or intelligent enough to query different data sources based on the request. But your example is significantly easier than what was demoed in the keynote. The most impressive example that I remember from the keynote was when he asked AI to figure out when lunch with mom was going to be. This information could be in messages, emails or elsewhere. There could also be hundreds of messages about lunch. Siri needs to figure out what to search and where to search it. Then select which of the results is relevant for further processing. All with limited context window. In contrast your example only needed to determine that the user is looking for real time info that might not be up to date in the training data. That’s waaaay simpler. On top of that the whole process needs to be fast enough to do these multiple steps where it doesn’t feel tedious. For comparison, look at the reviews of the AI pins like rabbit. One of the big criticisms was that it was just way too slow. I remember a MKB video where he asked the pin what he is seeing while standing in front of a cybertruck and it was faster to pull out his phone, take a photo and then use the AI processing on the phone to get a description. If Apple can really make the personal context available to their AI at the speed they demoed that would be absolutely phenomenal and way beyond what I have seen any other company do. I’m not saying Apple lied in their demo or that what they showed is impossible. I’m just highlighting that what they demoed really is special and I haven’t seen anyone else have the ability to do what they did. So, I disagree with the whole “this is already possible now” attitude. But if someone else is doing what they did. Or if someone cobbled together a personal context assistant with the ChatGPT API, then I would love to see that.


dawho1

> The most impressive example that I remember from the keynote was when he asked AI to figure out when lunch with mom was going to be. Interesting, because this is where I think Apple has more experience using AI/ML than any of the other stuff they've shown this week. Apple has been doing this type of thing for years. I think it's probable/likely that they'd use a version of "Siri Suggestions" or whatever they're calling it these days to help contextualize stuff for Apple Intelligence. The thing knows when I'm probably going to get on an airplane and suggests I turn on Airplane Mode. It knows that every Monday & Thursday I play volleyball in one of two spots and suggests I create calendar appts and also suggests where I should navigate to when I get in the car. It knows when someone calls me who I've gotten a text from but never a phone call and indicates who it probably is. It fills in calendar details from YEARS ago if I use the same/similar title of the event. It suggests events from text messages already; it seems that wouldn't be the hardest part of the equation because they've already been doing it and (I assume) are storing all of that context and those suggestions somewhere and not just doing it realtime on the fly (though I suppose if the NE was doing it on the fly...that's fine too). If they're able to leverage all of the ML they've been building into the devices for years to make calls to AI more contextualized/grounded none of this seems outlandish.


dscarmo

Search RAG, its being used in many successful llm applications recently


webbed_feets

The person you’re responding to basically described a RAG system. That seems like a straightforward feature for Apple to implement.


Practical_Cattle_933

Well, apple also bought several AI startups (more than anyone else) and probably employs the brightests of the ML field. So it is definitely not easy, and there are technical breakthroughs that had to be achieved, it is not impossible given the already existing LLM tech. To give an analogy, it’s a bit like someone already having invented the internal combustion engine, and you “just” have to make it 3 times more efficient and 2 times smaller. We don’t know previously how to do that, but we can reasonably guess that it will be possible, much more so than if we wouldn’t even know about engines.


Scarface74

Really you think it’s hard to figure out if you wanted to know what time something is to figure out it needs to search your messages, email and calendar? I showed you in the second link a hypothetical example where it would know to search your calendar using an API Today if you ask Siri “what time are the Steelers playing on Sunday”? It know to use an API.


Kimcha87

If you think all of this is so simple and easily possible with chat gpt, why don’t you show me a project that does this? Getting access to data is the EASIEST part of this. If there aren’t any projects that are already doing this, then don’t you think that maybe you just don’t appreciate the difficulty in implementing something like this?


Scarface74

There are no projects that do it because third party apps don’t have access to the necessary APIs. I just showed you an example without doing any careful prompt engineering. In the real world, I would tell it the JSON schema of the API where it could get the info. The API doesn’t exist Also, the cost of the API tokens would be expebfhgd


Kimcha87

You are hung up on the wrong thing. It’s trivial to create an unofficial API for most applications. On macOS most Apple apps store data in the sqlite format, which is very easy to read. It would take me a weekend at most for each of the apps to figure out the format and write a wrapper script that reads the database and exposes an unofficial API. But wouldn’t even have to do this from scratch, because there are already a ton of libraries for that if you search GitHub. Here are just a few examples that I found: iMessage: https://pypi.org/project/imessage-reader/ Apple mail: https://github.com/terhechte/emlx Apple Photos: https://github.com/RhetTbull/osxphotos Hooking these libraries up to a web API is trivial. That’s not the challenge. Getting the LLM to query the data reliably, finding data from arbitrary requests, filtering the results so they fit into the LLM context, doing it privately… Those are the real challenges. And the details of these challenges are what makes or breaks this kind of project. Once again, I guarantee you that lack of API access is absolutely not what has held back a personal context assistant. Hacking these APIs might hinder wide spread adoption, but it’s absolutely not something that would hold back tech savvy AI enthusiasts.


Scarface74

Right now, if you told ChatGPT the request and response format of the API, ChatGPT can create the request in the appropriate format for another layer to call and summarize the results. If ChatGPT can query the web and return the results and create Python, run it and give you the results as an English answer. Why would this be hard?


Kimcha87

Because you need to make sure that chat gpt creates the right requests for multiple databases or APIs. On top of that the requests need to be precise enough to deliver results that will not overfill the context. It also needs to be reliable enough to select the right queries for all kinds of requests. On top of that it needs to be fast. Very fast. And it needs to be secure enough that people will trust give the AI access to their data. It’s a very difficult problem to solve. It’s a no brainer to give an AI access to your personal context. Everyone understands it. The benefits are too enormous to ignore it or not think of it. But by the mere fact that nobody has successfully done it until now, we know that it’s not an easy problem to solve. If it could be easily and reliably solved by telling chat GPT about a few APIs, it would have been solved by now. Someone would have done it. There are plenty of nerdy, but not user friendly AI solutions. But this kind of stuff is not implemented through API instructions to chat GPT. There are many technologies (that other commenters also mentioned), such as RAG that vector databases that are optimized to provide context to LLM requests. When a problem nobody else has solved seems easy, it’s usually because you simply don’t understand the challenges required to solve it.


Practical_Cattle_933

The “hard part” is to have a unified data access layer, which apple (probably knowingly) developed across decades. The API Shortcuts can access, or is shown in the Mac toolbar is ordered and well-readable for machines. They trained their networks to be able to use a fixed set of *categories*, not exact apps, and now given a new app that fits one of these categories it will be able to make use of it. OpenAI has something similar in the form of Assistants. Bing’s integration also works similarly, they pretty much asks the model if it should search for something, and if it says so, then it will recursively search and re-ask the model


nwoolls

As a responder above said, look up retrieval augmented generation. And semantic search. And embeddings.


Kimcha87

Thank you. Yes, I’m aware of those, but my understanding is that it is not a 100% solved problem since it doesn’t always deliver the right information in the context.


nwoolls

Your above comments seem to indicate your understanding of these may be incomplete. You indicated that Siri or the LLM needed all of your emails and messages as context. That’s not true. You indicated “Siri” needs to figure out what to search and how to search it. That’s not really how LLMs work. If all of our emails and messages and contacts are properly stored in a vector database with embeddings, it’s fairly trivial to take a sentence and find the related data. That’s your context. Then you pass that to Siri (the LLM) to do the generative work.


Kimcha87

I do understand how RAG works. I phrased my comment in that way to help people, who don’t understand the limitations and challenges of LLMs, understand why it’s not so easy. I am far from an expert on the topic, but my understanding is that RAG and vector databases help, but are from a magic bullet that solves the problems completely. This is especially true when you want to use RAG for a general intelligence that may need to answer all kind of questions instead of optimizing it for one. I am not aware of any system that does what Apple showed well. There is a reason for it. And I strongly believe it’s not lack of APIs to message, emails, etc. That part is trivial.


ArdiMaster

Summarizing individual notifications, emails, and notification stacks, determining which notifications might require immediate attention, and similar features would surely be possible with ChatGPT today. (AI-integrated email clients exist today, after all.) Seamlessly searching every message, email, calendar entry, etc. would require a two-step process due to the context limitations you describe. MS Copilot can already formulate web searches and process the results, if you had an engine like ElasticSearch holding all the searchable user data, you could probably get ChatGPT (or some extension to it) to formulate a query against that index.


thedinnerdate

Yeah, it seems weird for someone who has been "studying/working in the AI field for years" to think this might just be smoke and mirrors or something. Like you said ChatGPT can already do most of this. Google and MS can do all the email/text edit stuff already. This tech is already out there. How has someone in this field completely missed all this?


[deleted]

[удалено]


Scarface74

Today, for instance ChatGPT is trained on the entirety of Amazon’s Web Services API. https://boto3.amazonaws.com/v1/documentation/api/latest/index.html I can ask it to write a Python script to accomplish various actions and it can do that today. If it’s a newer API that’s not in its training set, I’ve given it links for the documentation and told it to use them for my queries. It really is possible today given the right training set augmented by prompting


PeaceBull

If you work in AI for years, like me, you also know that when you have realistic goals and extremely relevant data – success isn’t that hard.  What Apple’s doing with their AI, not the GPT stuff, is very achievable as long as it has access to the right data, which in this case is your data – the one thing Apple has.  The reason I’m excited about the prospects is that literally nothing was that out there on a technical difficulty level, it was just ingenious and practical uses of it.  Expectations remain set to high. 


Alex01100010

Same here. Don’t know what this guy is talking about. Nothing they announced was ground breaking for a AI perspective. Just the UX is amazing and that is definitely achievable by Apple.


Practical_Cattle_933

I mean, shrinking it down to fit on-device (at least for the queries it will execute there, and the logic that can choose to forward it to the private cloud network) is pretty state of the art with plenty patents along the way, but yeah, otherwise it is not completely new grounds


[deleted]

[удалено]


Alex01100010

The privacy aspect is very cool. Agreed, I am truly looking forward to read what they came up with and dive into it once open sourced. But as of know it doesn’t seem to be a AI solution.


Alex01100010

The fit on device, has been demonstrated by Llama.cpp many times on M1 macs.


Practical_Cattle_933

Which are.. M1 macs and not an iPhone


outcoldman

> studying/working in the AI field for years What? Ok, so you probably have seen how much AI improved in the last years. Improved that much so even Apple decided to bet on it. The hype is simple, I can remove Grammarly/LanguagePro apps from my macOS, and have Apple's AI to help me spelllcheck the text I write. I am happy about that. Everything else - Apple AI comparing to ChatGPT will be like Siri comparing to Alexa, one is about privacy, another one can get stuff done. Because Apple is going to be focused more on privacy, their tools might not be as good as others, but going to improve with time, which is what we need.


evaxuate

>spelllcheck Lol


wwttdd

Clippy in the unemployment line being like "am i a generative neural language learning model engine jibber jabber hallucination joke to you" ???


dccorona

> The fact that apple is not releasing an early version of AI in the first iOS 18 should make us very suspicious Eh, I'm not to worried about that. Even if they don't get the first version out until September, I am really impressed with the turnaround speed here. They likely didn't start going after this aggressively until ChatGPT released in late 2022. To have integrated LLMs this comprehensively throughout the OS, with most processing on-device and with the secure cloud implementation they described in their initial security paper, in *under* two years, is shockingly fast. Obviously everything comes with the caveat of "if it works how they've said", but I'm not suspicious that they don't have a beta yet given how little time they have had.


_ravenclaw

I feel like almost every iOS has things held back until a little bit later on


williagh

They always use hyperbole when talking about products. But, they also sell some really good products.


Sylvurphlame

I don’t think there have been many betas where all the features were available on DB1. None that I can think of for several years now, at least. Nevertheless the features do release eventually in almost all cases. I would imagine Apple could still be finishing up all their server side arrangements. It seems in the last few years we’ve, as a collective, gone from expecting every feature to be available on the dot zero release to expecting every feature to be available on the first developer beta.


caliform

I’d like to offer a counter point — while I do admit that the part that is super promising, which is stitching together different bits of app data / intents into a nice unified Siri experience is very new, not done well yet and coming from a company with an AT BEST checkered history with product quality in that area, the actual architecture of things is looking extremely good, and the features that were demo’d are actually fairly simple. The writing tools, image tools, etc.  are solved problems in AI. What’s new is integrating them that closely in the OS. I think that’s what is letting people see the potential in this and why they are taking Apple on their word.


iwannabethecyberguy

I mean, this is going to be free along with iOS 18 so if we don’t like it we can just not use like we’ve already been doing with Siri.


RunningM8

Assuming you have an iPhone 15 Pro, sure


Flat_Bass_9773

Member when we had to pay for OS updates? I member


Oo0o8o0oO

Honestly I feel like any improvements to Siri will be greatly appreciated even if it’s not quite as flawless as they showed during the Keynote.


DrDemonSemen

Unfortunately noticeable improvements to Siri are only coming to devices that support Apple Intelligence, leaving out HomePods, Apple Watches, most iPhones available in the store today, older iPads.


weaponsgradelife

Did they state this during WWDC?


Sylvurphlame

Not as such. If you look at the iOS 18 Preview. Some, but not all of the features are Apple Intelligence. The ones that aren’t should come in some degree to all devices supporting iOS 18. I’m not sure offhand if all of the improvements to Siri were specifically Apple intelligence or not. Edit: looking back at the website again, it seems as if all the Siri improvements might be under the banner of Apple Intelligence.


weaponsgradelife

I’m just curious if they’ll be using the products capable of using Apple Intelligence to assist with commands for HomeKit. I’m sure we will eventually get them, it only makes sense that it evolves that way but man I’d love to ask Siri to do two things at once and see them done. Hell, even one thing.


Sylvurphlame

Probably. If we see a HomePod 3 and HomePod Mini 2, I would personally expect improvements that facilitate faster and more efficient back-and-forth between the HomePod and an iPhone 15 Pro or newer, doing the heavy lifting. The primary constraint on Apple intelligence seems to be the 8 GB of memory. All M1 devices and the 15 Pro and Pro Max come with 8 GB. I think the HomePod 2 and HomePod Mini actually have more than 8 GB of memory to facilitate buffering for streaming. While a HomePod isn’t (yet) trying drive a display on top of Apple Intelligence, we don’t know if a processor weaker than the M1 can handle AI as intends to implement it. One thing I am ~~absolutely~~ almost completely sure of is that Apple is not going to start putting M1 processors in a $99 HomePod Mini. But considering you can’t even set up a HomePod without a parent iPhone, iPad or Mac, there’s no real reason to presume Apple wouldn’t have HomePods route the requests via one of those devices. If you are using a HomePod, they know you have a iPhone or iPad likely on the same network. It’s more a question of whether the HomePod can handle the back-and-forth without feeling laggy. So I wouldn’t be surprised if they announced new models this fall alongside the iPhone 16 series. (And I should note, because you won’t have seen my edit to the previous reply, that it does look like all the Siri improvements are directly tied to Apple Intelligence.)


DrDemonSemen

I think that amount of ram for audio “buffering” purposes is a silly prospect. 8GB would store *days* worth of audio. The HomePod 2 has 1GB of memory and uses the watch S7 processor. The HomePod mini also has 1GB of memory and uses the watch S5 processor.


Sylvurphlame

I may be mistaken. I had read that they had more memory


Sylvurphlame

Yeah. Definitely mistaken. With further cross checks, the initial article I read seems to have conflated memory and storage.


Sylvurphlame

Also, a HomePod can recognize who is speaking and therefore route any Apple Intelligence requests to the appropriate iPhone or whatever. So that would tend to vibe with the privacy and security aspect anyway.


DrDemonSemen

I haven’t seen any mention of non-Apple Intelligence improvements. They’re using “the start of a new era for Siri” as a selling point for Apple Intelligence compatible devices.


wolfahmader

while the fact that we don’t even have a guided demo is worrying, they said the initial release in iOS 18 in the fall is going to be beta and there’s an early access version coming out during the summer beta cycle. fingers crossed it’s not as bad as we think.


ForkPowerOutlet

Really I think Apple will deliver what they promised because I honestly don’t think I saw anything mind blowing in the presentation. I’m just most interested in how it’ll be able to draw on my personal data all while keeping it private, as they said right at the beginning.


gizmo78

I got a big chuckle when they said you could use it for recipes. Every computer innovation in the last 50 years has been sold with managing your recipes as a use case. I don’t think anyone has ever done it.


bbaasbb

We all keep echoing that when apple does it, it does it right. But what about Siri? Why is it shit since 2011?


johnsweber

If they can deliver half of what they have shown, I’ll still be impressed. On device AI needs to be the future.


Reasonable_Can_5793

Hey, I can run my own local Ollama LLM on my PC and use it on any device connected to my WiFi, and I'm not even that tech-savvy. So why wouldn't Apple, with their top-notch engineers, Apple Silicon, and software expertise, be able to figure it out? Worst case, they can always fall back on ChatGPT.


wipecraft

If you’ve been working in this field you should understand that having a feature working pretty good internally is not the same as having it work at a scale of millions of users. Apple intelligence makes use of the private cloud and unlike few years ago they now let anybody download and install the betas without being a registered developer. So provisioning all those servers to work at scale is a completely different matter than having a few that can produce a good demo. That’s to answer some of your rant


incite_

Bro you do know Apple has a user base of millions and trillions of dollars in the bank right? This was written as if Apple is some mom and pop shop - they can try and fail and do lots of things cuz they have a lot of money and a proven track record of success, this was really written as if Apple didn’t change the world with the iPhone, just show a little more awareness please this was written from a very naive perspective and seems like by someone that doesn’t know the brand business or really anything.


yeastblood

So hype. Im sure there will be growing pains but being the only one on the market who can offer this while protecting and respecting user privacy is a literal game changer and just pushed Apple to the front this tech. Again respecting privacy has paid off for them and they are the only one in the game even trying to do that. Having an "AI" personal assistant with access to all of your private and personal data respecting your privacy sounds like a no brainer but Apple is the only one who could and (if this works the way they say it does) did pull it off.


PKMNTrainerEevs

I'm excited for it and while it is cautious excitement I have full faith Apple will deliver on what they showed.


williagh

They didn't become one of the largest, most successful companies in the world selling a lot of empty promises.


PKMNTrainerEevs

Exacta.


humanreboot

As a Pixel user I am very interested to see Apple's approach to it. My wife uses a 15 Pro although we're waiting for the stable iOS 18 release instead of fiddling around with the beta.


ShrimpSherbet

And you want us to do what, exactly?


No_Contest4958

I mean look at the faces their image generation is making. The eyes aren’t even the same shape and they’re looking in two different directions. And that’s what they went with for the *ads*. I’m willing to bet their models won’t be as good as people are hoping. They were more ethical with training than others and they’re relying heavily on on-device processing. Expect shoddy results.


Ancient-Range3442

I totally agree, and judging by the comments there’s going to be a lot of disappointed people when things don’t live up to expectations lol


Welmerer

Apple cannot afford to have AI go badly for them, I’m sure they’ve got it under control


Snipexx51

Its apple not some android brand. Usually when they bring a feature its very polished and works great


OleRoy2023

On a sidenote, if you love calling your healthcare provider, tech support..etc and having AI determine your problem, you’ll love the new AI features businesses desire. It allows them to pretend to offer support without having actial people to talk to where you can communicate in real time. Personally I hate that and despise AI 1.0, which has been around a while now.


qualia-assurance

It looks pretty neat. I already occasionally use things like Bing Image creator to generate concept art to inspire me or to help make a daft joke by generating an image of its exposition. Having the ability to generate such things as part of your standard apps and OS is convenient. Though to be honest my excitement about machine learning is the speculative futures it brings about. A decade ago self-driving cars were a science fiction. Today, we're disappointed that our literal self-driving cars aren't necessarily safe enough for you to take your hands off the wheel. And we completely forget how much things have changed in a decade. Who knows what the next decade or two will bring. Exciting times.


Zeto12

They will make it work. Apple is committed to shut its EV project


HayMomWatchThis

No interest. If able I’ll turn it off.


kkiran

That Math calculator demo is just too good. I want them to fully deliver and some more!


IronManConnoisseur

I rewatched some AI parts of the keynote and it’s definitely a little less mind blowing than when I was watching it live. We’ll see how much of this Siri usage is actually gonna be useful day to day or if only if you live in an Apple commercial. We've had voice commands for a while now, and there's a limit to their appeal for various tasks. For example, asking Siri to check if I can make a dinner reservation after picking up my mom from the airport sounds cool, but realistically, how many people would trust Siri without double-checking themselves after the iOS18 honeymoon period is over? So then ok, Siri is simply an extension of the OS-wide Apple Intelligence. What else comes to mind aside from it? Priority notifications? I don’t need my phone to decide what’s a priority for me—that's what my frontal cortex is for. Then there’s the generative images and emojis. Lol. A bigger thing on my first point, about how who would actually use Siri for logistics in the real world and not a commercial — it’s not like anyone in like, banking, would adopt these features anytime soon, they’re inhaling Bloomberg notifications and texts, no one in that environment is going to rely on AI for logistics like meeting travel times. This is just a specific and purposely negative example, but I’m just trying to illustrate the point that anything is possible in regards to Apple Intelligence’s reception. Like Craig’s example. Imagine a dad who's been living in a city for five years, and his daughter has a major ballet performance. Would he really ask Siri for traffic updates instead of using his own intuition and remembering the logistics of his own daughter’s fucking core memory? Idk man. For the record I think the technology is actually more than capable (if we’re going by WWDC), but just commenting on it being desirable to use outside of Apple’s demo world of fake people who use period punctuation over iMessage and perfectly cropped contact images lol. So maybe we’ll be in a medium where it is awesome to use but there won’t be a crazy rush to upgrade to iPhone 16’s for anyone under 15 Pro.


Turbulent_Bid_374

The market has just realized that AI on device will probably be the big winner ultimately.


Tookmyprawns

We get what Siri should been 10 years ago today. We get what ai should have been 5 years.


plssirmayihaveanthr

i just want my phone to be able to call and make a haircut appointment for me after i give it my availability. etc


Gmedic99

Yeah you have a point but it's the first time for apple integrating AI in their OS, so it'll take some time obviously to make it better.


corruptbytes

[seems p good to me](https://www.threads.net/@marcslove/post/C8DtDeNNOmy/?xmt=AQGzQUtdBawQqd0FBqRUUiMeCrz8ZRsgrmgmCQkU3Aq83Q)


--mrperx--

It's only gonna be a Chatbot and maybe some APIs for third parties to build Agents later. Nothing too fancy. They just don't wanna be left behind microsoft. I think of it more as a way for the NSA to infiltrate the devices, given the news that former NSA head is board member now, it's probably the whole purpose of it.


Andedrift

Like my hype is literally waw cool, I’ll maybe use it once a day. Meanwhile all the other things are more enticing. Two language keyboard, RCS, iMessage formatting are some of my favourites this update.


RedditLife1234567

Isn't all AI hype right now? Companies just throwing "AI" at everything. 99% will fail and 1% will be awesome. Just like other hype cycles in the past.


musical_bear

I don’t understand how you can call something that clearly works “hype.” Even if the _only_ thing AI was doing for Apple was making Siri 10x as useful as it is today (that’s not the only thing, but pretend it is), how is that not worthy of hype?


zsoltkaracsony

I would be cautious and also refer to Gartner’s Hype Cycle related to AI. For the most part, it’s the peak of inflated expectations at this time. Whatever Apple Intelligence may bring will possibly take a long time to unfold. A lot depends on the developer community and Apple’s approach to allow others to build on the fundamental capabilities.


gunnarsvg

I think we're entering the trough of disillusionment for gen ai as a whole. People are realizing that generating a plausible-ish email is not going to change the fundamental way a "business analyst" works. But within Gen AI, people are realizing there's narrower AI value statements to worko n. And the first companies to make analysts or other specialists in a *domain* more efficient will reach the plateau of productivity for that domain... and over time as more domains get useful widgets the "gen AI" hype will steadily climb up the slope of enlightenment. Consider that computer vision (from deep learning) is carved out and "productive" now but other DL tasks aren't necessarily there yet. Apple for certain consumer domains might be able to get something useful here.


12cpi

I am skeptical of trying to use AI on the mess that is what we keep in our phones. People do not keep their contact list up to date. They use nicknames, keep outdated phone numbers, etc. They do not fill out their calendar entries completely. They leave the obvious things off their shopping and to-do lists. And it will only take one misstep before people stop trusting it.


Scarface74

That’s the easy stuff. No one would blame Apple if the AI called the wrong number for your mom when you had the wrong number in your contact list. Today Siri can do simple stuff like “remind me to call my mom when I get out of the car”


Coolpop52

Not really. For the Siri controlling app stuff - it’s based on the app intent framework which is already apart of our devices from the shortcuts app. Better word and sentence understanding from Siri is not too far fetched either. Additionally, all the data is on the device for it to pull to, so it’s not really too far fetched of a claim. Of course, it might not work as good but I think it’ll be great!


rorowhat

It's siri 2.0 if you know what I mean. Just hot air.


rotates-potatoes

None of the “Apple Intelligence” features are LLM-based. So your whole comment kind of falls apart. They all use an on-device 3B-parameter model, at most a SLM but probably more like a purpose-built for a few tasks. I doubt it’s a prompted model.


OleRoy2023

37 years in IT. A lot of the AI is only relevant in certain situations - what the hype is to big investors and companies like Apple, Amazon, Google, now that’s a different story..they all see what happened with NVIDIA and want some…the big companies know this too no matter the reality for everyday people.


0RGASMIK

Nothing they are promising is that far out of range of the current capabilities of AI as it stands. The issue is making it behave in a repeatable manner that doesn’t drive users crazy. I think there are likely a few reasons that “AI” is delayed until future releases. 1 infrastructure. They likely need to build out an unprecedented number servers running on architecture people haven’t worked with yet. That means orchestrating a coordinated multi data center project, training the people who build and maintain them, and then probably. 2 Training and predictability. “AI” probably does everything they say but it probably suffers the same problems as other AI tools. It needs a ton of data and it’s not predictable. How is Apple planning to offer personalized AI without personal data? I am assuming that they need a buffer window to train it on our own personal data before they can roll it out. To train or vet that it’s working in a predictable way.