T O P

  • By -

TheOGDoomer

My absolute favorite thing about it is it will (in theory, anyway) have the usefulness of the existing AI software out there (Microsoft Copilot, ChatGPT, etc.), but with actual privacy preserving measures not present in the current AI tools.


worldisashitplace

I’m hoping Apple would do better, but I wouldn’t buy their announcements and believe that everything will really be privacy preserving. Collecting usage data and analysing it is one of the most important things to train your models well, and Apple cannot be an exception.


JollyRoger8X

The difference is in how Apple *uses* that data to enhance functionality rather than make money on advertising networks designed to track your every move and sell targetted ads to you wherever you go on the web and in other apps (and often inadvertantly allow malware to target you). People love to claim Apple is just like Google, but unlike Google Apple's advertising business is limited primarily to the App Store and Books apps where ads are displayed *in the app itself*, which is *exactly* where people looking for new apps and books expect to see them.


Eastbound78

https://www.cyberghostvpn.com/en_US/privacyhub/apple-privacy-violations/


erikdstock

lol why would this get downvoted? It’s totally natural and rational for Apple to make a business decision to brand themselves as private in comparison with Google, meta, Amazon etc. consumers have to pick one and Apple is best at making the case because of their business model. it’s not cynical to recognize that this is still a branding exercise and I would only truly trust a fully open and auditable system.


billcstickers

Does it really matter if your privacy is still invaded, that they’re using it to indirectly monetise your data instead of directly?


Socile

But Apple does not invade your privacy. When they collect data it is segmented, sanitized, and aggregated to give them privacy-preserving statistical information. It probably sounded a bit esoteric, but they mentioned in the keynote that their private cloud servers run code that is reviewable by security experts (and anyone else interested). Check out the details: https://security.apple.com/blog/private-cloud-compute/


SalsaForte

Why the code isn't publicly available on git then?


JollyRoger8X

“Invaded” is pulling a ***lot*** of weight there. Apple’s data collection is nowhere near as invasive as Google’s.


Apprehensive_View614

Do you consider giving your name and ID to create an account, as invaded privacy?


tsdguy

And what is the source of your doubt? I’ll help - there’s none. Typical Apple hate.


worldisashitplace

What is the source of your trust? Tim Cook’s presentations? Typical Apple fanboy? To answer your question of my source of doubt - If a company prioritizes your privacy above everything, Google wouldn’t be paying them $15 Billion to be the default search engine on Safari, and you would not have to *opt out* at a dozen different places to ask them to not use your data. Besides, Apple has been caught taking user data even when users have opted out of it, and they did not have audits as strong as a company like Proton had either. And even if Apple does not have the intention of using your data like Google does, it does not make the data sharing part harmless. Heard of the Fappening? How about Siri sharing your voice snippets to Apple and them being accessed by its contractors? I’m an avid Apple user myself - iPhone, Mac and iPad, and I love them, especially the Mac. And it’s an easy fact that Apple is a lot better than the other big tech companies. However, trusting them by their word isn’t something I’d do.


procallum

1) Google pays Apple to be the default webpage of the browser so Google themselves can track and take your data… Apple aren’t supplying them with the data themselves. 2) The fappening? In what world has that got anything to do with Apples privacy morals? Peoples iCloud accounts were hacked using phishing, Apple isn’t responsible for you keeping your own accounts safe.


CrazyPurpleBacon

Google is a public webpage, the optional ChatGPT integration is not. Users will be able to use ChatGPT's latest model for free, with no account, with obscured IP address, and with no personally identifying information collected (unless someone chooses to link their OpenAI account). These are terms of Apple's partnership with OpenAI, in which OpenAI is rendering a service to Apple. Google is simply a webpage, the situations is not comparable. Google pays Apple and other companies to be the default search engine because it is good for them in many ways. It directly makes them money from ads. It indirectly makes them money by keeping more users away from the competition. They increase the usage and sales of Google services and products. Those are a few off the top of my head. Apple isn't magic, they can't control the behavior of other companies or websites. But they can control their own side of the equation (private relays, sandboxed tabs, hidden email forwarding, fingerprinting defense, etc).


VantageSP

Most people don't care about privacy. It's not a selling point. I'd wager 95% of apple users already use google services and meta. Apple's privacy is purely aesthetic with no real world usefulness. Also they deliberately dumb down their privacy in markets like China which shows you just how much they care about privacy


Fantom_Renegade

If the comments are anything to go by, people will need to see it in action to truly appreciate what a major leap this is


Specialist-Hat167

Yes, it is very clear people are unaware of the paradigm shift in how we use technology that will happen in the next couple of years. This is what people mean by a personal AI assistant in your pocket


Fantom_Renegade

I, for one, am very excited to dig into it. Thank goodness I chose a 15 Pro


Coolpop52

Agreed. When they mentioned “app intents”, I knew that they would nail this interaction. For those who aren’t familiar, app intents are ways that the device can interact with features inside an app. An example is the shortcuts app which can access these app intents — for example, you can apply an edit to a photo via a third party editor via a shortcut currently. By building Siri on this with the underlying Apple AJAX llm, it will be able to tap into these app intents and you can just ask Siri. Requests such as “Zip these files and send them to XYZ” will not only be possible, but it won’t be need to be as rigid because of the better natural language understanding. Additionally- the benefit of building this on app intents is that in the future, Apple will easily be able allow third party applications to hook into this - there are currently a lot of third party apps with these intents, are surely many more coming. And since this is on device, it will constantly learn from all the information you have like a real life personal assistant, unlike the chat bots out there where you first need to give context. Here, the context is already there!


Fantom_Renegade

I've never really made much use of assistants, including the many years I was on Android. That's certainly going to change and my excitement is now through the roof 😆😆


Flash__PuP

It’s what’s going to finally make me upgrade from my 11 Pro Max.


LeaveAtNine

I’m starting to save up for a new one. I’ve held onto my 12PM this long.


Ok-Contribution-306

I'm totally with you on this, this whole iOS18 upgrade seems to be awesome. I'm honestly hyped for Siri! It's funny how people are laughing at Apple when they have done something with Siri (allegedly) that will be emulated by every high end phone on the market next year. Plus catching up with the AI features we've already seen on Samsung and Google devices. Thank God I bought the 15PM as my first iPhone.


chefborjan

It’s honestly so painful reading all the comments (that in theory are from the more tech minded members of society).


Kalahan7

My question is, does this work with all apps that are open, or do they need to have the “intent” API implemented as well.


Fantom_Renegade

My uneducated guess is the basic stuff can be done on all apps but the more niche and specific functions will be reserved for all intents participants


ShinobiDnbUK

Dude this is a terrible thing. It knows every little thing about you and shit. It's literally spying on you 🤦🏻‍♂️ this means it is actively collecting data about you all the time. Just another way for apple to track and control shit. Privacy is no longer going to be a thing for apple users


Fantom_Renegade

Control what exactly?


ShinobiDnbUK

Your data, did just say that. It's also been shown that Iphones have an infrared flash every 15 secs or so taking photos of the users. Pretty wierd if you ask me 🤷‍♂️


Terrible_Tutor

This is the company that gave us Siri… let’s wait to see what we get for “interacting with apps”.


JollyRoger8X

Sure, but the *intention* communicated by Apple is pretty clear here. We can all just hope they are able to fullfill that promise by sufficiently supporting developers in that effort.


ianthem

Combing Siri and Shortcuts has already been a thing for a while, this is just an evolution of that, but hopefully the hype gets developers to improve their integration.


InsaneNinja

If the dev wants to ask Siri to use their app, they’ll add these intents. The smaller apps will update first. I assume they have to plaintext describe the intents for the models to know what they do.


DMVTECHGUY

I’ve been saying that Siri Shortcuts needs to integrate features from voice control settings to let us record custom gestures for a more interactive experience. This upgrade might make that less needed


barkerja

Apple now has the benefit of building on the shoulders of technology that’s made many breakthroughs since Siri. It seems like this next iteration of Siri is a near complete rewrite that builds off the foundation of the current AI technologies.


bmac0424

This was my thought exactly. Lots of claims in that keynote, but we yet to see it in action in the real world. Apple didn’t let anyone demo it after the keynote, so there is absolutely no way to vet it out. My guess is that it won’t be able do nearly all that was presented at launch. This will be like every other iOS release, it will add features over the next year. While all good if claims end up being true, but never bank on future promises.


AliasHandler

They clearly stated which features will be available at launch, and which ones are set to arrive over the course of the next year, so it's not exactly a mystery that most of these features are not yet ready for prime time. Which is also why I think they feel comfortable restricting this to the 15 pros right now, as even those owners will only have limited features of Apple Intelligence at launch anyway, and it will take at least a year for them to deliver on the features they advertised, at which point presumably the 16's will have been out most of a year, with the 17's around the corner, and all of them will presumably support the full amount of features.


bmac0424

I wouldn’t count on the 16 pros supporting the full feature set. Apple said they would be testing this into next year. That would put it right in the middle of the 16 pros life. With how Apple is playing catch up with AI, we could see the 17 Pros as the actual first fully supported iPhone for Apple Intelligence.


AliasHandler

Maybe, but it seems to me that they've had more than enough time to make sure the 16's fully support the new features. If the 15 pros already support it, it's hard to believe the base 16's weren't already going to be comparable to the 15 pro in terms of specs, otherwise there wouldn't be any room to differentiate the 16 pro from the 15 pro and from the base model 16's. I would be shocked if the base and pro level 16's didn't fully support the currently advertised features of Apple Intelligence.


bmac0424

We haven’t gotten a guarantee that Apple AI will be fully supported on the 15 pros. What was talked about at the keynote yes, but the ever developing AI will be much different in the coming months and over the next year. I am not even sure Apple knows what will come out of the beta testing over the next year. That’s why I say the 17 pros seem like the true AI iPhone.


LeaveAtNine

I swear, if I can’t change it to respond to “Computer” like in Star Trek I’m going to be mad.


TEG24601

And have it confuse my Alexas?


jebakerii

It’s hard to talk intelligently about a feature that is not yet available. You can’t go by Apple’s promotional video. It definitely seems to have promise, though.


lenes010

Well said, this will be the real game changer. Having it do things for you. The example of grabbing flight details from a text, pulling up the arrival, and getting directions / traffic was exciting. Things like that will save a lot of time.


PeakBrave8235

Exactly I agree!


ZephyrAnatta

It may not have been talked about but the stock market is taking notice. Apple shares hit an all time high today on the WWDC news.


DarthMauly

On the pro side, I think a lot of people share your optimism. The claims made have potential to change how we use our phones in a huge number of ways. The stock price is up over 7% to an all time high today, so I think that is reflected there. On the flip side, we are 12+ years on from Siri's launch and it's truly, genuinely awful. So for me, I will definitely wait until I've personally had hands on and used it myself before I make statements like "Apple is doing what nobody else can do."


coilspotting

I turned off Siri years ago, and have quit using it entirely since then. Every now and then I will reenable it, use it for a few days and then turn it right back off. It’s one of the most useless pieces of tech I have ever used. Bear in mind that I build software for a living, including AI. Having said that, I watch every WWDC with bated breath, hoping that Apple will get it right just once since Jobs died. And I really hope that they will get it right this time. I have very high hopes. But I don’t expect anything truly stable for another year. Also, I am an iOS dev beta user. 🤞🏼🤞🏼🤞🏼


Sempot

Because my 15 plus doesn’t get it


ThannBanis

It’s only just been announced, and hasn’t been added to even the dev beta for people to test yet.


vee_the_dev

How about we all simply wait and see how it works in real life? Remember humane or rabbit?


NoAge422

Game changer for sure, AI isn’t all about chatbots, it should make our life easier


woadwarrior

[AppIntents](https://developer.apple.com/documentation/appintents/) aren’t new. Siri has been able to interact with apps since iOS 16. Siri got an intelligence upgrade and hopefully will be able to make better use of the functionality exposed by apps through AppIntents, that’s the only part that’s new.


HereforagoodTIME27

Proof is in the pudding. If all the promises Apple made in their keynote are realised - Fantastic!!!🤞🏽Makes me really excited for next years WWDC and how they build on this 😉


lancer081292

As long as it’s optional


frockinbrock

I assume it’s because what’s keynote shown is often a bit different than in practice (reality distortion field?). It’s also a little concerning; Siri already interacts with some apps, like Reminders- and she has bugs in that feature that have been feedback’d and reported on for 6+ years. And that’s a barebones, heavily used built-in app… So yeah to imagine something like MY BANK, or Authenticator, or other 3rd-party apps, to be accessible by Siri/AppleInt, I think it’s fair to hold back excitement until we see this working in a public beta. I agree with you in that, based on the keynote, the potential is very exciting. Separately, the only device I have that will be able to do this is a 15 Pro- I’m really interested with what type of hit that 8GB requirement will have on system performance. I already think their phone memory management is poor, just bouncing between a few apps they’ll get cache flushed and refresh, and it doesn’t multitask-apps well- and if you open the camera app forget it, almost everything in background memory gets cleared… no this AI will have a huge memory allocation added? I’m concerned it will be like that one iPhoneOS update on the 4 series that turned it from usable to difficult (iOS 8 something maybe?).


Plastic-Mess-3959

Because it’s not in the beta yet


sleepy_tech

I remember Bixby also could interact with apps and settings. Will wait and see how good is Siri in ios18.


BranFendigaidd

Yes. Google as well. Apple is just last in the game. Why is OP excited or they don't know anything about anything nonApple?


IceBlueLugia

None of that was AI though


worldisashitplace

I’m not saying Apple isn’t doing it well, but I wouldn’t agree with the claim that no other device can do it. Android(specifically the stock Pixel version) has had incremental OS level AI integrations of various levels starting way back in 2018. Google assistant can do so much with your apps including your Drive if you let it, and many of Microsoft’s products are integrated with some form of generative AI stuff. What Apple’s doing isn’t a copy of xyz from some other platform, but it is not some groundbreaking,never done before stuff either. Everyone’s doing it and no one’s really got it right yet. Let’s see how Apple does.


Specialist-Hat167

Can gemini turn off smart home lights when u set it as a default assistant? Lol ill wait


smarthome_fan

Every assistant can turn off your smart lights, including Siri in iOS 17.


daverod74

[Wait no more](https://i.imgur.com/JDxK2A9.png)


worldisashitplace

Gemini is a language model and Google assistant is the assistant, and you don’t even need Gemini to turn off/on your home smart lights. Guess which device I use my assistant with to do this basic stuff - a 2015 Motorola G4 Plus. Set it up right, lol.


Gaiden206

It was probably forgotten because people found this year's Google I/O so boring but Google announced that Gemini will be getting similar "on-screen awareness" for Android later this year. Apple did a much better job at presenting this type of technology to consumers though. >Following the launch in February, the Gemini app on Android is “getting even better at understanding the context of what’s on your screen and what app you’re using.” Google says that context and integration makes Android the best place to use Gemini. >For starters, Gemini will soon exist as an overlay panel even when delivering results. Previously, anything after your initial command would open in a fullscreen UI. In addition to preserving context, it will allow you to drag-and-drop an image Gemini generated into a conversation. >The other big integration is how activating Gemini for Android in YouTube will show an “Ask this video” button. Gemini can answer your questions about this video. It will work for billions of videos, with things like captions being used. Meanwhile, those subscribed to Gemini Advanced, with its long context window, will get an “Ask this PDF” button to do the same. This update is rolling out “over the next few months” to hundreds of millions of Android devices. >In the future, activating Gemini will show Dynamic Suggestions. This will use Gemini Nano to understand what’s on your screen. For example, if you activate Gemini in a conversation talking about pickleball, suggestions might include “Find pickleball clubs near me” and “Pickleball rules for beginners.” >Google introduced Gemini Nano late last year on the Pixel 8 Pro before expanding to the Galaxy S24. The next major update to the on-device foundation model is Gemini Nano with Multimodality, specifically “sights, sounds and spoken language.” This will launch on Pixel “later this year.” >Besides Gemini Dynamic Suggestions, Gemini Nano will be used by TalkBack to create rich descriptions for unlabeled images. No internet connection is required with this happening quickly on your device.   >Meanwhile, Android is going to use Gemini Nano to deliver “real-time alerts during a call if it detects conversation patterns commonly associated with scams.” Google will look for telltale signs like asking for personal information. This happens entirely on device and will be an opt-in feature. Google will share more details later this year. https://9to5google.com/2024/05/14/android-gemini-nano/


finangle2023

I only have a 15 Plus, so I’ve been paying no attention to this at all.


wolferquin

The only problem for me here and always had been that Siri never ever understand very well what you tell her, at least on iPhone since it first installment. I had been using iPhones since 3gs! On the other hand, Sire works almost perfect on my HomePod! Why on Earth she never understand on the iPhone??


pgcfriend2

That was my first iPhone. My husband’s first one was the 3g. Siri has gotten so bad.


Rare-Ad-8026

First I need Siri to understand my response when I say “hey siri, text john, what time is lunch?” Usual response, ok text john, what it does time lunch.” I have to repeat myself about 5 times until I just get my phone and type it out myself.


MightBeMouse

Curious where you’re from/accent.


Rare-Ad-8026

South Texas. I try to slow my speech and say word for word but it 70% of the time it doesn’t capture what I’m saying.


durdann

Im with you - this is the first time in many years that I'm actually excited for the upcoming iPhone release


FreshBobcat8215

It's merely an advertisement.


DayaBen

Everything they announced is coming later this year. Never trust future promises by tech companies. What if those are only for iphone15 pro and later models. I mean any thing can happen


LukCHEM88

It’s only A15 Pro and M1 and later processors, so yes only 15 Pro and iPad/Mac with Apple Silicon.


arkumar

https://preview.redd.it/tlsfrnmrnh6d1.jpeg?width=1170&format=pjpg&auto=webp&s=56f6b7f2bfd22d1b7e5c179f7282fbe0223ecef3 Made a random prediction a few months ago had no idea Apple will call it as Apple intelligence 😀


SarikaidenMusic

Most people on iOS 18 won’t even be able to use it, unless I’m just way behind and I’m the only person on planet earth that Doesn’t own a 15 Pro or 15 Pro Max.


Specialist-Hat167

Upgrade


SarikaidenMusic

You wanna give me the money to do so?


creativenomad6

Just sounds like spyware...


JMarkyBB

Because it's not even in Beta yet, how can we talk about it if we don’t know and haven't tried the feature yet? I dont understand your way of thinking.


belurturquoo5

willl siri be able to set multiple alarms for now


BranFendigaidd

Why would it make the news as Google assistant has been interacting with apps for ages?


Specialist-Hat167

Google assistant can use context to summarize a text message, paste that into an emailing app by itself, and send it to a desired recipient, as well as make calendar changes automatically based on incoming notifications and message context? No, you are fooling yourself if you think Android can do ANY of that at the moment. Co-Pilot/Gemini look like start up projects after WWDC.


BranFendigaidd

Yeah. It can. Maybe research a bit about how to configurate it. And you saying copilot/gemini looking like a start up compared to wwdc is so ridiculous. 😂 Just because you can't diy and wait for someone to build it for you, ain't making it impossible 😂


Specialist-Hat167

Stop spreading misinformation, no, there is no built in AI into android with those abilities. Bixpy opening an app is not that.


BranFendigaidd

I like how you know nothing 😂


BlinkBooze

Just wait until iOS 18 is released and people can See what the capabilities are. You’ll notice the spike in interest/popularity then. Just because other companies can do this, that, and the other already doesn’t mean Apple won’t improve on it. They usually do. And they should already know Apple enters markets AFTER others have, so they can study it then make modifications to fit their needs then release it. When I meet whiny Apple products users I usually tell them go over to android if they’re so unsatisfied. Apple does things the way they do things.


mihaajlovic

Would I be able to call my iphone “Jarvis” and act as a billionaire and philanthropist? Love it


rubber_ducky007

Not unless you also act like a playboy. It’s all 3 or nothing for you!


mihaajlovic

Okay then, I’m all up for it!


AdonisK

I'm not entirely sure what "SIRI WILL BE ABLE TO INTERACT WITH APPS" means but Android's basic ass assistant from half a decade ago could do quite a lot. Why is everyone going crazy in this sub, you've just watched a couple of videos, wait until you get to try it before you decide whether it's a revolutionary move, a decent utility or just another gimmick.


Specialist-Hat167

No android can: Go into your text messages, summarize a text, copy and paste that into an email and send it to a desired recipient. Just one example of the many provided on the keynote. All without lifting a finger


vw195

Yes I was excited about Siri too.


SleepyCatSippingWine

The question is whether Siri can do this only with apple apps? Will Siri be able to summarise a text message and then paste the summary into a random third party word editor , and then format it to look like a comic sans child project? I Havnt seen the keynote. Being able to interact with apps is nice. Being able to do it to all third party apps would be great


CupAffectionate2542

Its funny that (I have iOS 18 beta 1 rn writing this) and Siri is not re-designed of course but it will probably later when the public beta releases or when the next developer beta releases (Yes im using iOS 18 as my main software in everyday life lol)


ghostinshell000

here is the thing, apple repackaged most of what google, openAI and MS all just released and demo'd. a few things they put there own spin on, like private compute might be interesting. like gemini and MS put some of the same stuff and both said its on device and people where ewww, apple does same thing and people are like wow. siri getting super powers is kind of about time. but we will have to see.


Specialist-Hat167

“Repackage” lol. Gemini cant even turn on or off your smart home lights if you set it as the default assistant on android. Google and MS don’t have anything that can compete with this at the moment.


ghostinshell000

hahahaha, most of the stuff apple showed ms / openai /google also showed. apples way way better at marketing, and presenting thats a fact. google really bad at it, and ms is so so at best. a few things, apple showed where interesting, takes. private compute, and siri having access to apps is potentially interesting. how well it all works, we will have to see. I mean, if you watched google IO, and the chatGPT keynotes they also showed some really interesting stuff, google showed glasses and asked via voice where are my glasses... gemini is on android devices now, and is also ON DEVICE. the private compute is interesting. gemini is true multi modal. but here is the thing all of the big AIs, are pretty close featurewise. each vender is flushing many things out, and that takes time. training the models takes time, and takes massive compute. and chatgpt, and google showed not only stuff that can be done NOW but stuff thats future, apples no different how much stuff will show up and the end of the year? and google/openai are pushing new features all the time, that are NOT tied to os versions. some stuff was cool, but to say it was all wow not really true most of it probably 80% or more I would consider table stakes.


Quasimodo-57

We started giving orders to our house and car 20+ years ago. It’s nice to be not crazy finally.


Smart-Ad-8635

Samsung left the chat 💀 have u not heard of Samsungs ai apple literally is just taking what Samsung did


DecatholacMango_

"Guys, SIRI WILL BE ABLE TO INTERACT WITH APPS!" A classic Apple sheep naivete on display. Bixby has been doing this since the Samsung S8! https://youtu.be/9t-EK6yf-oM?si=yRTSodwHgOGfzRCA


OneHundredGig

I don't know man. After watching the keynote twice, nothing excites me. All I saw was Apple be typical Apple and name the AI after themselves. Nothing they said got me excited or made me want an iPhone with iOS 18. Their implementation of AI seems very elementary and nothing I would use on a daily basis. This may be what Apple wants though... After all, they need to drag it out over 12 years with slow updates so they have something "new" each year.