Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It changes nothing about the impressiveness (or lack thereof) of the feature.

Detecting an appointment from an email doesn't even require AI.

You're also over-indexing on the fact that some processing will be done on device. The rest will go to Apple's servers just the same as Google. And you will never know how much goes or doesn't.



Apple Mail has been able to detect appointments and reservations from email for years, just like Gmail -- and at least in my experience, Apple Mail pulls more useful information out of the mail when it creates the calendar entry. What they showed today is, in theory, something different. (I presume the difference is integrating it into the Siri assistant, not the mail application.)


The "AI" bit (a word they didn't mention during the keynote BTW) is the processing of a natural language user command to something the existing ML model can understand.

Most of the things shown during the keynote, can already be done with older iPhones - on device, but they need to be "talked to" like a computer, not with natural language that's not completely perfect.


> Most of the things shown during the keynote, can already be done with older iPhones - on device, but they need to be "talked to" like a computer, not with natural language that's not completely perfect.

That's only half true. If you get a text saying "Yo let's meet tomorrow at lunch?" it will offer an option to create an event from it, so even now it's possible in non-perfect scenarios.

Now the real question is: does getting the next 5% that wasn't possible justify sending potentially all you data to Apple's servers? I think the answer is a pretty resounding "fuck no".

Overall the announcement is extremely low value proposition (does anyone really use their stupid Bitmoji thing?) but asks for a LOT from the user (a vague "hey some stuff will be sent to our servers").




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: