The "AI" bit (a word they didn't mention during the keynote BTW) is the processing of a natural language user command to something the existing ML model can understand.
Most of the things shown during the keynote, can already be done with older iPhones - on device, but they need to be "talked to" like a computer, not with natural language that's not completely perfect.
> Most of the things shown during the keynote, can already be done with older iPhones - on device, but they need to be "talked to" like a computer, not with natural language that's not completely perfect.
That's only half true. If you get a text saying "Yo let's meet tomorrow at lunch?" it will offer an option to create an event from it, so even now it's possible in non-perfect scenarios.
Now the real question is: does getting the next 5% that wasn't possible justify sending potentially all you data to Apple's servers? I think the answer is a pretty resounding "fuck no".
Overall the announcement is extremely low value proposition (does anyone really use their stupid Bitmoji thing?) but asks for a LOT from the user (a vague "hey some stuff will be sent to our servers").
Most of the things shown during the keynote, can already be done with older iPhones - on device, but they need to be "talked to" like a computer, not with natural language that's not completely perfect.