I understood that it will have the full context of the data on your phone, in order to be ,,useful”.
We are yet to see if that means only the data you’ve invoked ai features for, or totality of your emails, notes, messages, transcripts of your audio, etc.
From the presentation it sounds like the on-device model determines what portion of the local index is sent to the cloud as context, but is designed for none of that index to be stored in the cloud.
So (as I understand it) something like "What time does my Mom's flight arrive?" could read your email and contacts to find the flight on-device, but necessarily has to send the flight information and only the flight information to answer the arrival time.
If not, and if you don’t want practically every typed word to end up on someone else’s computer (as cloud is just that), you’ll have to drop ios.
As for me that leaves me with a choice between dumbphone or grapheneOS. I’m just thrilled with these choices. :/