Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It was a very quick mention, but Siri will now have a text button directly on the lock screen.

If we assume AI will get even 3-4x better, at a certain point, I can't help but think this is the future of computing.

Most users on mobile won't even need to open other apps.

We really are headed for agents doing mostly everything for us.



Except the Intent API is completely crippled. Maybe the next big OS will just let the AI parse existing menus and figure out all the potential actions an app can take. Some actions need complex objects, so we need a new general mechanism for AIs to connect to 'exported functions'.

Some general OS rethinking is overdue. Or maybe Android is ready for this? Haven't looked into it since they made development impossible via gradle.

Despite this negativity the announcements were better than expected, rebranding AI is bold and funny. But the future will belong to general Agents, not a hardcoded one as presented.


Android theoretically has a pretty rich intent API, but like anything on Android adoption is a big meh.


Siri already has an optional text button on the lockscreen. They changed the shortcut though. For me in ios17 it's a long press on the side button.


And with ChatGPT's direct integration into Siri, ChatGPT will be available to anyone using iOS for free, without an account. Interesting.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: