A few weeks ago, while listening to music and developing an app with my AirPods, I noticed their spatial audio feature. I thought about what else could be done with reverse engineering, including the possibility of using AirPods as a motion controller. I built the world's first AirPods-controlled game, which uses a motor driven by head movements. In fact, it's not just AirPods, it's a game that uses a headset as a motion controller. And today, it was approved on the App Store. Who knows, maybe AirPods Arcade has even started? :)
This is an amazing idea! Even more surprising is the fact that apple approved it, usually they're not a fan of features being abused in ways they weren't intended for...
We made a cooking thermometer that plugged into the iPhone’s headphone jack before Bluetooth LE took off. Did not run into trouble with the app review team.
This reminds me of that ripe beautiful period after the iPhone was released when the all the apps were these weird idiosyncratic almost, "game like" offerings that weren't tied to platforms that either just brought amusement to a user or solved a simple problem. They were often free or cost .99 cents to 1.99$ and there were no subscriptions.
At any rate you earned your upvotes today tanis. Keep at her!
Yeah it was really a magical moment. I think the recipe for such moments is:
1. one piece of technology fairly widely accessible at once
2. Technology MUST be physical (device etc.)
GenAI is certainly having its incredible moment now, but without a physical layer to connect people through it doesn’t have that same edge.
Compare with something as niche as Kinect - for the engineering crowd, at least in my experience, the excitement around it was maybe less intense (no promise of money), but more crystalline
Yeah man I'm with you on the Kinect-- the WiiMote too. I remember how geeked out my family was when we finally got it to work as mouse on our computer. It was once again, "a hell of time to be alive."
Without being simple minded there might be fertility in, "can't we get the Kinect/WiiMote to control an AI?" Couldn't I swing my device around (giggidy) and have the AI analyze the movements to do something, "novel?" Food for thought. Thanks for commenting wellthisisgreat.
This is the kind of creative out of the box thinking, official technology is lacking right now. Many times we don’t need new technologies, what we need are new ways to use what we have right now.
First off, kudos and congrats on the launch, seems like a fun idea! I am curious, as you mentioned reverse engineering. How difficult was it to retrieve the raw gyroscope data from the AirPods - AFAIK there is no API to access this information, right?
Put a sine wave emitter (or multiple) on the scene. Enable head tracking. Analyze stereo sound at the output. Mute output. There you go: you now can track user’s head without direct access to gyroscope data.
Apple does not secretly analyze sine waves to infer head motion. Instead, airpods pro/max/gen-3 include actual IMUs (inertial measurement units), and ios exposes their readings through core motion.
It’s a known research technique called acoustic motion tracking (some labs use inaudible chirps to locate phones or headsets) you mentioned, but it’s not how airpods head tracking works
I think they're more so talking about measuring attenuation that apple applies for the "spatial audio" effect (after apple does all of the fancy IMU tracking for you), by using a known amplitude of signal in, and the ability to programmatically monitor the signal out after the effect, you can reverse engineer a crude estimated angle out of the delta between the two.
I don't think that's how this app works though, after installing it I got a permission prompt for motion tracking.
Since the author of the app mentioned reverse engineering, analyzing audio is a way that immediately came to mind. It should be quite precise, too, only at the expense of extra CPU cycles.
I did not imply that there is no API to get head tracking data (even though Google search overview straight up says that). It’s mostly a thought experiment. Kudos for digging up CMHeadphoneMotionManager.
> Apple does not secretly analyze sine waves to infer head motion.
Duh. The mechanism I described hinges on Apple being able to track head movements in the first place in order to convert that virtual 3D scene to stereo sound.
Reminds me of when the first iPhones came out and developers were very creative for the time with the available features: flashlight app, bubble level app, asphalt 4
Okay so there is something called TrackIR and it has an open source implementation called OpenTrack. It’s used for simulators like flight or driving sims.
I’ve seen iPhone apps that use a neural net to determine the direction your head is facing using the camera. I think it networks with OpenTrack somehow, but I’m not sure about the details.
I wonder if you could use the AirPods to track your head and also direct audio from the PC through them with some networking!
I don’t know how to do Apple development but this world be a very cool application.
On my iPhone 16 Pro + Airpods Pro, when I start the game (tapping on the screen when it says "Tap to start") I get a message saying "Airpods Disconnected", even though the Control Center on the phone reports them as connected).
Tried restarting the app, and disconnecting and reconnecting the Airpods with no luck.
I don’t have the latest AirPods so I haven’t downloaded it, but I’ll put in a feature request to enable the iPhone to be used as a tilt controller. It doesn’t have to change the marketing that AirPods are the intended controller.
But even though I can’t play it, great job on doing something new and creative.
Can I ask about the tech stack - what did you use to build it. Just plain Swift and SceneKit? I did notice the app download is over 100MB, which seems a bit excessive for the game play.
Theoretically yes. There are apps for the blind that do this[1]; you set a "beacon" at the location you want to navigate to, and the apps use head tracking and 3d (HRTF) audio to show you which direction the beacon is in.
Most of these are based on Microsoft's discontinued Soundscape app[2], which MS open sourced after its discontinuation.