
Today during Connect, Meta announced the Wearables Device Access Toolkit, which represents the company’s first steps toward allowing third-party experiences on its smart glasses.
If the name “Wearables Device Access Toolkit” sounds a little strange, it’s for good reason. Compared to a plain old SDK, which generally allows developers to build apps for a specific device, apps made for Meta smart glasses don’t actually run on the glasses themselves.
The “Device Access” part of the name is the key; developers will be able to access sensors (like the microphone or camera) on the smart glasses, and then pipe that info back to their own app running on an Android or iOS device. After processing the sensor data, the app can then send information back to the glasses for output.
For instance, a cooking app running on Android (like Epicurious) could be triggered by the user saying “Hey Epicurious” to the smart glasses. Then, when the user says “show me the top rated recipe I can make with these ingredients,” the Android app could access the camera on the Meta smart glasses to take a photo of what the user is looking at, then process that photo on the user’s phone before sending back its recommendation as spoken audio to the smart glasses.
In this way, developers will be able to extend apps from smartphones to smart glasses, but not run apps directly on the smart glasses.
The likely reason for this approach is that Meta’s smart glasses have strict limits on compute, thermals, and battery life. And the audio-only interface on most of the company’s smart glasses doesn’t allow for the kind of navigation and interaction that users are used to with a smartphone app.
Developers interested in building for Meta’s smart glasses can now sign up for access to the forthcoming preview of the Wearables Device Access Toolkit.
As for what can be done with the toolkit, Meta showed a few examples from partners who are experimenting with the devices.
Disney, for instance, made an app which combines knowledge about its parks with contextual awareness of the user’s situation by accessing the camera to see what they’re looking at.
Golf app 18Birdies showed an example of contextually aware information on a specific golf course.
For now, Meta says only select partners will be able to bring their app integrations with its smart glasses to the public, but expects to allow more open accessibility starting in 2026.
The examples shown so far used only voice output as the means of interacting with the user. While Meta says developers can also extend apps to the Ray-Ban Display glasses, it’s unclear at this point if apps will be able to send text, photo, or video back to the glasses, or integrate with the device’s own UI.