• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

XR Design & Development

New Meta Developer Tool Enables Third-parties to Bring Apps to its Smart Glasses for the First Time

September 19, 2025 From roadtovr

Today during Connect, Meta announced the Wearables Device Access Toolkit, which represents the company’s first steps toward allowing third-party experiences on its smart glasses.

If the name “Wearables Device Access Toolkit” sounds a little strange, it’s for good reason. Compared to a plain old SDK, which generally allows developers to build apps for a specific device, apps made for Meta smart glasses don’t actually run on the glasses themselves.

The “Device Access” part of the name is the key; developers will be able to access sensors (like the microphone or camera) on the smart glasses, and then pipe that info back to their own app running on an Android or iOS device. After processing the sensor data, the app can then send information back to the glasses for output.

For instance, a cooking app running on Android (like Epicurious) could be triggered by the user saying “Hey Epicurious” to the smart glasses. Then, when the user says “show me the top rated recipe I can make with these ingredients,” the Android app could access the camera on the Meta smart glasses to take a photo of what the user is looking at, then process that photo on the user’s phone before sending back its recommendation as spoken audio to the smart glasses.

In this way, developers will be able to extend apps from smartphones to smart glasses, but not run apps directly on the smart glasses.

The likely reason for this approach is that Meta’s smart glasses have strict limits on compute, thermals, and battery life. And the audio-only interface on most of the company’s smart glasses doesn’t allow for the kind of navigation and interaction that users are used to with a smartphone app.

Developers interested in building for Meta’s smart glasses can now sign up for access to the forthcoming preview of the Wearables Device Access Toolkit.

As for what can be done with the toolkit, Meta showed a few examples from partners who are experimenting with the devices.

Disney, for instance, made an app which combines knowledge about its parks with contextual awareness of the user’s situation by accessing the camera to see what they’re looking at.

Golf app 18Birdies showed an example of contextually aware information on a specific golf course.

For now, Meta says only select partners will be able to bring their app integrations with its smart glasses to the public, but expects to allow more open accessibility starting in 2026.

The examples shown so far used only voice output as the means of interacting with the user. While Meta says developers can also extend apps to the Ray-Ban Display glasses, it’s unclear at this point if apps will be able to send text, photo, or video back to the glasses, or integrate with the device’s own UI.

Filed Under: News, XR Design & Development, XR Industry News

Snap Spectacles Offer a Peek into All-day AR with New Geo-location Platform Update

March 17, 2025 From roadtovr

Snap, the company behind Snapchat, introduced its fifth-gen Spectacles AR glasses six months ago, and now the company is releasing a number of new features that aim to improve geo-located AR experiences.

Released in September 2024, Spectacles are still very much a developer kit—the AR glasses only have 45 minutes of standalone battery power—although Snap is one of the few companies out there actively engaging developers to build the sort of mainstay content you might find on the all-day consumer AR glasses of the near future.

While we’re not there yet, Snap announced that developers can start building Lenses (apps) integrating data from GPS, GNSS, compass heading, and custom locations, essentially giving devs access to geo-location data for better outdoor AR experiences.

Snap has highlighted a few sample Lenses to show off the integration, including Utopia Labs’ NavigatAR, which guide users with Snap Map Tiles, and Path Pioneer, which lets users create AR walking courses.

Geo-location data also helped Niantic bring multiplayer to Peridot Beyond, its AR pet simulator exclusively for Spectacles. The recent update also connects Spectacles with the mobile version of Peridot, allowing progression within the AR glasses experience to carry over to mobile.

Similarly, Snap has also worked with Wabisabi to integrate its machine learning model SnapML into Doggo Quest, the gamified dog-walking AR app, letting you overlay digital effects on your pooch as it tracks metrics such as routes and step counts.

Today’s update comes with a few more platform features too, including the ability to easily add leaderboards to Lenses, an AR keyboard for hand-tracked text input, and improved ability to open Lens links from messaging threads.

The update also features three new hand-tracking capabilities—phone detector to identify when a user has a phone in their hands, grab gesture, and refinements to targeting intent to reduce false positives while typing.

Additionally, Snap is kicking off a ‘Spectacles Community Challenges’ on April 1st that lets teams win cash prizes for submitting new or updating existing Lenses, which are judged on engagement, technical excellence, and Lens quality. The company says that each month it’s going to give out over $20,000 to the top five new Lenses, the top five updated Lenses, and top open source Lens.

This follows Snap’s recent bid to bring Spectacles to more than just developers. In January, Snap announced it was making the the fifth-gen device more affordable to students and teachers, bringing the price down to $594 for 12 months of subscription-free access, then $49.50 per month afterward for continued use of the headset.

While Snap’s Spectacles remain a developer-focused device, these updates signal the company’s long-term ambition for mainstream AR adoption, where it will notably be competing with companies like Meta, Apple and Google. Better geo-located experiences are undoubtedly a vital piece of the puzzle to making AR glasses a daily necessity rather than a niche tool.

Filed Under: AR apps, AR Design, AR News, News, XR Design & Development

  • Home