• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

AR News

Meta Aims to Double, Possibly Even Triple Smart Glasses Production This Year

January 15, 2026 From roadtovr

Meta and EssilorLuxottica are potentially set to double the expected production target for their smart glasses, according to a recent Bloomberg report.

Citing people familiar with the matter, the report maintains Meta has suggested increasing annual capacity to 20 million units by the end of 2026, as the company hopes to seize growing consumer interest in smart glasses.

Additionally, the report maintains that, provided demand is strong, capacity could exceed 30 million units. Talks are said to still be ongoing, Bloomberg says.

Ray-Ban creator EssilorLuxottica noted in February 2025 that it was ramping up production capacity to 10 million annual units by the end of 2026.

Meta Ray-Ban Display & Neural Band | Photo by Road to VR

The 10 million figure already represented a significant push past its 2 million units sold following the 2023 release of the first-gen Ray-Ban Meta smart glasses.

Currently, Meta and EssilorLuxottica offer two fundamental smart glasses types: audio-only AI centric frames, styled in both Oakley and Ray-Ban variants, and Meta Ray-Ban Display, which includes a single full-color display embedded in the right lens.

This comes amid news that Meta is pausing the international rollout of the $800 Meta Ray-Ban Display smart glasses, which was set to arrive in the UK, France, Italy and Canada sometime in early this year. The company maintains the pause was due to “unprecedented demand and limited inventory.”

Meanwhile, Meta is laying off around 10 percent of staff at its Reality Labs XR division, according to a New York Times report. The move is seen as a strategic shift, moving focus from VR and its metaverse ambitions to AI and smart glasses.

Filed Under: AR News, Meta Quest 3 News & Reviews, News

Google Teases Next Android XR Device: XREAL’s Upcoming AR Glasses ‘Project Aura’

May 21, 2025 From roadtovr

When it launches later this year, Android XR is coming first to Samsung’s mixed reality headset, Project Moohan. Now, Google has tapped AR glasses creator XREAL to be the second with its newly unveiled Project Aura.

Google announced at its I/O developer event that China-based XREAL will be the second device officially slated to run Android XR, the company’s forthcoming XR operating system currently in developer preview.

Codenamed Project Aura, the companies describe the optical see-through (OST) device as “a portable and tethered device that gives users access to their favorite Android apps, including those that have been built for XR.”

Information is still thin, however XREAL says Project Aura was created in collaboration with Google and chip-maker Qualcomm, and will be made available to developers “soon after” the launch of Project Moohan, which was recently affirmed to arrive later this year.

Image courtesy XREAL

XREAL hasn’t released specs, although the company has a track record of pairing micro-OLEDs with birdbath optics, which differs from the more expensive waveguide optics seen in devices such as Microsoft HoloLens, Magic Leap One, or Meta’s Orion AR glasses prototype.

Birdbath optics use a curved mirror system for brighter, higher field-of-view (FOV) and lower-cost AR displays, although this typically results in bulkier designs. Waveguides are often thinner and more expensive to manufacture, but provide more wearable form factors with better transparency; waveguides also typically feature a lower FOV, although prototypes like Meta Orion are bucking that trend.

Like the Android XR glasses seen on stage at Google I/O, which are coming from eyewear companies Warby Parker and Gentle Monster, XREAL Project Aura is expected to feature built-in Gemini AI, allowing it do things like real-time translation, AI assistant chats, web searches, object recognition, and displaying contextual info.

Choosing XREAL as its next Android XR hardware partner makes a good deal of sense here. Founded in 2017, XREAL (previously Nreal) has developed a number of AR glasses generations over the years, including its own custom Android launcher, Nebula, to handle native AR experiences on Android devices.

Like previous XREAL devices, Project Aura is meant to be a tethered, and not standalone. It’s uncertain just what external device the device will run Android XR, be it a standard smartphone or dedicated ‘puck’ like XREAL Beam.

That said, XREAL says they’ll be talking more about Project Aura at the Augmented World Expo (AWE) next month, which takes place June 10th – 12th in Long Beach, California. We’re going to present at AWE this year, so check back soon for more on all things XR to come from the event.

Filed Under: AR News, News

Snap Spectacles Offer a Peek into All-day AR with New Geo-location Platform Update

March 17, 2025 From roadtovr

Snap, the company behind Snapchat, introduced its fifth-gen Spectacles AR glasses six months ago, and now the company is releasing a number of new features that aim to improve geo-located AR experiences.

Released in September 2024, Spectacles are still very much a developer kit—the AR glasses only have 45 minutes of standalone battery power—although Snap is one of the few companies out there actively engaging developers to build the sort of mainstay content you might find on the all-day consumer AR glasses of the near future.

While we’re not there yet, Snap announced that developers can start building Lenses (apps) integrating data from GPS, GNSS, compass heading, and custom locations, essentially giving devs access to geo-location data for better outdoor AR experiences.

Snap has highlighted a few sample Lenses to show off the integration, including Utopia Labs’ NavigatAR, which guide users with Snap Map Tiles, and Path Pioneer, which lets users create AR walking courses.

Geo-location data also helped Niantic bring multiplayer to Peridot Beyond, its AR pet simulator exclusively for Spectacles. The recent update also connects Spectacles with the mobile version of Peridot, allowing progression within the AR glasses experience to carry over to mobile.

Similarly, Snap has also worked with Wabisabi to integrate its machine learning model SnapML into Doggo Quest, the gamified dog-walking AR app, letting you overlay digital effects on your pooch as it tracks metrics such as routes and step counts.

Today’s update comes with a few more platform features too, including the ability to easily add leaderboards to Lenses, an AR keyboard for hand-tracked text input, and improved ability to open Lens links from messaging threads.

The update also features three new hand-tracking capabilities—phone detector to identify when a user has a phone in their hands, grab gesture, and refinements to targeting intent to reduce false positives while typing.

Additionally, Snap is kicking off a ‘Spectacles Community Challenges’ on April 1st that lets teams win cash prizes for submitting new or updating existing Lenses, which are judged on engagement, technical excellence, and Lens quality. The company says that each month it’s going to give out over $20,000 to the top five new Lenses, the top five updated Lenses, and top open source Lens.

This follows Snap’s recent bid to bring Spectacles to more than just developers. In January, Snap announced it was making the the fifth-gen device more affordable to students and teachers, bringing the price down to $594 for 12 months of subscription-free access, then $49.50 per month afterward for continued use of the headset.

While Snap’s Spectacles remain a developer-focused device, these updates signal the company’s long-term ambition for mainstream AR adoption, where it will notably be competing with companies like Meta, Apple and Google. Better geo-located experiences are undoubtedly a vital piece of the puzzle to making AR glasses a daily necessity rather than a niche tool.

Filed Under: AR apps, AR Design, AR News, News, XR Design & Development

  • Home