• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

Apple Reportedly Preparing Several Styles of Smart Glasses with Distinct Camera Lens

April 13, 2026 From roadtovr

Apple is ostensibly gearing up to release its first pair of smart glasses next year. A new report from Bloomberg’s Mark Gurman however maintains the Cupertino tech giant is now evaluating several styles and color combinations, as well as a distinct camera lens shape.

The report largely echoes previous rumors, which maintain the device (internally codenamed N50) could be unveiled at the end of 2026 or early 2027, with launch slated for sometime in 2027.

Like Meta’s audio-only smart glasses from Ray-Ban and Oakley though, Apple’s smart glasses are reportedly said to omit any sort of display. Instead, they’ll primarily focus on capturing photos and videos, making phone calls, listening to notifications and music, and interacting with an AI voice assistant—a better version of Siri coming to iOS 27, the report maintains.

According to Bloomberg’s sources, Apple’s design team now has at least four different styles in play, with plans to launch some or all of them. Styes are said to include:

  • A large rectangular frame, reminiscent of Ray-Ban Wayfarers
  • A slimmer rectangular design, similar to the glasses worn by Apple Chief Executive Officer Tim Cook
  • Larger oval or circular frames
  • A smaller, more refined oval or circular option

The glasses are said to be made from acetate, a higher-quality plastic than the thermoplastic used in Ray-Ban Meta smart glasses. Additionally, Apple is reportedly planning “many” color options in addition to exploring “a range of finishes, including black, ocean blue and light brown.”

Unlike Ray-Ban Meta, Apple’s first smart glasses are said to include a new camera lens shape, which Bloomberg maintains will be “vertically oriented oval lenses with surrounding lights,” a break from Meta’s circular camera lens design.

Facebook Ray-Ban Stories (2021) | Image courtesy Meta, EssilorLuxottica

Apple’s forthcoming glasses are reportedly a part of a broader, “three-pronged AI wearables strategy,” which is slated to include new AirPods and a camera-equipped pendant. Like with all of its accessories, Apple hopes to achieve an “instantly recognizable” design, which the company refers to as “icon” internally.

Notably, Apple’s smart glasses plans mark a stark departure from its initial XR strategy when it first formed the product division around a decade ago. Back then, the company reportedly hoped to develop three distinct categories: an iPhone-tethered AR headset with wireless controller, a high-end mixed reality headset, and standalone AR glasses.

The company has only released two iterations of Vision Pro though—ostensibly a different product from the high-end MR headset the company envisioned. Meanwhile, industry insiders suggest Apple is years away from the release of standalone AR glasses, making its audio-only smart glasses a first step of many.


Confused about the smart glasses and AR glasses? Check out our handy primer on the key differences here.

Filed Under: AR Development, News, XR Industry News

Snap & Qualcomm Announce Long-term Partnership, Affirm 2026 Launch for ‘Specs’ Consumer AR Glasses

April 10, 2026 From roadtovr

Snap’s new XR subsidiary Specs Inc and Qualcomm announced a multi-year partnership for Snap’s upcoming AR glasses, with Qualcomm pledging Snapdragon chips for future iterations. The companies also reaffirmed that Snap’s next-gen Specs are coming “later this year.”

Specs Inc, which was formed by Snap in January to handle its XR efforts, is working with Qualcomm in what the companies call a “long-term strategic roadmap” which aims to rapidly bring things like “on-device AI, cutting-edge graphics, and advanced multiuser digital experiences,” the companies said in a joint press statement.

“Snap Inc. and Qualcomm Technologies have a strong track record of powering advanced immersive technology. This agreement builds on more than five years of innovation and collaboration, as Snapdragon platforms have powered multiple previous generations of Snap’s Spectacles,” the companies said.

Image courtesy Snap Inc, Niantic

Snap’s sixth gen Specs are looking to appeal to consumers while also possibly also frontrunning its largest competitors, including Meta, Samsung, Google, and Apple.

While Snap hasn’t shown off its next-gen Specs yet, the company seems to be leaning heavily into the device’s built-in AI, something that “uses its understanding of you and your world to help get things done on your behalf while protecting and respecting your privacy,” Snap said earlier this year.

“The next era of computing will be defined by devices that understand what you see, hear and say as well as context, and respond instantly to the world around you,” said  Qualcomm President and CEO Cristiano Amon. “Our work on future generations of Specs will enable power-efficient interactive AR devices that deliver agentic experiences that feel natural, intuitive and integrate seamlessly into daily life.”

Aiming for release sometime this year, the next iteration of Spec will technically be the company’s second pair of AR glasses, following its fifth-gen release in 2024.

Besides Snap noting the new AR device will be smaller and lighter, feature see-through AR optics, and be powered by some form of Snapdragon XR SoC, its specs are largely still a mystery.


Confused about the smart glasses and AR glasses? Check out our handy primer on the key differences here.

Filed Under: AR Development, ar industry, News, XR Industry News

Android XR Update Adds Deep Enterprise Support, Auto-3D Conversion & More

April 9, 2026 From roadtovr

Samsung and Google announced a major software update to Galaxy XR, the Android XR-based headset, which includes deep enterprise support, automatic 3D conversion of photos and videos, and more.

One of the biggest bits in the update, now available on Samsung Galaxy XR, is the inclusion of Android Enterprise, the Google-led initiative that includes built-in security, management, and app deployment tools for workplace-focused Android devices.

This includes support for fully managed deployments with flexible enrollment options like zero-touch, QR setup, and DPC provisioning, aimed at making large-scale rollouts easier.

It also includes enterprise app management, robust device controls, and hardware-level security to protect sensitive data and meet compliance standards, the company says in a press statement.

Photo by Road to VR

In addition to enterprise features, the latest Android XR update also includes improvements targeting everyday usability, such as customizable virtual keyboard positioning, desktop session restore for up to three apps after reboot, expanded accessibility tools such as single-eye tracking, and improved spatial alignment of on-screen content.

What’s more, the Android XR update now brings auto spatialization to Chrome and YouTube, which automatically converts 2D media into immersive 3D experiences. This comes in addition to the previously available auto spatalization feature for Google Photos.

Notably, as a part of the new update, Android XR is now set to receive regular software updates, including security patches, for up to five years.

Released in October 2025 for $1,800, Samsung Galaxy XR looks to fill a middle ground of price and features between Meta Quest 3 ($500) and Apple Vision Pro ($3,500).

Much like how Vision Pro taps into the library of iOS apps in addition to native visionOS apps, Android XR-based Galaxy XR makes use of Android’s vast app ecosystem. Still, Galaxy XR is only the first full-feature XR headset to adopt Android XR, seems to be filling out software features as we speak.

Filed Under: Android XR News & Reviews, News, VR Development, XR Industry News

‘Black Mirror’ to Bring Show’s Tech Dystopia to Life in New Location-based VR Experience

April 2, 2026 From roadtovr

Black Mirror, the hit sci-fi anthology series on Netflix, is getting its own VR experience soon—set to debut at VR destination Infinity Experience in Montreal, Canada next month ahead of wider rollout.

Developed by VR studio Univrse and Banijay Live Studio, THE BLACK MIRROR EXPERIENCE is slated to mash up physical environments with VR headsets, drawing on themes of the award-winning television series.

According a press statement, The Black Mirror Experience creates a scenario that will “force visitors to make [the] same choices” as seen in the show, which explores society’s complex relationship with technology and the moral quandaries often faced by characters.

Image courtesy Banijay

“Groups are invited to the exclusive opening of Phaethon’s showroom – a tech giant about to unveil its most ambitious creation yet: LifeAgent, a robot designed to simplify your life, understand your desires, and help you become your best self. At first, everything feels seamless. Reassuring. Almost perfect. Until it doesn’t,” the studio explains.

The experience supports up to six players (ages 12+), and will be offered in French, English, and Spanish. It’s slated to launch first at the Infinity Experience location in Montreal on May 21st, however more locations will be announced “soon,” the studio says.

Notably, Infinity Experience operates in seven cities across North America. In Canada: Montreal, Edmonton, Calgary, Quebec City, and Mississauga. In the US: Chicago and Atlanta.

Created by Charlie Brooker, Black Mirror returned for its seventh series on Netflix in April 2025, produced by Broke & Bones, with Brooker, Jessica Rhoades, and Annabel Jones as executive producers. Black Mirror is primarily owned by Banijay Entertainment.

Filed Under: Location-based VR, News

Meta Inches Into Health Wearables with New Food Logging Feature for Ray-Ban Smart Glasses

April 1, 2026 From roadtovr

Meta announced it’s pushing an update to Ray-Ban and Oakley Meta smart glasses that’s slated to make nutrition tracking easier by letting Meta AI visually suss out food before you eat it.

The News

Over time, the company says that a user’s food log will inform “increasingly personalized insights that get more useful, helping you make healthier, more informed choices.”

Meta says it will be somewhat of a manual process though, as users need to prompt Meta AI to log their food in addition to inputting specific nutrition goals.

Ray-Ban Meta (Gen 2) | Image courtesy Meta

While we’re not there yet, Meta says in the future glasses will be able to understand what you’re eating and automatically log your food, which in turn opens up even more personalized nutrition insights since you don’t have to remember to log every meal.

For now though, the company envisions users asking Meta AI questions like “What should I eat to increase my energy?” which will output a suggestion based on your food log and fitness goals.

Meta says the new feature will be available to users aged 18+in the US “soon” across all Ray-Ban Meta and Oakley Meta smart glasses, with its Meta Ray-Ban Display glasses getting the update sometime later this summer.

My Take

Meta doesn’t do health tracking; its smart glasses don’t track your heart rate, steps, activity, sleep (of course not), calories burned, O² levels—nothing.

Granted, they can link with Garmin smart watches which can do those things, although the glasses themselves essentially only act as a sort of audio relay, repeating the info sensed and stored by the Garmin app, meaning Meta can’t really do anything truly useful with the bulk of your health data. Notably, Meta smart glasses don’t tie into Samsung Health or Apple Health either, putting a majority of users’ health data out of Meta’s reach.

Meta Ray-Ban Display Glasses & Neural Band | Image courtesy Meta

But it probably won’t always be that way. Meta seems to be leveraging what it can feasibly (and cheaply) do right now without having to cut any expensive licensing deals with dominant players in the smart watch segment.

The company does have a vector to get all of that data one day though. Meta Ray-Ban Display comes with a wrist-worn Neural Band controller that uses surface electromyography (sEMG) which lets users quietly write out messages and manipulate UI. I can imagine a near future where Neural Band has a packet of sensors similar to a smart watch, albeit without the display.

Provided Meta goes that specific route, the company wouldn’t need to integrate with existing health ecosystems at all for its future smart glasses. It will already have everything it needs to close the loop on what you’re eating and how you’re burning it off.

Filed Under: AR Development, News, XR Industry News

Meta is Releasing 2 New Ray-Ban Smart Glasses for Prescription Wearers, Starting at $500

March 31, 2026 From roadtovr

Meta and eyewear partner EssilorLuxottica announced two new “optical-forward” pairs of Ray-Ban Meta glasses, which are said to support nearly all prescriptions.

Ray-Ban Meta and Oakley Meta smart glasses can already be paired with prescription lenses, although the latest pairs of Ray-Ban Meta smart glasses are coming with new ergonomic features: overextension hinges, interchangeable nose-pads, and optician-adjustable temple tips, things designed to give users a more custom fit.

Ray-Ban Meta ‘BLAYZER’ model |  Image courtesy Meta, EssilorLuxottica

In a blog post, Meta announced it’s offering two new frame styles: a rectangular ‘Blayzer Optics’ design available in two sizes (Standard and Large) and a more rounded ‘Scriber Optics’ frame. Both come with a Dark Brown charging carrying case, with pricing starting at $500.

Colors include Matte Black, Transparent Black, and Transparent Dark Olive, although Meta is also releasing seasonal colors, such as Transparent Matte Ice Grey and Transparent Stone Beige.

Ray-Ban Meta ‘Scriber’ model | Image courtesy Meta, EssilorLuxottica

Both new Blayzer and Scriber frames will be available for pre-order in the US starting today from Meta.com and Ray-Ban.com, as  well as at optical retailers in the US and and select international markets starting April 14th.

Meta also announced it’s releasing new lens and color options for Ray-Ban Meta (Gen 2) and Oakley Meta lines. New options include:

  • Vanguard Black with Prizm Black Lenses
  • Vanguard White with Prizm Rose Gold Lenses
  • Vanguard Black with Prizm Transitions® Ember Lenses (arriving later this Spring)
  • Vanguard Prizm Transitions Cobalt Lenses (arriving later this Spring)
  • HSTN Black with Prizm Dark Golf Lenses
  • HSTN Light Curry with Clear to Brown Transitions Lenses

Coming this spring and summer, Meta is also releasing three new limited-time seasonal colors for Ray-Ban Meta (Gen 2).

For the Skyler style: Shiny Transparent Peach with Transitions Brown Lenses. For Headliner: Matte Transparent Peach with Transitions Grey Lenses. For Wayfarer: Shiny Transparent Grey with Transitions Sapphire Lenses.

Filed Under: AR Development, News, XR Industry News

Meta Slated to Launch Two New Ray-Ban Smart Glasses, According to FCC Filing

March 27, 2026 From roadtovr

Meta and EssilorLuxottica appear to be preparing two new Ray-Ban smart glasses for launch, according to US Federal Communication Commission (FCC) filings from earlier this month.

As first reported by Janko Roettgers in his Lowpass newsletter, Meta hardware partner EssilorLuxottica has filed with the FCC two new devices which appear to be the next generation of their Ray-Ban smart glasses.

The FCC filing in question contains the names ‘Ray-Ban Meta Scriber’ and ‘Ray-Ban Meta Blazer’, describing them as “production units”, which could mean launch is fairly close.

Array of Meta smart glasses | Image courtesy Brad Lynch

Typically, FCC filings are one of the last stops before launch, as we saw with the last three hardware generations of Ray-Ban smart glasses, all three of which released less than a month after their respective FCC filings.

The company’s second-gen smart glasses released in 2025 include hardware refreshes of its popular Ray-Ban glasses, as well as Oakley Meta HSTN, Oakley Meta Vanguard, and the $800 Meta Ray-Ban Display glasses—the company’s first smart glasses to include a heads-up display.

As Roettgers points out, the new FCC filings are largely devoid of details or images of the devices, however a charging case was mentioned in testing—an accessory provided with all generations of the companies’ smart glasses.

Model numbers for Blazer and Scriber, respectively RW7001 and RW7002, are also new, which suggests we’re dealing with a new hardware generation.

Additionally, the devices were tested using the Wi-Fi 6 U-NII-4 band (5.9 GHz). Notably, the company’s latest smart glasses use Wi-Fi 6E (6 GHz), which does not operate in the 5.9 GHz U-NII-4 band.

This comes amid a significant shift in priorities at Reality Labs, Meta’s XR division. In January, the company laid off at least 10 percent of staff at Reality Labs, as the company has doubled down on AI and smart glasses, and reduced spending on first-party VR content.

Meanwhile, the smart glasses segment has been very successful for the Meta-EssilorLuxottica partnership, with the French-Italian eyewear maker revealing earlier this year that it had sold over seven million smart glasses last year, effectively tripling sales from all prior years in 2025 alone.

Filed Under: News, XR Industry News

Meta Shows Confidence in EMG Input for Wearables by Funding Six External Studies

March 20, 2026 From roadtovr

Meta announced it’s tapped six external teams to receive a research grant in order to advance work on its surface electromyography (sEMG) based wristband controller.

Meta revealed in a blog post it’s launched a research funding initiative focused on improving how users learn and interact with sEMG systems, having chosen six universities out of 70 global submissions.

Each research group is set to receive $150,000 in funding, which includes teams at the University of Central Florida, University of South Florida, University of California, Davis, Newcastle University, University of British Columbia, and Northwestern University.

Meta’s wrist-worn neural interface relies on sEMG, which detects electrical activity in the wrist and hand and translates it into digital commands. As Meta Ray-Ban Display’s main input device, the company hopes to answer a few questions with the studies, namely: how do people learn new sEMG-based controls, and how can onboarding be streamlined?

Wrist-worn XR Controller seen with Orion | Image courtesy Meta

The funded projects explore a range of challenges. Some focus on improving learning methods, such as comparing gamified training with step-by-step instruction, or developing systems that adapt to individual users over time.

Others aim to expand what sEMG can do, like enabling silent speech generation by translating muscle signals into synthesized voice, or increasing the ‘bandwidth’ of communication so users can issue more complex commands without disrupting natural hand movement.

A number of the proposed research topics include assistive applications, such as helping stroke survivors regain muscle control, or improving prosthetic limb operation through co-adaptive systems that learn alongside the user. You can see more about each study here.

This follows the release of Meta Ray-Ban Display last September, the company’s first pair of smart glasses with a built-in display. Priced at $800 and only available in the US for now, the smart glasses make use of the same input scheme first paired with Meta’s Orion AR prototype, which was revealed in late 2024.

This ostensibly shows Meta is pretty confident in the control scheme, viewing it as reliable enough for future (likely AR) devices. We’re looking forward to learning more as the research projects progress. Typically, we see papers either highlighted or released during SIGGRAPH, which is taking place in Los Angeles, California this year on July 19th – 23rd.

Filed Under: AR Development, ar industry, AR Investment, News, XR Industry News

XR Startup Lynx Appears to Enter Liquidation Proceedings Ahead of R2 Headset Launch

March 19, 2026 From roadtovr

The company behind Lynx has entered liquidation proceedings ahead of launch of its upcoming Lynx-R2 XR headset, which is targeted at both consumers and enterprise.

According to French court documents, SL Process, the company behind Lynx, has officially entered judicial liquidation following a ruling by the Economic Activities Court of Nanterre, France.

The legal notice was published on the Official Bulletin of Civil and Commercial Advertisements (BODACC), the country’s public bulletin wherein binding legal status changes are published.

Under French insolvency law, judicial liquidation essentially means restructuring efforts have failed and survival is no longer viable, as assets and IP are typically sold off to cover debts.

Lynx R2 | Image courtesy Lynx Mixed Reality

Road to VR initially reached out to Lynx when a similar posting was made last week, however has yet to receive comment. We’ll update when/if leadership responds to our request.

Notably, SL Process is what Lynx founder and CEO Stan Larroque calls in his personal blog a “shell company” which acts as a parent company to Lynx Mixed Reality.

While the exact reasoning behind the filing remains unclear, it may have something to do with Google reportedly pulling its support for Lynx-R2, which was initially supposed to launch running the Android XR operating system.

Lynx-R2 was slated to launch sometime later this year, featuring 126° horizontal FOV with unique aspheric pancake lenses, paired with a Snapdragon XR2 Gen 2 chipset, 16GB RAM, and full-color pass-through.

As noted by UploadVR in November though, Lynx revealed that Google “terminated Lynx’s agreement to use Android XR,” something the XR hardware maker called a “surprising turn of events” at the time.

If confirmed, the liquidation of SL Process could effectively mark the end of Lynx as an independent XR hardware maker, capping off one of the few European attempts to bring a standalone XR headsets to market—something Larroque characterized in 2024 as an “excruciating” fundraising environment.

Although the company managed to attract additional funding outside of R-1’s successful Kickstarter campaign from late 2021, which brought in $800,000 in crowd funds, Crunchbase data indicates the French startup only managed to attract $6.8 million in funding to date.

Filed Under: News, VR Development, vr industry, VR Investment, XR Industry News

Meta Faces Lawsuit Claiming Ray-Ban Smart Glasses Sent Private Footage to Overseas Reviewers

March 10, 2026 From roadtovr

Meta is facing a class action lawsuit in the US over privacy concerns tied to its Ray-Ban smart glasses. The company is accused of sending private camera footage to a Kenya-based subcontractor for manual review to train its AI models.

Allegations stem from an investigative report from Sweden’s Svenska Dagbladet and Göteborgs-Posten, which is said to have uncovered a subcontractor in Kenya tasked with reviewing and labeling images and videos uploaded from the glasses.

Sources within the subcontractor report seeing videos of everything, from sexual activity, handling of financial information, to a host of other private activities inside homes.

“In some videos you can see someone going to the toilet, or getting undressed. I don’t think they know, because if they knew they wouldn’t be recording,” a facility worker told Svenska Dagbladet.

Array of Meta smart glasses | Image courtesy Brad Lynch

These so-called ‘data annotators’ are said to manually process and tag images: “draw boxes around flower pots and traffic signs, follow contours, register pixels and name objects: cars, lamps, people. Every image must be described, labelled and quality assured,” the report maintains.

Following these revelations, a class-action lawsuit (via TechCrunch) was filed in a US federal court accusing Meta of misleading consumers about the product’s privacy protections.

“Meta chose to make privacy the centerpiece of its pervasive marketing campaign while concealing the facts that reveal those promises to be false,” the lawsuit states, further noting that Meta’s own “face anonymization” layer does not work to obscure the private nature of the transmitted videos.

Meta did not offer a comment to TechCrunch on the litigation itself, however, spokesperson Christopher Sgro provided the following statement:

“Ray-Ban Meta glasses help you use AI, hands-free, to answer questions about the world around you. Unless users choose to share media they’ve captured with Meta or others, that media stays on the user’s device. When people share content with Meta AI, we sometimes use contractors to review this data for the purpose of improving people’s experience, as many other companies do. We take steps to filter this data to protect people’s privacy and to help prevent identifying information from being reviewed.”

While many use Meta’s smart glasses as Ai-assisted sunglasses, its Ray-Ban smart glasses line can be specifically fit with a variety of prescription lens types, which allows users to wear them all-day as corrective glasses.

Filed Under: ar industry, News, XR Industry News

Next Page »

  • Home