• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

AR Development

Meta Inches Into Health Wearables with New Food Logging Feature for Ray-Ban Smart Glasses

April 1, 2026 From roadtovr

Meta announced it’s pushing an update to Ray-Ban and Oakley Meta smart glasses that’s slated to make nutrition tracking easier by letting Meta AI visually suss out food before you eat it.

The News

Over time, the company says that a user’s food log will inform “increasingly personalized insights that get more useful, helping you make healthier, more informed choices.”

Meta says it will be somewhat of a manual process though, as users need to prompt Meta AI to log their food in addition to inputting specific nutrition goals.

Ray-Ban Meta (Gen 2) | Image courtesy Meta

While we’re not there yet, Meta says in the future glasses will be able to understand what you’re eating and automatically log your food, which in turn opens up even more personalized nutrition insights since you don’t have to remember to log every meal.

For now though, the company envisions users asking Meta AI questions like “What should I eat to increase my energy?” which will output a suggestion based on your food log and fitness goals.

Meta says the new feature will be available to users aged 18+in the US “soon” across all Ray-Ban Meta and Oakley Meta smart glasses, with its Meta Ray-Ban Display glasses getting the update sometime later this summer.

My Take

Meta doesn’t do health tracking; its smart glasses don’t track your heart rate, steps, activity, sleep (of course not), calories burned, O² levels—nothing.

Granted, they can link with Garmin smart watches which can do those things, although the glasses themselves essentially only act as a sort of audio relay, repeating the info sensed and stored by the Garmin app, meaning Meta can’t really do anything truly useful with the bulk of your health data. Notably, Meta smart glasses don’t tie into Samsung Health or Apple Health either, putting a majority of users’ health data out of Meta’s reach.

Meta Ray-Ban Display Glasses & Neural Band | Image courtesy Meta

But it probably won’t always be that way. Meta seems to be leveraging what it can feasibly (and cheaply) do right now without having to cut any expensive licensing deals with dominant players in the smart watch segment.

The company does have a vector to get all of that data one day though. Meta Ray-Ban Display comes with a wrist-worn Neural Band controller that uses surface electromyography (sEMG) which lets users quietly write out messages and manipulate UI. I can imagine a near future where Neural Band has a packet of sensors similar to a smart watch, albeit without the display.

Provided Meta goes that specific route, the company wouldn’t need to integrate with existing health ecosystems at all for its future smart glasses. It will already have everything it needs to close the loop on what you’re eating and how you’re burning it off.

Filed Under: AR Development, News, XR Industry News

Meta is Releasing 2 New Ray-Ban Smart Glasses for Prescription Wearers, Starting at $500

March 31, 2026 From roadtovr

Meta and eyewear partner EssilorLuxottica announced two new “optical-forward” pairs of Ray-Ban Meta glasses, which are said to support nearly all prescriptions.

Ray-Ban Meta and Oakley Meta smart glasses can already be paired with prescription lenses, although the latest pairs of Ray-Ban Meta smart glasses are coming with new ergonomic features: overextension hinges, interchangeable nose-pads, and optician-adjustable temple tips, things designed to give users a more custom fit.

Ray-Ban Meta ‘BLAYZER’ model |  Image courtesy Meta, EssilorLuxottica

In a blog post, Meta announced it’s offering two new frame styles: a rectangular ‘Blayzer Optics’ design available in two sizes (Standard and Large) and a more rounded ‘Scriber Optics’ frame. Both come with a Dark Brown charging carrying case, with pricing starting at $500.

Colors include Matte Black, Transparent Black, and Transparent Dark Olive, although Meta is also releasing seasonal colors, such as Transparent Matte Ice Grey and Transparent Stone Beige.

Ray-Ban Meta ‘Scriber’ model | Image courtesy Meta, EssilorLuxottica

Both new Blayzer and Scriber frames will be available for pre-order in the US starting today from Meta.com and Ray-Ban.com, as  well as at optical retailers in the US and and select international markets starting April 14th.

Meta also announced it’s releasing new lens and color options for Ray-Ban Meta (Gen 2) and Oakley Meta lines. New options include:

  • Vanguard Black with Prizm Black Lenses
  • Vanguard White with Prizm Rose Gold Lenses
  • Vanguard Black with Prizm Transitions® Ember Lenses (arriving later this Spring)
  • Vanguard Prizm Transitions Cobalt Lenses (arriving later this Spring)
  • HSTN Black with Prizm Dark Golf Lenses
  • HSTN Light Curry with Clear to Brown Transitions Lenses

Coming this spring and summer, Meta is also releasing three new limited-time seasonal colors for Ray-Ban Meta (Gen 2).

For the Skyler style: Shiny Transparent Peach with Transitions Brown Lenses. For Headliner: Matte Transparent Peach with Transitions Grey Lenses. For Wayfarer: Shiny Transparent Grey with Transitions Sapphire Lenses.

Filed Under: AR Development, News, XR Industry News

Meta Shows Confidence in EMG Input for Wearables by Funding Six External Studies

March 20, 2026 From roadtovr

Meta announced it’s tapped six external teams to receive a research grant in order to advance work on its surface electromyography (sEMG) based wristband controller.

Meta revealed in a blog post it’s launched a research funding initiative focused on improving how users learn and interact with sEMG systems, having chosen six universities out of 70 global submissions.

Each research group is set to receive $150,000 in funding, which includes teams at the University of Central Florida, University of South Florida, University of California, Davis, Newcastle University, University of British Columbia, and Northwestern University.

Meta’s wrist-worn neural interface relies on sEMG, which detects electrical activity in the wrist and hand and translates it into digital commands. As Meta Ray-Ban Display’s main input device, the company hopes to answer a few questions with the studies, namely: how do people learn new sEMG-based controls, and how can onboarding be streamlined?

Wrist-worn XR Controller seen with Orion | Image courtesy Meta

The funded projects explore a range of challenges. Some focus on improving learning methods, such as comparing gamified training with step-by-step instruction, or developing systems that adapt to individual users over time.

Others aim to expand what sEMG can do, like enabling silent speech generation by translating muscle signals into synthesized voice, or increasing the ‘bandwidth’ of communication so users can issue more complex commands without disrupting natural hand movement.

A number of the proposed research topics include assistive applications, such as helping stroke survivors regain muscle control, or improving prosthetic limb operation through co-adaptive systems that learn alongside the user. You can see more about each study here.

This follows the release of Meta Ray-Ban Display last September, the company’s first pair of smart glasses with a built-in display. Priced at $800 and only available in the US for now, the smart glasses make use of the same input scheme first paired with Meta’s Orion AR prototype, which was revealed in late 2024.

This ostensibly shows Meta is pretty confident in the control scheme, viewing it as reliable enough for future (likely AR) devices. We’re looking forward to learning more as the research projects progress. Typically, we see papers either highlighted or released during SIGGRAPH, which is taking place in Los Angeles, California this year on July 19th – 23rd.

Filed Under: AR Development, ar industry, AR Investment, News, XR Industry News

Snap’s Top AR Exec Quits Ahead of Specs Consumer Debut

February 20, 2026 From roadtovr

Scott Myers, Snap’s top executive in charge of Specs, has left the company ahead of the planned release of its consumer AR glasses.

The News

Myers reportedly left his six-year tenure at the company due to a dispute with Snap CEO Evan Spiegel, tech outlet Sources claims, characterizing the dispute as a “blow-up” centered around the company’s strategy.

A Snap spokesperson confirmed Myers’ departure on Reddit, nothing that Specs are still on track for release this year:

“Scott Myers has decided to step down from his role at Snap. We thank him for his contributions and wish him the best in his next chapter. We can’t wait to bring Specs to the world later this year. We remain focused on disciplined execution and long term value creation for our developer partners, community and shareholders.”

Myers came to Snap in 2020 to oversee all aspects of Specs, including hardware, software, product and operations. He previously held senior positions at SpaceX, Apple, and Nokia, according to his LinkedIn profile.

Snap Spectacles (gen 5) | Image courtesy Snap Inc

This comes at a critical moment for Snap. In September 2025, Spiegel noted in an open letter that the company is heading into a make-or-break “crucible moment” in 2026, positioning Specs are an integral part of the company’s future.

“This moment isn’t just about survival. It’s about proving that a different way of building technology, one that deepens friendships and inspires creativity, can succeed in a world that often rewards the opposite,” Spiegel said.

The consumer version of Specs is set to be the company’s sixth generation glasses following the release of its fifth-gen hardware in 2024. As ‘true’ AR glasses (re: not smart glasses like Meta Ray-Ban Display), the device is ostensibly set to frontrun some of Snap’s largest competitors.

My Take

It’s uncertain why Myers left Snap; the company even disputed the “blow-up” narrative with TechCrunch, providing no other reasoning, which makes Myers’ departure an even greater mystery—especially on the eve of the company’s big consumer AR glasses launch.

Speculatively speaking, there is at least one recent sign that could point to trouble brewing in the background. Myer’s departure follows a recent move by the company to form a wholly-owned subsidiary dedicated to Specs.

Snap says the so-called ‘Specs Inc’ subsidiary will primarily allow for “new partnerships and capital flexibility,” including the potential for minority investment. More concretely, Specs Inc also insolates Snap from any potential failure.

Whether that betrays a lack of confidence is unclear, although the top executive who oversaw the release of the fourth and fifth-gen versions—notably the only two with displays and AR capabilities—doesn’t smack of confidence.

Filed Under: AR Development, ar industry, News, XR Industry News

Apple Reportedly Accelerates Smart Glasses Development Amid Wider Push for AI Hardware

February 18, 2026 From roadtovr

Apple is reportedly accelerating the development of smart glasses, as the company is ostensibly making a shift toward AI-centric hardware.

According to a report from Bloomberg’s Mark Gurman, Apple is ramping up development of its forthcoming smart glasses, which are slated to head into production as early as December 2026, with public release expected sometime in 2027.

Apple’s smart glasses are being positioned to compete with Meta and EssilorLuxottica’s most recent smart glasses, the report maintains.

While this mostly echoes previous reports from last October, Apple appears to be accelerating development, having recently distributed a broader set of glasses prototypes within its hardware engineering division.

According to an all-hands meeting with employees earlier this month, CEO Tim Cook supposedly also hinted that Apple would be pushing hard into AI devices, noting that the company was working on new “categories of products” centered around AI.

“We’re extremely excited about that,” Cook said in the internal meeting, saying “[t]he world is changing fast.”

Citing people familiar with Apple’s plans, the smart glasses (allegedly codenamed ‘N50’) are said to include two cameras: a high-resolution camera for photos and video, and another dedicated to computer vision tasks. The high quality onboard cameras and overall build quality are expected to set it apart from competing products, the report maintains.

Array of Meta smart glasses | Image courtesy Brad Lynch

Similar to Meta’s audio-only smart glasses though, Apple’s N50 hardware isn’t expected to include a display of any kind, instead relying on cameras, speakers and microphones for things such as phone calls, AI queries, listening to music, and capturing images.

Apple allegedly floated the idea of partnering with eyewear brands—similar to Meta’s partnership with EssilorLuxottica or Google’s partnership with Warby Parker and Gentle Monster—the company seems to have more recently decided on developing in-house designs, which are said to arrive in a variety of sizes and colors.

“Early prototypes of the glasses connect via a cable to a standalone battery pack and an iPhone, but newer versions have the components embedded in the frame,” Bloomberg reports. “The design uses high-end materials, including acrylic elements intended to give the glasses a premium feel. Apple is already discussing launching the device in additional styles over time.”

This comes as Apple is investing more heavily in AI in effort to better compete with Google and OpenAI, which comes part and parcel with a critical redesign of Siri. The report also maintains Apple is working on an AI-powered pendant and AirPods with expanded AI capabilities—all three of which will rely on visual input.

Notably, the report maintains that all three will rely on connection to iPhone. Apple did not respond to Bloomberg’s request for comment.

Filed Under: AR Development, News, XR Industry News

Meta Sold Over 7 Million Smart Glasses Last Year, Effectively Tripling Sales in 2025 Alone

February 12, 2026 From roadtovr

EssilorLuxottica reported its Q4 2025 financial results, revealing the company sold over seven million smart glasses last year.

The French-Italian eyewear conglomerate has been making smart glasses in partnership with Meta since the launch of the original Ray-Ban Stories in 2021.

Now, in its fourth-quarter results, EssilorLuxottica revealed it sold over seven million smart glasses last year—more than tripling sales since last reported.

In February 2025, the company announced it had sold two million Ray-Ban (Gen 1) smart glasses since release in late 2023.

Image courtesy Brad Lynch

It’s no wonder 2025 was a landmark year for the company though. Alongside Meta, EssilorLuxottica not only released a hardware refresh of its popular Ray-Ban Meta glasses, but also Oakley Meta HSTN, Oakley Meta Vanguard, and the $800 Meta Ray-Ban Display glasses—the company’s first smart glasses to include a heads-up display.

In addition to its smart glasses efforts, EssilorLuxottica maintains that 2025 marked a further acceleration in its “evolution from an optical company into a leading medtech and big-data group,” owing to growth across both its Nuance Audio hearing-aid glasses and AI-driven healthcare platform.

While Meta and EssilorLuxottica are current market leaders in smart glasses, the XR wearables race has really only just begun. As it appears today, companies largely see smart glasses as a first step towards creating all-day AR glasses of the near future, with  potential contenders including Google, Samsung and reportedly also Apple.

Filed Under: AR Development, News, XR Industry News

Ray-Ban Smart Glasses Get Massive Utility Boost with Cool (but risky) ClawdBot Hack

February 10, 2026 From roadtovr

If you’re comfortable mucking around with a new open source project, you could be shopping on Amazon just by looking at an object with your Ray-Ban Meta smart glasses.

Ray-Ban Meta smart glasses are pretty useful out of the box, offering photo & video capture, calls, music playback, and your standard assortment of AI chatbot stuff. They don’t have an app store though, which means you’re basically stuck with a handful of curated services.

Now, indie developer Sean Liu released an open-source project called VisionClaw that links Ray-Ban Meta smart glasses with OpenClaw (aka ClawdBot), essentially giving the autonomous AI agent eyes and ears.

Check out VisionClaw in action below, courtesy Liu:

now my clawdbot lives in my ray-ban meta glasses so i can just buy whatever i’m looking at pic.twitter.com/gWrijyTRhE

— xiaoan (@_seanliu) February 6, 2026

OpenClaw isn’t an AI model like ChatGPT or Google Gemini though. It’s an agentic layer—essentially a complex messaging layer built on top of an AI model that interacts with services on your behalf, like sending emails, managing shopping lists, or controlling smart home devices—just three of the 56+ tools OpenClaw can integrate with right now.

Basically, it works like this: VisionClaw uses Gemini Live for real-time voice and computer vision, which can do things like describe what you’re seeing and answer questions—basically the same sort of tasks you can do with the glasses’ native Meta AI.

Image courtesy Sean Liu

But once you want to actually interact with an app or service—like when you want to send a message over email or your favorite non-Meta messaging app like Signal or Telegram—Gemini Live hands off the request to OpenClaw, which takes action.

Users looking to run VisionClaw will need an iPhone, as Liu’s codebase is written as an Xcode/Swift app that specifically uses Meta’s Wearables Device Access Toolkit (DAT) for iOS to connect the phone to Ray-Ban Meta glasses.

Beyond that, you’ll also need a fair understanding of the risks involved with running OpenClaw on your personal hardware.

While it can do some pretty amazing things, it’s a third-party bit of software that could require you to input passwords, API keys, and personal information, which can open the user up to malicious actors. Notably, OpenClaw’s skill integrations could be written by anyone, so users need to be especially vigilant.

Filed Under: AR Development, News

Distance Technologies Reveals Military AR Goggles for Battlefield Awareness

January 21, 2026 From roadtovr

Distance Technologies has unveiled the Field Operator HUD (FOH), an AI-enhanced AR system designed for military vehicles ranging from light utility platforms to main battle tanks.

FOH is said to combine Distance’s own optics with AI-assisted data processing, which the company says improves situational awareness, survivability, and visual workload management in land combat environments.

Having undergone field trials with UK and Finnish forces, FOH integrates command-and-control functions with its AR optics by fusing multiple sensor inputs—ostensibly similar to Anduril’s EagleEye project, revealed in October 2025.

Image courtesy Distance Technologies

Distance says FOH is designed to present only the most critical information by using AI-driven sensor fusion, automated detection, and by integrating everything from thermal and night vision to data sourced from a wide range of vehicle-mounted sensors—something the company hopes will translate into more effective decision-making both in and outside of military vehicles.

The precise specs of the company’s various FOH configurations are predictably under wraps though, which are said to include models for on-the-ground soldiers, pilots, and various types of ground vehicle operators.

Image courtesy Distance Technologies

On the company website however, Distance says FOH includes technology that “creates an independent lightfield for each eye, allowing us to control the perceived distance of the content on a per-pixel level. This makes it possible to match virtual elements 1-to-1 with reality for a completely natural XR experience.”

It’s also said to allow for visualizations that “appear on top of reality across the entire field of view, perfectly matching the observable world people see and experience around them.”

FOH is expected to be available for NATO and allied field trials by the end of Q1 2026, with broader deployment planned from 2027 through defense prime contractors.

Founded in 2024 the Helsinki, Finland-based company is involved in building what it calls “the first true glasses-free XR solution.” It was founded by a host of XR veterans, including a cadre of alums from fellow Finnish XR startup Varjo, including Urho Konttori, Jussi Mäkinen, Mikko Strandborg, Thomas M. Carlsson, and Petteri Timonen.

Filed Under: AR Development, ar industry, News, XR Industry News

Meta Pauses International Release of Meta Ray-Ban Display Glasses

January 7, 2026 From roadtovr

Meta Ray-Ban Display glasses seem to be selling too well, as the company announced it’s delaying the international rollout of its first display-clad smart glasses.

The News

Initially released in the US back in September, Meta said it was hoping to bring the $800 smart glasses to a number of regions in early 2026, which includes a single color display embedded in the right lens.

Now, the company says in a blog post it’s decided to “pause” the planned expansion to the UK, France, Italy and Canada, citing “unprecedented demand and limited inventory.”

Meta Ray-Ban Display Glasses & Neural Band | Image courtesy Meta

The company characterizes stock as “extremely limited,” noting that its seen an “overwhelming amount of interest, and as a result, product waitlists now extend well into 2026.”

Meta says it will continue to focus on fulfilling orders in the US while they “re-evaluate [the] approach to international availability.”

My Take

I was looking forward to getting my hands on a pair of Meta Ray-Ban Display glasses here in Italy, one of the regions currently on “pause”—which my Corpo-to-English translator says I probably shouldn’t hold my breath.

While Meta Ray-Ban Display can’t do everything promised just yet—and doesn’t actually have an app store—the device can do a fair number of things I was hoping to test out if it fit into my daily life.

After all, it can do everything the audio-only Ray-Ban Meta glasses can do in addition to serving up a viewfinder for taking photos and video, the ability to see and respond to messages via WhatsApp, Facebook Messenger, and Instagram, and give you turn-by-turn walking directions in supported cities.

Turn-by-turn Directions in Meta Ray-Ban Display | Image courtesy Meta

Months after launch, Meta says it’s also now pushed an update that includes a teleprompter, the previously teased EMG handwriting, as well as more cities for pedestrian navigation.

Still, it makes a lot more sense from a manufacturing perspective. Meta needs to go slow and deliberate with Meta Ray-Ban Display though, if only based on the fact that the device has likely been heavily subsidized to not be eye-wateringly expensive out of the gate; the company is no doubt eating the fairly high bill of materials if only based on waveguide wastage rates. No app store also means no app revenue, making the first-gen decidedly more of a large beta test than anything.

So, right now it seems like Meta is deliberately going slow to make sure use cases, distribution, and supply chain are all in place before really cashing in on the second gen—maybe following Quest’s playbook; in 2019, the company released the original Quest only to toss out Quest 2 a year later, making for the company’s best-selling XR device to date—and also leaving everyone who bought the first-gen to upgrade only a year later.

Filed Under: AR Development, ar industry, News, XR Industry News

Alibaba Launches Smart Glasses to Rival Meta Ray-Ban Display

December 2, 2025 From roadtovr

Alibaba released a pair of display-clad smart glasses, ostensibly looking to go toe-to-toe with Meta Ray-Ban Display, which launched in the US for $800 back in September.

The News

China’s Alibaba, one the world’s largest retailers and e-commerce companies, just released its first smart glasses, called Quark AI Glasses, which run the company’s own Qwen AI model.

Image courtesy Reuters

Seemingly China-only devices for now, Alibaba is now offering Quark AI in two fundamental versions across Chinese online and brick-and-mortar retailers:

  • Quark AI Glasses S1: starting at ¥3,799 (~$540 USD), includes dual monochrome green displays
  • Quark AI Glasses G1: starting at ¥1,899 (~$270 USD), no displays, sharing core technology of ‘S1’ model

Quark AI Glasses S1 is equipped with a Qualcomm Snapdragon AR1 chipset and a low-power co-processor which drive dual monochrome green micro-OLED displays, boasting a brightness of up to 4,000 nits, according to South China Morning Post.

It also features a five-microphone array with bone conduction, 3K video recording which can be automatically upscaled to 4K, as well as low-light enhancement tech said to bring mobile phone-level imaging to smart glasses. Additionally, Quark AI Glasses S1 include hot-swappable batteries, which plug into the glasses’ stem piece.

You can see the English dubbed version of the Chinese language announcement below:

My Take

At least when it comes to on-paper specs, Quark AI Glasses S1 aren’t exactly a 1:1 rival with Meta Ray-Ban Display, even though both technically include display(s), onboard AI, and the ability to take photos and video.

While Meta Ray-Ban Display only feature a single full-color display, Quark S1’s dual displays only offer monochrome green output, which limits the sort of information that can be seen.

Meta Ray-Ban Display & Neural Band | Photo by Road to VR

Quark S1 also doesn’t come with an input device, like Meta Ray-Ban’s Neural Band, limiting it to only voice and touch input. That means Quark S1 user won’t be scrolling social media, pinching and zooming content, or other nifty UI manipulation.

Still, that might be just enough—at least one of the world’s largest e-commerce, cloud infrastructure, and FinTech companies thinks so. Also not worth overlooking is Quark S1’s unique benefit of being tightly integrated into the Qwen AI ecosystem, as well as the Chinese payment infrastructure for fast and easy QR code-based payments with Alipay; that last one is something most Chinese smart glasses are trying to hook into, like Xiaomi’s own Ray-Ban Meta competitors.

Although the company’s Qwen AI model is available globally, I find it pretty unlikely that Alibaba will ever bring its first-gen models of Quark AI Glasses S1/G1 outside of its usual sphere of influence, or meaningfully intersect with Meta’s supported regions.

Filed Under: AR Development, News, XR Industry News

Next Page »

  • Home