• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

News

Apple Reportedly Accelerates Smart Glasses Development Amid Wider Push for AI Hardware

February 18, 2026 From roadtovr

Apple is reportedly accelerating the development of smart glasses, as the company is ostensibly making a shift toward AI-centric hardware.

According to a report from Bloomberg’s Mark Gurman, Apple is ramping up development of its forthcoming smart glasses, which are slated to head into production as early as December 2026, with public release expected sometime in 2027.

Apple’s smart glasses are being positioned to compete with Meta and EssilorLuxottica’s most recent smart glasses, the report maintains.

While this mostly echoes previous reports from last October, Apple appears to be accelerating development, having recently distributed a broader set of glasses prototypes within its hardware engineering division.

According to an all-hands meeting with employees earlier this month, CEO Tim Cook supposedly also hinted that Apple would be pushing hard into AI devices, noting that the company was working on new “categories of products” centered around AI.

“We’re extremely excited about that,” Cook said in the internal meeting, saying “[t]he world is changing fast.”

Citing people familiar with Apple’s plans, the smart glasses (allegedly codenamed ‘N50’) are said to include two cameras: a high-resolution camera for photos and video, and another dedicated to computer vision tasks. The high quality onboard cameras and overall build quality are expected to set it apart from competing products, the report maintains.

Array of Meta smart glasses | Image courtesy Brad Lynch

Similar to Meta’s audio-only smart glasses though, Apple’s N50 hardware isn’t expected to include a display of any kind, instead relying on cameras, speakers and microphones for things such as phone calls, AI queries, listening to music, and capturing images.

Apple allegedly floated the idea of partnering with eyewear brands—similar to Meta’s partnership with EssilorLuxottica or Google’s partnership with Warby Parker and Gentle Monster—the company seems to have more recently decided on developing in-house designs, which are said to arrive in a variety of sizes and colors.

“Early prototypes of the glasses connect via a cable to a standalone battery pack and an iPhone, but newer versions have the components embedded in the frame,” Bloomberg reports. “The design uses high-end materials, including acrylic elements intended to give the glasses a premium feel. Apple is already discussing launching the device in additional styles over time.”

This comes as Apple is investing more heavily in AI in effort to better compete with Google and OpenAI, which comes part and parcel with a critical redesign of Siri. The report also maintains Apple is working on an AI-powered pendant and AirPods with expanded AI capabilities—all three of which will rely on visual input.

Notably, the report maintains that all three will rely on connection to iPhone. Apple did not respond to Bloomberg’s request for comment.

Filed Under: AR Development, News, XR Industry News

SlimeVR Launches Crowdfunding Campaign for Thinner & Lighter Full-Body Trackers

February 17, 2026 From roadtovr

SlimeVR has launched its next crowdfunding campaign, this time looking to get backers excited about its next-gen ‘Butterfly’ body trackers, which promise to be thinner, lighter, and offer a longer-lasting battery.

SlimeVR’s crowdfunding campaign for the new Butterfly Trackers quickly crossed its $180,000 funding goal when it went live on February 9th, now sitting at over $347,000 from nearly 760 backers.

Much like the original SlimeVR Full-Body Tracker, which attracted more than $9 million in 2021, the IMU-based body tracking solution lets VR users better articulate their avatars without the need of base stations or external sensors of any type. It’s also handy for things like motion capture and VTubing.

Image courtesy SlimeVR

That said, Butterfly Trackers are built using the same tracking technology and the ICM-45686 IMU chip by TDK as the original, something SlimeVR notes also includes the same long drift reset times and tracking quality.

The key innovation however is the inclusion of a custom 2.4 GHz dongle instead of Wi-Fi, which essentially allows the trackers to be smaller, lighter, and have a longer battery life. SlimeVR estimates each tracker can last up to 48 hours on a single charge, which is more than double its original Full-Body Trackers.

And because they’re so thin and light, this also includes a few new methods of attachment, including directly via straps, clips, and even iron-on patches.

Image courtesy SlimeVR

Notably, all SlimeVR trackers are compatible with standalone headsets, including Meta Quest, Pico, or Steam Frame, as well as any headset that uses SteamVR.

The campaign’s lowest backer tier is the ‘Core Set Bundle’, which includes six Butterfly Trackers for $279, which are estimated to ship by Aug 31st, 2026. SlimeVR says the six-unit bundle is enough to track the position and rotation of your hip, knees, and chest, as well as the position (re: not rotation) of your feet.

You can check out the full specs list and additional funding tiers over on the SlimeVR Butterfly Tracker Crowd Supply campaign, which ends March 19th.

Filed Under: Meta Quest 3 News & Reviews, News, PC VR News & Reviews

Vision Pro Finally Gets Native ‘YouTube’ App with Full Immersive Video Library

February 13, 2026 From roadtovr

Vision Pro users have been waiting over two years for a native YouTube app. Now, it’s finally here—thankfully also including support for immersive videos.

The News

Google first announced that a YouTube app was “on the road map” shortly after Vision Pro’s February 2024 launch, although it never gave a specific release window, leaving users searching for alternatives beyond simply opening YouTube in Safari, which notably didn’t include native support for spatial video.

The official YouTube app, which is now available on the App Store, now gives Vision Pro users access to every YouTube video and Short, which includes access to all of the regular YouTube features, such as subscriptions, playlists, and watch history.

Image courtesy Apple, Google

What’s more, the official YouTube app also comes with support for viewing spatial videos, which including all 3D, VR180, and 360 videos on the platform. Vision Pro users can find them by navigating to the app’s dedicated ‘Spatial’ tab.

Additionally, Google maintains YouTube for Vision Pro also includes video playback up to 8K for the M5 version, which was released last October.

My Take

There’s no official explanation out there (yet), although there are probably a few reasons why YouTube didn’t come to Vision Pro up until now.

The most obvious to me: Apple’s $3,500 XR headset likely presented a limited return on investment for Google, which may or may not have been influenced by the companies’ historical platform rivalry. Notably, there is still no Gmail, Chrome, Docs, Drive, Photos, Maps—no Google-owned app on Vision Pro except YouTube right now.

That said, YouTube did make a spatial version of its app for Android XR, which was released with Samsung Galaxy XR last October. The relative timing makes me think the release on Vision Pro was more of a knock-on effect of having already built than leadership at YouTube specifically determining that now would finally be profitable, as I don’t suspect the M5 hardware refresh has significantly driven additional consumer interest.

Whatever the case, YouTube has now found itself on the hook for maintaining the app across three distinct XR platforms: Android XR, visionOS and Horizon OS.

Filed Under: Apple Vision Pro News & Reviews, News

Meta Sold Over 7 Million Smart Glasses Last Year, Effectively Tripling Sales in 2025 Alone

February 12, 2026 From roadtovr

EssilorLuxottica reported its Q4 2025 financial results, revealing the company sold over seven million smart glasses last year.

The French-Italian eyewear conglomerate has been making smart glasses in partnership with Meta since the launch of the original Ray-Ban Stories in 2021.

Now, in its fourth-quarter results, EssilorLuxottica revealed it sold over seven million smart glasses last year—more than tripling sales since last reported.

In February 2025, the company announced it had sold two million Ray-Ban (Gen 1) smart glasses since release in late 2023.

Image courtesy Brad Lynch

It’s no wonder 2025 was a landmark year for the company though. Alongside Meta, EssilorLuxottica not only released a hardware refresh of its popular Ray-Ban Meta glasses, but also Oakley Meta HSTN, Oakley Meta Vanguard, and the $800 Meta Ray-Ban Display glasses—the company’s first smart glasses to include a heads-up display.

In addition to its smart glasses efforts, EssilorLuxottica maintains that 2025 marked a further acceleration in its “evolution from an optical company into a leading medtech and big-data group,” owing to growth across both its Nuance Audio hearing-aid glasses and AI-driven healthcare platform.

While Meta and EssilorLuxottica are current market leaders in smart glasses, the XR wearables race has really only just begun. As it appears today, companies largely see smart glasses as a first step towards creating all-day AR glasses of the near future, with  potential contenders including Google, Samsung and reportedly also Apple.

Filed Under: AR Development, News, XR Industry News

‘Battlefield’-like VR Shooter ‘Forefront’ is Coming to PSVR 2 with Cross-Play

February 12, 2026 From roadtovr

Triangle Factory announced that its Battlefield-inspired VR shooter Forefront is finally coming to PSVR 2.

There’s no release date yet beyond the studio’s initial announcement, however Triangle Factory has confirmed that when it does, the 32-player shooter will “support cross play with other VR platforms.”

Created by the same studio behind Breachers (2023) and Hyper Dash (2021), Forefront serves up an experience that should be pretty familiar to fans of the Battlefield series.

Boasting an expansive, semi-destructible environments, 16v16 battles include the ability to pilot everything from helicopters, tanks, humvees and boats as you push objectives.

Currently, Forefront is available in early access across all other major VR headsets, including Quest, SteamVR, and Pico headsets. And it’s done very well for itself in the last three months since it launched into early access.

We’ll be keeping an eye out for official release dates, but in the meantime you can wishlist Forefront over on the PlayStation Store for PSVR 2.

Filed Under: Meta Quest 3 News & Reviews, News, PC VR News & Reviews, PSVR 2 News & Reviews

Ray-Ban Smart Glasses Get Massive Utility Boost with Cool (but risky) ClawdBot Hack

February 10, 2026 From roadtovr

If you’re comfortable mucking around with a new open source project, you could be shopping on Amazon just by looking at an object with your Ray-Ban Meta smart glasses.

Ray-Ban Meta smart glasses are pretty useful out of the box, offering photo & video capture, calls, music playback, and your standard assortment of AI chatbot stuff. They don’t have an app store though, which means you’re basically stuck with a handful of curated services.

Now, indie developer Sean Liu released an open-source project called VisionClaw that links Ray-Ban Meta smart glasses with OpenClaw (aka ClawdBot), essentially giving the autonomous AI agent eyes and ears.

Check out VisionClaw in action below, courtesy Liu:

now my clawdbot lives in my ray-ban meta glasses so i can just buy whatever i’m looking at pic.twitter.com/gWrijyTRhE

— xiaoan (@_seanliu) February 6, 2026

OpenClaw isn’t an AI model like ChatGPT or Google Gemini though. It’s an agentic layer—essentially a complex messaging layer built on top of an AI model that interacts with services on your behalf, like sending emails, managing shopping lists, or controlling smart home devices—just three of the 56+ tools OpenClaw can integrate with right now.

Basically, it works like this: VisionClaw uses Gemini Live for real-time voice and computer vision, which can do things like describe what you’re seeing and answer questions—basically the same sort of tasks you can do with the glasses’ native Meta AI.

Image courtesy Sean Liu

But once you want to actually interact with an app or service—like when you want to send a message over email or your favorite non-Meta messaging app like Signal or Telegram—Gemini Live hands off the request to OpenClaw, which takes action.

Users looking to run VisionClaw will need an iPhone, as Liu’s codebase is written as an Xcode/Swift app that specifically uses Meta’s Wearables Device Access Toolkit (DAT) for iOS to connect the phone to Ray-Ban Meta glasses.

Beyond that, you’ll also need a fair understanding of the risks involved with running OpenClaw on your personal hardware.

While it can do some pretty amazing things, it’s a third-party bit of software that could require you to input passwords, API keys, and personal information, which can open the user up to malicious actors. Notably, OpenClaw’s skill integrations could be written by anyone, so users need to be especially vigilant.

Filed Under: AR Development, News

Snap Forms ‘Specs Inc’ to Insulate AR Business Ahead of AR Glasses Launch

January 29, 2026 From roadtovr

Snapchat maker Snap announced it’s formed a new business dedicated to its upcoming AR glasses.

The News

Called Specs Inc, the wholly-owned subsidiary within Snap is said to allow for “greater operational focus and alignment” ahead of the public launch of its latest AR glasses coming later this year.

In addition to operating its AR efforts directly under the new brand, Snap says Specs Inc will also allow for “new partnerships and capital flexibility,” including the potential for minority investment.

Snap Spectacles Gen 5 (2024) | Image courtesy Snap Inc

In September, Snap CEO Evan Spiegel noted in an open letter that the company is heading into a make-or-break “crucible moment” in 2026, characterizing Specs as an integral part of the company’s future.

“This moment isn’t just about survival. It’s about proving that a different way of building technology, one that deepens friendships and inspires creativity, can succeed in a world that often rewards the opposite,” Spiegel said.

While the company hasn’t shown of its next-gen Specs yet, the company touts the device’s built-in AI, something that “uses its understanding of you and your world to help get things done on your behalf while protecting and respecting your privacy.”

Snap further notes that it’s “building a computer that we hope you’ll use less, because it does more for you.”

My Take

Snap (or rather, Specs) is set to release its sixth-gen Spectacles this year, although this is the first pair of AR glasses the company is ostensibly hoping to pitch directly to the public, and not just developers and educational institutions.

Info is still thin surrounding Spec Inc’s launch plans for the devices, although forming a new legal entity for its AR business right beforehand could mean a few things.

For now, it doesn’t appear Snap is “spinning out” Spectacles proper; Snap hasn’t announced new leadership, leading me to believe that it’s more of a play to not only attract more targeted investment in the AR efforts, but also insulate the company from potential failure.

Snap Spectacles Gen 5 (2024) | Image courtesy Snap Inc, Niantic

It’s all fairly opaque at this point, although the move does allow investors to more clearly choose between supporting the company’s traditional ad business, or investing it the future of AR.

However you slice it though, AR hardware development is capital intensive, and Snap’s pockets aren’t as deep as its direct competitors, including Meta, Apple, Google, and Microsoft.

While Snap confirmed it spent $3 billion over the course of 11 years creating its AR platform, that’s notably less than what Meta typically spends in a single quarter on its XR Reality Labs division.

It’s also risky. The very real flipside is that Specs Inc could go bankrupt. Maybe it’s too early. Maybe it underdelivers in comparison to competitors. Maybe it’s too expensive out of the gate for consumers, and really only appeals to enterprise. Maybe it isn’t too expensive, but the world heads into its sixth once-in-a-generation economic meltdown.

Simply put, there are a lot of ‘maybes’ right now. And given the new legal separation, Snap still has the option to survive relatively unscathed if it goes belly up, and lives to find another existential pivot.

Filed Under: ar industry, AR Investment, News, XR Industry News

Meta CTO: Metaverse Efforts Led to a “lack of focus” on Quest “at expense of user experience”

January 26, 2026 From roadtovr

Meta CTO Andrew Bosworth offered the first bit of insight into the company’s recent Reality Labs shakeup, publicly acknowledging that Meta’s metaverse efforts suffered from a “lack of focus” that ultimately hurt the user experience on Quest.

Speaking at Axios House in Davos, Switzerland alongside the World Economic Forum last week, Bosworth discussed several issues that led Meta to refocus its metaverse and VR strategy—something that also included layoffs affecting 10 percent of its Reality Labs XR team.

Meta is refocusing its approach, and doubling down on AI and smart glasses while narrowing and reorganizing its VR and metaverse efforts. Bosworth, who is also head of Reality Labs, frames the pivot as a three-point problem: poor communication around the metaverse vision, high development costs, and an over-integration of Horizon Worlds with Meta’s VR strategy.

Horizon World teases (2022) | Based on images courtesy Mark Zuckerberg

Horizon Worlds wasn’t the company’s first social VR platform, although it did represent the first real concerted effort to bring to Quest users a ‘default’ shared VR space when it was initially released in 2021. Bosworth notes that Meta’s metaverse ambitions were to build a “rich version” of the mental “transportation” people already experience when socializing through smartphones.

“We still plan on doing that,” Bosworth told Axios’ Ina Fried, referring to Horizon Worlds. “But it’s like any investment. You’re going to look at how you do over the course of years and you’re going to reinvest in some areas and trim your losses in others. For us, we’re seeing tremendous growth of the our metaverse on mobile.”

Image courtesy Meta

While the launch across Android and iOS mobile devices in 2023 pushed Horizon Worlds reach beyond Quest for the first time, it eventually led to higher costs and a more difficult development process.

“Having to build everything twice—once for mobile and once for VR—is a tremendous tax on the team. You’d rather grow a giant audience and then work from a position of strength.”

A second issue was Meta’s decision to tightly bind Horizon Worlds to the Quest platform—something Bosworth admits wasn’t for everyone.

“When you put the headset on, you’re immediately in this kind of co-present accessible space. That is a real challenging piece of work to land from a standpoint of there’s lots of people who put this headset on for lots of different reasons. You want to support all those different use cases, [but] the lack of focus comes at an expense of user experience and a great expense in terms of development cost.”

Bosworth says that while the company now has “two much more focused bets,” those essentially come down to supporting third-party VR content and Horizon Worlds on mobile.

“To do this, of course, it’s tragic anytime your plans change and there’s a human cost; we found a bunch of roles that we just didn’t need anymore,” Bosworth said, referring to layoffs. “So, we did end up downsizing the effort on the metaverse specifically. Though on net, Reality Labs isn’t downsizing. We’re taking basically taking all of those [positions] and taking the investment on wearables, which is growing so rapidly for us.”

This follows the closure of three first-party VR studios, representing a concerted pullback from developing and funding content for the Quest platform.

Notably, Reality Labs’ operating costs have consistently exceeded $4 billion per quarter since late 2021. Q4 is the XR division’s most performant in terms of revenue, however Reality Labs typically only generates a max of around $1 billion, with Q1-Q3 bringing in significantly less. We’re sure to learn more about Q4 2025 when the company reports its after market close on Wednesday, January 28th.

You can watch the full interview below. Thanks go to Reddit user ‘gogodboss’ for pointing us to the news.

Filed Under: News

Lynx-R2 Headset Revealed With Surprisingly Wide Field-of-View in a Tiny Package

January 21, 2026 From roadtovr

Lynx has unveiled the Lynx-R2, a significant upgrade over its original R1 mixed reality standalone which aims to capture the enterprise and prosumers market.

The France-based startup considers R2 is a significant step forward, featuring new aspheric pancake lenses from Hypervision which are said to deliver 126° horizontal field-of-view (FOV)—notably larger than R1’s 90°, or Quest 3’s 110° horizontal FOV.

Paired with dual 2.3K LCD displays delivering more than 24 pixels per degree (PPD) at the center, R2 is said to deliver “crisp text and image rendering for industrial and medical use cases.”

Image courtesy Lynx Mixed Reality

While the new standalone headset features the same flip-up design as its predecessor, R2 is powered by Qualcomm’s Snapdragon XR2 Gen 2, offering substantial gains in GPU and AI performance over R1, which was introduced in 2021 with the older Snapdragon XR2 Gen 1.

Other features including 6DOF head tracking, hand-tracking, controller and ring tracking, plus a full-color four-sensor Sony camera array that also includes depth sensing for advanced computer vision.

Originally planned to ship with Android XR, Lynx-R2 is actually set to launch with Lynx OS following Google’s decision to withdraw support. Lynx OS is however based on Android 14, meaning it can sideload APKs in addition to supporting OpenXR 1.1.

Image courtesy Lynx Mixed Reality

Additionally, Lynx says it will release “all the electronic schematics of the headset motherboard and the mechanical design blueprints,” which is said to allow academics
and hobbyists to freely mod the device.

This will also include raw sensor access so developers can enable their own computer vision applications, as well as full offline functionality for sectors such as defense, healthcare, and industry, Lynx says.

“With the R1, we proved that a small, independent team could build a world-class mixed reality device,” said Stan Larroque, founder and CEO of Lynx Mixed Reality. “With the R2, we are proving that an open ecosystem is not just a philosophy, but provides a superior way to approach these devices. We have listened to 3rd party developers and enterprise users. They didn’t just want more pixels; they wanted a wider field of view, faster processing, and total ownership of their sensors. The R2 delivers just that. I believe the Lynx-R2 is a great VR headset, and will provide the best MR experience.”

There’s no official launch date yet. Lynx says R2 will be available for order “starting this summer” via the official Lynx portal as well as authorized enterprise resellers.

In the meantime, we’re still learning about specs, but this is what Lynx has indicated so far:

Lynx-R2 Specs

Display

2.3K per eye LCD

Lens Type

Hypervision Aspheric Pancake

Pixels Per Degree (PPD)

>24 PPD (center)

Field-of-View 126° horizontal, 133° diagonal
Refresh Rate Not specified
IPD Adjustment Yes
Eye Relief Adjustment Yes
Glasses Support Yes
Processor (SoC)

Qualcomm Snapdragon XR2 Gen 2

Cooling System

Active (dual silent fans)

Operating System

Lynx OS (Android 14–based)

OpenXR Support

Yes (OpenXR 1.1)

Passthrough Type

Full-color video passthrough (Sony RGB)

Passthrough Resolution 3K × 3K per eye
Tracking Cameras

4 (hand, ring, controller & head tracking)

Depth Camera Yes
IR LEDs Yes
Supported Engines

Unity, Unreal, StereoKit

Battery Placement Rear-mounted
Battery Access

User-replaceable

Strap Type Rigid
Weight Not specified

Filed Under: News, VR Development, XR Industry News

Distance Technologies Reveals Military AR Goggles for Battlefield Awareness

January 21, 2026 From roadtovr

Distance Technologies has unveiled the Field Operator HUD (FOH), an AI-enhanced AR system designed for military vehicles ranging from light utility platforms to main battle tanks.

FOH is said to combine Distance’s own optics with AI-assisted data processing, which the company says improves situational awareness, survivability, and visual workload management in land combat environments.

Having undergone field trials with UK and Finnish forces, FOH integrates command-and-control functions with its AR optics by fusing multiple sensor inputs—ostensibly similar to Anduril’s EagleEye project, revealed in October 2025.

Image courtesy Distance Technologies

Distance says FOH is designed to present only the most critical information by using AI-driven sensor fusion, automated detection, and by integrating everything from thermal and night vision to data sourced from a wide range of vehicle-mounted sensors—something the company hopes will translate into more effective decision-making both in and outside of military vehicles.

The precise specs of the company’s various FOH configurations are predictably under wraps though, which are said to include models for on-the-ground soldiers, pilots, and various types of ground vehicle operators.

Image courtesy Distance Technologies

On the company website however, Distance says FOH includes technology that “creates an independent lightfield for each eye, allowing us to control the perceived distance of the content on a per-pixel level. This makes it possible to match virtual elements 1-to-1 with reality for a completely natural XR experience.”

It’s also said to allow for visualizations that “appear on top of reality across the entire field of view, perfectly matching the observable world people see and experience around them.”

FOH is expected to be available for NATO and allied field trials by the end of Q1 2026, with broader deployment planned from 2027 through defense prime contractors.

Founded in 2024 the Helsinki, Finland-based company is involved in building what it calls “the first true glasses-free XR solution.” It was founded by a host of XR veterans, including a cadre of alums from fellow Finnish XR startup Varjo, including Urho Konttori, Jussi Mäkinen, Mikko Strandborg, Thomas M. Carlsson, and Petteri Timonen.

Filed Under: AR Development, ar industry, News, XR Industry News

« Previous Page
Next Page »

  • Home