• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

AR Development

Snap Plans to Launch New Consumer ‘Specs’ AR Glasses Next Year

June 10, 2025 From roadtovr

Snap, the company behind Snapchat, today announced it’s working on the next iteration of its Spectacles AR glasses (aka ‘Specs’), which are slated to release publicly sometime next year.

Snap first released its fifth generation of Specs (Spectacles ’24) exclusively to developers in late 2024, later opening up sales to students and teachers in January 2025 through an educational discount program.

Today, at the AWE 2025, Snap announced it’s launching an updated version of the AR glasses for public release next year, which Snap co-founder and CEO Evan Spiegel teases will be “a much smaller form factor, at a fraction of the weight, with a ton more capability.”

There’s no pricing or availability yet beyond the 2026 launch window. To boot, we haven’t even seen the device in question, although we’re betting they aren’t as chunky as these:

Snap Spectacles ’24 | Image courtesy Snap Inc

Spiegel additionally noted that its four million-strong library of Lenses, which add 3D effects, objects, characters, and transformations in AR, will be compatible with the forthcoming version of Specs.

While the company isn’t talking specs (pun intended) right now, the version introduced in 2024 packs in a 46° field of view via stereo waveguide displays, which include automatic tint, and dual liquid crystal on silicon (LCoS) miniature projectors boasting 37 pixels per degree.

As a standalone unit, the device features dual Snapdragon processors, stereo speakers for spatial audio, six microphones for voice recognition, as well as two high-resolution color cameras and two infrared computer vision cameras for 6DOF spatial awareness and hand tracking.

There’s no telling how these specs will change on the next version, although we’re certainly hoping for more than the original’s 45-minute battery life.

Snap Spectacles ’24 | Image courtesy Snap Inc

And as the company is gearing up to release its first publicly available AR glasses, Snap also announced major updates coming to Snap OS. Key enhancements include new integrations with OpenAI and Google Cloud’s Gemini, allowing developers to create multimodal AI-powered Lenses for Specs. These include things like real-time translation, currency conversion, recipe suggestions, and interactive adventures.

Additionally, new APIs are said to expand spatial and audio capabilities, including Depth Module API, which anchors AR content in 3D space, and Automated Speech Recognition API, which supports 40+ languages. The company’s Snap3D API is also said to enable real-time 3D object generation within Lenses.

For developers building location-based experiences, Snap says it’s also introducing a Fleet Management app, Guided Mode for seamless Lens launching, and Guided Navigation for AR tours. Upcoming features include Niantic Spatial VPS integration and WebXR browser support, enabling a shared, AI-assisted map of the world and expanded access to WebXR content.

Releasing Specs to consumers could put Snap in a unique position as a first mover; companies including Apple, Meta, and Google still haven’t released their own AR glasses, although consumers should expect the race to heat up this decade. The overall consensus is these companies are looking to own a significant piece of AR, as many hope the device class will unseat smartphones as the dominant computing paradigm in the future.

Filed Under: AR Development, News, XR Industry News

A Look Inside Meta’s ‘Aria’ Research Glasses Shows What Tech Could Come to Future AR Glasses

June 5, 2025 From roadtovr

Earlier this year, Meta unveiled Aria Gen 2, the next iteration of its research glasses. At the time, Meta was pretty sparse with details, however now the company is gearing up to release the device to third-party researchers sometime next year, and in the process, showing what might come to AR glasses in the future.

Meta revealed more about Aria Gen 2 in recent blog post, filling in some details about the research glasses’ form factor, audio, cameras, sensors, and on-device compute.

Although Aria Gen 2 can’t do the full range of augmented reality tasks since it lacks any sort of display, much of what goes into Meta’s latest high-tech specs are leading the way for AR glasses of the future.

Better Computer Vision Capabilities

One of the biggest features all-day-wearable AR glasses of the future will undoubtedly need is robust computer vision (CV), such as mapping an indoor space and recognizing objects.

In terms of computer vision, Meta says Aria Gen 2 doubles the number of CV cameras (now four) over Gen 1, features a 120 dB HDR global shutter, an expanded field of view, and 80° stereo overlap—dramatically enhancing 3D tracking and depth perception.

To boot, Meta showed off the glasses in action inside of a room as it performed simultaneous localization and mapping (SLAM):

New Sensors & Smarter Compute

Other features include sensor upgrades, such as a calibrated ambient light sensor, a contact microphone embedded in the nosepad for clearer audio in noisy environments, and a heart rate sensor (PPG) for physiological data.

Additionally, Meta says Aria Gen 2’s on-device compute has also seen a leap over Gen 1, with real-time machine perception running on Meta’s custom coprocessor, including:

  • Visual-Inertial Odometry (VIO) for 6DOF spatial tracking
  • Advanced eye tracking (gaze, vergence, blink, pupil size, etc.)
  • 3D hand tracking for precise motion data and annotation
  • New SubGHz radio tech enables sub-millisecond time alignment between devices, crucial for multi-device setups.

And It’s Light

Aria Gen 2 may contain the latest advancements in computer vision, machine learning, and sensor technology, but they’re also remarkably light at just 74-76g. For reference, a pair of typical eyeglasses can weigh anywhere from 20-50g, depending on materials used and lens thickness.

Aria Gen 2 | Image courtesy Meta

The device’s 2g weight variation is due to Meta offering eight size variants, which the company says will help users get the right fit for head and nose bridge size. And like regular glasses, they also fold for easy storage and transport.

Notably, the company hasn’t openly spoken about battery life, although it does feature a UBS-C port on the glasses’ right arm, which could possibly be used to tether to a battery pack.

Human Perception Meets Machine Vision

Essentially, Aria Gen 2 not only tracks and analyses the user’s environment, but also the user’s physical perception of that environment, like the user preparing a coffee in the image below.

Image courtesy Meta

While the device tracks a user’s eye gaze and heart rate—both of which could indicate reaction to stimulus—it also captures the relative position and movement through the environment, which is informed by its CV cameras, magnetometer, two inertial measurement units (IMUs) and barometer.

That makes for a mountain of useful data for human-centric research projects, but also the sort of info AR glasses will need (and likely collect) in the future.

The Road to AR Glasses

According to Meta, Aria Gen 2 glasses will “pave the way for future innovations that will define the next computing platform,” which is undoubtedly set to be AR. That said, supplanting smartphones in any meaningful way is probably still years away.

Meta’s Orion AR Glasses Prototype | Image courtesy Meta

Despite some early consumer AR glasses out there already, such as XREAL One Pro, packing in thin displays, powerful processors, and enough battery to run it all-day isn’t a trivial feat—something Meta is trying to address both with Aria as well as its Orion AR prototype, which tethers to a wireless compute unit.

Still, Meta CTO and Reality Labs chief Andrew Bosworth says an AR device based on Orion is coming this decade, and will likely shoot for a price point somewhere north of a smartphone.

We’re likely to learn more about Aria Gen 2 soon. Meta says it’s showcasing the device at CVPR 2025 in Nashville, which will include interactive demos. We’ll have our eyes out for more to come from CVPR, which is taking place June 11th – 15th, 2025 at the Music City Center in Nashville TN.

Filed Under: AR Development, ar industry, News, XR Industry News

Spacetop Launches Windows App to Turn Laptops into Large AR Workspaces

May 2, 2025 From roadtovr

Late last year, Sightful announced it was cancelling its unique laptop with built-in AR glasses, instead pivoting to build a version of its AR workspace software for Windows. Now the company has released Spacetop for Windows, which lets you transform your environment into a private virtual display for productivity on the go.

Like its previous hardware, Spacetop works with XREAL AR glasses, however the new subscription-based app is targeting a much broader set of AI PCs, including the latest hardware from Dell, HP, Lenovo, Asus, Acer and Microsoft.

Previously, the company was working on its own ‘headless’ laptop of sorts, which ran an Android-based operating system called SpaceOS. Sightful however announced in October  2024 it was cancelling and refunding customers for its Spacetop G1 AR workspace device, which was slated to cost $1,900.

At the time, Sightful said the pivot came down to just how much neural processing units (NPU) could improve processing power and battery efficiency when running AR applications.

Image courtesy Sightful

Now, Sightful has released its own Spacetop Bundle at $899, which includes XREAL Air 2 Ultra AR glasses (regularly priced at $699) and 12-month Spacetop subscription (renews annually at $200).

Additionally, Sightful is selling optional optical lenses at an added cost, including prescription single-vision lens inserts for $50, and prescription progressive-vision lens inserts for $150.

Recommended laptops include Dell XPS Core Ultra 7 (32GB), HP Elitebook, Lenovo Yoga Slim, ASUS Zenbook, Acer Swift Go 14, and Microsoft Surface Pro for Business (Ultra 7), however Sightful notes this list isn’t exhaustive, as the cadre of devices which integrate Intel Core Ultra 7/9 processors with Meteor Lake architecture (or newer) is continuously growing.

Key features include:

  • Seamless access to popular apps: Spacetop works with consumer and business apps
    that power productivity every day for Windows users
  • Push, slide, and rotate your workspace with intuitive keystrokes
  • Travel mode that keeps your workspace with you on the go, whether in a plane, train, coffee shop, Ubering, or on your sofa
  • Bright, crystal-clear display that adjusts to lighting for use indoors and out
  • Natural OS experience, designed to feel familiar yet unlock the potential of spatial computing vs. a simple screen extension
  • All-day comfort with lightweight glasses (83g)
  • Massive 100” display for a multi-monitor / multi-window expansive workspace
  • Ergonomic benefits help avoid neck strain, hunching, and squinting at a small display

Backed by over $61M in funding, Sightful was founded in 2020 by veterans from PrimeSense, Magic Leap, and Broadcom. It is headquartered in Tel Aviv with offices in Palo Alto, New York, and Taiwan. You can learn more about Spacetop for Windows here.

Filed Under: AR Development, ar industry, News, XR Industry News

Google Reportedly Set to Acquire Eye-tracking Startup to Bolster Android XR Hardware Efforts

March 13, 2025 From roadtovr

Google is reportedly set to acquire Canada-based eye-tracking startup AdHawk Microsystems Inc., something that would strengthen the company’s ongoing foray into XR headsets and glasses.

As reported by Bloomberg’s Mark Gurman, Google is allegedly acquiring AdHawk for $115 million, according to people with knowledge of the matter.

The deal is said to include $15 million in future payments based on the eye-tracking company reaching performance targets. While the acquisition is purportedly slated to conclude this week, a deal still hasn’t been signed, leaving some room for doubt. Furthermore, should the deal go through, the report maintains AdHawk’s staff will join Google’s Android XR team.

This isn’t the first time AdHawk has flirted with an acquisition by a key XR player. In 2022, Bloomberg reported the company was in the final stages of an acquisition by Meta.

Notably, AdHawk is best known for its innovations in eye-tracking, which replaces traditional cameras with micro-electromechanical systems (MEMS), which is said to result in faster processing and reduced power consumption—two things highly prized by AR and smart glasses creators today.

Image courtesy AdHawk Microsystems Inc.

Its flagship product, the MindLink glasses, is a research-focused device that is meant to connect eye movements with neurological and ocular health, human behavior, and state of mind, the company says on its website. Additionally, the company offers its camera-free eye-tracking modules for researchers working with VR devices, such as Meta Quest.

While neither Google nor AdHawk have commented on report, Google is ramping up its XR division to compete with the likes of Meta and Apple.

In December, Google announced Android XR, marking a decisive shift for the company’s XR efforts, as the company is bringing a ‘full fat’ version of Android to headsets for the first time, which not only includes XR-specific apps but also the full slate of Android content. Android XR is ostensibly set to debut on Samsung’s Project Moohan mixed reality headset, which still has no release date or price.

Then, in January, Google announced the acquisition of a number of HTC’s XR engineers, a deal amounting to $250 million. At the time, Google said HTC veterans would “accelerate the development of the Android XR platform across the headsets and glasses ecosystem.”

In addition to supporting its Android XR software efforts, the acquisition of a novel eye-tracking startup would also prove valuable in the company’s internal XR hardware efforts, which has been nothing short of fragmented over the years.

Google has summarily cancelled a number of XR projects in the past, including its Daydream VR platform in 2019, Google Glass for Enterprise in 2023, and its Iris AR glasses project in 2024.

Filed Under: AR Development, News, VR Development, XR Industry News

Meta & Plessey Announce Super Bright, High-efficiency Red MicroLED: an Important Piece in All-day AR

January 16, 2025 From roadtovr

Meta announced in 2020 it was working with UK-based AR display maker Plessey, which was tapped to provide Meta with AR displays over the course of multiple years. Now the companies have announced they’ve developed what they’re deeming “the world’s brightest” red microLED display for AR glasses.

Plessey and Meta say the new red microLED display offers brightness up to 6,000,000 nits at high resolution (<5um>

Blue GaN microLEDs are traditionally more efficient and brighter, while green GaN microLEDs are slightly less efficient than blue, but typically much more efficient than red. All three should be balanced to create a full-color, high-performance AR display, making red color output a limiting factor.

“With the world’s brightest red microLED display, we are one major step closer to making AR glasses a mainstream reality,” says Dr. Keith Strickland, CEO of Plessey, who calls it “a major breakthrough in the development of AR technology.”

Meta’s Orion AR Glasses Prototype | Image courtesy Meta

“We are building the future of human connection and the technology that makes it possible,” says Jason Hartlove, Vice President of Display and Optics at Meta’s Reality Labs. “These types of breakthroughs are crucial to build AR glasses that help people stay more present and empowered in the world with a form factor people actually feel comfortable wearing. Our work with Plessey has pushed the boundaries of what’s previously been possible, and it’s only the beginning–the future is starting to look up.”

As part of its long-term commercial agreement, Plessey says it’s continuing to work with Meta by dedicating its manufacturing operations to support the development of prototypes and new technologies for potential use in the XR category.

This follows the unveiling of Meta’s AR glasses prototype Orion last September, which includes a purported 70 degree field-of-view, silicon carbide waveguides, custom silicon, microLED projectors, wrist-worn electromyography (EMG) band used for hand-tracking, and external wireless compute unit that slips into your pocket.

Although Meta isn’t commercializing Orion, following its unveiling at Connect 2024 Meta CTO and Reality Labs chief Andrew Bosworth said the company will make its AR consumer tech available sometime before 2030, noting that the company aims to make them “affordable and accessible at least in the space of phone, laptop territory.”

Filed Under: AR Development, AR News, News, XR Industry News

« Previous Page

  • Home