• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

AR Development

Xiaomi Unveils China’s Answer to Ray-Ban Meta Smart Glasses with a Few Killer Features

June 26, 2025 From roadtovr

Today at Xiaomi’s ‘Human x Car x Home’ event, the Chinese tech giant revealed its answer to Meta and EssilorLuxottica’s series of smart glasses: Xiaomi AI Glasses.

Reports from late last year alleged Xiaomi was partnering with China-based ODM Goertek to produce a new generation of AI-assisted smart glasses, which was rumored to “fully benchmark” against Ray-Ban Meta—notably not officially available in China.

Now, Xiaomi has unveiled its first Xiaomi AI Glasses, which include onboard Hyper XiaoAi voice assistant, 12MP camera with electronic image stabilization (EIS), five mics, two speakers—all of it driven by a low-power BES2700 Bluetooth audio chip and Qualcomm’s Snapdragon AR1. So far, that’s pretty toe-to-toe with Ray-Ban Meta and the recently unveiled Oakley Meta HSTN glasses.

Image courtesy Xiaomi

And like Meta’s smart glasses, Xiaomi AI Glasses don’t include displays of any kind, instead relying on voice and touch input to interact with Hyper XiaoAI. It also boasts foreign language text translation in addition to photo and video capture, which can be toggled either with a voice prompt or tap of the frames.

Xiaomi rates the glasses’ 263mAh silicon-carbon battery at 8.6 hours of use, which the company says can include things like 15 one-minute video recordings, 50 photo shots, 90 minutes of Bluetooth calls, or 20 minutes of XiaoAi voice conversations. Those are just mixed use estimates though, as Xiaomi says it can last up to 21 hours in standby mode, 7 hours of music listening, and 45 minutes of continuous video capture.

One of the most interesting native features though is the ability to simply look at an Alipay QR code, which are ubiquitous across the country, and pay for goods and services with a vocal prompt. Xiaomi says the feature is expected to arrive as an OTA update in September 2025.

The device is set to launch today in China, although global availability is still in question at this time. Xiaomi says the glasses were “optimized for Asian face shapes,” which may rule out a broader global launch for this particular version.

Image courtesy Xiaomi

While there’s only a singular frame shape to choose from, it will be offered in three colors—black, and semi-transparent tortoiseshell brown and parrot green—in addition to three lens options, which aim to beat Ray-Ban Meta and Oakley Meta in cool factor.

The base model with clear lenses is priced at ¥1,999 RMB (~$280 USD), while customers can also choose electrochromic shaded lenses at ¥2,699 RMB (~$380 USD) and colored electrochromic shaded lenses at ¥2,999 RMB (~$420 USD).

Xiaomi’s electrochromic lenses allow for gradual shading depending on the user’s comfort, letting you change the intensity of shading by touching the right frame. Notably, the company says its base model can optionally include prescription lenses through its online and offline partners.

Image courtesy Xiaomi

This makes Xiaomi AI Glasses the company’s first mass-produced smart glasses with cameras marketed under the Xiaomi brand.

Many of Xiaomi’s earlier glasses—such as the Mijia Smart Audio Glasses 2—were only sold in China and lacked camera sensors entirely, save the limited release device Mijia Glasses Camera from 2022, which featured a 50 MP primary and 8 MP periscope camera, and micro-OLED heads-up display.

Here are the specs we’ve gathered so far from Xiaomi’s presentation. We’ll be filling in more as information comes in:

Camera 12MP
Lens ƒ/2.2 large aperture | 105° angle lens
Photo & Video capture 4,032 x 3,024 photos | 2K/30FPS video recording | EIS video stabilization
Video length 45 minute continuous recording cap
Weight 40 g
Charging USB Type-C
Charging time 45 minutes
Battery 263mAh silicon-carbon battery
Battery life 8.6 hours of mixed use
Audio two frame-mounted speakers
Mics 4 mics + 1 bone conduction mic
Design Foldable design

–

According to Chinese language outlet Vrtuoluo, the device has already seen strong initial interest on e-commerce platform JD.com, totaling over 25,000 reservations made as of 9:30 AM CST (local time here).

Filed Under: AR Development, News, XR Industry News

Meta Reveals Oakley Smart Glasses, Promising Better Video Capture & Longer Battery Life at $400

June 20, 2025 From roadtovr

Meta today revealed its next smart glasses built in collaboration with EssilorLuxottica— Oakley Meta Glasses.

As a part of the extended collaboration, Meta and EssilorLuxottica today unveiled Oakley Meta HSTN (pronounced HOW-stuhn), the companies’ next smart glasses following the release of Ray-Ban Meta in 2023.

Pre-orders are slated to arrive on July 11th for its debut version, priced at $499: the Limited Edition Oakley Meta HSTN, which features gold accents and 24K PRIZM polarized lenses.

Image courtesy Meta, EssilorLuxottica

Meanwhile, the rest of the collection will be available “later this summer,” Meta says, which start at $399, and will include the following six frame and lens color combos:

  • Oakley Meta HSTN Desert with PRIZM Ruby Lenses
  • Oakley Meta HSTN Black with PRIZM Polar Black Lenses
  • Oakley Meta HSTN Shiny Brown with PRIZM Polar Deep-Water Lenses
  • Oakley Meta HSTN Black with Transitions Amethyst Lenses
  • Oakley Meta HSTN Clear with Transitions Grey Lenses
Image courtesy Meta, EssilorLuxottica

It’s not just a style change though, as the next-gen promises better battery life and higher resolution video capture over Ray-Ban Meta.

In comparison to Ray-Ban Meta glasses, the new Oakley Meta HSTN are said offer up to “3K video” from the device’s ultra-wide 12MP camera. Ray-Ban’s current second-gen glasses are capped at 1,376 × 1,824 pixels at 30 fps from its 12MP sensor, with both glasses offering up to three minutes of video capture.

What’s more, Oakley Meta HSTN is said to allow for up to eight hours of “typical use” and up to 19 hours on standby mode, effectively representing a doubling of battery life over Ray-Ban Meta.

Image courtesy Meta, EssilorLuxottica

And like Ray-Ban Meta, Oakley Meta HSTN come with a charging case, which also bumps battery life from Ray-Ban Meta’s estimated 32 hours to 48 hours of extended battery life on Oakley Meta HSTN.

It also pack in five mics for doing things like taking calls and talking to Meta AI, as well as off-ear speakers for listening to music while on the go.

Notably, Oakley Meta glasses are said to be water-resistant up to an IPX4 rating—meaning its can take splashes, rain, and sweat, but not submersion or extended exposure to water or other liquids.

The companies say Oakley Meta HSTN will be available across a number of regions, including the US, Canada, UK, Ireland, France, Italy, Spain, Austria, Belgium, Australia, Germany, Sweden, Norway, Finland, and Denmark. The device is also expected to arrive in Mexico, India, and the United Arab Emirates later this year.

In the meantime, you can sign up for pre-order updates either through Meta or Oakley for more information.

Filed Under: AR Development, ar industry, News, XR Industry News

Vuzix Secures $5M Investment as Veteran Smart Glasses Maker Sets Sights on Consumers

June 17, 2025 From roadtovr

Vuzix, the veteran smart glasses maker, announced it’s secured a $5 million investment from Quanta Computer, the Taiwan-based ODM and major Apple assembler.

The latest investment was the second tranche following an initial $10 million investment made by Quanta in September 2024, which included the purchase of Vuzix common stock at $1.30 per share. At the time, Vuzix anticipated a total of $20 million from Quanta.

Paul Travers, President and CEO of Vuzix, notes the funding will be used to enhance Vuzix’s waveguide manufacturing capabilities, something he says will help Vuzix deliver “the world’s most affordable, lightweight, and performance-driven AI smart glasses for mass-market adoption.”

Additionally, Travers says the investment “marks another important milestone in strengthening our partnership with Quanta and expanding the capabilities of our cutting-edge waveguide production facility.”

Vuzix Z100 Smart Glasses | Image courtesy Vuzix

Founded in 1997, Vuzix has largely serviced enterprise with its evolving slate of smart glasses, which have typically targeted a number of industrial roles, including healthcare, manufacturing, and warehousing.

The company also produces its own waveguides for both in-house use and licensing. In the past, Vuzix has worked to integrate its waveguide tech with Garmin, Avegant, an unnamed US Fortune 50 tech company, and an unnamed U.S. defense supplier.

While the company has made a few early consumer devices in the 2010s, including V920 video eyewear and STAR 1200 AR headset, in November 2024, Vuzix introduced the Z100 smart glasses, its first pair of sleek, AI‑assisted smart glasses, priced at $500.

Its Z100 smart glasses include a 640 × 480 monochrome green microLED waveguide, and were designed to pair with smartphones to display notifications, fitness metrics, maps, targeting everyday consumers and enterprise customers alike.

Notably, the investment also coincides with greater market interest in smart glasses on the whole. Google announced last month it’s partnering with eyewear companies Warby Parker and Gentle Monster to release a line of fashionable smart glasses running Android XR.

Meta also recently confirmed it’s expanding its partnership with Ray-Ban Meta-maker EssilorLuxottica to create Oakley-branded smart glasses, expected to launch on June 20th, 2025.

Meanwhile, rumors suggest that both Samsung and Apple are aiming to release their own smart glasses in the near future, with reports maintaining that Samsung could release a device this year, and Apple as soon as next year.

Filed Under: AR Development, ar industry, AR Investment, News, XR Industry News

Snap Plans to Launch New Consumer ‘Specs’ AR Glasses Next Year

June 10, 2025 From roadtovr

Snap, the company behind Snapchat, today announced it’s working on the next iteration of its Spectacles AR glasses (aka ‘Specs’), which are slated to release publicly sometime next year.

Snap first released its fifth generation of Specs (Spectacles ’24) exclusively to developers in late 2024, later opening up sales to students and teachers in January 2025 through an educational discount program.

Today, at the AWE 2025, Snap announced it’s launching an updated version of the AR glasses for public release next year, which Snap co-founder and CEO Evan Spiegel teases will be “a much smaller form factor, at a fraction of the weight, with a ton more capability.”

There’s no pricing or availability yet beyond the 2026 launch window. To boot, we haven’t even seen the device in question, although we’re betting they aren’t as chunky as these:

Snap Spectacles ’24 | Image courtesy Snap Inc

Spiegel additionally noted that its four million-strong library of Lenses, which add 3D effects, objects, characters, and transformations in AR, will be compatible with the forthcoming version of Specs.

While the company isn’t talking specs (pun intended) right now, the version introduced in 2024 packs in a 46° field of view via stereo waveguide displays, which include automatic tint, and dual liquid crystal on silicon (LCoS) miniature projectors boasting 37 pixels per degree.

As a standalone unit, the device features dual Snapdragon processors, stereo speakers for spatial audio, six microphones for voice recognition, as well as two high-resolution color cameras and two infrared computer vision cameras for 6DOF spatial awareness and hand tracking.

There’s no telling how these specs will change on the next version, although we’re certainly hoping for more than the original’s 45-minute battery life.

Snap Spectacles ’24 | Image courtesy Snap Inc

And as the company is gearing up to release its first publicly available AR glasses, Snap also announced major updates coming to Snap OS. Key enhancements include new integrations with OpenAI and Google Cloud’s Gemini, allowing developers to create multimodal AI-powered Lenses for Specs. These include things like real-time translation, currency conversion, recipe suggestions, and interactive adventures.

Additionally, new APIs are said to expand spatial and audio capabilities, including Depth Module API, which anchors AR content in 3D space, and Automated Speech Recognition API, which supports 40+ languages. The company’s Snap3D API is also said to enable real-time 3D object generation within Lenses.

For developers building location-based experiences, Snap says it’s also introducing a Fleet Management app, Guided Mode for seamless Lens launching, and Guided Navigation for AR tours. Upcoming features include Niantic Spatial VPS integration and WebXR browser support, enabling a shared, AI-assisted map of the world and expanded access to WebXR content.

Releasing Specs to consumers could put Snap in a unique position as a first mover; companies including Apple, Meta, and Google still haven’t released their own AR glasses, although consumers should expect the race to heat up this decade. The overall consensus is these companies are looking to own a significant piece of AR, as many hope the device class will unseat smartphones as the dominant computing paradigm in the future.

Filed Under: AR Development, News, XR Industry News

A Look Inside Meta’s ‘Aria’ Research Glasses Shows What Tech Could Come to Future AR Glasses

June 5, 2025 From roadtovr

Earlier this year, Meta unveiled Aria Gen 2, the next iteration of its research glasses. At the time, Meta was pretty sparse with details, however now the company is gearing up to release the device to third-party researchers sometime next year, and in the process, showing what might come to AR glasses in the future.

Meta revealed more about Aria Gen 2 in recent blog post, filling in some details about the research glasses’ form factor, audio, cameras, sensors, and on-device compute.

Although Aria Gen 2 can’t do the full range of augmented reality tasks since it lacks any sort of display, much of what goes into Meta’s latest high-tech specs are leading the way for AR glasses of the future.

Better Computer Vision Capabilities

One of the biggest features all-day-wearable AR glasses of the future will undoubtedly need is robust computer vision (CV), such as mapping an indoor space and recognizing objects.

In terms of computer vision, Meta says Aria Gen 2 doubles the number of CV cameras (now four) over Gen 1, features a 120 dB HDR global shutter, an expanded field of view, and 80° stereo overlap—dramatically enhancing 3D tracking and depth perception.

To boot, Meta showed off the glasses in action inside of a room as it performed simultaneous localization and mapping (SLAM):

New Sensors & Smarter Compute

Other features include sensor upgrades, such as a calibrated ambient light sensor, a contact microphone embedded in the nosepad for clearer audio in noisy environments, and a heart rate sensor (PPG) for physiological data.

Additionally, Meta says Aria Gen 2’s on-device compute has also seen a leap over Gen 1, with real-time machine perception running on Meta’s custom coprocessor, including:

  • Visual-Inertial Odometry (VIO) for 6DOF spatial tracking
  • Advanced eye tracking (gaze, vergence, blink, pupil size, etc.)
  • 3D hand tracking for precise motion data and annotation
  • New SubGHz radio tech enables sub-millisecond time alignment between devices, crucial for multi-device setups.

And It’s Light

Aria Gen 2 may contain the latest advancements in computer vision, machine learning, and sensor technology, but they’re also remarkably light at just 74-76g. For reference, a pair of typical eyeglasses can weigh anywhere from 20-50g, depending on materials used and lens thickness.

Aria Gen 2 | Image courtesy Meta

The device’s 2g weight variation is due to Meta offering eight size variants, which the company says will help users get the right fit for head and nose bridge size. And like regular glasses, they also fold for easy storage and transport.

Notably, the company hasn’t openly spoken about battery life, although it does feature a UBS-C port on the glasses’ right arm, which could possibly be used to tether to a battery pack.

Human Perception Meets Machine Vision

Essentially, Aria Gen 2 not only tracks and analyses the user’s environment, but also the user’s physical perception of that environment, like the user preparing a coffee in the image below.

Image courtesy Meta

While the device tracks a user’s eye gaze and heart rate—both of which could indicate reaction to stimulus—it also captures the relative position and movement through the environment, which is informed by its CV cameras, magnetometer, two inertial measurement units (IMUs) and barometer.

That makes for a mountain of useful data for human-centric research projects, but also the sort of info AR glasses will need (and likely collect) in the future.

The Road to AR Glasses

According to Meta, Aria Gen 2 glasses will “pave the way for future innovations that will define the next computing platform,” which is undoubtedly set to be AR. That said, supplanting smartphones in any meaningful way is probably still years away.

Meta’s Orion AR Glasses Prototype | Image courtesy Meta

Despite some early consumer AR glasses out there already, such as XREAL One Pro, packing in thin displays, powerful processors, and enough battery to run it all-day isn’t a trivial feat—something Meta is trying to address both with Aria as well as its Orion AR prototype, which tethers to a wireless compute unit.

Still, Meta CTO and Reality Labs chief Andrew Bosworth says an AR device based on Orion is coming this decade, and will likely shoot for a price point somewhere north of a smartphone.

We’re likely to learn more about Aria Gen 2 soon. Meta says it’s showcasing the device at CVPR 2025 in Nashville, which will include interactive demos. We’ll have our eyes out for more to come from CVPR, which is taking place June 11th – 15th, 2025 at the Music City Center in Nashville TN.

Filed Under: AR Development, ar industry, News, XR Industry News

Spacetop Launches Windows App to Turn Laptops into Large AR Workspaces

May 2, 2025 From roadtovr

Late last year, Sightful announced it was cancelling its unique laptop with built-in AR glasses, instead pivoting to build a version of its AR workspace software for Windows. Now the company has released Spacetop for Windows, which lets you transform your environment into a private virtual display for productivity on the go.

Like its previous hardware, Spacetop works with XREAL AR glasses, however the new subscription-based app is targeting a much broader set of AI PCs, including the latest hardware from Dell, HP, Lenovo, Asus, Acer and Microsoft.

Previously, the company was working on its own ‘headless’ laptop of sorts, which ran an Android-based operating system called SpaceOS. Sightful however announced in October  2024 it was cancelling and refunding customers for its Spacetop G1 AR workspace device, which was slated to cost $1,900.

At the time, Sightful said the pivot came down to just how much neural processing units (NPU) could improve processing power and battery efficiency when running AR applications.

Image courtesy Sightful

Now, Sightful has released its own Spacetop Bundle at $899, which includes XREAL Air 2 Ultra AR glasses (regularly priced at $699) and 12-month Spacetop subscription (renews annually at $200).

Additionally, Sightful is selling optional optical lenses at an added cost, including prescription single-vision lens inserts for $50, and prescription progressive-vision lens inserts for $150.

Recommended laptops include Dell XPS Core Ultra 7 (32GB), HP Elitebook, Lenovo Yoga Slim, ASUS Zenbook, Acer Swift Go 14, and Microsoft Surface Pro for Business (Ultra 7), however Sightful notes this list isn’t exhaustive, as the cadre of devices which integrate Intel Core Ultra 7/9 processors with Meteor Lake architecture (or newer) is continuously growing.

Key features include:

  • Seamless access to popular apps: Spacetop works with consumer and business apps
    that power productivity every day for Windows users
  • Push, slide, and rotate your workspace with intuitive keystrokes
  • Travel mode that keeps your workspace with you on the go, whether in a plane, train, coffee shop, Ubering, or on your sofa
  • Bright, crystal-clear display that adjusts to lighting for use indoors and out
  • Natural OS experience, designed to feel familiar yet unlock the potential of spatial computing vs. a simple screen extension
  • All-day comfort with lightweight glasses (83g)
  • Massive 100” display for a multi-monitor / multi-window expansive workspace
  • Ergonomic benefits help avoid neck strain, hunching, and squinting at a small display

Backed by over $61M in funding, Sightful was founded in 2020 by veterans from PrimeSense, Magic Leap, and Broadcom. It is headquartered in Tel Aviv with offices in Palo Alto, New York, and Taiwan. You can learn more about Spacetop for Windows here.

Filed Under: AR Development, ar industry, News, XR Industry News

Google Reportedly Set to Acquire Eye-tracking Startup to Bolster Android XR Hardware Efforts

March 13, 2025 From roadtovr

Google is reportedly set to acquire Canada-based eye-tracking startup AdHawk Microsystems Inc., something that would strengthen the company’s ongoing foray into XR headsets and glasses.

As reported by Bloomberg’s Mark Gurman, Google is allegedly acquiring AdHawk for $115 million, according to people with knowledge of the matter.

The deal is said to include $15 million in future payments based on the eye-tracking company reaching performance targets. While the acquisition is purportedly slated to conclude this week, a deal still hasn’t been signed, leaving some room for doubt. Furthermore, should the deal go through, the report maintains AdHawk’s staff will join Google’s Android XR team.

This isn’t the first time AdHawk has flirted with an acquisition by a key XR player. In 2022, Bloomberg reported the company was in the final stages of an acquisition by Meta.

Notably, AdHawk is best known for its innovations in eye-tracking, which replaces traditional cameras with micro-electromechanical systems (MEMS), which is said to result in faster processing and reduced power consumption—two things highly prized by AR and smart glasses creators today.

Image courtesy AdHawk Microsystems Inc.

Its flagship product, the MindLink glasses, is a research-focused device that is meant to connect eye movements with neurological and ocular health, human behavior, and state of mind, the company says on its website. Additionally, the company offers its camera-free eye-tracking modules for researchers working with VR devices, such as Meta Quest.

While neither Google nor AdHawk have commented on report, Google is ramping up its XR division to compete with the likes of Meta and Apple.

In December, Google announced Android XR, marking a decisive shift for the company’s XR efforts, as the company is bringing a ‘full fat’ version of Android to headsets for the first time, which not only includes XR-specific apps but also the full slate of Android content. Android XR is ostensibly set to debut on Samsung’s Project Moohan mixed reality headset, which still has no release date or price.

Then, in January, Google announced the acquisition of a number of HTC’s XR engineers, a deal amounting to $250 million. At the time, Google said HTC veterans would “accelerate the development of the Android XR platform across the headsets and glasses ecosystem.”

In addition to supporting its Android XR software efforts, the acquisition of a novel eye-tracking startup would also prove valuable in the company’s internal XR hardware efforts, which has been nothing short of fragmented over the years.

Google has summarily cancelled a number of XR projects in the past, including its Daydream VR platform in 2019, Google Glass for Enterprise in 2023, and its Iris AR glasses project in 2024.

Filed Under: AR Development, News, VR Development, XR Industry News

Meta & Plessey Announce Super Bright, High-efficiency Red MicroLED: an Important Piece in All-day AR

January 16, 2025 From roadtovr

Meta announced in 2020 it was working with UK-based AR display maker Plessey, which was tapped to provide Meta with AR displays over the course of multiple years. Now the companies have announced they’ve developed what they’re deeming “the world’s brightest” red microLED display for AR glasses.

Plessey and Meta say the new red microLED display offers brightness up to 6,000,000 nits at high resolution (<5um>

Blue GaN microLEDs are traditionally more efficient and brighter, while green GaN microLEDs are slightly less efficient than blue, but typically much more efficient than red. All three should be balanced to create a full-color, high-performance AR display, making red color output a limiting factor.

“With the world’s brightest red microLED display, we are one major step closer to making AR glasses a mainstream reality,” says Dr. Keith Strickland, CEO of Plessey, who calls it “a major breakthrough in the development of AR technology.”

Meta’s Orion AR Glasses Prototype | Image courtesy Meta

“We are building the future of human connection and the technology that makes it possible,” says Jason Hartlove, Vice President of Display and Optics at Meta’s Reality Labs. “These types of breakthroughs are crucial to build AR glasses that help people stay more present and empowered in the world with a form factor people actually feel comfortable wearing. Our work with Plessey has pushed the boundaries of what’s previously been possible, and it’s only the beginning–the future is starting to look up.”

As part of its long-term commercial agreement, Plessey says it’s continuing to work with Meta by dedicating its manufacturing operations to support the development of prototypes and new technologies for potential use in the XR category.

This follows the unveiling of Meta’s AR glasses prototype Orion last September, which includes a purported 70 degree field-of-view, silicon carbide waveguides, custom silicon, microLED projectors, wrist-worn electromyography (EMG) band used for hand-tracking, and external wireless compute unit that slips into your pocket.

Although Meta isn’t commercializing Orion, following its unveiling at Connect 2024 Meta CTO and Reality Labs chief Andrew Bosworth said the company will make its AR consumer tech available sometime before 2030, noting that the company aims to make them “affordable and accessible at least in the space of phone, laptop territory.”

Filed Under: AR Development, AR News, News, XR Industry News

  • Home