• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

AR Development

Meta & Stanford Reveal Ultra-Thin Holographic XR Display the Size of Glasses

July 30, 2025 From roadtovr

Researchers at Meta Reality Labs and Stanford University have unveiled a new holographic display that could deliver virtual and mixed reality experiences in a form factor the size of standard glasses.

In a paper published in Nature Photonics, Stanford electrical engineering professor Gordon Wetzstein and colleagues from Meta and Stanford outline a prototype device that combines ultra-thin custom waveguide holography with AI-driven algorithms to render highly realistic 3D visuals.

Although based on waveguides, the device’s optics aren’t transparent like you might find on HoloLens 2 or Magic Leap One though—the reason why it’s referred to as a mixed reality display and not augmented reality.

At just 3 millimeters thick, its optical stack integrates a custom-designed waveguide and a Spatial Light Modulator (SLM), which modulates light on a pixel-by-pixel basis to create “full-resolution holographic light field rendering” projected to the eye.

Image courtesy Nature Photonics

Unlike traditional XR headsets that simulate depth using flat stereoscopic images, this system produces true holograms by reconstructing the full light field, resulting in more realistic and naturally viewable 3D visuals.

“Holography offers capabilities we can’t get with any other type of display in a package that is much smaller than anything on the market today,” Wetzstein tells Stanford Report.”

The idea is also to deliver realistic, immersive 3D visuals not only across a wide field-of-view (FOV), but also a wide eyebox—allowing you to move your eye relative to the glasses without losing focus or image quality, or one of the “keys to the realism and immersion of the system,” Wetzstein says.

The reason we haven’t seen digital holographic displays in headsets up until now is due to the “limited space–bandwidth product, or étendue, offered by current spatial light modulators (SLMs),” the team says.

In practice, a small étendue fundamentally limits how large of a field of view and range of possible pupil positions, that is, eyebox, can be achieved simultaneously.

While the field of view is crucial for providing a visually effective and immersive experience, the eyebox size is important to make this technology accessible to a diversity of users, covering a wide range of facial anatomies as well as making the visual experience robust to eye movement and device slippage on the user’s head.

The project is considered the second in an ongoing trilogy. Last year, Wetzstein’s lab introduced the enabling waveguide. This year, they’ve built a functioning prototype. The final stage—a commercial product—may still be years away, but Wetzstein is optimistic.

The team describes it as a “significant step” toward passing what many in the field refer to as a “Visual Turing Test”—essentially the ability to no longer “distinguish between a physical, real thing as seen through the glasses and a digitally created image being projected on the display surface,” Suyeon Choi said, the paper’s lead author.

This follows a recent reveal from researchers at Meta’s Reality Labs featuring ultra-wide field-of-view VR & MR headsets that use novel optics to maintain a compact, goggles-style form factor. In comparison, these include “high-curvature reflective polarizers,” and not waveguides as such.

Filed Under: AR Development, News, VR Development, XR Industry News

Brilliant Labs to Launch Next-gen Smart Glasses on July 31st

July 25, 2025 From roadtovr

Brilliant Labs announced it’s getting ready to launch its next generation of smart glasses at the end of the month, making it the company’s third device since it was founded in 2019.

In 2023, Brilliant Labs released Monocle, a developer kit which included a single heads-up display that was meant to be clipped onto existing eyewear.

A year later, the company released Frame, which evolved Monocle’s monoscopic display and housed it in a glasses-like form factor, including a single camera sensor—making for an impressively slim and light package weighing in at less than 40g.

Image courtesy Brilliant Labs

Frame was “designed to be your AI driven personal assistant,” the company says, emphasizing its access to AI models like Perplexity, ChatGPT, and Whisper, so you gets answers to questions about what you’re currently looking at, experience live translation from either speech or text, and search the Internet in real-time.

Now, Brilliant Labs says its next device is coming on July 31st. Information is thin on the ground, however company co-founder and CEO Bobak Tavangar is taking part in a launch day Q&A via the augmented reality subreddit.

Image courtesy Brilliant Labs

There, we also got a side glimpse of the device in question, which appears to have ditched the round, old school spectacle vibe for a more modern frame shape. Whatever the case, we’re sure to learn more come July 31st. We’ll be keeping an eye on the augmented reality subreddit and the company’s website then.

Meanwhile, the smart glasses segment is heating up. Meta and EssilorLuxottica announced its next-gen Oakley Meta HSTN smart glasses last month; shortly afterwards Chinese tech giant Xiaomi announced its was releasing its own AI Glasses. On the horizon is Google’s Android XR-based smart glasses, built in collaboration with Warby Parker and Gentle Monster.

Although Brilliant Labs is currently one of the few actually offering a pair of smart glasses with a built-in display, it won’t be that way for long. Google says it’s going to offer a model of its Android XR smart glasses with some sort of display. Leaks also maintain Meta’s next pair of smart glasses may also include a display and a wrist-worn controller for input.

Filed Under: AR Development, News, XR Industry News

CREAL Secures $8.9M Funding to Miniaturize Light Field Display for AR Glasses

July 11, 2025 From roadtovr

Switzerland-based light field display startup CREAL announced its closed a $8.9 million equity funding round, which the company says will accelerate the miniaturization of its light field module for AR glasses.

The equity funding round was led by ZEISS, the Germany-based optical systems and optoelectronics company, with participation from new and existing investors, including members of the UBS private investor network.

This brings the company’s overall funding to $32 million, with previous investors including Swisscom Ventures, Verve Ventures, and DAA Capital Partners.

In a press statement, Creal says funds will accelerate its mission to deliver “natural, comfortable, and healthy visual digital experiences by advancing its proprietary light field display.”

Image courtesy CREAL

Integrated into AR glasses, light field displays can recreate the way light naturally enters our eyes, enabling more realistic depth perception and reducing eye strain by allowing proper focus cues at different distances. You can learn more about light fields in our explainer below:

Light fields are significant to AR and VR because they’re a genuine representation of how light exists in the real world, and how we perceive it. Unfortunately they’re difficult to capture or generate, and arguably even harder to display.

Every AR and VR headset on the market today uses some tricks to try to make our eyes interpret what we’re seeing as if it’s actually there in front of us. Most headsets are using basic stereoscopy and that’s about it—the 3D effect gives a sense of depth to what’s otherwise a scene projected onto a flat plane at a fixed focal length.

Such headsets support vergence (the movement of both eyes to fuse two images into one image with depth), but not accommodation (the dynamic focus of each individual eye). That means that while your eyes are constantly changing their vergence, the accommodation is stuck in one place. Normally these two eye functions work unconsciously in sync, hence the so-called ‘vergence-accommodation conflict’ when they don’t.

Some headsets include ‘varifocal’ approaches, dynamically shifting the focal length based on where you’re looking (with eye-tracking), such as Magic Leap One as well as older Meta prototype VR headsets—supporting a larger number of focal lengths. Even so, these varifocal approaches still have some inherent issues that arise because they aren’t actually displaying light fields.

“As AI reshapes how we work and create, AR is poised to become the killer interface to this new era,” says Tomas Sluka, CEO and co-founder of Creal. “But if we’re going to wear AR glasses all day, they must imperatively be healthy, comfortable, and natural to use. That’s why we’re focused on delivering AR glasses that uniquely project digital imagery with real-world depth — fully supporting the natural focusing mechanism of the human eye. This is one of the key foundations for immersive spatial computing.”

Creal says the fresh funding round will help the Écublens, Switzerland-based company continue R&D on its AR light field module, which the company aims to integrate into lightweight, fashionable AR glasses—first for enterprise, and later for consumers.

This also includes ongoing support of a licensing agreement with Zeiss, kicked off in late 2024, to bring its light field-based vision care platform to Zeiss, which it will be used in Zeiss’ next-gen diagnostic and treatment devices.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.

Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.

Filed Under: AR Development, AR Investment, News, XR Industry News

Meta Reportedly Invests $3.5 Billion in EssilorLuxottica, Further Strengthening Smart Glasses Partnership

July 10, 2025 From roadtovr

It was rumored last year that Meta was seeking a minority stake in French-Italian eyewear conglomerate EssilorLuxottica, not only the largest eyewear manufacturer in the world, but also Meta’s partner behind its growing line of smart glasses. Now, a new report suggests the deal has gone through.

Citing people familiar with the matter, a Bloomberg report maintains Meta has acquired just under 3% of EssilorLuxottica, which is suspected to be worth €3 billion (~$3.5 million).

Meta is reportedly considering additional investment in the eyewear maker over time, which could bring its overall stake to around 5%, the people said, who asked to not be named due to ongoing deliberations.

In June 2024, a Wall Street Journal report maintained Meta was considering a stake of about 5% in the eyewear group, although at the time talks were reportedly still in early phases.

Then, three months later, Meta announced it was expanding its smart glasses partnership with EssilorLuxottica into 2030. At the time, Meta CEO Mark Zuckerberg described its long-term roadmap as giving the companies “the opportunity to turn glasses into the next major technology platform, and make it fashionable in the process.”

Image courtesy Meta, EssilorLuxottica

This comes as the companies prepare to release their latest smart glasses collaboration: Oakley Meta HSTN. Meta and EssilorLuxottica are releasing a limited edition version of device on July 11th, priced at $499, with multiple lens and frame colorways slated to go on sale later this summer.

Oakley Meta HSTN comes with a modest feature bump over the companies’ second-gen Ray-Ban Meta glasses, which launched in 2023. In addition to serving up music, photo and video capture, and AI chats, Oakley Meta HSNT also promises better battery life and higher resolution video capture over the current Ray-Ban Meta generation, offering up to “3K video” from the device’s ultra-wide 12MP camera and a typical battery life of eight hours between recharges via the supplied charging case.

Meanwhile, other tech giants are preparing their own entries into the segment. Google revealed back in May that it was partnering with eyewear makers Warby Parker and Gentle Monster to bring its first Android XR smart glasses to market. Recent industry reports allege Google is also looking to invest $100 million in the South Korea-based Gentle Monster.

Chinese tech giant Xiaomi recently released its Xiaomi AI Glasses, which match many of Oakley Meta HSNT’s biggest features, but also add a few of their own—notably an increased continuous recording cap of 45 minutes and even the option to select electrochromic lenses for variable lens-shading. For now, Xiaomi AI Glasses are only available in China.

Both Samsung and Apple are reportedly looking to launch their own smart glasses at some point in the future too. Separate reports maintain Samsung could release a device this year, and Apple as soon as 2026.

Filed Under: AR Development, AR Investment, News, XR Industry News

Smart Contact Maker Raises $250M Investment at a Whopping $1.35B Valuation

July 9, 2025 From roadtovr

Smart contact lens startup XPANCEO announced it’s secured $250 million in Series A funding, putting its valuation at $1.35 billion and minting it as XR’s most recent unicorn.

The funding round was led by Opportunity Venture (Asia), which led the company’s $40 million Seed round in 2023, bringing its overall funding to $290 million, according to Crunchbase data.

XPANCEO, a UAE-based company, says the new funding will “accelerate the company’s mission to launch the world’s first all-in-one smart contact lens,” which is targeted to arrive by 2026.

While the company’s smart contacts are still in prototyping phase, XPANCEO says they will integrate XR, real-time health monitoring, night vision, and zoom features.

Display System with Sub-0.5 mm Projector | Image courtesy XPANCEO

“Becoming a unicorn is a powerful signal that we’re on the right path,” said Roman Axelrod, founder and Managing Partner at XPANCEO. “In just 24 months, we’ve developed 15 working prototypes, each unlocking a new layer of possibility. Our vision remains the same: to merge all your devices into a single, invisible interface – your eyes.”

Since its 2023 seed round, XPANCEO says its fleet of prototypes include a lens for AR vision, a smart lens with intraocular pressure (IOP) sensing for glaucoma monitoring, a biochemical lens capable of measuring health parameters such as glucose directly from tear fluid, and a lens capable of real-time wireless charging and data reading.

Other prototypes feature nanoparticle-enhanced lenses for night vision and color correction, as well as lenses designed for 3D imaging, the company says.

Smart Сontact Lens with Wireless Powering Companion | Image courtesy XPANCEO

Headed by serial entrepreneur Roman Axelrod and physicist Dr. Valentyn S. Volkov, XPANCEO has grown rapidly since its 2021 founding, expanding from 50 to 100 scientists, engineers, and business leaders. Meanwhile, its lab has expanded to support the increasing scope of its research, the company says.

Over the years, XPANCEO has collaborated with a number of institutions, including the University of Manchester, the National University of Singapore, Donostia International Physics Center, and the University of Dubai.

High-Sensitivity Compact IOP Sensor | Image courtesy XPANCEO

XPANCEO’s new unicorn status puts it alongside some of the most ambitious XR projects to date: AR headset company Magic Leap first broke the $1 billion valuation mark in 2014 with a $542 million Series B investment led by Google, putting it at a max of $6.4 billion valuation in 2018 following its landmark investment by Saudi Arabia’s Public Investment Fund (PIF).

Earlier this year, immersive web content company Infinite Reality announced it raised $3 billion from a private investor to build its “vision for the next generation of the internet,” bringing the company’s valuation to $12.25 billion.

Filed Under: AR Development, ar industry, AR Investment, News, XR Industry News

Xiaomi Unveils China’s Answer to Ray-Ban Meta Smart Glasses with a Few Killer Features

June 26, 2025 From roadtovr

Today at Xiaomi’s ‘Human x Car x Home’ event, the Chinese tech giant revealed its answer to Meta and EssilorLuxottica’s series of smart glasses: Xiaomi AI Glasses.

Reports from late last year alleged Xiaomi was partnering with China-based ODM Goertek to produce a new generation of AI-assisted smart glasses, which was rumored to “fully benchmark” against Ray-Ban Meta—notably not officially available in China.

Now, Xiaomi has unveiled its first Xiaomi AI Glasses, which include onboard Hyper XiaoAi voice assistant, 12MP camera with electronic image stabilization (EIS), five mics, two speakers—all of it driven by a low-power BES2700 Bluetooth audio chip and Qualcomm’s Snapdragon AR1. So far, that’s pretty toe-to-toe with Ray-Ban Meta and the recently unveiled Oakley Meta HSTN glasses.

Image courtesy Xiaomi

And like Meta’s smart glasses, Xiaomi AI Glasses don’t include displays of any kind, instead relying on voice and touch input to interact with Hyper XiaoAI. It also boasts foreign language text translation in addition to photo and video capture, which can be toggled either with a voice prompt or tap of the frames.

Xiaomi rates the glasses’ 263mAh silicon-carbon battery at 8.6 hours of use, which the company says can include things like 15 one-minute video recordings, 50 photo shots, 90 minutes of Bluetooth calls, or 20 minutes of XiaoAi voice conversations. Those are just mixed use estimates though, as Xiaomi says it can last up to 21 hours in standby mode, 7 hours of music listening, and 45 minutes of continuous video capture.

One of the most interesting native features though is the ability to simply look at an Alipay QR code, which are ubiquitous across the country, and pay for goods and services with a vocal prompt. Xiaomi says the feature is expected to arrive as an OTA update in September 2025.

The device is set to launch today in China, although global availability is still in question at this time. Xiaomi says the glasses were “optimized for Asian face shapes,” which may rule out a broader global launch for this particular version.

Image courtesy Xiaomi

While there’s only a singular frame shape to choose from, it will be offered in three colors—black, and semi-transparent tortoiseshell brown and parrot green—in addition to three lens options, which aim to beat Ray-Ban Meta and Oakley Meta in cool factor.

The base model with clear lenses is priced at ¥1,999 RMB (~$280 USD), while customers can also choose electrochromic shaded lenses at ¥2,699 RMB (~$380 USD) and colored electrochromic shaded lenses at ¥2,999 RMB (~$420 USD).

Xiaomi’s electrochromic lenses allow for gradual shading depending on the user’s comfort, letting you change the intensity of shading by touching the right frame. Notably, the company says its base model can optionally include prescription lenses through its online and offline partners.

Image courtesy Xiaomi

This makes Xiaomi AI Glasses the company’s first mass-produced smart glasses with cameras marketed under the Xiaomi brand.

Many of Xiaomi’s earlier glasses—such as the Mijia Smart Audio Glasses 2—were only sold in China and lacked camera sensors entirely, save the limited release device Mijia Glasses Camera from 2022, which featured a 50 MP primary and 8 MP periscope camera, and micro-OLED heads-up display.

Here are the specs we’ve gathered so far from Xiaomi’s presentation. We’ll be filling in more as information comes in:

Camera 12MP
Lens ƒ/2.2 large aperture | 105° angle lens
Photo & Video capture 4,032 x 3,024 photos | 2K/30FPS video recording | EIS video stabilization
Video length 45 minute continuous recording cap
Weight 40 g
Charging USB Type-C
Charging time 45 minutes
Battery 263mAh silicon-carbon battery
Battery life 8.6 hours of mixed use
Audio two frame-mounted speakers
Mics 4 mics + 1 bone conduction mic
Design Foldable design

–

According to Chinese language outlet Vrtuoluo, the device has already seen strong initial interest on e-commerce platform JD.com, totaling over 25,000 reservations made as of 9:30 AM CST (local time here).

Filed Under: AR Development, News, XR Industry News

Meta Reveals Oakley Smart Glasses, Promising Better Video Capture & Longer Battery Life at $400

June 20, 2025 From roadtovr

Meta today revealed its next smart glasses built in collaboration with EssilorLuxottica— Oakley Meta Glasses.

As a part of the extended collaboration, Meta and EssilorLuxottica today unveiled Oakley Meta HSTN (pronounced HOW-stuhn), the companies’ next smart glasses following the release of Ray-Ban Meta in 2023.

Pre-orders are slated to arrive on July 11th for its debut version, priced at $499: the Limited Edition Oakley Meta HSTN, which features gold accents and 24K PRIZM polarized lenses.

Image courtesy Meta, EssilorLuxottica

Meanwhile, the rest of the collection will be available “later this summer,” Meta says, which start at $399, and will include the following six frame and lens color combos:

  • Oakley Meta HSTN Desert with PRIZM Ruby Lenses
  • Oakley Meta HSTN Black with PRIZM Polar Black Lenses
  • Oakley Meta HSTN Shiny Brown with PRIZM Polar Deep-Water Lenses
  • Oakley Meta HSTN Black with Transitions Amethyst Lenses
  • Oakley Meta HSTN Clear with Transitions Grey Lenses
Image courtesy Meta, EssilorLuxottica

It’s not just a style change though, as the next-gen promises better battery life and higher resolution video capture over Ray-Ban Meta.

In comparison to Ray-Ban Meta glasses, the new Oakley Meta HSTN are said offer up to “3K video” from the device’s ultra-wide 12MP camera. Ray-Ban’s current second-gen glasses are capped at 1,376 × 1,824 pixels at 30 fps from its 12MP sensor, with both glasses offering up to three minutes of video capture.

What’s more, Oakley Meta HSTN is said to allow for up to eight hours of “typical use” and up to 19 hours on standby mode, effectively representing a doubling of battery life over Ray-Ban Meta.

Image courtesy Meta, EssilorLuxottica

And like Ray-Ban Meta, Oakley Meta HSTN come with a charging case, which also bumps battery life from Ray-Ban Meta’s estimated 32 hours to 48 hours of extended battery life on Oakley Meta HSTN.

It also pack in five mics for doing things like taking calls and talking to Meta AI, as well as off-ear speakers for listening to music while on the go.

Notably, Oakley Meta glasses are said to be water-resistant up to an IPX4 rating—meaning its can take splashes, rain, and sweat, but not submersion or extended exposure to water or other liquids.

The companies say Oakley Meta HSTN will be available across a number of regions, including the US, Canada, UK, Ireland, France, Italy, Spain, Austria, Belgium, Australia, Germany, Sweden, Norway, Finland, and Denmark. The device is also expected to arrive in Mexico, India, and the United Arab Emirates later this year.

In the meantime, you can sign up for pre-order updates either through Meta or Oakley for more information.

Filed Under: AR Development, ar industry, News, XR Industry News

Vuzix Secures $5M Investment as Veteran Smart Glasses Maker Sets Sights on Consumers

June 17, 2025 From roadtovr

Vuzix, the veteran smart glasses maker, announced it’s secured a $5 million investment from Quanta Computer, the Taiwan-based ODM and major Apple assembler.

The latest investment was the second tranche following an initial $10 million investment made by Quanta in September 2024, which included the purchase of Vuzix common stock at $1.30 per share. At the time, Vuzix anticipated a total of $20 million from Quanta.

Paul Travers, President and CEO of Vuzix, notes the funding will be used to enhance Vuzix’s waveguide manufacturing capabilities, something he says will help Vuzix deliver “the world’s most affordable, lightweight, and performance-driven AI smart glasses for mass-market adoption.”

Additionally, Travers says the investment “marks another important milestone in strengthening our partnership with Quanta and expanding the capabilities of our cutting-edge waveguide production facility.”

Vuzix Z100 Smart Glasses | Image courtesy Vuzix

Founded in 1997, Vuzix has largely serviced enterprise with its evolving slate of smart glasses, which have typically targeted a number of industrial roles, including healthcare, manufacturing, and warehousing.

The company also produces its own waveguides for both in-house use and licensing. In the past, Vuzix has worked to integrate its waveguide tech with Garmin, Avegant, an unnamed US Fortune 50 tech company, and an unnamed U.S. defense supplier.

While the company has made a few early consumer devices in the 2010s, including V920 video eyewear and STAR 1200 AR headset, in November 2024, Vuzix introduced the Z100 smart glasses, its first pair of sleek, AI‑assisted smart glasses, priced at $500.

Its Z100 smart glasses include a 640 × 480 monochrome green microLED waveguide, and were designed to pair with smartphones to display notifications, fitness metrics, maps, targeting everyday consumers and enterprise customers alike.

Notably, the investment also coincides with greater market interest in smart glasses on the whole. Google announced last month it’s partnering with eyewear companies Warby Parker and Gentle Monster to release a line of fashionable smart glasses running Android XR.

Meta also recently confirmed it’s expanding its partnership with Ray-Ban Meta-maker EssilorLuxottica to create Oakley-branded smart glasses, expected to launch on June 20th, 2025.

Meanwhile, rumors suggest that both Samsung and Apple are aiming to release their own smart glasses in the near future, with reports maintaining that Samsung could release a device this year, and Apple as soon as next year.

Filed Under: AR Development, ar industry, AR Investment, News, XR Industry News

Snap Plans to Launch New Consumer ‘Specs’ AR Glasses Next Year

June 10, 2025 From roadtovr

Snap, the company behind Snapchat, today announced it’s working on the next iteration of its Spectacles AR glasses (aka ‘Specs’), which are slated to release publicly sometime next year.

Snap first released its fifth generation of Specs (Spectacles ’24) exclusively to developers in late 2024, later opening up sales to students and teachers in January 2025 through an educational discount program.

Today, at the AWE 2025, Snap announced it’s launching an updated version of the AR glasses for public release next year, which Snap co-founder and CEO Evan Spiegel teases will be “a much smaller form factor, at a fraction of the weight, with a ton more capability.”

There’s no pricing or availability yet beyond the 2026 launch window. To boot, we haven’t even seen the device in question, although we’re betting they aren’t as chunky as these:

Snap Spectacles ’24 | Image courtesy Snap Inc

Spiegel additionally noted that its four million-strong library of Lenses, which add 3D effects, objects, characters, and transformations in AR, will be compatible with the forthcoming version of Specs.

While the company isn’t talking specs (pun intended) right now, the version introduced in 2024 packs in a 46° field of view via stereo waveguide displays, which include automatic tint, and dual liquid crystal on silicon (LCoS) miniature projectors boasting 37 pixels per degree.

As a standalone unit, the device features dual Snapdragon processors, stereo speakers for spatial audio, six microphones for voice recognition, as well as two high-resolution color cameras and two infrared computer vision cameras for 6DOF spatial awareness and hand tracking.

There’s no telling how these specs will change on the next version, although we’re certainly hoping for more than the original’s 45-minute battery life.

Snap Spectacles ’24 | Image courtesy Snap Inc

And as the company is gearing up to release its first publicly available AR glasses, Snap also announced major updates coming to Snap OS. Key enhancements include new integrations with OpenAI and Google Cloud’s Gemini, allowing developers to create multimodal AI-powered Lenses for Specs. These include things like real-time translation, currency conversion, recipe suggestions, and interactive adventures.

Additionally, new APIs are said to expand spatial and audio capabilities, including Depth Module API, which anchors AR content in 3D space, and Automated Speech Recognition API, which supports 40+ languages. The company’s Snap3D API is also said to enable real-time 3D object generation within Lenses.

For developers building location-based experiences, Snap says it’s also introducing a Fleet Management app, Guided Mode for seamless Lens launching, and Guided Navigation for AR tours. Upcoming features include Niantic Spatial VPS integration and WebXR browser support, enabling a shared, AI-assisted map of the world and expanded access to WebXR content.

Releasing Specs to consumers could put Snap in a unique position as a first mover; companies including Apple, Meta, and Google still haven’t released their own AR glasses, although consumers should expect the race to heat up this decade. The overall consensus is these companies are looking to own a significant piece of AR, as many hope the device class will unseat smartphones as the dominant computing paradigm in the future.

Filed Under: AR Development, News, XR Industry News

A Look Inside Meta’s ‘Aria’ Research Glasses Shows What Tech Could Come to Future AR Glasses

June 5, 2025 From roadtovr

Earlier this year, Meta unveiled Aria Gen 2, the next iteration of its research glasses. At the time, Meta was pretty sparse with details, however now the company is gearing up to release the device to third-party researchers sometime next year, and in the process, showing what might come to AR glasses in the future.

Meta revealed more about Aria Gen 2 in recent blog post, filling in some details about the research glasses’ form factor, audio, cameras, sensors, and on-device compute.

Although Aria Gen 2 can’t do the full range of augmented reality tasks since it lacks any sort of display, much of what goes into Meta’s latest high-tech specs are leading the way for AR glasses of the future.

Better Computer Vision Capabilities

One of the biggest features all-day-wearable AR glasses of the future will undoubtedly need is robust computer vision (CV), such as mapping an indoor space and recognizing objects.

In terms of computer vision, Meta says Aria Gen 2 doubles the number of CV cameras (now four) over Gen 1, features a 120 dB HDR global shutter, an expanded field of view, and 80° stereo overlap—dramatically enhancing 3D tracking and depth perception.

To boot, Meta showed off the glasses in action inside of a room as it performed simultaneous localization and mapping (SLAM):

New Sensors & Smarter Compute

Other features include sensor upgrades, such as a calibrated ambient light sensor, a contact microphone embedded in the nosepad for clearer audio in noisy environments, and a heart rate sensor (PPG) for physiological data.

Additionally, Meta says Aria Gen 2’s on-device compute has also seen a leap over Gen 1, with real-time machine perception running on Meta’s custom coprocessor, including:

  • Visual-Inertial Odometry (VIO) for 6DOF spatial tracking
  • Advanced eye tracking (gaze, vergence, blink, pupil size, etc.)
  • 3D hand tracking for precise motion data and annotation
  • New SubGHz radio tech enables sub-millisecond time alignment between devices, crucial for multi-device setups.

And It’s Light

Aria Gen 2 may contain the latest advancements in computer vision, machine learning, and sensor technology, but they’re also remarkably light at just 74-76g. For reference, a pair of typical eyeglasses can weigh anywhere from 20-50g, depending on materials used and lens thickness.

Aria Gen 2 | Image courtesy Meta

The device’s 2g weight variation is due to Meta offering eight size variants, which the company says will help users get the right fit for head and nose bridge size. And like regular glasses, they also fold for easy storage and transport.

Notably, the company hasn’t openly spoken about battery life, although it does feature a UBS-C port on the glasses’ right arm, which could possibly be used to tether to a battery pack.

Human Perception Meets Machine Vision

Essentially, Aria Gen 2 not only tracks and analyses the user’s environment, but also the user’s physical perception of that environment, like the user preparing a coffee in the image below.

Image courtesy Meta

While the device tracks a user’s eye gaze and heart rate—both of which could indicate reaction to stimulus—it also captures the relative position and movement through the environment, which is informed by its CV cameras, magnetometer, two inertial measurement units (IMUs) and barometer.

That makes for a mountain of useful data for human-centric research projects, but also the sort of info AR glasses will need (and likely collect) in the future.

The Road to AR Glasses

According to Meta, Aria Gen 2 glasses will “pave the way for future innovations that will define the next computing platform,” which is undoubtedly set to be AR. That said, supplanting smartphones in any meaningful way is probably still years away.

Meta’s Orion AR Glasses Prototype | Image courtesy Meta

Despite some early consumer AR glasses out there already, such as XREAL One Pro, packing in thin displays, powerful processors, and enough battery to run it all-day isn’t a trivial feat—something Meta is trying to address both with Aria as well as its Orion AR prototype, which tethers to a wireless compute unit.

Still, Meta CTO and Reality Labs chief Andrew Bosworth says an AR device based on Orion is coming this decade, and will likely shoot for a price point somewhere north of a smartphone.

We’re likely to learn more about Aria Gen 2 soon. Meta says it’s showcasing the device at CVPR 2025 in Nashville, which will include interactive demos. We’ll have our eyes out for more to come from CVPR, which is taking place June 11th – 15th, 2025 at the Music City Center in Nashville TN.

Filed Under: AR Development, ar industry, News, XR Industry News

« Previous Page
Next Page »

  • Home