• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

ar industry

VITURE Launches ‘Luma Ultra’ AR Glasses with Sony Micro-OLED Panels

September 17, 2025 From roadtovr

VITURE has now launched Luma Ultra AR glasses, which pack in Sony’s latest micro-OLED to go along with spatial gesture tracking thanks to onboard sensor array.

Priced at $600, and now shipping worldwide, Viture Luma Ultra is targeting prosumers, enterprise and business professionals looking for a personal, on-the-go workspace.

Notably, these aren’t standalone devices, instead relying on PC, console and mobile tethering for compute, which means they integrate as external (albeit very personal) monitors.

Image courtesy VITURE

Luma Ultra is said to include a 52-degree field of view (FOV), Sony’s latest micro-OLED panels with a resolution up to 1200p and 1,250 nits peak brightness. Two depth sensing cameras are onboard in addition to a single RGB camera for spatial tracking and hand gesture input.

Unlike some AR glasses, which rely on slimming waveguide optics, Luma Ultra uses what’s called a ‘birdbath’ optic system, which uses a curved, semi-transparent mirror to project the digital image into the user’s eyes. It’s typically cheaper and easier to manufacture, and can also reach higher brightness at the expense of more bulk and weight.

Image courtesy VITURE

The device also includes an electrochromic film for tint control, myopia adjustments up to -4.0 diopters, and support for 64 ± 6mm interpupillary distance (IPD).

In reality, the company also launched a slate of AR glasses alongside it, which are targeted at consuming traditional media, positioning Viture Luma Ultra the company’s flagship device.

Check out the full lineup and spec below:

Image courtesy VITURE

Viture Luma ($400), Luma Pro ($500) and Luma Ultra ($600) are all estimated to ship within two weeks of ordering, with the next device, Luma Beast ($550) slated to ship sometime in November.

None of the devices above (besides Luma Ultra) include spatial tracking due to the lack of depth sensors, however Luma Beast is said to come with the same micro-OLED displays as Luma Ultra at a slightly larger 58-degree FOV and an auto-adjusting electrochromic film for tint control.

This follows the news of Viture’s latest funding round, which brought the San Francisco-based XR glasses company $100 million in Series B financing. which the company says will aid in global expansion of its consumer XR glasses. Viture says the funding will aid in global expansion of its consumer XR glasses.

Filed Under: AR Development, ar industry, News, XR Industry News

Snapchat CEO’s Open Letter Ties Spectacles AR Glasses to the Survival of the Company at Large

September 12, 2025 From roadtovr

According to Snap’s CEO Evan Spiegel, the company behind Snapchat has reached a “crucible moment” as it heads into 2026, which he says rests on the growth and performance of Spectacles, the company’s AR glasses, as well as AI, advertising and direct revenue streams.

Snap announced in June it was working on the next iteration of its Spectacles AR glasses (aka ‘Specs’), which are expected to release to consumers sometime next year. Snap hasn’t revealed them yet, although the company says the new Specs will be smaller and lighter, feature see-through AR optics and include a built-in AI assistant.

Snap Spectacles (gen 5) | Image courtesy Snap Inc

Following the release of the fifth gen in 2024 to developers, next year will be “the most consequential year yet” in Snap’s 14-year history, Spiegel says, putting its forthcoming generation of Specs in the spotlight.

“After starting the year with considerable momentum, we stumbled in Q2, with ad revenue growth slowing to just 4% year-over-year,” Spiegel admits in his recent open letter. “Fortunately, the year isn’t over yet. We have an enormous opportunity to re-establish momentum and enter 2026 prepared for the most consequential year yet in the life of Snap Inc.”

Not only are Specs a key focus in the company’s growth, Spiegel thinks AR glasses, combined with AI, will drastically change the way people work, learn and play.

“The need for Specs has become urgent,” Spiegel says. “People spend over seven hours a day staring at screens. AI is transforming the way we work, shifting us from micromanaging files and apps to supervising agents. And the costs of manufacturing physical goods are skyrocketing.”

Image courtesy Snap Inc, Niantic

Those physical goods can be replaced with “photons, reducing waste while opening a vast new economy of digital goods,” Spiegel says, something the company hopes to tap into with Specs. And instead of replicating the smartphone experience into AR, Spiegel maintains the core of the device will rely on AI.

“Specs are not about cramming today’s phone apps into a pair of glasses. They represent a shift away from the app paradigm to an AI-first experience — personalized, contextual, and shared. Imagine pulling up last week’s document just by asking, streaming a movie on a giant, see-through, and private display that only you can see, or reviewing a 3D prototype at scale with your teammate standing next to you. Imagine your kids learning biology from a virtual cadaver, or your friends playing chess around a real table with a virtual board.”

Like many of its competitors, Spiegel characterizes Specs as “an enormous business opportunity,” noting the AR device can not only replace multiple physical screens, but the operating system itself will be “personalized with context and memory,” which he says will compound in value over time.

Meanwhile, Snap competitors Meta, Google, Samsung, and Apple are jockeying for position as they develop their own XR devices—the umbrella term for everything from mixed reality headsets, like Meta Quest 3 or Apple Vision Pro, to smart glasses like Ray-Ban Meta or Google’s forthcoming Android XR glasses, to full-AR glasses, such as Meta’s Orion prototype, which notably hopes to deliver many of the same features promised by the sixth gen Specs.

And as the company enters 2026, Spiegel says Snap is looking to organize differently, calling for “startup energy at Snap scale” by setting up a sort of internal accelerator of five to seven teams composed 10 to 15-person squads, which he says will include “weekly demo days, 90-day mission cycles, and a culture of fast failure will keep us moving.”

It’s a bold strategy, especially as the company looks to straddle the expectant ‘smartphone-to-AR’ computing paradigm shift, with Spiegel noting that “Specs are how we move beyond the limits of smartphones, beyond red-ocean competition, and into a once-in-a-generation transformation towards human-centered computing.”


You can read Snap CEO Evan Spiegel’s full open letter here, which includes more on AI and the company’s strategies for growth, engagement and ultimately how it’s seeking to generate more revenue.

Filed Under: AR Development, ar industry, News, XR Industry News

Mojo Vision Secures $75M Investment to Commercialize Micro-LED Displays for XR Glasses

September 9, 2025 From roadtovr

Mojo Vision announced it’s secured a $75 million Series B Prime investment round, which the company says will support the commercialization of its powerful and flexible micro-LED platform for XR glasses.

The round was led by Vanedge Capital, and included investments from current shareholders Edge Venture Capital, New Enterprise Associates (NEA), Fusion Fund, Knollwood Capital, Dolby Family Ventures, and Khosla Ventures, and new shareholders, including imec.xpand, Keymaker, Ohio Innovation Fund, and Hyperlink Ventures.

This brings the company’s overall funding to $345 million, according to Crunch Base data; Mojo Vision’s penultimate round came in late 2023, amounting to $43.5 million.

While previously geared towards producing smart contact lenses, Mojo Vision is now all about the underlying micro-LED technology that initially generated headlines back in 2022.

Image courtesy Mojo Vision

At the time, it was expected Mojo Vision would commercialize a contact lens with embedded micro-LED display, however in April 2023 the company announced it was pivoting.

Founded in 2015, Mojo Vision is now building a type of micro-LED technology that allows the mass-production of them onto silicon chips, combining advanced components like gallium nitride (GaN) on silicon emitters, quantum dots, and micro-lens arrays. According to Mojo Vision, this makes the displays very bright, very small, and energy-efficient.

“Through our micro-LED technology development, Mojo has made significant advancements in establishing breakthrough performance standards while laying the foundation for micro-LEDs as a platform for AI innovation in large market segments,” said Nikhil Balram, CEO of Mojo Vision. “This oversubscribed funding round and strong industry support mark a new phase in the design and production of our next-generation micro-LED platform. The company is on an accelerated path to commercialize micro-LED applications that can power AI.”

The company says it’s targeting the micro-LED platform to build displays for XR glasses, but also large format displays and optical interconnects for AI infrastructure.

Filed Under: AR Development, ar industry, AR Investment, Investment, News, VR Development, vr industry, VR Investment, XR Industry News

Smart Contact Maker Raises $250M Investment at a Whopping $1.35B Valuation

July 9, 2025 From roadtovr

Smart contact lens startup XPANCEO announced it’s secured $250 million in Series A funding, putting its valuation at $1.35 billion and minting it as XR’s most recent unicorn.

The funding round was led by Opportunity Venture (Asia), which led the company’s $40 million Seed round in 2023, bringing its overall funding to $290 million, according to Crunchbase data.

XPANCEO, a UAE-based company, says the new funding will “accelerate the company’s mission to launch the world’s first all-in-one smart contact lens,” which is targeted to arrive by 2026.

While the company’s smart contacts are still in prototyping phase, XPANCEO says they will integrate XR, real-time health monitoring, night vision, and zoom features.

Display System with Sub-0.5 mm Projector | Image courtesy XPANCEO

“Becoming a unicorn is a powerful signal that we’re on the right path,” said Roman Axelrod, founder and Managing Partner at XPANCEO. “In just 24 months, we’ve developed 15 working prototypes, each unlocking a new layer of possibility. Our vision remains the same: to merge all your devices into a single, invisible interface – your eyes.”

Since its 2023 seed round, XPANCEO says its fleet of prototypes include a lens for AR vision, a smart lens with intraocular pressure (IOP) sensing for glaucoma monitoring, a biochemical lens capable of measuring health parameters such as glucose directly from tear fluid, and a lens capable of real-time wireless charging and data reading.

Other prototypes feature nanoparticle-enhanced lenses for night vision and color correction, as well as lenses designed for 3D imaging, the company says.

Smart Сontact Lens with Wireless Powering Companion | Image courtesy XPANCEO

Headed by serial entrepreneur Roman Axelrod and physicist Dr. Valentyn S. Volkov, XPANCEO has grown rapidly since its 2021 founding, expanding from 50 to 100 scientists, engineers, and business leaders. Meanwhile, its lab has expanded to support the increasing scope of its research, the company says.

Over the years, XPANCEO has collaborated with a number of institutions, including the University of Manchester, the National University of Singapore, Donostia International Physics Center, and the University of Dubai.

High-Sensitivity Compact IOP Sensor | Image courtesy XPANCEO

XPANCEO’s new unicorn status puts it alongside some of the most ambitious XR projects to date: AR headset company Magic Leap first broke the $1 billion valuation mark in 2014 with a $542 million Series B investment led by Google, putting it at a max of $6.4 billion valuation in 2018 following its landmark investment by Saudi Arabia’s Public Investment Fund (PIF).

Earlier this year, immersive web content company Infinite Reality announced it raised $3 billion from a private investor to build its “vision for the next generation of the internet,” bringing the company’s valuation to $12.25 billion.

Filed Under: AR Development, ar industry, AR Investment, News, XR Industry News

Meta Reveals Oakley Smart Glasses, Promising Better Video Capture & Longer Battery Life at $400

June 20, 2025 From roadtovr

Meta today revealed its next smart glasses built in collaboration with EssilorLuxottica— Oakley Meta Glasses.

As a part of the extended collaboration, Meta and EssilorLuxottica today unveiled Oakley Meta HSTN (pronounced HOW-stuhn), the companies’ next smart glasses following the release of Ray-Ban Meta in 2023.

Pre-orders are slated to arrive on July 11th for its debut version, priced at $499: the Limited Edition Oakley Meta HSTN, which features gold accents and 24K PRIZM polarized lenses.

Image courtesy Meta, EssilorLuxottica

Meanwhile, the rest of the collection will be available “later this summer,” Meta says, which start at $399, and will include the following six frame and lens color combos:

  • Oakley Meta HSTN Desert with PRIZM Ruby Lenses
  • Oakley Meta HSTN Black with PRIZM Polar Black Lenses
  • Oakley Meta HSTN Shiny Brown with PRIZM Polar Deep-Water Lenses
  • Oakley Meta HSTN Black with Transitions Amethyst Lenses
  • Oakley Meta HSTN Clear with Transitions Grey Lenses
Image courtesy Meta, EssilorLuxottica

It’s not just a style change though, as the next-gen promises better battery life and higher resolution video capture over Ray-Ban Meta.

In comparison to Ray-Ban Meta glasses, the new Oakley Meta HSTN are said offer up to “3K video” from the device’s ultra-wide 12MP camera. Ray-Ban’s current second-gen glasses are capped at 1,376 × 1,824 pixels at 30 fps from its 12MP sensor, with both glasses offering up to three minutes of video capture.

What’s more, Oakley Meta HSTN is said to allow for up to eight hours of “typical use” and up to 19 hours on standby mode, effectively representing a doubling of battery life over Ray-Ban Meta.

Image courtesy Meta, EssilorLuxottica

And like Ray-Ban Meta, Oakley Meta HSTN come with a charging case, which also bumps battery life from Ray-Ban Meta’s estimated 32 hours to 48 hours of extended battery life on Oakley Meta HSTN.

It also pack in five mics for doing things like taking calls and talking to Meta AI, as well as off-ear speakers for listening to music while on the go.

Notably, Oakley Meta glasses are said to be water-resistant up to an IPX4 rating—meaning its can take splashes, rain, and sweat, but not submersion or extended exposure to water or other liquids.

The companies say Oakley Meta HSTN will be available across a number of regions, including the US, Canada, UK, Ireland, France, Italy, Spain, Austria, Belgium, Australia, Germany, Sweden, Norway, Finland, and Denmark. The device is also expected to arrive in Mexico, India, and the United Arab Emirates later this year.

In the meantime, you can sign up for pre-order updates either through Meta or Oakley for more information.

Filed Under: AR Development, ar industry, News, XR Industry News

Vuzix Secures $5M Investment as Veteran Smart Glasses Maker Sets Sights on Consumers

June 17, 2025 From roadtovr

Vuzix, the veteran smart glasses maker, announced it’s secured a $5 million investment from Quanta Computer, the Taiwan-based ODM and major Apple assembler.

The latest investment was the second tranche following an initial $10 million investment made by Quanta in September 2024, which included the purchase of Vuzix common stock at $1.30 per share. At the time, Vuzix anticipated a total of $20 million from Quanta.

Paul Travers, President and CEO of Vuzix, notes the funding will be used to enhance Vuzix’s waveguide manufacturing capabilities, something he says will help Vuzix deliver “the world’s most affordable, lightweight, and performance-driven AI smart glasses for mass-market adoption.”

Additionally, Travers says the investment “marks another important milestone in strengthening our partnership with Quanta and expanding the capabilities of our cutting-edge waveguide production facility.”

Vuzix Z100 Smart Glasses | Image courtesy Vuzix

Founded in 1997, Vuzix has largely serviced enterprise with its evolving slate of smart glasses, which have typically targeted a number of industrial roles, including healthcare, manufacturing, and warehousing.

The company also produces its own waveguides for both in-house use and licensing. In the past, Vuzix has worked to integrate its waveguide tech with Garmin, Avegant, an unnamed US Fortune 50 tech company, and an unnamed U.S. defense supplier.

While the company has made a few early consumer devices in the 2010s, including V920 video eyewear and STAR 1200 AR headset, in November 2024, Vuzix introduced the Z100 smart glasses, its first pair of sleek, AI‑assisted smart glasses, priced at $500.

Its Z100 smart glasses include a 640 × 480 monochrome green microLED waveguide, and were designed to pair with smartphones to display notifications, fitness metrics, maps, targeting everyday consumers and enterprise customers alike.

Notably, the investment also coincides with greater market interest in smart glasses on the whole. Google announced last month it’s partnering with eyewear companies Warby Parker and Gentle Monster to release a line of fashionable smart glasses running Android XR.

Meta also recently confirmed it’s expanding its partnership with Ray-Ban Meta-maker EssilorLuxottica to create Oakley-branded smart glasses, expected to launch on June 20th, 2025.

Meanwhile, rumors suggest that both Samsung and Apple are aiming to release their own smart glasses in the near future, with reports maintaining that Samsung could release a device this year, and Apple as soon as next year.

Filed Under: AR Development, ar industry, AR Investment, News, XR Industry News

A Look Inside Meta’s ‘Aria’ Research Glasses Shows What Tech Could Come to Future AR Glasses

June 5, 2025 From roadtovr

Earlier this year, Meta unveiled Aria Gen 2, the next iteration of its research glasses. At the time, Meta was pretty sparse with details, however now the company is gearing up to release the device to third-party researchers sometime next year, and in the process, showing what might come to AR glasses in the future.

Meta revealed more about Aria Gen 2 in recent blog post, filling in some details about the research glasses’ form factor, audio, cameras, sensors, and on-device compute.

Although Aria Gen 2 can’t do the full range of augmented reality tasks since it lacks any sort of display, much of what goes into Meta’s latest high-tech specs are leading the way for AR glasses of the future.

Better Computer Vision Capabilities

One of the biggest features all-day-wearable AR glasses of the future will undoubtedly need is robust computer vision (CV), such as mapping an indoor space and recognizing objects.

In terms of computer vision, Meta says Aria Gen 2 doubles the number of CV cameras (now four) over Gen 1, features a 120 dB HDR global shutter, an expanded field of view, and 80° stereo overlap—dramatically enhancing 3D tracking and depth perception.

To boot, Meta showed off the glasses in action inside of a room as it performed simultaneous localization and mapping (SLAM):

New Sensors & Smarter Compute

Other features include sensor upgrades, such as a calibrated ambient light sensor, a contact microphone embedded in the nosepad for clearer audio in noisy environments, and a heart rate sensor (PPG) for physiological data.

Additionally, Meta says Aria Gen 2’s on-device compute has also seen a leap over Gen 1, with real-time machine perception running on Meta’s custom coprocessor, including:

  • Visual-Inertial Odometry (VIO) for 6DOF spatial tracking
  • Advanced eye tracking (gaze, vergence, blink, pupil size, etc.)
  • 3D hand tracking for precise motion data and annotation
  • New SubGHz radio tech enables sub-millisecond time alignment between devices, crucial for multi-device setups.

And It’s Light

Aria Gen 2 may contain the latest advancements in computer vision, machine learning, and sensor technology, but they’re also remarkably light at just 74-76g. For reference, a pair of typical eyeglasses can weigh anywhere from 20-50g, depending on materials used and lens thickness.

Aria Gen 2 | Image courtesy Meta

The device’s 2g weight variation is due to Meta offering eight size variants, which the company says will help users get the right fit for head and nose bridge size. And like regular glasses, they also fold for easy storage and transport.

Notably, the company hasn’t openly spoken about battery life, although it does feature a UBS-C port on the glasses’ right arm, which could possibly be used to tether to a battery pack.

Human Perception Meets Machine Vision

Essentially, Aria Gen 2 not only tracks and analyses the user’s environment, but also the user’s physical perception of that environment, like the user preparing a coffee in the image below.

Image courtesy Meta

While the device tracks a user’s eye gaze and heart rate—both of which could indicate reaction to stimulus—it also captures the relative position and movement through the environment, which is informed by its CV cameras, magnetometer, two inertial measurement units (IMUs) and barometer.

That makes for a mountain of useful data for human-centric research projects, but also the sort of info AR glasses will need (and likely collect) in the future.

The Road to AR Glasses

According to Meta, Aria Gen 2 glasses will “pave the way for future innovations that will define the next computing platform,” which is undoubtedly set to be AR. That said, supplanting smartphones in any meaningful way is probably still years away.

Meta’s Orion AR Glasses Prototype | Image courtesy Meta

Despite some early consumer AR glasses out there already, such as XREAL One Pro, packing in thin displays, powerful processors, and enough battery to run it all-day isn’t a trivial feat—something Meta is trying to address both with Aria as well as its Orion AR prototype, which tethers to a wireless compute unit.

Still, Meta CTO and Reality Labs chief Andrew Bosworth says an AR device based on Orion is coming this decade, and will likely shoot for a price point somewhere north of a smartphone.

We’re likely to learn more about Aria Gen 2 soon. Meta says it’s showcasing the device at CVPR 2025 in Nashville, which will include interactive demos. We’ll have our eyes out for more to come from CVPR, which is taking place June 11th – 15th, 2025 at the Music City Center in Nashville TN.

Filed Under: AR Development, ar industry, News, XR Industry News

Spacetop Launches Windows App to Turn Laptops into Large AR Workspaces

May 2, 2025 From roadtovr

Late last year, Sightful announced it was cancelling its unique laptop with built-in AR glasses, instead pivoting to build a version of its AR workspace software for Windows. Now the company has released Spacetop for Windows, which lets you transform your environment into a private virtual display for productivity on the go.

Like its previous hardware, Spacetop works with XREAL AR glasses, however the new subscription-based app is targeting a much broader set of AI PCs, including the latest hardware from Dell, HP, Lenovo, Asus, Acer and Microsoft.

Previously, the company was working on its own ‘headless’ laptop of sorts, which ran an Android-based operating system called SpaceOS. Sightful however announced in October  2024 it was cancelling and refunding customers for its Spacetop G1 AR workspace device, which was slated to cost $1,900.

At the time, Sightful said the pivot came down to just how much neural processing units (NPU) could improve processing power and battery efficiency when running AR applications.

Image courtesy Sightful

Now, Sightful has released its own Spacetop Bundle at $899, which includes XREAL Air 2 Ultra AR glasses (regularly priced at $699) and 12-month Spacetop subscription (renews annually at $200).

Additionally, Sightful is selling optional optical lenses at an added cost, including prescription single-vision lens inserts for $50, and prescription progressive-vision lens inserts for $150.

Recommended laptops include Dell XPS Core Ultra 7 (32GB), HP Elitebook, Lenovo Yoga Slim, ASUS Zenbook, Acer Swift Go 14, and Microsoft Surface Pro for Business (Ultra 7), however Sightful notes this list isn’t exhaustive, as the cadre of devices which integrate Intel Core Ultra 7/9 processors with Meteor Lake architecture (or newer) is continuously growing.

Key features include:

  • Seamless access to popular apps: Spacetop works with consumer and business apps
    that power productivity every day for Windows users
  • Push, slide, and rotate your workspace with intuitive keystrokes
  • Travel mode that keeps your workspace with you on the go, whether in a plane, train, coffee shop, Ubering, or on your sofa
  • Bright, crystal-clear display that adjusts to lighting for use indoors and out
  • Natural OS experience, designed to feel familiar yet unlock the potential of spatial computing vs. a simple screen extension
  • All-day comfort with lightweight glasses (83g)
  • Massive 100” display for a multi-monitor / multi-window expansive workspace
  • Ergonomic benefits help avoid neck strain, hunching, and squinting at a small display

Backed by over $61M in funding, Sightful was founded in 2020 by veterans from PrimeSense, Magic Leap, and Broadcom. It is headquartered in Tel Aviv with offices in Palo Alto, New York, and Taiwan. You can learn more about Spacetop for Windows here.

Filed Under: AR Development, ar industry, News, XR Industry News

Meta Reveals Next Generation Aria Glasses for Research and Experimentation

February 27, 2025 From roadtovr

Meta today revealed its next-gen glasses, Aria Gen 2, which the company intends to release to third-party researchers working on machine perception systems, AI and robotics.

The company revealed its first iteration of Project Aria back in 2020, showing off a sensor-rich pair of glasses which the company used internally to train its machine perception systems, ultimately tackling some of the most complex issues in creating practical, all-day augmented reality glasses of the future.

Since then, Meta’s first-gen Aria has found its way outside of company offices; early collaborations with BMW and a number of universities followed, including Carnegie Mellon, IIIT Hyderabad, University of Bristol, and University of Iowa, which used the glasses to tackle the a host of machine perception challenges.

Now, Meta has revealed Aria Gen 2. Like the first-gen device, Gen 2 doesn’t include displays of any type, though it now houses an upgraded sensor suite, including an RGB camera, position-tracking cameras, eye-tracking cameras, spatial microphones, IMUs, barometer, magnetometer, GNSS, and custom Meta silicon.

New to Aria Gen 2 are two new sensors embedded in the device’s nosepad: a photoplethysmogram (PPG) sensor for measuring heart rate and a contact microphone to distinguish the wearer’s voice from that of bystanders.

What’s more, Meta touts the 75g device’s all-day usability—making for 6-8 hours of active use—and its a foldable design.

The increasingly AI-rich device also features a slate of on-device machine perception systems, such as hand and eye-tracking, speech recognition, and simultaneous localization and mapping (SLAM) tracking for positional awareness.

Aria Gen 2 | Image courtesy Meta

Meta envisions the Aria’s SLAM tracking will allow users to internally map and navigate indoor areas that don’t have good or detailed GPS coverage—aka, a visual positioning system (VPS) that could equally help you get around a city street and a find specific item in a store.

The company isn’t ready to distribute Aria Gen 2 just yet, although Meta says it will share more details over the coming months, which is slated to target both commercial and academic researchers.

One such early collaboration was with Envision, which announced in October it was working with Meta to provide Aria with a ‘Personal Accessibility Assistant’ to help blind and low-vision users navigate indoor spaces, locate items, and essentially act as a pair of ‘seeing eye’ glasses.

Envision and Meta showed off their latest work in a video (seen above), revealing how Aria Gen 2’s SLAM tracking and spatial audio can assist a blind user to navigate a supermarket by following a spatially correct homing ping that the user perceives as emanating from the correct area, which guides them to the desired item, such as a red onion, or Granny Smith apple.

This comes as Meta continues its push to release its first commercial AR device, which not only needs all of those systems highlighted in Aria, but also the ability to display stereo-correct information in a slim, all-day wearable package. It’s no small feat, considering displays have much higher compute and power requirements relative to Aria’s various machine perception systems.

One of Meta’s biggest ‘light house’ moments was the reveal of its AR prototype Orion in September, which does feature those compute and power-hungry display, yet still fitting into an impressively slim form-factor, owing to its separate wireless compute unit.

Orion | Image courtesy Meta

Orion, or rather an Orion-like AR device, isn’t going on sale anytime soon though. The internal prototype itself cost Meta nearly $10,000 per unit to build due to its difficult to scale silicon carbide lenses, which notably feature a class-leading 70 degree field-of-view (FOV).

Still, the race is heating up to get all of the right components and use cases up to snuff to release a commercial product, which is aiming to supplant smartphones as the dominant mobile computing platform. Meta hopes to launch such AR glasses before 2030, with other major companies hoping to do the same, including Apple, Samsung, and Google.

Filed Under: ar industry, AR News, News, XR Industry News

  • Home