• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

AR Development

Meta to Ship Project Aria Gen 2 to Researchers in 2026, Paving the Way for Future AR Glasses

October 29, 2025 From roadtovr

Meta announced it’s shipping out Project Aria Gen 2 to third-party researchers next year, which the company hopes will accelerate development of machine perception and AI technologies needed for future AR glasses and personal AI assistants.

The News

Meta debuted Project Aria Gen 1 back in 2020, the company’s sensor-packed research glasses which it used internally to train various AR-focused perception systems, in addition to releasing it in 2024 to third-party researchers across 300 labs in 27 countries.

Then, in February, the company announced Aria Gen 2, which Meta says includes improvements in sensing, comfort, interactivity, and on-device computation. Notably, neither generation contains a display of any type, like the company’s recently launch Meta Ray-Ban Display smart glasses.

Now the company is taking applications for researchers looking to use the device, which is said to ship to qualified applicants sometime in Q2 2026. That also means applications for Aria Gen 1 are now closed, with remaining requests still to be processed.

To front run what Meta calls a “broad” rollout next year, the company is releasing two major resources: the Aria Gen 2 Device Whitepaper and the Aria Gen 2 Pilot Dataset.

The whitepaper details the device’s ergonomic design, expanded sensor suite, Meta’s custom low-power co-processor for real-time perception, and compares Gen 1 and Gen 2’s abilities.

Meanwhile, the pilot dataset provides examples of data captured by Aria Gen 2, showing its capabilities in hand and eye-tracking, sensor fusion, and environmental mapping. The dataset also includes example outputs from Meta’s own algorithms, such as hand-object interaction and 3D bounding box detection, as well as NVIDIA’s FoundationStereo for depth estimation.

Meta is accepting applications from both academic and corporate researchers for Aria Gen 2.

My Take

Meta doesn’t call Project Aria ‘AI glasses’ like it does with its various generations of Ray-Ban Meta or Meta Ray-Ban Display, or even ‘smart glasses’ like you might expect—even if they’re substantively similar on the face of things. They’re squarely considered ‘research glasses’ by the company.

Cool, but why? Why does the company that already makes smart glasses with and without displays, and cool prototype AR glasses need to put out what’s substantively the skeleton of a future device?

What Meta is attempting to do with Project Aria is actually pretty smart for a few reasons: sure, it’s putting out a framework that research teams will build on, but it’s also doing it at a comparatively lower cost than outright hiring teams to directly build out future use cases, whatever those might be.

Aria Gen 2 | Image courtesy Meta

While the company characterizes its future Aria Gen 2 rollout as “broad”, Meta is still filtering for projects based on merit, i.e. getting a chance to guide research without really having to interface with what will likely be substantially more than 300 teams, all of whom will use the glasses to solve problems in how humans can more fluidly interact with an AI system that can see, hear, and know a heck of a lot more about your surroundings than you might at any given moment.

AI is also growing faster than supply chains can keep up, which I think more than necessitates an artisanal pair of smart glasses so teams can get to grips with what will drive the future of AR glasses—the real crux of Meta’s next big move.

Building out an AR platform that may one day supplant the smartphone is no small task, and its iterative steps have the potential to give Meta the sort of market share the company dreamt of way back in 2013 when it co-released the HTC First, which at the time was colloquially called the ‘Facebook phone’.
The device was a flop, partly because the hardware was lackluster, and I think I’m not alone in saying so, mostly because people didn’t want a Facebook phone in their pockets at any price when the ecosystem had some many other (clearly better) choices.

Looking back at the early smartphones, Apple teaches us that you don’t have to be first to be best, but it does help to have so many patents and underlying research projects that your position in the market is mostly assured. And Meta has that in spades.

Filed Under: AR Development, News, XR Industry News

Researchers Propose Novel E-Ink XR Display with Resolution Far Beyond Current Headsets

October 27, 2025 From roadtovr

A group of Sweden-based researchers proposed a novel e-ink display solution that could make way for super compact, retina-level VR headsets and AR glasses in the future.

The News

Traditional emissive displays are shrinking, but they face physical limits; smaller pixels tend to emit less uniformly and provide less intense light, which is especially noticeable in near-eye applications like virtual and augmented reality headsets.

In a recent research paper published in Nature, a team of researchers presents what a “retinal e-ink display” which hopes to offer a new solution quite unlike displays seen in modern VR headsets today, which are increasingly adopting micro-OLEDs to reduce size and weight.

The paper was authored by researchers affiliated with Uppsala University, Umeå University, University of Gothenburg, and Chalmers University of Technology in Gothenburg: Ade Satria Saloka Santosa, Yu-Wei Chang, Andreas B. Dahlin, Lars Österlund, Giovanni Volpe, and Kunli Xiong.

While conventional e-paper has struggled to reach the resolution necessary for realistic, high-fidelity images, the team proposes a new form of e-paper featuring electrically tunable “metapixels” only about 560 nanometres wide.

This promises a pixel density of over 25,000 pixels per inch (PPI)—an order of magnitude denser than displays currently used in headsets like Samsung Galaxy XR or Apple Vision Pro. Those headsets have a PPI of around 4,000.

Image courtesy Nature

As the paper describes it, each metapixel is made from tungsten trioxide (WO₃) nanodisks that undergo a reversible insulator-to-metal transition when electrically reduced. This process dynamically changes the material’s refractive index and optical absorption, allowing nanoscale control of brightness and color contrast.

In effect, when lit by ambient light, the display can create bright, saturated colors far thinner than a human hair, as well as deep blacks with reported optical contrast ratios around 50%—a reflective equivalent of high-dynamic range (HDR).

And the team says it could be useful in both AR and VR displays. The figure below shows a conceptual optical stack for both applications, with Figure A representing a VR display, and Figure B showing an AR display.

Image courtesy Nature

Still, there are some noted drawbacks. Beyond sheer resolution, the display delivers full-color video at “more than 25 Hz,” which is significantly lower than what VR users need for comfortable viewing. In addition to a relatively low refresh rate, researchers note the retina e-paper requires further optimization in color gamut, operational stability and lifetime.

“Lowering the operating voltage and exploring alternative electrolytes represent promising engineering routes to extend device durability and reduce energy consumption,” the paper explains. “Moreover, its ultra-high resolution also necessitates the development of ultra-high-resolution TFT arrays for independent pixel control, which will enable fully addressable, large-area displays and is therefore a critical direction for future research and technological development.”

And while the e-paper display itself is remarkably low-powered, packing in the graphical compute to put those metapixels to work will also be a challenge. It’s a good problem to have, but a problem none the less.

My Take

At least as the paper describes it, the underlying tech could produce XR displays approaching the size and pixel density that we’ve never seen before. And reaching the limits of human visual perception is one of those holy grail moments I’ve been waiting for.

Getting that refresh rate up well beyond 25 Hz is going to be extremely important though. As the paper describes it, 25 Hz is good for video playback, but driving an immersive VR environment requires at least 60 Hz refresh to be minimally comfortable. 72 Hz is better, and 90 Hz is the standard nowadays.

I’m also curious to see the e-paper display stacked up against lower resolution micro-OLED contemporaries, if only to see how that proposed ambient lighting can achieve HDR. I have a hard time wrapping my head around it. Essentially, the display’s metapixels absorb and scatter ambient light, much like Vantablack does—probably something that needs to be truly seen in person to be believed.

Healthy skepticism aside, I find it truly amazing we’ve even arrived at the conversation in the first place: we’re at the point where XR displays could recreate reality, at least as far as your eyes are concerned.

Filed Under: AR Development, News, VR Development, XR Industry News

Former Oculus Execs’ AI Smart Glasses Startup ‘Sesame’ Raises $250M Series B Funding

October 24, 2025 From roadtovr

Sesame, an AI and smart glasses startup founded by former Oculus execs, raised $250 million in Series B funding, which the company hopes will accelerate its voice-based AI.

The News

As first reported by Tech Crunch, lead investors in Sesame’s Series B include Spark Capital and Sequoia Capital, bringing the company’s overall funding to $307.6 million, according to Crunchbase data.

Exiting stealth earlier this year, Sesame was founded by Oculus co-founder and former CEO Brendan Iribe, former Oculus hardware architect Ryan Brown, and Ankit Kumar, former CTO of AR startup Ubiquity6. Additionally, Oculus co-founder Nate Mitchell announced in June he was joining Sesame as Chief Product Officer, which he noted was to “help bring computers to life.”

Image courtesy Sesame

Sesame is currently working on an AI assistant along with a pair of lightweight smart glasses. Its AI assistant aims to be “the perfect AI conversational partner,” Sequoia Capital says in a recent post.

“Sesame’s vision is to build an ambient interface that is always available and has contextual awareness of the world around you,” Sequoia says. “To achieve that, Sesame is creating their own lightweight, fashion-forward AI-enabled glasses designed to be worn all day. They’re intentionally crafted—fit for everyday life.”

Sesame is currently taking signups for beta access to its AI assistants Miles and Maya in an iOS app, and also has a public preview showcasing a ‘call’ function that allows you to speak with the chatbots.

My Take

Love it or hate it, AI is going to be baked into everything in the future, as contextually aware systems hope to bridge the gap between user input and the expectation of timely and intelligent output. That’s increasingly important when the hardware doesn’t include a display, requiring the user to interface almost entirely by voice.

Some things to watch out for: if the company does commercialize a pair of smart glasses to champion its AI assistant, it will be competing for some pretty exclusive real estate that companies like Meta, Google, Samsung, and Apple (still unconfirmed) are currently gunning for. That puts Sesame at somewhat of a disadvantage if it hopes to go it alone, but not if it’s hoping for a timely exit into the coming wave of smart glasses by being acquired by any of the above.

There’s also some pretty worrying precedent in the rear view mirror too: e.g. Humane’s AI Pin or AI Friend necklace, both of which were publicly lambasted for essentially releasing hardware that could just as easily have been apps on your smartphone.

Granted, Sesame hasn’t shown off its smart glasses hardware yet, so there’s no telling what the company hopes to bring to the table outside of the having an easy-to-wear pair of off-ear headphones for all-day AI stuff—that, to me, would be the worst case scenario, as Meta refines its own smart glasses in partnership with EssilorLuxottica, Google releases Android XR frames with Gentle Monster and Warby Parker, Samsung releases its own Android XR glasses, and Apple does… something. We don’t know yet.

Whatever the case, I’m looking forward to it, if only based on the company’s combined experience in XR, which I’d argue any startup would envy as the race to build the next big computing platform truly takes off.

Filed Under: AR Development, AR Investment, News, XR Industry News

Amazon is Developing Smart Glasses to Allow Delivery Drivers to Work Hands-free

October 23, 2025 From roadtovr

Amazon announced it’s developing smart glasses for its delivery drivers, which include a display for real-time navigation and delivery instructions.

The News

Amazon announced the news in a blog post, which partially confirms a recent report from The Information, which alleged that Amazon is developing smart glasses both for its delivery drivers and consumers.

The report, released in September, maintained that Amazon’s smart glasses for delivery drivers will be bulkier and less sleek than the consumer model. Codenamed ‘Jayhawk’, the delivery-focused smart glasses are expected to rollout as soon as Q2 2026, and include an initial production run of 100,000 units.

Image courtesy Amazon

Amazon says the smart glasses were designed and optimized with input from hundreds of delivery drivers, and include the ability to identify hazards, scan packages, capture proof of delivery, and navigate by serving up turn-by-turn walking directions.

The company hasn’t confirmed whether the glasses’ green monotone heads-up display is monoscopic or stereoscopic, however images suggest it indeed features a single waveguide in the right lens.

Moreover, the glasses aren’t meant to be used while driving, as Amazon says that the glasses “automatically activate” when the driver parks their vehicle. Only afterwards does the driver receive instructions, ostensibly done to reduce the risk of driver distraction.

In addition to the glasses, the system also features what Amazon calls “a small controller worn in the delivery vest that contains operational controls, a swappable battery ensuring all-day use, and a dedicated emergency button to reach emergency services along their routes if needed.”

Additionally, Amazon says the glasses support prescription lenses along with transitional lenses that automatically adjust to light.

As for the reported consumer version, it’s possible Amazon may be looking to evolve its current line of ‘Echo Frames’ glasses. First introduced in 2019, Echo Frames support AI voice control, music playback, calls, and Alexa smart home control, although they notably lack any sort of camera or display.

My Take

I think Amazon has a good opportunity to dogfood (aka, use its own technology) here on a pretty large scale—probably much larger than Meta or Google could initially with their first generation of smart glasses with displays.

That said, gains made in enterprise smart glasses can be difficult to translate to consumer products, which will necessarily include more functions and apps, and likely require more articulated input—all of the things that can make or break any consumer product.

Third-gen Echo Frames | Image courtesy Amazon

Amazon’s core strength though is generally less focused on high-end innovation, and more about creating cheap, reliable hardware that feeds into recurring revenue streams: Kindle, Fire TV, Alexa products, etc. Essentially, if Amazon can’t immediately figure out a way to make consumer smart glasses feed into its existing ecosystems, I wouldn’t expect to see the company put its full weight behind the device, at least not initially.

After the 2014 failure of Fire Phone, Amazon may still be gun-shy from going head-first into a segment it has near-zero experience entering. And I really don’t count Echo Frames, because they’re primarily just Bluetooth headphones with Alexa support baked in. Still, real smart glasses with cameras and displays represent a treasure trove of data that the company may not be so keen to pass up.

Using object recognition to peep into your home or otherwise follow you around could allow Amazon to better target personalized suggestions, figure out brand preferences, and even track users as they shop at physical stores. Whatever the case, I bet the company will give it a go, if only to occupy the top slot when you search “smart glasses” on Amazon.

Filed Under: AR Development, News, XR Industry News

Meta Ray-Ban Display Repairablity is Predictably Bad, But Less Than You Might Think

October 9, 2025 From roadtovr

iFixit got their hands on a pair of Meta Ray-Ban Display smart glasses, so we finally get to see what’s inside. Is it repairable? Not really. But if you can somehow find replacement parts, you could at least potentially swap out the battery.

The News

Meta launched the $800 smart glasses in the US late last month, marking the company’s first pair with a heads-up display.

Serving up a monocular display, Meta Ray-Ban allows for basic app interaction beyond the standard stuff seen (or rather ‘heard’) in its audio-only Ray-Ban Meta and Oakley Meta glasses. It can do things like let you view and respond to messages, get turn-by-turn walking directions, and even use the display as a viewfinder for photos and video.

And iFixit shows off in their latest video that cracking into the glasses and attempting repairs is pretty fiddly, but not entirely impossible.

Meta Ray-Ban Display’s internal battery | Image courtesy iFixit

The first thing you’d probably eventually want to do is replace the battery, which requires splitting the right arm down a glued seam—a common theme with the entire device. Getting to the 960 mWh internal battery, which is slightly larger than the one seen in the Oakley Meta HSTN, you’ll be sacrificing the device’s IPX4 splash resistance rating.

And the work is fiddly, but iFixit manages to go all the way down to the dual speakers, motherboard, Snapdragon AR1 chipset, and liquid crystal on silicon (LCoS) light engine, the latter of which was captured with a CT scanner to show off just how micro Meta has managed to get its most delicate part.

Granted, this is a teardown and not a repair guide as such. All of the components are custom, and replacement parts aren’t available yet. You would also need a few specialized tools and an appetite for risk of destroying a pretty hard-to-come-by device.

For more, make sure to check out iFixit’s full article, which includes images and detailed info on each component. You can also see the teardown in action in the full nine-minute video below.

My Take

Meta isn’t really thinking deeply about reparability when it comes to smart glasses right now, which isn’t exactly shocking. Like earbuds, smart glasses are all about miniaturization to hit an all-day wearable form factor, making its plastic and glue-coated exterior a pretty clear necessity in the near term.

Another big factor: the company is probably banking on the fact that prosumers willing to shell out $800 bucks this year will likely be happy to so the same when Gen 2 eventually arrives. That could be in two years, but I’m betting less if the device performs well enough in the market. After all, Meta sold Quest 2 in 2020 just one year after releasing the original Quest, so I don’t see why they wouldn’t do the same here.

That said, I don’t think we’ll see any real degree of reparability in smart glasses until we get to the sort of sales volumes currently seen in smartphones. And that’s just for a baseline of readily available replacement parts, third-party or otherwise.

So while I definitely want a pair of smart glasses (and eventually AR glasses) that look indistinguishable from standard frames, that also kind of means I have to be okay with eventually throwing away a perfectly cromulent pair of specs just because I don’t have the courage to open it up, or know anyone who does.

Filed Under: AR Development, News, XR Industry News

Why Ray-Ban Meta Glasses Failed on Stage at Connect

September 19, 2025 From roadtovr

Meta CEO Mark Zuckerberg’s keynote at this year’s Connect wasn’t exactly smooth—especially if count two big hiccups that sidetracked live demos for both the latest Ray-Ban Meta smart glasses and the new Meta Ray-Ban Display glasses.

Ray-Ban Meta (Gen 2) smart glasses essentially bring the same benefits as Oakley Meta HSTN, which launched back in July: longer battery life and better video capture.

One of the biggest features though is its access to Meta’s large language model (LLM), Meta AI, which pops up when you say “Hey Meta”, letting you ask questions about anything, from the weather to what the glasses camera can actually see.

As part of the on-stage demo of its Live AI feature, which runs continuously instead of sporadically, food influencer Jack Mancuso attempted to create a Korean-inspired steak sauce using the AI as a guide.

And it didn’t go well, as Mancuso struggled to get the Live AI back on track after missing a key step in the sauce’s preparation. You can see the full cringe-inducing glory for yourself, timestamped below:

And the reason behind it is… well, just dumb. Jake Steineman, Developer Advocate at Meta’s Reality Labs, explained what happened in an X post:

So here’s the story behind why yesterdays live #metaconnect demo failed – when the chef said “Hey Meta start Live AI” it activated everyone’s Meta AI in the room at once and effectively DDOS’d our servers 🤣

That’s what we get for doing it live!

— Jake Steinerman 🔜 Meta Connect (@jasteinerman) September 19, 2025

Unfortunate, yes. But also pretty foreseeable, especially considering the AI ‘wake word’ gaffe has been a thing since the existence of Google Nest (ex-Home) and Amazon Alexa.

Anyone with one of those friendly tabletop pucks has probably experienced what happens when a TV advert includes “Hey Google” or “Hey Alexa,” unwittingly commanding every device in earshot to tell them the weather, or even order items online.

What’s more surprising though: there were enough people using a Meta product in earshot to screw with its servers. Meta AI isn’t like Google Gemini or Apple’s Siri—it doesn’t have OS-level access to smartphones. The only devices with default are the company’s Ray-Ban Meta and Oakley Meta glasses (and Quest if you opt-in), conjuring the image of a room full of confused, bespectacled Meta employees waiting out of shot.

As for the Meta Ray-Ban Display glasses, which the company is launching in the US for $799 on September 30th, the hiccup was much more forgivable. Zuckerberg was attempting to take a live video call from company CTO Andrew Bosworth, who after several missed attempts, came on stage to do an ad hoc simulation of what it might have been like.

Those sorts of live product events are notoriously bad for both Wi-Fi and mobile connections, simply because of how many people are in the room, often with multiple devices per-person. Still, Zuckerberg didn’t pull a Steve Jobs, where the former Apple CEO demanded everyone in attendance at iPhone 4’s June 2010 unveiling turn off their Wi-Fi after an on-stage connection flub.

You can catch the Meta Ray-Ban Display demo below (obligatory cringe warning):

Filed Under: AR Development, News, XR Industry News

Meta Reveals Next-Gen Ray-Ban & New Oakley Vanguard Smart Glasses

September 17, 2025 From roadtovr

Undoubtedly the smart glasses headliner of Meta Connect this year was the new $800 Meta Ray-Ban Display Glasses, which pack in a single display into a familiar Wafarer-style package. Alongside it though, Meta showed off two new smart glasses: the Oakley Meta Vanguard and next generation of Ray-Ban Meta.

Oakley Meta Vanguard – $499 (available Oct 21)

Oakley Meta Vanguard | Image courtesy Meta

Before Meta and Essilor Luxottica released Oakley Meta HSTN in July, we were definitely envisioning something more like the new Oakley Meta Vanguard. But it’s better late than never, as Meta has just unveiled its sleek, blade-like frames they say are “built for high-intensity sports.”

Rated at IP67 dust and water resistance, Meta Oakley Vanguard is supposedly durable enough for sweaty workouts or rainy rides, with it targeting sports like cycling, snowboarding, and running.

Oakley Meta Vanguard | Image courtesy Meta

Notably, like many of its traditional specs, the new smart glasses use Oakley’s Three-Point Fit system, which includes three interchangeable nose pads for a more secure fit, with Meta noting the frames are optimized for use with cycling helmets and hats.

They also include an onboard 12MP, 122° wide-angle camera sensor for capturing video up to 3K resolution, with modes including Slow Motion, Hyperlapse, and adjustable image stabilization.

And just like Ray-Ban Meta, it features open-ear speakers, notably rated at six decibels louder than previous Meta Oakley HSTN models, including a wind-optimized five-mic array to provide clear audio for taking calls, using voice commands, or listening to music while training.

The newest Oakley’s also integrate with Garmin, Strava, Apple Health, and Android Health Connect, delivering post-workout summaries and real-time stats through Meta AI. Athletes can check heart rate, progress, or other data hands-free with voice prompts.

Oakley Meta Vanguard | Image courtesy Meta

Available in four frame/lens color combinations, the glasses weigh 66g and offer up to nine hours of mixed use (or six hours of music) on a single charge, with an additional 36 hours via the charging case. Quick charging is said to bring the glasses to 50% in just 20 minutes, Meta says.

Like all of the other Meta smart glasses on offer, they include 32GB of storage for over 1,000 photos or 100 short videos, the company says.

Since it’s built for high-intensity sports, it also means the company is introducing replaceable lenses, starting at $85. Here are all four models available for pre-order, including the lenses you’ll be able to mix and match later.

  • Oakley Meta Vanguard Black with PRIZMTM 24K
  • Oakley Meta Vanguard White with PRIZMTM Black
  • Oakley Meta Vanguard Black with PRIZMTM Road
  • Oakley Meta Vanguard White with PRIZMTM Sapphire

Oakley Meta Vanguard is now available for pre-order through Meta or Oakley, priced at for $499 and launching October 21st.

They’ll be available in the US, Canada, UK, Ireland, France, Italy, Spain, Austria, Belgium, Australia, Germany, Sweden, Norway, Finland, Denmark, Switzerland, and the Netherlands. Meta says they should also eventually launch in Mexico, India, Brazil, and the United Arab Emirates later this year.

Ray-Ban Meta (Gen 2) – Starting at $379 (Now Available)

Ray-Ban Meta Wayfarer (Gen 2) | Image courtesy Meta

While the company considers its next Ray-Ban Meta Glasses “Gen 2”, they’re technically the third generation following the release of Ray-Ban Facebook Stories in 2021 and Ray-Ban Meta in 2023.

Naming scheme aside, the latest Ray-Ban Meta smart glasses are delivering the same improvements seen in Oakley Meta HSTN, and essentially the same base functionality. While can play music, do real-time translation, and hands-free calls, but also offers better photo and video capture than its predecessor.

Its ultrawide 12MP camera sensor is rated for photo capture up to 3,024 × 4032 pixels and video from 1200p at 60 FPS 1440p at 30 FPS, and 3K at 30 FPS—all of which are up to three minutes in length.

Ray-Ban Meta Wayfarer (Gen 2) | Image courtesy Meta

Like Oakley Meta HSTN, Ray-Ban Meta (Gen2) boasts up to eight hours of continuous use and an additional 48 hours from the charging case, plus quick charge to 50% in 20 minutes in the charging case.

And it probably goes without saying, but all of Meta’s smart glasses make heavy use of its own Meta AI, which includes things like voice search queries (“Hey Meta!”), reading QR codes, suggesting recipes, saving notes, etc.

Ray-Ban Meta Skyler (Gen 2) | Image courtesy Meta

Additionally, the device includes Bluetooth 5.3, Wi-Fi 6, 32GB of storage, and an IPX4 water-resistance rating for light rain or splashes.

And like the 2023 model, the new Ray-Ban Meta smart glasses offer gads of frame and lens combinations: 27 in total across its Wayfarer and Skyler models, which include options for large or low nose bridges.

It is also getting a price bump over the first-gen, which were launched in 2023 for $299. Ray-Ban Meta (Gen 2) starts at $379 for standard lens options, and will be available with polarized lenses ($409), transitions lenses ($459), and prescription lenses (pricing varies).

You can find all of those models and lens combinations starting today over at Meta and Ray-Ban.com.


We’re currently on the ground at Meta Connect this year, so check back soon for all things XR.

Filed Under: AR Development, ar industry, News, XR Industry News

VITURE Launches ‘Luma Ultra’ AR Glasses with Sony Micro-OLED Panels

September 17, 2025 From roadtovr

VITURE has now launched Luma Ultra AR glasses, which pack in Sony’s latest micro-OLED to go along with spatial gesture tracking thanks to onboard sensor array.

Priced at $600, and now shipping worldwide, Viture Luma Ultra is targeting prosumers, enterprise and business professionals looking for a personal, on-the-go workspace.

Notably, these aren’t standalone devices, instead relying on PC, console and mobile tethering for compute, which means they integrate as external (albeit very personal) monitors.

Image courtesy VITURE

Luma Ultra is said to include a 52-degree field of view (FOV), Sony’s latest micro-OLED panels with a resolution up to 1200p and 1,250 nits peak brightness. Two depth sensing cameras are onboard in addition to a single RGB camera for spatial tracking and hand gesture input.

Unlike some AR glasses, which rely on slimming waveguide optics, Luma Ultra uses what’s called a ‘birdbath’ optic system, which uses a curved, semi-transparent mirror to project the digital image into the user’s eyes. It’s typically cheaper and easier to manufacture, and can also reach higher brightness at the expense of more bulk and weight.

Image courtesy VITURE

The device also includes an electrochromic film for tint control, myopia adjustments up to -4.0 diopters, and support for 64 ± 6mm interpupillary distance (IPD).

In reality, the company also launched a slate of AR glasses alongside it, which are targeted at consuming traditional media, positioning Viture Luma Ultra the company’s flagship device.

Check out the full lineup and spec below:

Image courtesy VITURE

Viture Luma ($400), Luma Pro ($500) and Luma Ultra ($600) are all estimated to ship within two weeks of ordering, with the next device, Luma Beast ($550) slated to ship sometime in November.

None of the devices above (besides Luma Ultra) include spatial tracking due to the lack of depth sensors, however Luma Beast is said to come with the same micro-OLED displays as Luma Ultra at a slightly larger 58-degree FOV and an auto-adjusting electrochromic film for tint control.

This follows the news of Viture’s latest funding round, which brought the San Francisco-based XR glasses company $100 million in Series B financing. which the company says will aid in global expansion of its consumer XR glasses. Viture says the funding will aid in global expansion of its consumer XR glasses.

Filed Under: AR Development, ar industry, News, XR Industry News

Meta Leaks Next-gen Smart Glasses with Display Ahead of Connect This Week

September 16, 2025 From roadtovr

It seems Meta has a new generation of smart glasses to show off at Connect this week, and it appears we’ve just got an eye-full of the long-rumored version with a built-in display, previously codenamed ‘Hypernova’.

As noted by XR analyst Brad Lynch, Meta seems to have leaked the next slate of smart glasses built in collaboration with Essilor Luxottica.

The video, which was unlisted on Meta’s YouTube channel, has since been deleted.

New Meta smartglasses with display leaked via an unlisted video on their own YouTube channel

Along with their EMG wristband, and other smartglass models they plan to show off this week at Meta Connect pic.twitter.com/8tTlmaeQ0a

— SadlyItsDadley (@SadlyItsBradley) September 15, 2025

The video shows off four main models: the recently released Oakley Meta HSTN, the rumored Oakley Meta Sphaera model, what appears to be the next gen version of Ray-Ban Meta, and the rumored variant with display, which also comes with an electromyography (EMG) based wristband for input.

Meta also showed off a few use cases for the new display-clad smart glasses: typing on the back of a laptop to send a message, following turn-by-turn directions, identifying an object using AI, and real-time text translation.

Image courtesy Brad Lynch

Notably, prior to its unintentional unveiling, it was thought the display model would not be built in collaboration with Essilor Luxottica, and instead be marketed under the Meta name, owing to its ‘Celeste’ branding seen in previous leaks. It appears however the company is coopting a slightly larger Ray-Ban Wayfarer design and appending the name ‘Display’.

What’s more, the the new smart glasses with heads-up display are also shown with the previously reported EMG wristband, which is meant to control the device’s UI. Meta has previously shown the wristband input device working with its prototype Orion AR glasses, which picks up movement in the wrist without needing line of sight to camera sensors, like Meta Quest 3 does.

There’s no confirmed pricing info yet, however a previous report from Bloomberg’s Mark Gurman maintains the display model and EMG wristband controller could cost “about $800.”

Meta Connect kicks off September 17th – 18th, where we expect to learn more about release dates and pricing for all of the company’s newest smart glasses.


We will be at Meta Connect this week, so make sure to check back soon for all of the latest in Meta’s XR hardware and software.

Filed Under: AR Development, News, XR Industry News

Virtualware Seals €5M Deal to Support Virtual Vocational Training in Spain

September 15, 2025 From roadtovr

Virtualware, the Spain-based XR and 3D simulation software company, announced it’s secured a €5 million ($5.8 million) deal to broadly roll out its VIROO platform in vocational training facilities supported by Spain’s Ministry of Education.

The six-year contract allows Virtualware to bring its XR enterprise platform VIROO to 66 new ‘Centres of Excellence for Vocational Training’ (VET), the company says in a press statement, which are run by Spain’s Ministry of Education, Vocational Training and Sport (MEFPD).

The rollout to Spain’s VET Centres will join the more than 25 vocational training centers across the country already equipped with VIROO. In Spain, VET supports initial training of young people as well as the continuing up-skilling and re-skilling of adults across a variety of industries.

“We are opening a new chapter of growth and pedagogical innovation, allowing thousands of students to train with state-of-the-art immersive simulators developed and deployed through VIROO platform, raising their technical skills from day one,” says Virtualware founder and CEO Unai Extremo. “Our goal is to bring immersive technology to every vocational training classroom in Spain, through a sustainable model for content creation and deployment”

Founded in 2004 and then later acquired by Swedish company Simumatik in 2024, the in Bilbao, Spain-based company has recently focused on expanding its capabilities to support a number of key industries, including energy, automotive, transportation, defense, manufacturing, education, and healthcare.

Among Virtualware’s clients are GE Vernova, Petronas, Volvo, Gestamp, Alstom, ADIF, Bosch, Biogen, Kessler Foundation, Invest WindsorEssex, McMaster University, the University of El Salvador, Ohio University, the Spanish Ministry of Defense and the Basque Government.

Check out VIROO in action below, which was created to showcase the company’s work with the Spanish nation rail service, ADIF (Administrador de Infraestructuras Ferroviarias).

Filed Under: AR Development, AR Investment, VR Development, VR Investment, XR Industry News

Next Page »

  • Home