• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

Meta Says New ‘Horizon Worlds’ Engine Update Brings Faster Loading and Up to 100 Concurrent Users

September 18, 2025 From roadtovr

Today at Connect, Meta said it’s rolling out an updated version of the engine that powers Horizon Worlds. The new tech will purportedly speed up loading of Horizon Worlds spaces and allow up to 100 users in a single space.

The new tech, which Meta is calling the ‘Horizon Engine’ is said to be replacing the original foundation of Horizon Worlds which was based on Unity. The engine has been rebuilt with the goals of Horizon Worlds in mind—namely, enabling players to hop between interconnected social spaces.

Meta says the new system can increase loading times for Worlds spaces by four times, making jumping between different spaces more seamless. The improved performance also means that Worlds experiences can now host up to 100 players simultaneously, which is five times as many as the previous limit.

Meta says it has also rebuilt ‘Horizon Home’ using the new engine, which is the default space you see when you put on Quest. This purportedly brings improved visual quality and some functionality upgrades, like being able to pin apps to the walls for quick access.

The changes to Horizon Home appear to move Meta one step closer to merging Horizon Home and Horizon Worlds together. Now, running on the same engine, the space will also allow users to pin portals to various Worlds spaces for quick access.

At Connect, Meta also announced that it is working on an ‘agentic editor’ for Horizon Worlds called Meta Horizon Studio. While the company has already released AI features that allow creators to generate various assets for building Worlds experiences, the new agentic editor melds multiple tools together under a chat-based interface.

Image courtesy Meta

The new tool allows creators to build new Worlds experiences by asking for additions and changes in natural language, like ‘change the style to sci-fi’, or ‘add a new character that’s a talking bear who is lost and wants the player to help them get home’.

Meta Horizon Studio will be rolling out in beta in the near future, the company says.

Filed Under: Meta Quest 3 News & Reviews, News

Hands-on: Meta Ray-Ban Display Glasses & Neural Band Offer a Glimpse of Future AR Glasses

September 18, 2025 From roadtovr

The newly announced Meta Ray-Ban Display glasses, and the ‘Neural Band’ input device that comes with them, are still far from proper augmented reality. But Meta has made several clever design choices that will pay dividends once their true AR glasses are ready for the masses.

The Ray-Ban Display glasses are a new category for Meta. Previous products communicated to the user purely through audio. Now, a small, static monocular display adds quite a bit of functionality to the glasses. Check out the full announcement of the Meta Ray-Ban Display glasses here for all the details, and read on for my hands-on impressions of the device.

A Small Display is a Big Improvement

Meta Ray-Ban Display Glasses | Image courtesy Meta

A 20° monocular display isn’t remotely sufficient for proper AR (where virtual content floats in the world around you), but it adds a lot of new functionality to Meta’s smart glasses.

For instance, imagine you want to ask Meta AI for a recipe for teriyaki chicken. On the non-display models, you could definitely ask the question and get a response. But after the AI reads it out to you, how do you continue to reference the recipe? Well, you could either keep asking the glasses over and-over, or you could pull your phone out of your pocket and use the Meta AI companion app (at which point, why not just pull the recipe up on your phone in the first place?).

Now with the Meta Ray-Ban Display glasses, you can actually see the recipe instructions as text in a small heads-up display, and glance at them whenever you need.

In the same way, almost everything you could previously do with the non-display Meta Ray-Ban glasses is enhanced by having a display.

Now you can see a whole thread of messages instead of just hearing one read through your ear. And when you reply you can actually read the input as it appears in real-time to make sure it’s correct instead of needing to simply hear it played back to you.

When capturing photos and videos you now see a real-time viewfinder to ensure you’re framing the scene exactly as you want it. Want to check your texts without needing to talk out loud to your glasses? Easy peasy.

And the real-time translation feature becomes more useful too. In current Meta glasses you have to listen to two overlapping audio streams at once. The first is the voice of the speaker and the second is the voice in your ear translating into your language, which can make it harder to focus on the translation. With the Ray-Ban Display glasses, now the translation can appear as a stream of text, which is much easier to process while listening to the person speaking in the background.

It should be noted that Meta has designed the screen in the Ray-Ban Display glasses to be off most of the time. The screen is set off and to the right of your central vision, making it more of a glanceable display than something that’s right in the middle of your field-of-view. At any time you can turn the display on or off with a double-tap of your thumb and middle finger.

Technically, the display is a 0.36MP (600 × 600) full-color LCoS display with a reflective waveguide. Even though the resolution is “low,” it’s plenty sharp across the small 20° field-of-view. Because it’s monocular, it does have a ghostly look to it (because only one eye can see it). This doesn’t hamper the functionality of the glasses, but aesthetically it’s not ideal.

Meta hasn’t said if they designed the waveguide in-house or are working with a partner. I suspect the latter, and if I had to guess, Lumus would be the likely supplier. Meta says the display can output up to 5,000 nits brightness, which is enough to make the display readily usable even in full daylight (the included Transitions also help).

From the outside, the waveguide is hardly visible in the lens. The most prominent feature is some small diagonal markings toward the temple-side of the headset.

Photo by Road to VR

Meanwhile, the final output gratings are very transparent. Even when the display is turned on, it’s nearly impossible to see a glint from the display in a normally lit room. Meta said the outward light-leakage is around 2%, which I am very impressed by.

 The waveguide is extremely subtle within the lens | Photo by Road to VR

Aside from the glasses being a little chonkier than normal glasses, the social acceptability here is very high—even more so because you don’t need to constantly talk to the glasses to use them, or even hold your hand up to tap the temple. Instead, the so-called Neural Band (based on EMG sensing), allows you to make subtle inputs while your hand is down at your side.

The Neural Band is an Essential Piece to the Input Puzzle

Photo by Road to VR

The included Neural Band is just as important to these new glasses as the display itself—and it’s clear that this will be equally important to future AR glasses.

To date, controlling XR devices has been done with controllers, hand-tracking, or voice input. All of these have their pros and cons, but none are particularly fitting for glasses that you’d wear around in public; controllers are too cumbersome, hand-tracking requires line of sight which means you need to hold your hands awkwardly out in front of you, and voice is problematic both for privacy and certain social settings where talking isn’t appropriate.

The Neural Band, on the other hand, feels like the perfect input device for all-day wearable glasses. Because it’s detecting muscle activity (instead of visually looking for your fingers) no line-of-sight is needed. You can have your arm completely to your side (or even behind your back) and you’ll still be able to control the content on the display.

The Neural Band offers several ways to navigate the UI of the Ray-Ban Display glasses. You can pinch your thumb and index finger together to ‘select’; pinch your thumb and middle finger to ‘go back’; and swipe your thumb across the side of your finger to make up, down, left, and right selections. There are a few other inputs too, like double-tapping fingers or pinching and rotating your hand.

As of now, you navigate the Ray-Ban Display glasses mostly by swiping around the interface and selecting. In the future, having eye-tracking on-board will make navigation even more seamless, by allowing you to simply look and pinch to select what you want. The look-and-pinch method, combined with eye-tracking, already works great on Vision Pro. But it still misses your pinches sometimes if your hand isn’t in the right spot, because the cameras can’t always see your hands at quite the right angle. If I could use the Neural Band for pinch detection on Vision Pro, I absolutely would—that’s how well it seems to work already.

While it’s easy enough to swipe and select your way around the Ray-Ban Display interface, the Neural Band has the same downside that all the aforementioned input methods have: text input. But maybe not for long.

In my hands-on with the Ray-Ban Display, the device was still limited to dictation input. So replying to a message or searching for a point of interest still means talking out loud to the headset.

However, Meta showed me a demo (that I didn’t get to try myself) of being able to ‘write’ using your finger against a surface like a table or your leg. It’s not going to be nearly as fast as a keyboard (or dictation, for that matter), but private text input is an important feature. After all, if you’re out in public, you probably don’t want to be speaking all of your message replies out loud.

The ‘writing’ input method is said to be a forthcoming feature, though I didn’t catch whether they expected it to be available at launch or sometime after.

On the whole, the Neural Band seems like a real win for Meta. Not just for making the Ray-Ban display more useful, but it seems like the ideal input method for future glasses with full input capabilities.

Photo by Road to VR

And it’s easy to see a future where the Neural Band becomes even more useful by evolving to include smartwatch and fitness tracking functions. I already wear a smartwatch most of the day anyway… making it my input device for a pair of smart glasses (or AR glasses in the future) is a smart approach.

Little Details Add Up

One thing I was not expecting to be impressed by was the charging case of the Ray-Ban Display glasses. Compared to the bulky charging cases of all of Meta’s other smart glasses, this clever origami-like case folds down flat to take up less space when you aren’t using it. It goes from being big enough to accommodate a charging battery and the glasses themselves, down to something that can easily go in a back pocket or slide into a small pocket in a bag.

This might not seem directly relevant to augmented reality, but it’s actually more important than you might think. It’s not like Meta invented a folding glasses case, but it shows that the company is really thinking about how this kind of device will fit into people’s lives. An analog to this for their MR headsets would be including a charging dock with every headset—something they’ve yet to do.

Now with a display on-board, Meta is also repurposing the real-time translation feature as a sort of ‘closed captioning’. Instead of translating to another language, you can turn on the feature and see a real-time text stream of the person in front of you, even if they’re already speaking your native language. That’s an awesome capability for those that are hard-of-hearing.

Live Captions in Meta Ray-Ban Display Glasses | Image courtesy Meta

And even for those that aren’t, you might still find it useful… Meta says the beam-forming microphones in the Ray-Ban Display can focus on the person you’re looking at while ignoring other nearby voices. They showed me a demo of this in action in a room with one person speaking to me and three others having a conversation nearby to my left. It worked relatively well, but it remains to be seen if it will work in louder environments like a noisy restaurant or a club with thumping music.

Meta wants to eventually pack full AR capabilities into glasses of a similar size. And even if they aren’t there yet, getting something out the door like the Ray-Ban Display gives them the opportunity to explore, iterate—and hopefully perfect—many of the key ‘lifestyle’ factors that need to be in place for AR glasses to really take off.


Disclosure: Meta covered lodging for one Road to VR correspondent to attend an event where information for this article was gathered.

Filed Under: Feature, News, Smart Glasses, XR Industry News

Meta Reveals Next-Gen Ray-Ban & New Oakley Vanguard Smart Glasses

September 17, 2025 From roadtovr

Undoubtedly the smart glasses headliner of Meta Connect this year was the new $800 Meta Ray-Ban Display Glasses, which pack in a single display into a familiar Wafarer-style package. Alongside it though, Meta showed off two new smart glasses: the Oakley Meta Vanguard and next generation of Ray-Ban Meta.

Oakley Meta Vanguard – $499 (available Oct 21)

Oakley Meta Vanguard | Image courtesy Meta

Before Meta and Essilor Luxottica released Oakley Meta HSTN in July, we were definitely envisioning something more like the new Oakley Meta Vanguard. But it’s better late than never, as Meta has just unveiled its sleek, blade-like frames they say are “built for high-intensity sports.”

Rated at IP67 dust and water resistance, Meta Oakley Vanguard is supposedly durable enough for sweaty workouts or rainy rides, with it targeting sports like cycling, snowboarding, and running.

Oakley Meta Vanguard | Image courtesy Meta

Notably, like many of its traditional specs, the new smart glasses use Oakley’s Three-Point Fit system, which includes three interchangeable nose pads for a more secure fit, with Meta noting the frames are optimized for use with cycling helmets and hats.

They also include an onboard 12MP, 122° wide-angle camera sensor for capturing video up to 3K resolution, with modes including Slow Motion, Hyperlapse, and adjustable image stabilization.

And just like Ray-Ban Meta, it features open-ear speakers, notably rated at six decibels louder than previous Meta Oakley HSTN models, including a wind-optimized five-mic array to provide clear audio for taking calls, using voice commands, or listening to music while training.

The newest Oakley’s also integrate with Garmin, Strava, Apple Health, and Android Health Connect, delivering post-workout summaries and real-time stats through Meta AI. Athletes can check heart rate, progress, or other data hands-free with voice prompts.

Oakley Meta Vanguard | Image courtesy Meta

Available in four frame/lens color combinations, the glasses weigh 66g and offer up to nine hours of mixed use (or six hours of music) on a single charge, with an additional 36 hours via the charging case. Quick charging is said to bring the glasses to 50% in just 20 minutes, Meta says.

Like all of the other Meta smart glasses on offer, they include 32GB of storage for over 1,000 photos or 100 short videos, the company says.

Since it’s built for high-intensity sports, it also means the company is introducing replaceable lenses, starting at $85. Here are all four models available for pre-order, including the lenses you’ll be able to mix and match later.

  • Oakley Meta Vanguard Black with PRIZMTM 24K
  • Oakley Meta Vanguard White with PRIZMTM Black
  • Oakley Meta Vanguard Black with PRIZMTM Road
  • Oakley Meta Vanguard White with PRIZMTM Sapphire

Oakley Meta Vanguard is now available for pre-order through Meta or Oakley, priced at for $499 and launching October 21st.

They’ll be available in the US, Canada, UK, Ireland, France, Italy, Spain, Austria, Belgium, Australia, Germany, Sweden, Norway, Finland, Denmark, Switzerland, and the Netherlands. Meta says they should also eventually launch in Mexico, India, Brazil, and the United Arab Emirates later this year.

Ray-Ban Meta (Gen 2) – Starting at $379 (Now Available)

Ray-Ban Meta Wayfarer (Gen 2) | Image courtesy Meta

While the company considers its next Ray-Ban Meta Glasses “Gen 2”, they’re technically the third generation following the release of Ray-Ban Facebook Stories in 2021 and Ray-Ban Meta in 2023.

Naming scheme aside, the latest Ray-Ban Meta smart glasses are delivering the same improvements seen in Oakley Meta HSTN, and essentially the same base functionality. While can play music, do real-time translation, and hands-free calls, but also offers better photo and video capture than its predecessor.

Its ultrawide 12MP camera sensor is rated for photo capture up to 3,024 × 4032 pixels and video from 1200p at 60 FPS 1440p at 30 FPS, and 3K at 30 FPS—all of which are up to three minutes in length.

Ray-Ban Meta Wayfarer (Gen 2) | Image courtesy Meta

Like Oakley Meta HSTN, Ray-Ban Meta (Gen2) boasts up to eight hours of continuous use and an additional 48 hours from the charging case, plus quick charge to 50% in 20 minutes in the charging case.

And it probably goes without saying, but all of Meta’s smart glasses make heavy use of its own Meta AI, which includes things like voice search queries (“Hey Meta!”), reading QR codes, suggesting recipes, saving notes, etc.

Ray-Ban Meta Skyler (Gen 2) | Image courtesy Meta

Additionally, the device includes Bluetooth 5.3, Wi-Fi 6, 32GB of storage, and an IPX4 water-resistance rating for light rain or splashes.

And like the 2023 model, the new Ray-Ban Meta smart glasses offer gads of frame and lens combinations: 27 in total across its Wayfarer and Skyler models, which include options for large or low nose bridges.

It is also getting a price bump over the first-gen, which were launched in 2023 for $299. Ray-Ban Meta (Gen 2) starts at $379 for standard lens options, and will be available with polarized lenses ($409), transitions lenses ($459), and prescription lenses (pricing varies).

You can find all of those models and lens combinations starting today over at Meta and Ray-Ban.com.


We’re currently on the ground at Meta Connect this year, so check back soon for all things XR.

Filed Under: AR Development, ar industry, News, XR Industry News

VITURE Launches ‘Luma Ultra’ AR Glasses with Sony Micro-OLED Panels

September 17, 2025 From roadtovr

VITURE has now launched Luma Ultra AR glasses, which pack in Sony’s latest micro-OLED to go along with spatial gesture tracking thanks to onboard sensor array.

Priced at $600, and now shipping worldwide, Viture Luma Ultra is targeting prosumers, enterprise and business professionals looking for a personal, on-the-go workspace.

Notably, these aren’t standalone devices, instead relying on PC, console and mobile tethering for compute, which means they integrate as external (albeit very personal) monitors.

Image courtesy VITURE

Luma Ultra is said to include a 52-degree field of view (FOV), Sony’s latest micro-OLED panels with a resolution up to 1200p and 1,250 nits peak brightness. Two depth sensing cameras are onboard in addition to a single RGB camera for spatial tracking and hand gesture input.

Unlike some AR glasses, which rely on slimming waveguide optics, Luma Ultra uses what’s called a ‘birdbath’ optic system, which uses a curved, semi-transparent mirror to project the digital image into the user’s eyes. It’s typically cheaper and easier to manufacture, and can also reach higher brightness at the expense of more bulk and weight.

Image courtesy VITURE

The device also includes an electrochromic film for tint control, myopia adjustments up to -4.0 diopters, and support for 64 ± 6mm interpupillary distance (IPD).

In reality, the company also launched a slate of AR glasses alongside it, which are targeted at consuming traditional media, positioning Viture Luma Ultra the company’s flagship device.

Check out the full lineup and spec below:

Image courtesy VITURE

Viture Luma ($400), Luma Pro ($500) and Luma Ultra ($600) are all estimated to ship within two weeks of ordering, with the next device, Luma Beast ($550) slated to ship sometime in November.

None of the devices above (besides Luma Ultra) include spatial tracking due to the lack of depth sensors, however Luma Beast is said to come with the same micro-OLED displays as Luma Ultra at a slightly larger 58-degree FOV and an auto-adjusting electrochromic film for tint control.

This follows the news of Viture’s latest funding round, which brought the San Francisco-based XR glasses company $100 million in Series B financing. which the company says will aid in global expansion of its consumer XR glasses. Viture says the funding will aid in global expansion of its consumer XR glasses.

Filed Under: AR Development, ar industry, News, XR Industry News

Meta Leaks Next-gen Smart Glasses with Display Ahead of Connect This Week

September 16, 2025 From roadtovr

It seems Meta has a new generation of smart glasses to show off at Connect this week, and it appears we’ve just got an eye-full of the long-rumored version with a built-in display, previously codenamed ‘Hypernova’.

As noted by XR analyst Brad Lynch, Meta seems to have leaked the next slate of smart glasses built in collaboration with Essilor Luxottica.

The video, which was unlisted on Meta’s YouTube channel, has since been deleted.

New Meta smartglasses with display leaked via an unlisted video on their own YouTube channel

Along with their EMG wristband, and other smartglass models they plan to show off this week at Meta Connect pic.twitter.com/8tTlmaeQ0a

— SadlyItsDadley (@SadlyItsBradley) September 15, 2025

The video shows off four main models: the recently released Oakley Meta HSTN, the rumored Oakley Meta Sphaera model, what appears to be the next gen version of Ray-Ban Meta, and the rumored variant with display, which also comes with an electromyography (EMG) based wristband for input.

Meta also showed off a few use cases for the new display-clad smart glasses: typing on the back of a laptop to send a message, following turn-by-turn directions, identifying an object using AI, and real-time text translation.

Image courtesy Brad Lynch

Notably, prior to its unintentional unveiling, it was thought the display model would not be built in collaboration with Essilor Luxottica, and instead be marketed under the Meta name, owing to its ‘Celeste’ branding seen in previous leaks. It appears however the company is coopting a slightly larger Ray-Ban Wayfarer design and appending the name ‘Display’.

What’s more, the the new smart glasses with heads-up display are also shown with the previously reported EMG wristband, which is meant to control the device’s UI. Meta has previously shown the wristband input device working with its prototype Orion AR glasses, which picks up movement in the wrist without needing line of sight to camera sensors, like Meta Quest 3 does.

There’s no confirmed pricing info yet, however a previous report from Bloomberg’s Mark Gurman maintains the display model and EMG wristband controller could cost “about $800.”

Meta Connect kicks off September 17th – 18th, where we expect to learn more about release dates and pricing for all of the company’s newest smart glasses.


We will be at Meta Connect this week, so make sure to check back soon for all of the latest in Meta’s XR hardware and software.

Filed Under: AR Development, News, XR Industry News

Virtualware Seals €5M Deal to Support Virtual Vocational Training in Spain

September 15, 2025 From roadtovr

Virtualware, the Spain-based XR and 3D simulation software company, announced it’s secured a €5 million ($5.8 million) deal to broadly roll out its VIROO platform in vocational training facilities supported by Spain’s Ministry of Education.

The six-year contract allows Virtualware to bring its XR enterprise platform VIROO to 66 new ‘Centres of Excellence for Vocational Training’ (VET), the company says in a press statement, which are run by Spain’s Ministry of Education, Vocational Training and Sport (MEFPD).

The rollout to Spain’s VET Centres will join the more than 25 vocational training centers across the country already equipped with VIROO. In Spain, VET supports initial training of young people as well as the continuing up-skilling and re-skilling of adults across a variety of industries.

“We are opening a new chapter of growth and pedagogical innovation, allowing thousands of students to train with state-of-the-art immersive simulators developed and deployed through VIROO platform, raising their technical skills from day one,” says Virtualware founder and CEO Unai Extremo. “Our goal is to bring immersive technology to every vocational training classroom in Spain, through a sustainable model for content creation and deployment”

Founded in 2004 and then later acquired by Swedish company Simumatik in 2024, the in Bilbao, Spain-based company has recently focused on expanding its capabilities to support a number of key industries, including energy, automotive, transportation, defense, manufacturing, education, and healthcare.

Among Virtualware’s clients are GE Vernova, Petronas, Volvo, Gestamp, Alstom, ADIF, Bosch, Biogen, Kessler Foundation, Invest WindsorEssex, McMaster University, the University of El Salvador, Ohio University, the Spanish Ministry of Defense and the Basque Government.

Check out VIROO in action below, which was created to showcase the company’s work with the Spanish nation rail service, ADIF (Administrador de Infraestructuras Ferroviarias).

Filed Under: AR Development, AR Investment, VR Development, VR Investment, XR Industry News

Samsung Preps iPhone-Style Spatial Videos & Photos for “Galaxy XR Headset”, Leak Suggests

September 12, 2025 From roadtovr

A new feature has leaked to some Samsung smartphones which is expected to bring the ability to capture 3D images and videos specifically for “Galaxy XR headsets,” SamMobile has discovered.

Samsung revealed its forthcoming XR headset, codenamed ‘Project Moohan’ (Korean for ‘Infinite’), late last year, which is slated to bring competition to Apple Vision Pro sometime later this year.

When, how much, or even the mixed reality headset’s official named are all still a mystery, however a recent feature leak uncovered by SamMobile’s Asif Iqbal Shaik reveals Samsung smartphones could soon be able to capture 3D photos and video—just like iPhone does for Vision Pro.

Image courtesy SamMobile

Shaik maintains the latest version (4.0.0.3) of the Camera Assistant app contains the option to capture specifically for “Galaxy XR headsets,” initially hidden within an update to the app on Galaxy S25 FE. Transferring the APK file to a Galaxy S25 Ultra however reveals the option, seen above.

Speculation regarding the plurality of Galaxy XR headsets aside: Samsung has gone on record multiple times since Project Moohan’s late 2024 unveiling that the mixed reality headset will indeed release later this year, making the recent software slip an understandable mistake as the company ostensibly seeks to match Vision Pro feature-for-feature on its range of competing smartphones on arrival.

Slated to be the first XR headset to run Google’s Android XR operating system, Moohan could be releasing sooner than you think. A recent report from Korea’s Newsworks maintained the device will be featured at a Samsung product event on September 29th. Notably, Moohan was a no-show at Samsung’s Galaxy event earlier this month, which saw the unveiling of Galaxy S25 FE, Galaxy Tab S11, and Galaxy Tab S11 Ultra.

Newsworks further suggests Moohan could launch first in South Korea on October 13th, priced at somewhere between ₩2.5 and ₩4 million South Korean won—or between $1,800 and $2,900 USD—with a global rollout set to follow.

Filed Under: News, VR Development, XR Industry News

Snapchat CEO’s Open Letter Ties Spectacles AR Glasses to the Survival of the Company at Large

September 12, 2025 From roadtovr

According to Snap’s CEO Evan Spiegel, the company behind Snapchat has reached a “crucible moment” as it heads into 2026, which he says rests on the growth and performance of Spectacles, the company’s AR glasses, as well as AI, advertising and direct revenue streams.

Snap announced in June it was working on the next iteration of its Spectacles AR glasses (aka ‘Specs’), which are expected to release to consumers sometime next year. Snap hasn’t revealed them yet, although the company says the new Specs will be smaller and lighter, feature see-through AR optics and include a built-in AI assistant.

Snap Spectacles (gen 5) | Image courtesy Snap Inc

Following the release of the fifth gen in 2024 to developers, next year will be “the most consequential year yet” in Snap’s 14-year history, Spiegel says, putting its forthcoming generation of Specs in the spotlight.

“After starting the year with considerable momentum, we stumbled in Q2, with ad revenue growth slowing to just 4% year-over-year,” Spiegel admits in his recent open letter. “Fortunately, the year isn’t over yet. We have an enormous opportunity to re-establish momentum and enter 2026 prepared for the most consequential year yet in the life of Snap Inc.”

Not only are Specs a key focus in the company’s growth, Spiegel thinks AR glasses, combined with AI, will drastically change the way people work, learn and play.

“The need for Specs has become urgent,” Spiegel says. “People spend over seven hours a day staring at screens. AI is transforming the way we work, shifting us from micromanaging files and apps to supervising agents. And the costs of manufacturing physical goods are skyrocketing.”

Image courtesy Snap Inc, Niantic

Those physical goods can be replaced with “photons, reducing waste while opening a vast new economy of digital goods,” Spiegel says, something the company hopes to tap into with Specs. And instead of replicating the smartphone experience into AR, Spiegel maintains the core of the device will rely on AI.

“Specs are not about cramming today’s phone apps into a pair of glasses. They represent a shift away from the app paradigm to an AI-first experience — personalized, contextual, and shared. Imagine pulling up last week’s document just by asking, streaming a movie on a giant, see-through, and private display that only you can see, or reviewing a 3D prototype at scale with your teammate standing next to you. Imagine your kids learning biology from a virtual cadaver, or your friends playing chess around a real table with a virtual board.”

Like many of its competitors, Spiegel characterizes Specs as “an enormous business opportunity,” noting the AR device can not only replace multiple physical screens, but the operating system itself will be “personalized with context and memory,” which he says will compound in value over time.

Meanwhile, Snap competitors Meta, Google, Samsung, and Apple are jockeying for position as they develop their own XR devices—the umbrella term for everything from mixed reality headsets, like Meta Quest 3 or Apple Vision Pro, to smart glasses like Ray-Ban Meta or Google’s forthcoming Android XR glasses, to full-AR glasses, such as Meta’s Orion prototype, which notably hopes to deliver many of the same features promised by the sixth gen Specs.

And as the company enters 2026, Spiegel says Snap is looking to organize differently, calling for “startup energy at Snap scale” by setting up a sort of internal accelerator of five to seven teams composed 10 to 15-person squads, which he says will include “weekly demo days, 90-day mission cycles, and a culture of fast failure will keep us moving.”

It’s a bold strategy, especially as the company looks to straddle the expectant ‘smartphone-to-AR’ computing paradigm shift, with Spiegel noting that “Specs are how we move beyond the limits of smartphones, beyond red-ocean competition, and into a once-in-a-generation transformation towards human-centered computing.”


You can read Snap CEO Evan Spiegel’s full open letter here, which includes more on AI and the company’s strategies for growth, engagement and ultimately how it’s seeking to generate more revenue.

Filed Under: AR Development, ar industry, News, XR Industry News

XR Glasses Maker VITURE Secures $100M Investment as Wearable Segment Heats Up

September 11, 2025 From roadtovr

San Francisco-based XR glasses company VITURE announced it’s secured $100 million in Series B financing, which the company says will aid in global expansion of its consumer XR glasses.

Viture initially announced in October 2024 it successfully secured a Series B, however now the company reveals its most recent tranche has brought the Series B total to $100 million, bringing overall funding to $121.5 million, according to Crunch Base data.

Previous investors include Singtel Innov8, BlueRun Ventures, BAI Capital, Verity Ventures, with the company noting that some strategic investors in the Series B “prefer to remain undisclosed at this time.”

Viture Luma | Image courtesy Viture

The company says its Series B will allow it to expand its consumer XR glasses globally through retail and distribution networks, grow its enterprise offerings, and further develop its hardware and AI-powered software ecosystems.

This follows the July announcement of the company’s Luma series and Beast, phone/PC-tethered XR glasses that use bird bath-style optics, which the company is targeting towards casual content consumption and productivity.

Meanwhile, the XR glasses segment is heating up, although not uniformly in the direction of the sort of casual content-focused specs that Viture is developing.

More precisely, smart glasses with heads-up displays (i.e. not augmented reality) appear to be the next hot commodity among Meta, Google, Amazon and possibly even Apple, which generally see them as stepping stones to all-day wearable AR glasses of the future.

These sorts of smart glasses are very different from Viture’s however, or full-AR glasses, like Meta’s Orion prototype; smart glasses are essentially designed to offload daily tasks from the user’s smartphone, such as notifications, turn-by-turn directions, AI queries, calls, as well as photo and video capture.


Check out this handy primer on the differences between smart glasses and AR glasses to learn more.

Filed Under: AR Development, News, XR Industry News

Amazon Reportedly Developing Smart Glasses with Display to Rival Meta

September 10, 2025 From roadtovr

Amazon is reportedly developing a pair of consumer smart glasses which is slated to rival Meta’s rumored ‘Hypernova’ smart glasses with display, according to a report by The Information.

Citing two people with direct knowledge of the plans, The Information maintains the glasses, internally codenamed ‘Jayhawk’, are set to include include microphones, speakers, a camera, and a monocular, full-color display.

The report maintains Amazon is eyeing a consumer launch of Jayhawk in late 2026 or early 2027, however the price point is currently unknown.

Equally uncertain is whether Jayhawk will be marketed under Amazon’s ‘Echo Frames’ line, first introduced in 2019, including voice-controlled frames and music playback, calls, and smart home control powered by Alexa.

Third-gen Echo Frames | Image courtesy Amazon

Now in its third generation, launched in late 2023, Echo Frames offer essentially the same set of features as the first and second, with the notable outlier being any sort of camera (or display) for photos and video capture.

Additionally, The Information reports that Amazon is also creating smart glasses for its delivery drivers, said to be bulkier and less sleek than the consumer ‘Jayhawk’ model for consumers.

Codenamed ‘Amelia’, the glasses are reportedly set to provide instructions to help sort and deliver packages. Those are said to rollout as soon as Q2 2026, and include an initial production run of 100,000 units.

In contrast, recent reports from supply chain analyst Ming-Chi Kuo and Bloomberg’s Mark Gurman maintain Meta is nearly ready to begin mass production of its own smart glasses with monocular display.

Internally codenamed ‘Hypernova’, and possibly marketed as ‘Celeste’, Meta’s forthcoming smart glasses are expected to cost around $800, according to Kuo.

– – — – –

The Information maintains in its reporting that the glasses will be “augmented reality”, however the device’s description puts it squarely in the smart glasses segment.

In short, AR glasses overlay spatially anchored 3D digital content into the real environment, while smart glasses mainly provide heads-up information or notifications, either by monoscopic or even stereoscopic displays. You can learn more about the difference between AR and smart glasses here.

Filed Under: AR Development, News, XR Industry News

Next Page »

  • Home