• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

News

Vuzix Secures $5M Investment as Veteran Smart Glasses Maker Sets Sights on Consumers

June 17, 2025 From roadtovr

Vuzix, the veteran smart glasses maker, announced it’s secured a $5 million investment from Quanta Computer, the Taiwan-based ODM and major Apple assembler.

The latest investment was the second tranche following an initial $10 million investment made by Quanta in September 2024, which included the purchase of Vuzix common stock at $1.30 per share. At the time, Vuzix anticipated a total of $20 million from Quanta.

Paul Travers, President and CEO of Vuzix, notes the funding will be used to enhance Vuzix’s waveguide manufacturing capabilities, something he says will help Vuzix deliver “the world’s most affordable, lightweight, and performance-driven AI smart glasses for mass-market adoption.”

Additionally, Travers says the investment “marks another important milestone in strengthening our partnership with Quanta and expanding the capabilities of our cutting-edge waveguide production facility.”

Vuzix Z100 Smart Glasses | Image courtesy Vuzix

Founded in 1997, Vuzix has largely serviced enterprise with its evolving slate of smart glasses, which have typically targeted a number of industrial roles, including healthcare, manufacturing, and warehousing.

The company also produces its own waveguides for both in-house use and licensing. In the past, Vuzix has worked to integrate its waveguide tech with Garmin, Avegant, an unnamed US Fortune 50 tech company, and an unnamed U.S. defense supplier.

While the company has made a few early consumer devices in the 2010s, including V920 video eyewear and STAR 1200 AR headset, in November 2024, Vuzix introduced the Z100 smart glasses, its first pair of sleek, AI‑assisted smart glasses, priced at $500.

Its Z100 smart glasses include a 640 × 480 monochrome green microLED waveguide, and were designed to pair with smartphones to display notifications, fitness metrics, maps, targeting everyday consumers and enterprise customers alike.

Notably, the investment also coincides with greater market interest in smart glasses on the whole. Google announced last month it’s partnering with eyewear companies Warby Parker and Gentle Monster to release a line of fashionable smart glasses running Android XR.

Meta also recently confirmed it’s expanding its partnership with Ray-Ban Meta-maker EssilorLuxottica to create Oakley-branded smart glasses, expected to launch on June 20th, 2025.

Meanwhile, rumors suggest that both Samsung and Apple are aiming to release their own smart glasses in the near future, with reports maintaining that Samsung could release a device this year, and Apple as soon as next year.

Filed Under: AR Development, ar industry, AR Investment, News, XR Industry News

What We Know So Far About Anduril’s ‘Eagle Eye’ Military XR Headset and Founder’s Reunion With Meta

June 16, 2025 From roadtovr

Palmer Luckey’s military tech company Anduril recently announced a partnership with Meta to build “the world’s best AR and VR systems for the US military.” In two recent public conversations, Luckey offered up some details on the XR helmet his company is building for the military and how this unlikely partnership arose years after his VR company Oculus was acquired by Meta, followed by his unceremonious firing.

Following the announcement, Luckey spoke to host Ashlee Vance on an episode of the Core Memory podcast, and on stage with author and creative technologist Stephanie Riggs during a conversation at the AWE USA 2025 conference. From these conversations, we’ve detailed the most interesting information about Anduril’s upcoming military XR headset.

Eagle Eye

Luckey said that Anduril’s upcoming military XR device is codenamed ‘Eagle Eye’. The goal is to build a complete helmet replacement (with built-in XR capabilities) for soldiers, rather than merely an add-on device that would be worn or attached to standard-issue helmets.

“Eagle Eye is not just a head mounted display. It’s a fully-integrated ballistic shell, with hearing protection, vision protection, head protection, on-board compute, on-board networking, radios… and also vision augmentation systems… sensor systems that enhance your perception,” Luckey said on Core Memory. “And what we’re doing is working with Meta to take the building blocks that they’ve invested enormous amounts of money and expertise in, and we’re able to use those building blocks in Eagle Eye without having to recreate them from scratch ourselves.”

More specifically, he explained at AWE that, “Eagle Eye is not one head mounted display. It’s actually a platform for building vision augmentation systems. We’re building different versions because you have different people who have different roles. The guy who is a front-line infantryman being shot at has a different job than the guy who’s a logistician, or aircraft maintainer, or somebody who works in a warehouse. The field-of-view they need, the level of ballistic rating they need—it’s very very different. So Eagle Eye is actually a platform for hosting multiple vision augmentation systems.”

While not many technical specifics have been shared thus far, Luckey mentioned the headset uses multiple microdisplays per-eye. That tells us the headset could be a passthrough AR headset rather than transparent. That might seem surprising, (considering the need for battlefield awareness) but he repeatedly emphasises the goal of the helmet offering greater perception for soldiers through augmentation, rather than less.

Luckey admitted that the multi-microdisplay layout results in a visible seam in the peripheral image (which reminds me of an old ultrawide field-of-view headset prototype from Panasonic).

He said the seam wouldn’t be acceptable for the consumer market, but because the headset is being built as a tool to keep people alive, the tradeoff is worth it.

“One of the things we’re doing with eagle eye is using multiple microdisplays per-eye, with a tiled seam. And so you end up with this small little kind of distorted seam that’s living out in your peripheral view. And you can see it really easily. It’s there. It doesn’t bother you. It doesn’t make you sick. But it’s definitely there,” he told host Ashlee Vance. “Apple [for example] can’t make something like that [because it wouldn’t be acceptable to the consumer market]. They can’t make a thing where there’s a seamless magical experience, except for this weird distorted bubble seam down both sides of your vision in your periphery. But for a tool [like Eagle Eye] you can do that… it’s not actually a problem.”

As for cost, at AWE Luckey suggested that the headset could cost in excess of $10,000.

“[The US military] would rather have something that is significantly more performant even if it’s somewhat more expensive. Now I’m not saying we should charge the government some obscene price, but if they can choose between a $1,000 sensor that lets them see things that are twice as far, or a $100 sensor that has half the range, every time they’re going to make the choice for the $1,000 sensor, because the cost of losing that soldier or failing the mission is so much higher than the cost of that headset,” he said. “So what’s fun for me—from a tech perspective—is we’re able to build a headset that costs tens of thousands of dollars to make. We can load it with image sensors that are nicer than even Apple would put in something like the Vision Pro. We can afford to put extremely high-end displays in it that are far beyond what the consumer market would reasonably bear today.”

Without a consumer cost restriction, Luckey said Eagle Eye will have some specs that are significantly beyond anything that’s available on the consumer market today.

“Eagle Eye is gonna be the best AR and VR device that’s ever been made; it’s not even close. We’re running at an extraordinarily high framerate and extraordinarily high resolution. I’d tell you the specs but unfortunately the customer doesn’t want me to at this point,” Luckey told Stephanie Riggs at AWE. “But I will tell you it’s several times higher resolution in capability than even Apple Vision Pro. There’s nothing in the consumer market that’s going to be able to meet it where it is, because I have a different set of requirements. I’m not making an entertainment device you buy at Best Buy, I’m building a tool that keeps you alive. And that’s something the Army is willing to pay for.”

He also emphasized not just the helmet’s XR tech but also the integration of artificial intelligence, likening the end goal being “in the vein of Cortana,” the artificially intelligent sidekick of Master Chief (the hero from the Halo franchise).

“[…talking about Iron Man’s sci-fi armor suit] it wasn’t just the suit right? It was also the augmented vision paired with [some] kind of AI guardian angel in the form of Jarvis; that is what we were building. Eagle Eye has an onboard AI guardian angel, maybe less in the style of Jarvis and more in the vein of Cortana from Halo, but this idea of having this ever-present companion who can operate systems, who can communicate with others, that you can offload tasks onto, that is looking out for you with more eyes than you could ever look out for yourself, right there in your helmet—that is such a powerful thing to make real.”

One of the key capabilities of the headset involves threat detection, Luckey said at AWE.

“Eagle Eye has a 360° threat awareness system… that is able to detect drone threats, vehicular threats, threats on foot, and automatically categorize ‘what is a threat and what is not’ and then present that to you.”

Further, he spoke of the AI as a way to make all of the helmet’s capabilities easy to use without overwhelming the wearer.

“You shouldn’t be toggling between 10 different sensor menus. You should just see seamless view that’s built by kind of an AI interpolator that looks out into the world and says ‘ok well I know he probably wants to see all of the hot human signatures, I know he probably wants to see all the drones…’ you can build technology that is transparent to the user,” said Luckey. “[…] maybe I’m not the guy to argue that the tech is easy to use because I’m a hardcore technohead from birth and I can operate wacky stuff. But you can put it on a normal person… they can look out into the world and do things and see things with zero training that they never would have been able to do otherwise. I’m not concerned about information overload because I’m [confident in our ability to build the right tool for the job].”

Regarding manufacturing, Luckey said the Eagle Eye XR helmet will be built in the US or with US allies, with “no Chinese parts,” as a matter of operational security. He expects the first prototypes of Eagle Eye this year, and says the company already has working prototypes.

“We’re gonna be delivering the first prototypes to the army this year. That’s the intent anyway, if all goes according to plan in the way that I hope,” he told Vance. “But we’ve been working on the technology that underpins Eagle Eye for years. And we’ve been making a really serious hardware effort for over a year at this point. And so actually there’s an Eagle Eye sitting on my desk back at my office right now.”

Reunion with Meta and Zuckerberg

But how did Luckey go from having his VR startup (Oculus) acquired by Meta, then getting fired from Meta for political backlash, starting a military technology company (Anduril), raising it to a valuation of billions, and then end up partnering once again with the company that had booted him out?

Well, by Luckey’s telling, it started last year when Meta CEO Mark Zuckerberg offered a quote to an article about Luckey that was surprisingly conciliatory. That openness from Zuckerberg (and outright apology from Meta CTO Andrew “Boz” Bosworth) opened the door to a renewed relationship.

“We ended up reconnecting [after the article], talking about some of the problems that are going on with America, some of the inefficiencies that exist for terrible reasons… how there are people who are dying needlessly because of barriers between our technology industry and our national security community,” Luckey said on the Core Memory podcast. “We ended up deciding that this was something that we needed to work on together. Meta’s been doing a lot more on the national security front; they’ve been working a lot more work with the government.”

Luckey says he’s moved on from any anger he harbored for his firing by Meta, saying that it’s a different company than it was those nine years ago—not just culturally, but also many of the people advocating for his ousting are no longer working at Meta.

Luckey sees the partnership as a win for Anduril (as it doesn’t need to rebuild key XR technology), while saving the American taxpayer from paying for tech that already exists in the private sector.

“[…] there’s a lot of things in Meta that I invented, my team invented, before they acquired [Oculus]. There’s other things that I invented, that the team invented, while I was at Facebook (now Meta). And there was a bunch of technology that was invented after I was fired,” he explained to Vance. “And this partnership is about taking that entire base of technology and IP—around hardware, software, in AI, VR, AR space—and applying it to solving our military’s most pressing challenges. It’s taking a lot of the people who have been working on these technologies for consumer applications and adapting their work to solve national security problems at a very low cost to the taxpayer.”

Luckey says the partnership will allow Anduril to build “the world’s best” XR tech for the US government and allies.

On the other hand, he said that the details of the partnership with the likes of Meta and Qualcomm mean that future innovations will hopefully trickle back to the consumer side.

“The way I see this is: the tech that we’re building—working with partners like Qualcomm and Meta—they’re going to be able to bring back into their consumer devices. And that’s the way our licensing agreement works,” he told Riggs. “The tech that we co-develop together… I’m the guy who is going to be deploying it to the military; they’re going to be the people taking it back into the consumer realm.”

It’ll be some time yet until we know more about what Eagle Eye actually looks like and how it works, but there may well be some overlap with Microsoft’s prototype IVAS system, as that’s the helmet that Eagle Eye is being built to replace.

Filed Under: News, XR Industry News

Meta Teases Oakley Partnership for Sportier Smart Glasses, Reportedly Releasing This Year

June 16, 2025 From roadtovr

Meta officially confirmed the expansion of its EssilorLuxottica partnership to include a pair of Oakley smart glasses—possibly arriving soon.

Earlier this year, Bloomberg’s Mark Gurman reported that Meta was looking to expand its line of smart glasses beyond Ray-Ban Meta, which would include two possible new devices: a sportier Oakley-branded model, and a high-end model with built-in display—the latter has yet to be announced.

Now, Meta CTO Andrew ‘Boz’ Bosworth confirmed that ‘Oakley Meta’ smart glasses are coming in an X post, showing a graphic of both brands merging and linking to a new @oakleymeta profile.

🤘🏼 @oakleymeta pic.twitter.com/lRL6oimgMR

— Boz (@boztank) June 16, 2025

Details remain scarce, however Gurman’s January report maintained the Oakley smart glasses would be designed for athletes and could launch sometime this year.

Meta’s EssilorLuxottica partnership has been growing steadily since the release of the first-gen Facebook Ray-Ban Stories in 2021, prompting the company to offer a second-gen version in 2023, Ray-Ban Meta, which which introduced updated styles, improved audio and cameras, and on-board AI features.

In late 2024, Meta announced it was expanding its smart glasses partnership with EssilorLuxottica into 2030. At the time, Meta CEO Mark Zuckerberg described its long-term roadmap as giving the companies “the opportunity to turn glasses into the next major technology platform, and make it fashionable in the process.”

In addition to Ray-Ban and Oakley, the French-Italian luxury eyewear company owns other major brands, including Persol, Oliver Peoples, and Vogue Eyewear, along with eyewear retailers LensCrafters, Pearle Vision, and Sunglass Hut.

Filed Under: News, XR Industry News

Disney in Talks with Jim Henson Company to Bring ‘The Muppets’ to VR

June 16, 2025 From roadtovr

Jim Henson’s Muppets could be coming to VR following talks with Disney—possibly offering a clue at the sort of content Meta reportedly hopes to bring to its next VR headset.

Disney held an event on June 14th celebrating the 70th anniversary of The Jim Henson Company. The event was also a bittersweet sendoff for one of Disney’s Hollywood Studios most famous long-running attractions, Muppet* Vision 3D.

As reported by Disney fan site Laughing Place, The Jim Henson Company CEO Lisa Henson announced at the event that, while Disney closed the physical attraction a few days prior, the company was now “exploring ways to preserve the film and other parts of the experience for fans to enjoy in the future.”

This, Henson said, included discussions with Disney about bringing the attraction-based film to VR, with Laughing Company reporting that the Muppet* Vision 3D was captured using VR cameras.

This follows a Wall Street Journal report from earlier this month alleging that Meta is currently shopping for branded immersive content from companies such as Disney, A24, and smaller production studios.

The WSJ report maintains Meta is hoping to sign timed-exclusive episodic and standalone immersive video content geared towards its next VR headset.

Codenamed ‘Loma,’ the reported device is said to feature a design similar to a pair of eyeglasses that connects to a tethered puck, which is described as having greater compute power than its Quest 3 series of headsets, and a price of “less than $1,000.”

Filed Under: Meta Quest 3 News & Reviews, News

Google’s First ‘Beam’ Videoconferencing Device is ‘HP Dimension’, Coming Late 2025 at $25,000

June 13, 2025 From roadtovr

HP announced last year it was going to be the first to offer hardware based on Google Beam (formerly ‘Project Starline’), the light field-based 3D videoconferencing platform. Now, HP unveiled ‘Dimension’, which is being pitched to enterprise at $25,000 a pop.

HP Dimension with Google Beam is said to use six cameras and “state of the art AI” to create a realistic 3D video of each participant, displayed on a special 65-inch light field display with realistic size, depth, color, and eye contact.

HP says the device, which will be sold to select partners starting in late 2025, will be priced at $25,000. This notably doesn’t come with the Google Beam license, which is sold separately.

Image courtesy Google, HP

As an enterprise-first device, HP Dimension is slated to support Zoom Rooms and Google Meet, so it can do 3D immersive chats, but also 2D traditional group meetings, integrating cloud-based video services such as Teams and Webex.

“We believe that meaningful collaboration thrives on authentic human connections, which is why we partnered with Google to bring HP Dimension with Google Beam out of the lab and into the enterprise,” said Helen Sheirbon, SVP and President of Hybrid Systems, HP Inc. “HP Dimension with Google Beam bridges the gap between the virtual and physical worlds to create lifelike virtual communication experiences that brings us closer together.”

First introduced in 2021, Google Beam (ex ‘Project Starlight’) combines a light-field display to show natural 3D depth without the need for an XR headset or glasses of any sort—essentially simulating a face-to-face chat between two people.

In its testing, HP says Beam this makes for 39% more non-verbal behaviors noticed, as well as 37% more users noting better turn taking, and 28% noticing an increase in memory recall over traditional videoconferencing platforms.

Filed Under: News, XR Industry News

Report: Samsung’s Project Moohan XR Headset May Get a Launch Date at Unpacked Next Month

June 12, 2025 From roadtovr

Samsung Unpacked is expected to kick off next month with the usual slate of hardware announcements, which this year could include the company’s latest foldable smartphones, Galaxy Z Flip 7 and Fold 7, and its latest Galaxy Watch 8. Rumors suggest though the company is also looking to put its upcoming XR headset, Project Moohan, in the spotlight too.

Project Moohan was announced alongside Android XR back in December 2024, which will be the first device to run Google’s upcoming XR operating system. Samsung has said in the past that consumers should expect Project Moohan’s launch sometime this year, although it still doesn’t have a specific date or official name scheme.

Now, Samsung serial leaker ‘Panda Flash‘ reports the company’s upcoming mixed reality headset could finally get a release date there.

While were initially expecting to hear something about Project Moohan at Google I/O last month (we didn’t), Samsung might be keeping the device a little closer to home than initially thought.

Samsung Project Moohan | Image courtesy The Verge

Panda Flash, who has been following Galaxy Z Flip 7 and Fold 7 leaks and supply chain rumors, additionally reports the headset will launch first in South Korea, and then gradually launch globally sometime afterwards—essentially mirroring Apple’s US-first launch of Vision Pro before heading into other markets.

Samsung has shown its supposed Vision Pro competitor at a number of events over the past year, which includes our opportunity to go hands-on with Project Moohan in December, although the company has largely stayed mum on revealing the XR headset’s full spec sheet.

So far, we know the Android XR headset is packing in a Qualcomm Snapdragon XR2 + Gen 2, Sony-sourced micro‑OLED panels (resolution still TBA), pancake lenses, automatic interpupillary distance (IPD) adjustment, support for eye and hand-tracking, optional magnetically-attached light shield, and a removable external battery pack. It also supports VR motion controllers of some sort, although we haven’t seen those either.

We’re also hoping to learn more about the company’s smart glasses efforts; Samsung is reportedly working on a pair of smart glasses that could launch sometime this year—ostensibly looking to serve up competition to Ray-Ban Meta Smart Glasses.

Whatever the case, we’ll be looking out for official dates for Samsung Unpacked, which is expected to take place sometime early next month in New York City.

Filed Under: News, XR Industry News

Apple Details Vision Pro’s New Persistent Widget System Coming to VisionOS 26

June 10, 2025 From roadtovr

At WWDC this week, Apple revealed a new persistent widget system coming to Vision Pro in visionOS 26. The system will allow users to anchor widgets against walls or on surfaces that will always stay in the same place with glanceable info.

In a developer session released during WWDC this week, Apple delved into the new widget system for Vision Pro, exploring how they work, customization options, and more. Unlike typical Vision Pro windows, which float in space wherever the user places them (and will relocate when the headset is recentered or rebooted), widgets can be placed against walls or on flat surfaces and will always stay in that place, even if the headset is restarted. This makes widgets on VisionOS 26 act like persistent parts of your physical environment.

Apple explained that the system supports existing widgets built for other Apple platforms (like iOS and iPadOS), which means there will be a wealth of widgets for Vision Pro users. But Apple has also extended its WidgetKit platform with new options that are specific to Vision Pro. That includes new sizes and aspect ratios to choose from.

Image courtesy Apple

Additionally, developers will be able to choose between a ‘paper’ and ‘glass’ overlay which will define how the widget responds to the lighting in the user’s space.

Image courtesy Apple

For widgets with static info like photos or cover art, Apple suggests the paper style so that they will dim with the room’s lighting, making them appear more like part of the actual room.

For widgets that present dynamic information, Apple suggests the glass style, noting that information on the glass style will stay illuminated to ensure it’s always easy to see.

Widgets made for VisionOS 26 can also be proximity-aware, allowing the widget to change between states based on how near or far the user is from the widget. This enables developers to show simplified information (like weather conditions) from across the room, then surface more detail when the user gets closer (like the upcoming week’s forecast).

Image courtesy Apple

Among the new first-party Apple widgets that are specific to Vision Pro is an album poster which shows album art on the wall and then plays the album when the user clicks on it. Another is a ‘photo window’ that allows you to create a virtual window with a panoramic photo that makes it seem like you’re actually looking out into the scene. However, these photos are flat for now rather than spatial.

View post on imgur.com

The Clock widget has gotten an overhaul with new designs and enhanced detail for viewing up close.

Apple really wants widgets to feel not like floating windows but part of the user’s actual space. Thus, they must be placed against walls or on flat surfaces, and they will also be occluded by other virtual content and by the real world environment, like furniture or walls.

To keep widgets grounded as part of the real environment, they are always contained within a frame that casts a realistic shadow.

While developers have new tools for making widgets on Vision Pro, users are also given a range of customization options.

Image courtesy Apple

Widgets can be adjusted to be 75% to 125% of their original size. They can be ‘elevated’ to sit on the wall like a picture frame, or ‘recessed’ which sinks them slightly into the wall, making them feel like part of it. When in ‘elevated’ mode, users can define the thickness of the frame.

Image courtesy Apple

Users can also choose between a handful of colors, in both light and dark variations.

Image courtesy Apple

When arranging widgets against a wall, nearby widgets will automatically snap into a grid arrangement for easy grouping.

Widgets can be interactive, allowing a user to, for instance, check off a to-do list item by using look-and-pinch or physically touching the widget. If a widget doesn’t have specific interactions, interacting with it will launch the parent application by default.

Widgets are already available in the VisionOS 26 developer beta that released this week, and is expected to be released to the public this fall.

Filed Under: Apple Vision Pro News & Reviews, News

Snap Plans to Launch New Consumer ‘Specs’ AR Glasses Next Year

June 10, 2025 From roadtovr

Snap, the company behind Snapchat, today announced it’s working on the next iteration of its Spectacles AR glasses (aka ‘Specs’), which are slated to release publicly sometime next year.

Snap first released its fifth generation of Specs (Spectacles ’24) exclusively to developers in late 2024, later opening up sales to students and teachers in January 2025 through an educational discount program.

Today, at the AWE 2025, Snap announced it’s launching an updated version of the AR glasses for public release next year, which Snap co-founder and CEO Evan Spiegel teases will be “a much smaller form factor, at a fraction of the weight, with a ton more capability.”

There’s no pricing or availability yet beyond the 2026 launch window. To boot, we haven’t even seen the device in question, although we’re betting they aren’t as chunky as these:

Snap Spectacles ’24 | Image courtesy Snap Inc

Spiegel additionally noted that its four million-strong library of Lenses, which add 3D effects, objects, characters, and transformations in AR, will be compatible with the forthcoming version of Specs.

While the company isn’t talking specs (pun intended) right now, the version introduced in 2024 packs in a 46° field of view via stereo waveguide displays, which include automatic tint, and dual liquid crystal on silicon (LCoS) miniature projectors boasting 37 pixels per degree.

As a standalone unit, the device features dual Snapdragon processors, stereo speakers for spatial audio, six microphones for voice recognition, as well as two high-resolution color cameras and two infrared computer vision cameras for 6DOF spatial awareness and hand tracking.

There’s no telling how these specs will change on the next version, although we’re certainly hoping for more than the original’s 45-minute battery life.

Snap Spectacles ’24 | Image courtesy Snap Inc

And as the company is gearing up to release its first publicly available AR glasses, Snap also announced major updates coming to Snap OS. Key enhancements include new integrations with OpenAI and Google Cloud’s Gemini, allowing developers to create multimodal AI-powered Lenses for Specs. These include things like real-time translation, currency conversion, recipe suggestions, and interactive adventures.

Additionally, new APIs are said to expand spatial and audio capabilities, including Depth Module API, which anchors AR content in 3D space, and Automated Speech Recognition API, which supports 40+ languages. The company’s Snap3D API is also said to enable real-time 3D object generation within Lenses.

For developers building location-based experiences, Snap says it’s also introducing a Fleet Management app, Guided Mode for seamless Lens launching, and Guided Navigation for AR tours. Upcoming features include Niantic Spatial VPS integration and WebXR browser support, enabling a shared, AI-assisted map of the world and expanded access to WebXR content.

Releasing Specs to consumers could put Snap in a unique position as a first mover; companies including Apple, Meta, and Google still haven’t released their own AR glasses, although consumers should expect the race to heat up this decade. The overall consensus is these companies are looking to own a significant piece of AR, as many hope the device class will unseat smartphones as the dominant computing paradigm in the future.

Filed Under: AR Development, News, XR Industry News

Spatial Photos on Vision Pro Are Getting a Volumetric Upgrade for Greater Immersion

June 9, 2025 From roadtovr

At WWDC today, Apple announced the headlining features of visionOS 26, its next big OS release for Vision Pro. Among them is a new revamped spatial photos feature that ought to make them even more immersive.

Vision Pro launched with the ability to view spatial photos, captured either with the headset itself or with iPhone 16, 15 Pro and Pro Max. These spatial photos created a sense of depth and dimensionality by combining stereo capture and applying depth mapping to the image.

Now, Apple says it’s applied a new generative AI algorithm to create “spatial scenes with multiple perspectives, letting users feel like they can lean in and look around,” essentially ‘guessing’ at details not actually captured on camera.

With visionOS 26, Vision Pro users will be able to view spatial scenes in the Photos app, Spatial Gallery app, and Safari. The company says developers will also be able to use the Spatial Scene API to add the feature into their apps.

To show off the new AI-assisted spatial photos feature, real-estate marketplace Zillow says it’s adopting Spatial Scene API in the Zillow Immersive app for Vision Pro, which lets users to see spatial images of homes and apartments.

Apple’s visionOS 26 is slated to arrive sometime later this year, although the company says testing is already underway.

Filed Under: Apple Vision Pro News & Reviews, News

Vision Pro’s Next Big Update Will Add Anchored Widgets That Live Around Your House

June 9, 2025 From roadtovr

Apple today announced at WWDC that Vision Pro is getting spatialized Widgets, coming along when visionOS 26 drops later this year.

On basically all of Apple’s devices, Widgets are designed to offer up personalized and useful info at a glance.

Now Apple says Vision Pro is also getting spatial Widgets too, which will let you place a variety of these mini-apps around your house which reappear every time you put on Vision Pro.

Apple says Widgets in visionOS 26 are “customizable, with a variety of options for frame width, color, and depth. Beautiful new widgets — including Clock, Weather, Music, and Photos — all offer unique interactions and experiences.”

Essentially, you’ll be able to decorate you space with things like spatial photos, clocks with distinctive face designs, a calendar with your events, and also quick access to music playlists and songs so you can, say, keep your favorite track in a specific part of your room.

Notably, Apple says developers will be able to create their own widgets using WidgetKit. There’s no word on exactly when visionOS 26 releases, although the company says we can expect it sometime later this year.


This story is breaking. We’re currently at WWDC today, and will report back when we learn more about all things Vision Pro.

Filed Under: Apple Vision Pro News & Reviews, News

Next Page »

  • Home