• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

News

Meta is Testing a Quest UI Overhaul and 3D Instagram Photos in Latest Horizon OS Release

May 23, 2025 From roadtovr

Meta announced it’s now running a test in Quest’s latest Horizon OS release (v77) that overhauls the platform’s dock-based UI for a new launcher overlay. Additionally, Meta says some users will also see 3D Instagram photos in their feed on Quest too, which is neat.

First teased at Connect 2024, Meta is finally bringing Navigator to Quest, which serves as a new centralized hub for apps, quick actions, and system functions.

“As part of our work to develop a fully spatial operating system designed around people, Navigator gives you convenient access to your recently used applications, with the added ability to pin up to 10 items in your library for quick access and seamless task resumption. This makes it easier to multitask in-headset and connect with the people and things you care about most,” Meta says in the v77 patch notes.

Essentially, Navigator is supposed to make it easier to access system-level controls and then quickly return to what you were doing in-headset. More specifically, the new UI should feel pretty familiar to smartphone users thanks to its more traditional layout.

YouTuber ‘The Construct’ shows off Navigator, including a tutorial video and hands-on impressions:

“We designed Navigator based on everything we’ve learned over the last decade. It’s unobtrusive, intuitive, and built from the ground up for the unique needs of spatial computing,” Meta says.

The company says Navigator will begin rolling out as a limited test to some people on the Public Test Channel (PTC) v77, which is expected to roll out gradually to all users over the coming months.

Additionally, Instagram is getting a little love on Quest too, as Meta says it’s currently testing 3D-ified photos on the platform. For some users on PTC v77, Meta’s AI will automatically transform existing 2D photos not originally captured in 3D into an immersive format.

“And it’s an early look at our plans to continue bringing more social and entertainment experiences that are 2D today into a more immersive, 3D future,” Meta says.

Note: To enroll in Quest’s Public Test Channel (PTC), you need to use the Meta Horizon app on your phone and navigate to the ‘Devices’ section. Select your Quest headset and then go to ‘Headset settings’ and then ‘Advanced Settings’. Finally, toggle on ‘Public Test Channel’.

Filed Under: Meta Quest 3 News & Reviews, News

Google Teases Next Android XR Device: XREAL’s Upcoming AR Glasses ‘Project Aura’

May 21, 2025 From roadtovr

When it launches later this year, Android XR is coming first to Samsung’s mixed reality headset, Project Moohan. Now, Google has tapped AR glasses creator XREAL to be the second with its newly unveiled Project Aura.

Google announced at its I/O developer event that China-based XREAL will be the second device officially slated to run Android XR, the company’s forthcoming XR operating system currently in developer preview.

Codenamed Project Aura, the companies describe the optical see-through (OST) device as “a portable and tethered device that gives users access to their favorite Android apps, including those that have been built for XR.”

Information is still thin, however XREAL says Project Aura was created in collaboration with Google and chip-maker Qualcomm, and will be made available to developers “soon after” the launch of Project Moohan, which was recently affirmed to arrive later this year.

Image courtesy XREAL

XREAL hasn’t released specs, although the company has a track record of pairing micro-OLEDs with birdbath optics, which differs from the more expensive waveguide optics seen in devices such as Microsoft HoloLens, Magic Leap One, or Meta’s Orion AR glasses prototype.

Birdbath optics use a curved mirror system for brighter, higher field-of-view (FOV) and lower-cost AR displays, although this typically results in bulkier designs. Waveguides are often thinner and more expensive to manufacture, but provide more wearable form factors with better transparency; waveguides also typically feature a lower FOV, although prototypes like Meta Orion are bucking that trend.

Like the Android XR glasses seen on stage at Google I/O, which are coming from eyewear companies Warby Parker and Gentle Monster, XREAL Project Aura is expected to feature built-in Gemini AI, allowing it do things like real-time translation, AI assistant chats, web searches, object recognition, and displaying contextual info.

Choosing XREAL as its next Android XR hardware partner makes a good deal of sense here. Founded in 2017, XREAL (previously Nreal) has developed a number of AR glasses generations over the years, including its own custom Android launcher, Nebula, to handle native AR experiences on Android devices.

Like previous XREAL devices, Project Aura is meant to be a tethered, and not standalone. It’s uncertain just what external device the device will run Android XR, be it a standard smartphone or dedicated ‘puck’ like XREAL Beam.

That said, XREAL says they’ll be talking more about Project Aura at the Augmented World Expo (AWE) next month, which takes place June 10th – 12th in Long Beach, California. We’re going to present at AWE this year, so check back soon for more on all things XR to come from the event.

Filed Under: AR News, News

Google Partners with Prominent Eyewear Makers for Upcoming Android XR Smartglasses

May 20, 2025 From roadtovr

Google today announced that it is working with eyewear makers Warby Parker and Gentle Monster to bring the first Android XR smartglasses to market. The move mirrors Meta’s early partnership with EssilorLuxottica, the dominant eyewear maker that’s behind Meta’s Ray-Ban smartglasses.

While no productized Android XR smartglasses have been announced, Google said today it is working with eyewear makers Warby Parker and Gentle Monster on the first generation of products. Android XR smartglasses will prominently feature Google’s Gemini AI, and some will include on-board displays for visual output.

Image courtesy Google

Warby Parker is a well known American eyewear brand, founded in 2010, which has pioneered a lower cost, direct-to-consumer glasses business. Gentle Monster, founded in 2011, is a well known South Korean eyewear brand, and has a similar approach as Warby Parker.

While influential, both eyewear makers pale in comparison to EssilorLuxottica, the massive eyewear and lens conglomerate behind brands like Ray-Ban and Oakley.

EssilorLuxottica and Meta partnered several years ago around their smartglasses ambitions. Things seem to be going well for the partnership as the duo has launched several iterations of the Meta Ray-Ban smartglasses featuring classic Ray-Ban designs.

Ray-Ban Meta Glasses, Image courtesy Meta, EssilorLuxottica

Google is now taking the same tact by partnering with two well known glasses-makers to ensure that it has strong brand and fashion credibility behind its upcoming Android XR smartglasses.

The company’s first pair of smartglasses, Google Glass, launched way back in 2012. Although they were impressively compact for their time (especially considering the inclusion of a display), the asymmetrical design of the bulky display optics was seen as socially off-putting—just a bit too weird to pass as regular glasses.

That sent Google (and others) back to the drawing board for years, waiting until the tech could advance enough to make smartglasses that looked more socially acceptable.

It’s unclear when the first Android XR smartglasses will launch, or what they might cost, but Google also said today that developers will be able to start developing for Android XR smartglasses later this year.

Filed Under: News, XR Industry News

Project Starline Immersive Videoconferencing Now Branded Google Beam, Heading to Market with HP

May 20, 2025 From roadtovr

Today at its annual I/O developer conference, Google affirmed plans to bring its Project Starline immersive videoconferencing platform to market with HP. While this partnership was confirmed last year, the product is now officially called Google Beam, with more info promised soon.

Google’s Project Starline is a platform for immersive videoconferencing which was first introduced in 2021. But rather than using a headset, the platform is built around cameras and a light-field display. The light-field display shows natural 3D depth without the need for the viewer to wear a headset or glasses. The goal, the company says, is to create a system that feels like two people are talking to each other face-to-face in the same room, rather than feeling like they are separated by a screen and cameras.

Image courtesy Google

Google has been evolving the system over the years to improve usability and quality. Today the company showed a glimpse of the latest version of the system which it says is coming to market under the name Google Beam.

Image courtesy Google

As confirmed last year, Google is working with HP to bring Google Beam to market starting this year with an initial focus on enterprise customers seeking high-quality videoconferencing. While details are still light, Google says that “HP will have a lot more to share a few weeks from now.”

Image courtesy Google

Filed Under: News, XR Industry News

Google Teases Android Smart Glasses Ahead of I/O Developer Conference Next Week

May 16, 2025 From roadtovr

Google may be getting ready to unveil a pair of smart glasses at its Google I/O developer conference next week, ostensibly hoping to take on Ray-Ban Meta Glasses.

In a promo for Google I/O, Android Ecosystem President Sameer Samat showed off what appears to be a pair of smart glasses.

While Samat didn’t speak directly about the device, when donning the glasses, he said Google I/O attendees will have a chance to see “a few more really cool Android demos.”

Using our CSI-style enhancement abilities (aka ‘crop a YouTube screenshot’), the distinctly Ray-Ban Wayfarer-style glasses appear to have a single camera sensor on the left temple.

Image courtesy Google

There also what appears to be an LED above the camera sensor, likely to inform others when video or pictures are being taken, which may indicate it’s going for feature parity with Ray-Ban Meta Glasses.

The glasses chunky arms are also likely packed with battery and onboard processors, which, owing to Samat’s tease, is probably running some version of its upcoming Android XR operating system. Notably, just under the left arm we can see a small slit close to Samat’s ear, possibly for integrated audio. Alternatively, it may not be a a slit at all, but rather a button of some sort.

Meanwhile Apple may be readying its own pair of smart glasses, with a recent Bloomberg report maintaining the company is now developing a processor specifically optimized for the task.

In any case, we’re hoping to find out more at Google I/O, which is slated to kick off May 20th – 21st where the company will feature livestreamed keynotes, developer sessions, and more. Outside of the keynote, which may actually mention Android XR, the event is set to include two developer talks specifically dedicated to Android XR.

We’ll of course be tuning in, although you can watch the keynote live on YouTube starting on Tuesday, May 20th at 10 AM PT (local time here).

Check out the moment below:

Filed Under: News, XR Industry News

Apple is Reportedly Developing Smart Glasses to Rival Ray-Ban Meta Glasses

May 13, 2025 From roadtovr

Apple is reportedly developing a new chip for an upcoming pair of smart glasses which is aiming to compete with Ray-Ban Meta Glasses, according to a recent Bloomberg report from Mark Gurman.

Apple’s smart glasses chip is reportedly based on the low-energy processors used in Apple Watches, which are being optimized for power efficiency and the ability to control multiple cameras.

The report maintains production of the chip is expected to start by late 2026 or 2027, positioning the device for a market launch within the next two years. Apple’s long-time chips partner Taiwan Semiconductor Manufacturing Co. is expected to handle production.

“Apple is currently exploring non-AR [smart] glasses that use cameras to scan the surrounding environment and rely on AI to assist users,” Gurman says. “That would make the device similar to the Meta product, though Apple is still figuring out the exact approach it wants to take. The iPhone maker also needs its own artificial intelligence technology to vastly improve before the company can roll out a compelling AI-centric device.”

As for Apple’s continued augmented reality efforts, Bloomberg reported in April that Apple CEO Tim Cook “cares about nothing else” than beating Meta to market with a pair of AR glasses, representing a multi-year challenge that goes far beyond creating a pair of smart glasses.

In short, smart glasses like Meta Ray-Ban Glasses can play audio, take pictures, make phone calls, and access a voice assistant. The latest version of the device, released in 2023, has been so successful though, Meta is reportedly set to release the next generation of the device to include a single heads-up display.

Meanwhile, the sort of all-day AR glasses companies like Apple, Google and Meta are hoping to build go several steps further. AR glasses overlay digital content onto the real-world, blending virtual objects or information with the physical environment through transparent displays, requiring more advanced sensors, displays, optics, processors, batteries, and cooling management to achieve.

Filed Under: Apple Vision Pro News & Reviews, News

Pimax Delays Thin & Light ‘Dream Air’ PC VR Headset to Q3 2025, Reveals Cheaper ‘Dream Air SE’ Version

May 12, 2025 From roadtovr

Pimax announced Dream Air last December, aiming to take on the emerging segment of compact high-end PC VR headsets, such as Bigscreen Beyond and Shiftall MaganeX Superlight 8K. And before the company has even released Dream Air, Pimax revealed it’s also producing a cheaper version: Dream Air SE.

Previously expected to release in May, the company announced during its Pimax Connect event that Dream Air has been delayed to August-September 2025, as the company says it’s waiting on high-end Sony micro-OLED panels, 3,840 × 3,552 per-eye.

“Sony’s micro-OLED panels are top-tier, also used by Apple and Google,” says Pimax European Marketing Director Martin Lammi. “They have an excellent quality consistency across all panels and their visual effect is better. This is because the brightness is higher and the pixels have a wider view angle or ‘chief ray angle’, up from 15 degrees to 20 degrees.”

Other updates to Dream Air include a more balanced split-cable design, which was revealed in March, as well as an optional flip-up style halo headstrap, and support for third-party head straps, such as HTC’s Deluxe Audio Strap.

In the meantime, Pimax revealed Dream Air SE, which includes many of the same features of Dream Air, including micro-OLED panels, integrated audio, self-adjusting strap, pancake lenses, hand-tracking, and Tobii eye-tracking. The standout difference though is Dream Air SE’s 2,560 × 2,560 resolution micro-OLEDs and lower price.

Image courtesy Pimax

Dream Air SE starts at $899 for the Lighthouse version, appealing to those with existing SteamVR base stations and controllers. The SLAM version, priced at $1,199, includes controllers and inside-out tracking. You can find them both available for pre-order on Pimax’s website.

Like Pimax’s other headsets, users pay an upfront cost for the headset which comes with a 14-day trial period. Afterwards, if users want to keep the headset, they then pay a Pimax Prime software membership for continued access. Here’s how that breaks down:

  • Dream Air SE – SLAM Version: $699 upfront + $500 Prime = $1,199 total
  • Dream Air SE – Lighthouse Version (no controllers or basestation): $599 upfront + $300 Prime = $899 total

For comparison, Dream Air Lighthouse version starts at $1,899 ($1,199 upfront + $700 Prime), with the Dream Air SLAM version priced at $2,199 ($1,399 upfront + $800 Prime).

Check out the Pimax Connect announcement below:

Filed Under: News, PC VR News & Reviews

Half the Size & Half the Price is What Vision Pro Needs to Take Off

May 6, 2025 From roadtovr

Apple has set the bar for UX on a standalone headset. As soon as the company can get the same experience into a smaller and cheaper package, it’s going to become significantly more appealing to a wider range of people.

Apple has billed Vision Pro as “tomorrow’s technology, today.” And frankly, that feels pretty accurate if we’re talking about the headset’s core user experience, which is far beyond other products on the market. Vision Pro is simple and intuitive to use. It might not do as much as a headset like Quest, but what it does do, it does extremely well. But it’s still undeniably big, bulky, and expensive… my recommendation is that it’s not worth buying for most people.

And that’s probably why there seems to be a broadly held notion that Vision Pro is a bad product… a rare flop for Apple. But as someone who has used the headset since launch, I can plainly see all the ways the headset is superior to what else is out there.

Saying Vision Pro is a bad product is a bit like saying a Ferrari is a bad car for not being as widespread as a Honda Accord.

I don’t know if the first generation of Vision Pro met Apple’s sales expectations or fell short of them. But what I do know is that the headset offers an incredibly compelling experience that’s significantly held back by its price and size.

If Apple can take the exact same specs, capabilities, and experience, and fit them into something that’s half the size and costs half as much, I’m certain the headset will see a massive boost in demand.

A more compact Vision Pro concept | Photo generated by Road to VR

Cutting it down to half the size would mean bringing it down around 310 grams; certainly not be easy but also not entirely unrealistic, especially if they stick to an off-board battery. After all, Bigscreen Beyond is around 180 grams. It might not be a standalone headset, but it shows how compact the housing, optics, and displays can be.

And half the cost would mean a price tag of roughly $1,750. Still not cheap compared to most headsets out there, but significantly more attainable, especially if Apple can market it as also being the best TV most people will have in their home.

This might seem obvious. Making any tech product smaller and cheaper is a good thing.

But my point here is that Vision Pro is disproportionately held back by its size and cost. It has way more to be gained by halving its size and cost than Quest, for instance, because Quest’s core UX is still very clunky.

Fitting the Quest experience into something half the size and half the cost would be nice, but the core UX would still be holding it back in a big way.

On the other hand, Vision Pro feels like its core UX is just waiting to be unleashed… halving the size and cost wouldn’t just be nice, it would be transformative.

Of course this is much easier said than done. After all, you might counter that the very reason why Vision Pro’s core UX is so great is because it costs so much. It must be the expensive hardware that makes the difference between Quest and Vision Pro.

While this is perhaps true in some specific cases, in so many more cases, it’s the software experience that makes Vision Pro excel in usability. For instance, we explained previously that Quest 3 actually has higher effective resolution than Vision Pro, but it’s the thoughtful software design of Vision Pro that lead most people to the conclusion that Vision Pro looks much better visually.

And when I say that Vision Pro will take off when it reaches half the size and half the price, I’m not even factoring in several key improvements that will hopefully come with future versions of the headset (like sharper passthrough with less motion blur and some enhancements to the software).

Apple has set a high bar for how its headset should feel and how easy it is to use. The question now is not if, but when can the company deliver the same experience in a smaller and less expensive package.

Filed Under: Apple Vision Pro News & Reviews, News, XR Industry News

Spacetop Launches Windows App to Turn Laptops into Large AR Workspaces

May 2, 2025 From roadtovr

Late last year, Sightful announced it was cancelling its unique laptop with built-in AR glasses, instead pivoting to build a version of its AR workspace software for Windows. Now the company has released Spacetop for Windows, which lets you transform your environment into a private virtual display for productivity on the go.

Like its previous hardware, Spacetop works with XREAL AR glasses, however the new subscription-based app is targeting a much broader set of AI PCs, including the latest hardware from Dell, HP, Lenovo, Asus, Acer and Microsoft.

Previously, the company was working on its own ‘headless’ laptop of sorts, which ran an Android-based operating system called SpaceOS. Sightful however announced in October  2024 it was cancelling and refunding customers for its Spacetop G1 AR workspace device, which was slated to cost $1,900.

At the time, Sightful said the pivot came down to just how much neural processing units (NPU) could improve processing power and battery efficiency when running AR applications.

Image courtesy Sightful

Now, Sightful has released its own Spacetop Bundle at $899, which includes XREAL Air 2 Ultra AR glasses (regularly priced at $699) and 12-month Spacetop subscription (renews annually at $200).

Additionally, Sightful is selling optional optical lenses at an added cost, including prescription single-vision lens inserts for $50, and prescription progressive-vision lens inserts for $150.

Recommended laptops include Dell XPS Core Ultra 7 (32GB), HP Elitebook, Lenovo Yoga Slim, ASUS Zenbook, Acer Swift Go 14, and Microsoft Surface Pro for Business (Ultra 7), however Sightful notes this list isn’t exhaustive, as the cadre of devices which integrate Intel Core Ultra 7/9 processors with Meteor Lake architecture (or newer) is continuously growing.

Key features include:

  • Seamless access to popular apps: Spacetop works with consumer and business apps
    that power productivity every day for Windows users
  • Push, slide, and rotate your workspace with intuitive keystrokes
  • Travel mode that keeps your workspace with you on the go, whether in a plane, train, coffee shop, Ubering, or on your sofa
  • Bright, crystal-clear display that adjusts to lighting for use indoors and out
  • Natural OS experience, designed to feel familiar yet unlock the potential of spatial computing vs. a simple screen extension
  • All-day comfort with lightweight glasses (83g)
  • Massive 100” display for a multi-monitor / multi-window expansive workspace
  • Ergonomic benefits help avoid neck strain, hunching, and squinting at a small display

Backed by over $61M in funding, Sightful was founded in 2020 by veterans from PrimeSense, Magic Leap, and Broadcom. It is headquartered in Tel Aviv with offices in Palo Alto, New York, and Taiwan. You can learn more about Spacetop for Windows here.

Filed Under: AR Development, ar industry, News, XR Industry News

Quest Devs Can Now Publish Apps That Use the Headset’s Cameras to Scan the World

May 1, 2025 From roadtovr

While Meta’s Quest has always relied heavily on cameras for tracking location of the headset, controllers, and the world around the user, developers haven’t had the same privileged access to the headset’s cameras. Earlier this year Meta gave developers the ability to experiment with direct access to the headset’s cameras in private projects; starting this week developers can now publicly release apps that make use of the new feature.

This week’s update of the Passthrough Camera API for Quest means that developers can now publish apps to the Horizon store that directly access the front-facing cameras of Quest 3 and 3S. This opens the door to third-party applications which can scan the world around the user to understand more about it. For instance, developers could add computer-vision capabilities to track objects or people in the scene, or to build a map of the environment for analysis and interaction.

For a long time this was impossible due to limitations Meta placed on what developers could and couldn’t do with the headset’s hardware. Despite computer-vision capabilities being widely available to developers on smartphones, Meta was hesitant to allow the same on its headsets, apparently due to privacy concerns (and surely amplified by the many privacy controversies the company has faced in the past).

Previously, third-party apps could learn some information about the world around the user—like the shape of the room and objects within it—but this information was provided by the system in a way that prevented apps from directly seeing what the cameras could see. This made it possible for developers to build mixed reality applications that were, to some extent, aware of the space around the user. But it made some use-cases difficult or even impossible; for example, tracking a specific object held by the user.

Last year Meta announced it would finally unlock direct access to the headset’s cameras. In March, it began offering an experimental version of the capability to developers, allowing them to build apps that accessed the headset’s cameras. But they weren’t allowed to publish those apps to the public, until now.

The company has also specified the technical capabilities and performance of the cameras that the developers can access on Quest 3 and 3S:

  • Image capture latency: 40-60ms
  • GPU overhead: ~1-2% per streamed camera
  • Memory overhead: ~45MB
  • Data rate: 30Hz
  • Max resolution: 1280×960
  • Internal data format YUV420

Meta says that a developer’s use of camera data on Quest is covered under its Developer Data Use Policy, including a section on “Prohibited Uses of User Data,” which prohibits certain uses of data, including to “perform, facilitate, or provide tools for surveillance,” and “uniquely identifying a device or user, except as permitted [in the policy].”

Filed Under: Meta Quest 3 News & Reviews, News, XR Industry News

Next Page »

  • Home