• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

News

Unofficial SteamVR Driver Will Reportedly Enable Support for WMR Headsets on Latest Windows Versions

June 9, 2025 From roadtovr

Microsoft pulled the plug on support for its entire WMR platform on Windows 11 last year, putting an official end to the company’s foray into PC VR headsets. Now, an unofficial SteamVR driver hopes to bring it back.

Microsoft deprecated the Mixed Reality Portal app, Windows Mixed Reality for SteamVR, and Steam VR Beta when Windows 11 24H2 update rolled out last October, making a fleet of PC VR headsets from Acer, Asus, Dell, Lenovo, HP, and Samsung essentially expensive paperweights.

Granted, if you haven’t updated to Windows 11 24H2, or are still on Windows 10, Microsoft says you’ll be able to play SteamVR content through November 2026 before the plug is pulled for good. Still, that’s a bitter pill to swallow for users of WMR’s most modern headset, HP Reverb G2, which released less than five years ago.

Now, Reddit user ‘mbucchia’ claims an unofficial SteamVR driver is in the works, which aims to bring all WMR headsets back into the fold sometime this Fall. Below you can see the first HP Reverb (2019) in action:

Dubbed the ‘Oasis’ driver, mbucchia says the SteamVR driver “does not need the Mixed Reality Portal,” which was deprecated in Windows 11 24H2 last year alongside SteamVR beta support.

“This means it can work on Windows 11 24H2 and newer. It supports full 6DoF tracking along with motion controllers,” mbucchia says.

“As mentioned on another post, I don’t have all WMR headsets to test with. Though I can tell you that it [also] works on the original Acer AH100. It should in theory work on any brand/model,” mbucchia says.

The first wave of WMR headsets launched in 2017 | Image courtesy Microsoft

Work on the driver is remaining behind closed doors, mbucchia says, noting that Oasis will be restricted to Nvidia GPUs due to the way SteamVR interfaces with the GPU drivers.

Most interesting of all, though, is Oasis isn’t being undertaken by just anyone. Mbucchia claims they are currently an employee at Microsoft who has previously worked in the company’s Mixed Reality division.

“I am bound by NDAs and other obligations. I want to be clear that I have taken much care to NOT BREACH any of these agreements while working on this project. In particular, I am leveraging SteamVR for a lot of heavy lifting and I am not borrowing any Microsoft intellectual property,” mbucchia says.

For these reasons, Oasis won’t feature a beta, or similar early access, before its release in Fall 2025. It also won’t be open source.

“Much of the code is the result of deep reverse-engineer. Reverse-engineering that if shared, could be construed as exposing internals of programs like SteamVR or the Nvidia GPU drivers,” mbucchia explains. “Not[e] that here again, I am NOT BREACHING any proprietary/intellectual property. Having respect for both Valve and Nvidia, I will not divulge any of the code that they do not consider public.”

Mbucchia says they’ll reveal more about the project in the Windows Mixed Reality subreddit leading up to its Fall 2025 release.

Filed Under: News, PC VR News & Reviews

AWE 2025 Preview: 4 Companies Building XR’s Future

June 6, 2025 From roadtovr

AWE USA 2025, one of the XR industry’s largest annual conferences, kicks off next week. We got a preview of what four interesting companies attending the event will be showing.

As far as industry events go, AWE USA has become our must-attend XR event of the year. It kicks off next week on June 10–12 in Long Beach, CA. As the Premier Media Partner of this year’s event, our exclusive exclusive 20% discount on tickets is still available.

We’ll be on site at the event, reporting on the most important developments. Ahead of AWE though we asked four interesting companies for a preview of what they’re bringing to the show.

CREAL

At AWE 2025, CREAL will showcase its Clarity light-field display. Released at the beginning of the year, CREAL has, since then, continuously improved the image quality by innovating on the spatial light modulator. Visitors will be able to experience the new display technology through a headset as well as a tabletop pair of glasses

Both prototypes feature CREAL’s Clarity display, which includes the light field optical engine and holographic lenses. Beyond the display, the headset prototype integrates off-the-shelf components to enable full-scale demonstrations of our technology, while the glasses prototype is designed with custom components to showcase our ultimate form factor. | Image courtesy CREAL

XREAL

At AWE, XREAL will be demoing the ultra-popular XREAL One Series AR glasses with spatial computing capabilities. Also available for demo will be the XREAL EYE, a modular camera attachment for the One Series. XREAL will also be unveiling an exciting new accessory and showing it off in person for the very first time.

Image courtesy XREAL

ForgeFX

At AWE 2025, ForgeFX Simulations will unveil VORTEX, a next-generation XR training platform engineered for high-risk, high-consequence environments where traditional training methods fall short. Built on the proprietary ForgeSIM framework, VORTEX delivers immersive, AI-enhanced, scenario-based mission rehearsal through photorealistic LiDAR environments, GIS-enabled sand tables, voice-activated SMEs, and real-time performance analytics—already piloted by JPEO-CBRND for CBRN response. ForgeFX is also debuting an enhanced Horizontal Directional Drill (HDD) Simulator for the Meta Quest 3 PCVR, co-developed with Vermeer Corporation, featuring authentic drill controls and a new Auto Rod Exchange module that trains on a previously unsimulated, safety-critical task. At Booth #346, attendees can experience six interactive demos, including the JLG Access Ready XR trainer, Somero S-22EZ Laser Screed simulator, CBRND HoloTrainer, Trumpf Laser Cutting simulator, ForgeFX Excavator trainer, and Ocuweld welding VR simulator, each showcasing ForgeFX’s leadership in immersive, equipment-integrated training solutions.

Image courtesy ForgeFX

PICO

At AWE USA 2025, PICO will showcase the PICO 4 Ultra Enterprise, its latest all-in-one mixed reality headset designed for enterprise applications. Equipped with advanced MR capabilities and the PICOMotion Tracker for full-body and object tracking, the headset empowers industries to deliver highly immersive, practical solutions. PICO has successfully expanded into education, training and location-based entertainment (LBE), and visitors to the booth will have the opportunity to experience a selection of these real-world use cases firsthand. A private meeting space will also be available for deeper conversations about how PICO’s solutions can accelerate business strategies. PICO will also host two featured speaking sessions: ‘Unlocking the Potential of LBE: Scaling with PICO’s XR Solutions’ and ‘Superpowers for Spatial Developers: WebSpatial and SpatialML.’

Image courtesy PICO

What are you hoping to see from AWE 2025? Let us know in the comments below.

Filed Under: News, XR Industry News

A Look Inside Meta’s ‘Aria’ Research Glasses Shows What Tech Could Come to Future AR Glasses

June 5, 2025 From roadtovr

Earlier this year, Meta unveiled Aria Gen 2, the next iteration of its research glasses. At the time, Meta was pretty sparse with details, however now the company is gearing up to release the device to third-party researchers sometime next year, and in the process, showing what might come to AR glasses in the future.

Meta revealed more about Aria Gen 2 in recent blog post, filling in some details about the research glasses’ form factor, audio, cameras, sensors, and on-device compute.

Although Aria Gen 2 can’t do the full range of augmented reality tasks since it lacks any sort of display, much of what goes into Meta’s latest high-tech specs are leading the way for AR glasses of the future.

Better Computer Vision Capabilities

One of the biggest features all-day-wearable AR glasses of the future will undoubtedly need is robust computer vision (CV), such as mapping an indoor space and recognizing objects.

In terms of computer vision, Meta says Aria Gen 2 doubles the number of CV cameras (now four) over Gen 1, features a 120 dB HDR global shutter, an expanded field of view, and 80° stereo overlap—dramatically enhancing 3D tracking and depth perception.

To boot, Meta showed off the glasses in action inside of a room as it performed simultaneous localization and mapping (SLAM):

New Sensors & Smarter Compute

Other features include sensor upgrades, such as a calibrated ambient light sensor, a contact microphone embedded in the nosepad for clearer audio in noisy environments, and a heart rate sensor (PPG) for physiological data.

Additionally, Meta says Aria Gen 2’s on-device compute has also seen a leap over Gen 1, with real-time machine perception running on Meta’s custom coprocessor, including:

  • Visual-Inertial Odometry (VIO) for 6DOF spatial tracking
  • Advanced eye tracking (gaze, vergence, blink, pupil size, etc.)
  • 3D hand tracking for precise motion data and annotation
  • New SubGHz radio tech enables sub-millisecond time alignment between devices, crucial for multi-device setups.

And It’s Light

Aria Gen 2 may contain the latest advancements in computer vision, machine learning, and sensor technology, but they’re also remarkably light at just 74-76g. For reference, a pair of typical eyeglasses can weigh anywhere from 20-50g, depending on materials used and lens thickness.

Aria Gen 2 | Image courtesy Meta

The device’s 2g weight variation is due to Meta offering eight size variants, which the company says will help users get the right fit for head and nose bridge size. And like regular glasses, they also fold for easy storage and transport.

Notably, the company hasn’t openly spoken about battery life, although it does feature a UBS-C port on the glasses’ right arm, which could possibly be used to tether to a battery pack.

Human Perception Meets Machine Vision

Essentially, Aria Gen 2 not only tracks and analyses the user’s environment, but also the user’s physical perception of that environment, like the user preparing a coffee in the image below.

Image courtesy Meta

While the device tracks a user’s eye gaze and heart rate—both of which could indicate reaction to stimulus—it also captures the relative position and movement through the environment, which is informed by its CV cameras, magnetometer, two inertial measurement units (IMUs) and barometer.

That makes for a mountain of useful data for human-centric research projects, but also the sort of info AR glasses will need (and likely collect) in the future.

The Road to AR Glasses

According to Meta, Aria Gen 2 glasses will “pave the way for future innovations that will define the next computing platform,” which is undoubtedly set to be AR. That said, supplanting smartphones in any meaningful way is probably still years away.

Meta’s Orion AR Glasses Prototype | Image courtesy Meta

Despite some early consumer AR glasses out there already, such as XREAL One Pro, packing in thin displays, powerful processors, and enough battery to run it all-day isn’t a trivial feat—something Meta is trying to address both with Aria as well as its Orion AR prototype, which tethers to a wireless compute unit.

Still, Meta CTO and Reality Labs chief Andrew Bosworth says an AR device based on Orion is coming this decade, and will likely shoot for a price point somewhere north of a smartphone.

We’re likely to learn more about Aria Gen 2 soon. Meta says it’s showcasing the device at CVPR 2025 in Nashville, which will include interactive demos. We’ll have our eyes out for more to come from CVPR, which is taking place June 11th – 15th, 2025 at the Music City Center in Nashville TN.

Filed Under: AR Development, ar industry, News, XR Industry News

Meta Partners with Ousted Oculus Founder’s Company to Build “the world’s best AR and VR systems for the US military”

May 29, 2025 From roadtovr

In a twist that promises to make the inevitable Palmer Luckey documentary even more dramatic, Palmer Luckey’s military tech company Anduril has now officially partnered with Meta to build “the world’s best AR and VR systems for the US military.”

Luckey founded Oculus in 2012, the company whose Rift headset was the spark that rebooted the modern era of VR. As a rapidly growing startup, Oculus attracted the attention of Meta (at the time Facebook), which acquired the company in 2014 for more than $2 billion. Luckey continued in VR under Meta’s roof for several years but was eventually pushed out of the company due to backlash over his politics. After leaving Meta, Luckey went on to found Anduril, a tech-defense startup which itself went on to achieve a multi-billion valuation.

Unsurprisingly, given Luckey’s background, Anduril itself has been developing XR tech alongside more traditional military products like drones and sensors. In February, Anduril announced that it was taking over Microsoft’s beleaguered Integrated Visual Augmentation System (IVAS) program, which seeks to produce AR helmets for the United States Army.

An early version of the IVAS helmet | Image courtesy Microsoft

Now Anduril says it’s working in concert with Meta to build “the world’s best AR and VR systems for the US military.”

“Anduril and Meta are partnering to design, build, and field a range of integrated XR products that provide warfighters with enhanced perception and enable intuitive control of autonomous platforms on the battlefield,” the announcement reads. “The capabilities enabled by the partnership will draw on more than a decade of investment by both companies in advanced hardware, software, and artificial intelligence. The effort has been funded through private capital, without taxpayer support, and is designed to save the U.S. military billions of dollars by utilizing high-performance components and technology originally built for commercial use.”

“I am glad to be working with Meta once again.” says Luckey. “Of all the areas where dual-use technology can make a difference for America, this is the one I am most excited about. My mission has long been to turn warfighters into technomancers, and the products we are building with Meta do just that.”

Both Meta CEO Mark Zuckerberg and CTO Andrew “Boz” Bosworth—who were publicly at odds with Luckey following his prior ousting from Meta—both provided quotes as part of the announcement, further cementing a renewed relationship between Meta and Luckey.

Oculus & Anduril founder Palmer Luckey (left) and Meta CEO Mark Zuckerberg (right) pose for a new image demonstrating their renewed relationship | Image courtesy Palmer Luckey

Thus far it sounds like the work between the companies will largely be around the headset that’s being built for the IVAS project, a $20 billion program to build an AR helmet for ground soldiers. Initially headed by Microsoft, Anduril has purportedly taken a leading role over project, and has now tapped Meta to bring some of its technology to the battlefield.

Filed Under: News, XR Industry News

Meta is Testing a Quest UI Overhaul and 3D Instagram Photos in Latest Horizon OS Release

May 23, 2025 From roadtovr

Meta announced it’s now running a test in Quest’s latest Horizon OS release (v77) that overhauls the platform’s dock-based UI for a new launcher overlay. Additionally, Meta says some users will also see 3D Instagram photos in their feed on Quest too, which is neat.

First teased at Connect 2024, Meta is finally bringing Navigator to Quest, which serves as a new centralized hub for apps, quick actions, and system functions.

“As part of our work to develop a fully spatial operating system designed around people, Navigator gives you convenient access to your recently used applications, with the added ability to pin up to 10 items in your library for quick access and seamless task resumption. This makes it easier to multitask in-headset and connect with the people and things you care about most,” Meta says in the v77 patch notes.

Essentially, Navigator is supposed to make it easier to access system-level controls and then quickly return to what you were doing in-headset. More specifically, the new UI should feel pretty familiar to smartphone users thanks to its more traditional layout.

YouTuber ‘The Construct’ shows off Navigator, including a tutorial video and hands-on impressions:

“We designed Navigator based on everything we’ve learned over the last decade. It’s unobtrusive, intuitive, and built from the ground up for the unique needs of spatial computing,” Meta says.

The company says Navigator will begin rolling out as a limited test to some people on the Public Test Channel (PTC) v77, which is expected to roll out gradually to all users over the coming months.

Additionally, Instagram is getting a little love on Quest too, as Meta says it’s currently testing 3D-ified photos on the platform. For some users on PTC v77, Meta’s AI will automatically transform existing 2D photos not originally captured in 3D into an immersive format.

“And it’s an early look at our plans to continue bringing more social and entertainment experiences that are 2D today into a more immersive, 3D future,” Meta says.

Note: To enroll in Quest’s Public Test Channel (PTC), you need to use the Meta Horizon app on your phone and navigate to the ‘Devices’ section. Select your Quest headset and then go to ‘Headset settings’ and then ‘Advanced Settings’. Finally, toggle on ‘Public Test Channel’.

Filed Under: Meta Quest 3 News & Reviews, News

Google Teases Next Android XR Device: XREAL’s Upcoming AR Glasses ‘Project Aura’

May 21, 2025 From roadtovr

When it launches later this year, Android XR is coming first to Samsung’s mixed reality headset, Project Moohan. Now, Google has tapped AR glasses creator XREAL to be the second with its newly unveiled Project Aura.

Google announced at its I/O developer event that China-based XREAL will be the second device officially slated to run Android XR, the company’s forthcoming XR operating system currently in developer preview.

Codenamed Project Aura, the companies describe the optical see-through (OST) device as “a portable and tethered device that gives users access to their favorite Android apps, including those that have been built for XR.”

Information is still thin, however XREAL says Project Aura was created in collaboration with Google and chip-maker Qualcomm, and will be made available to developers “soon after” the launch of Project Moohan, which was recently affirmed to arrive later this year.

Image courtesy XREAL

XREAL hasn’t released specs, although the company has a track record of pairing micro-OLEDs with birdbath optics, which differs from the more expensive waveguide optics seen in devices such as Microsoft HoloLens, Magic Leap One, or Meta’s Orion AR glasses prototype.

Birdbath optics use a curved mirror system for brighter, higher field-of-view (FOV) and lower-cost AR displays, although this typically results in bulkier designs. Waveguides are often thinner and more expensive to manufacture, but provide more wearable form factors with better transparency; waveguides also typically feature a lower FOV, although prototypes like Meta Orion are bucking that trend.

Like the Android XR glasses seen on stage at Google I/O, which are coming from eyewear companies Warby Parker and Gentle Monster, XREAL Project Aura is expected to feature built-in Gemini AI, allowing it do things like real-time translation, AI assistant chats, web searches, object recognition, and displaying contextual info.

Choosing XREAL as its next Android XR hardware partner makes a good deal of sense here. Founded in 2017, XREAL (previously Nreal) has developed a number of AR glasses generations over the years, including its own custom Android launcher, Nebula, to handle native AR experiences on Android devices.

Like previous XREAL devices, Project Aura is meant to be a tethered, and not standalone. It’s uncertain just what external device the device will run Android XR, be it a standard smartphone or dedicated ‘puck’ like XREAL Beam.

That said, XREAL says they’ll be talking more about Project Aura at the Augmented World Expo (AWE) next month, which takes place June 10th – 12th in Long Beach, California. We’re going to present at AWE this year, so check back soon for more on all things XR to come from the event.

Filed Under: AR News, News

Google Partners with Prominent Eyewear Makers for Upcoming Android XR Smartglasses

May 20, 2025 From roadtovr

Google today announced that it is working with eyewear makers Warby Parker and Gentle Monster to bring the first Android XR smartglasses to market. The move mirrors Meta’s early partnership with EssilorLuxottica, the dominant eyewear maker that’s behind Meta’s Ray-Ban smartglasses.

While no productized Android XR smartglasses have been announced, Google said today it is working with eyewear makers Warby Parker and Gentle Monster on the first generation of products. Android XR smartglasses will prominently feature Google’s Gemini AI, and some will include on-board displays for visual output.

Image courtesy Google

Warby Parker is a well known American eyewear brand, founded in 2010, which has pioneered a lower cost, direct-to-consumer glasses business. Gentle Monster, founded in 2011, is a well known South Korean eyewear brand, and has a similar approach as Warby Parker.

While influential, both eyewear makers pale in comparison to EssilorLuxottica, the massive eyewear and lens conglomerate behind brands like Ray-Ban and Oakley.

EssilorLuxottica and Meta partnered several years ago around their smartglasses ambitions. Things seem to be going well for the partnership as the duo has launched several iterations of the Meta Ray-Ban smartglasses featuring classic Ray-Ban designs.

Ray-Ban Meta Glasses, Image courtesy Meta, EssilorLuxottica

Google is now taking the same tact by partnering with two well known glasses-makers to ensure that it has strong brand and fashion credibility behind its upcoming Android XR smartglasses.

The company’s first pair of smartglasses, Google Glass, launched way back in 2012. Although they were impressively compact for their time (especially considering the inclusion of a display), the asymmetrical design of the bulky display optics was seen as socially off-putting—just a bit too weird to pass as regular glasses.

That sent Google (and others) back to the drawing board for years, waiting until the tech could advance enough to make smartglasses that looked more socially acceptable.

It’s unclear when the first Android XR smartglasses will launch, or what they might cost, but Google also said today that developers will be able to start developing for Android XR smartglasses later this year.

Filed Under: News, XR Industry News

Project Starline Immersive Videoconferencing Now Branded Google Beam, Heading to Market with HP

May 20, 2025 From roadtovr

Today at its annual I/O developer conference, Google affirmed plans to bring its Project Starline immersive videoconferencing platform to market with HP. While this partnership was confirmed last year, the product is now officially called Google Beam, with more info promised soon.

Google’s Project Starline is a platform for immersive videoconferencing which was first introduced in 2021. But rather than using a headset, the platform is built around cameras and a light-field display. The light-field display shows natural 3D depth without the need for the viewer to wear a headset or glasses. The goal, the company says, is to create a system that feels like two people are talking to each other face-to-face in the same room, rather than feeling like they are separated by a screen and cameras.

Image courtesy Google

Google has been evolving the system over the years to improve usability and quality. Today the company showed a glimpse of the latest version of the system which it says is coming to market under the name Google Beam.

Image courtesy Google

As confirmed last year, Google is working with HP to bring Google Beam to market starting this year with an initial focus on enterprise customers seeking high-quality videoconferencing. While details are still light, Google says that “HP will have a lot more to share a few weeks from now.”

Image courtesy Google

Filed Under: News, XR Industry News

Google Teases Android Smart Glasses Ahead of I/O Developer Conference Next Week

May 16, 2025 From roadtovr

Google may be getting ready to unveil a pair of smart glasses at its Google I/O developer conference next week, ostensibly hoping to take on Ray-Ban Meta Glasses.

In a promo for Google I/O, Android Ecosystem President Sameer Samat showed off what appears to be a pair of smart glasses.

While Samat didn’t speak directly about the device, when donning the glasses, he said Google I/O attendees will have a chance to see “a few more really cool Android demos.”

Using our CSI-style enhancement abilities (aka ‘crop a YouTube screenshot’), the distinctly Ray-Ban Wayfarer-style glasses appear to have a single camera sensor on the left temple.

Image courtesy Google

There also what appears to be an LED above the camera sensor, likely to inform others when video or pictures are being taken, which may indicate it’s going for feature parity with Ray-Ban Meta Glasses.

The glasses chunky arms are also likely packed with battery and onboard processors, which, owing to Samat’s tease, is probably running some version of its upcoming Android XR operating system. Notably, just under the left arm we can see a small slit close to Samat’s ear, possibly for integrated audio. Alternatively, it may not be a a slit at all, but rather a button of some sort.

Meanwhile Apple may be readying its own pair of smart glasses, with a recent Bloomberg report maintaining the company is now developing a processor specifically optimized for the task.

In any case, we’re hoping to find out more at Google I/O, which is slated to kick off May 20th – 21st where the company will feature livestreamed keynotes, developer sessions, and more. Outside of the keynote, which may actually mention Android XR, the event is set to include two developer talks specifically dedicated to Android XR.

We’ll of course be tuning in, although you can watch the keynote live on YouTube starting on Tuesday, May 20th at 10 AM PT (local time here).

Check out the moment below:

Filed Under: News, XR Industry News

Apple is Reportedly Developing Smart Glasses to Rival Ray-Ban Meta Glasses

May 13, 2025 From roadtovr

Apple is reportedly developing a new chip for an upcoming pair of smart glasses which is aiming to compete with Ray-Ban Meta Glasses, according to a recent Bloomberg report from Mark Gurman.

Apple’s smart glasses chip is reportedly based on the low-energy processors used in Apple Watches, which are being optimized for power efficiency and the ability to control multiple cameras.

The report maintains production of the chip is expected to start by late 2026 or 2027, positioning the device for a market launch within the next two years. Apple’s long-time chips partner Taiwan Semiconductor Manufacturing Co. is expected to handle production.

“Apple is currently exploring non-AR [smart] glasses that use cameras to scan the surrounding environment and rely on AI to assist users,” Gurman says. “That would make the device similar to the Meta product, though Apple is still figuring out the exact approach it wants to take. The iPhone maker also needs its own artificial intelligence technology to vastly improve before the company can roll out a compelling AI-centric device.”

As for Apple’s continued augmented reality efforts, Bloomberg reported in April that Apple CEO Tim Cook “cares about nothing else” than beating Meta to market with a pair of AR glasses, representing a multi-year challenge that goes far beyond creating a pair of smart glasses.

In short, smart glasses like Meta Ray-Ban Glasses can play audio, take pictures, make phone calls, and access a voice assistant. The latest version of the device, released in 2023, has been so successful though, Meta is reportedly set to release the next generation of the device to include a single heads-up display.

Meanwhile, the sort of all-day AR glasses companies like Apple, Google and Meta are hoping to build go several steps further. AR glasses overlay digital content onto the real-world, blending virtual objects or information with the physical environment through transparent displays, requiring more advanced sensors, displays, optics, processors, batteries, and cooling management to achieve.

Filed Under: Apple Vision Pro News & Reviews, News

« Previous Page
Next Page »

  • Home