• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

XR Industry News

Meta Teases Oakley Partnership for Sportier Smart Glasses, Reportedly Releasing This Year

June 16, 2025 From roadtovr

Meta officially confirmed the expansion of its EssilorLuxottica partnership to include a pair of Oakley smart glasses—possibly arriving soon.

Earlier this year, Bloomberg’s Mark Gurman reported that Meta was looking to expand its line of smart glasses beyond Ray-Ban Meta, which would include two possible new devices: a sportier Oakley-branded model, and a high-end model with built-in display—the latter has yet to be announced.

Now, Meta CTO Andrew ‘Boz’ Bosworth confirmed that ‘Oakley Meta’ smart glasses are coming in an X post, showing a graphic of both brands merging and linking to a new @oakleymeta profile.

🤘🏼 @oakleymeta pic.twitter.com/lRL6oimgMR

— Boz (@boztank) June 16, 2025

Details remain scarce, however Gurman’s January report maintained the Oakley smart glasses would be designed for athletes and could launch sometime this year.

Meta’s EssilorLuxottica partnership has been growing steadily since the release of the first-gen Facebook Ray-Ban Stories in 2021, prompting the company to offer a second-gen version in 2023, Ray-Ban Meta, which which introduced updated styles, improved audio and cameras, and on-board AI features.

In late 2024, Meta announced it was expanding its smart glasses partnership with EssilorLuxottica into 2030. At the time, Meta CEO Mark Zuckerberg described its long-term roadmap as giving the companies “the opportunity to turn glasses into the next major technology platform, and make it fashionable in the process.”

In addition to Ray-Ban and Oakley, the French-Italian luxury eyewear company owns other major brands, including Persol, Oliver Peoples, and Vogue Eyewear, along with eyewear retailers LensCrafters, Pearle Vision, and Sunglass Hut.

Filed Under: News, XR Industry News

Google’s First ‘Beam’ Videoconferencing Device is ‘HP Dimension’, Coming Late 2025 at $25,000

June 13, 2025 From roadtovr

HP announced last year it was going to be the first to offer hardware based on Google Beam (formerly ‘Project Starline’), the light field-based 3D videoconferencing platform. Now, HP unveiled ‘Dimension’, which is being pitched to enterprise at $25,000 a pop.

HP Dimension with Google Beam is said to use six cameras and “state of the art AI” to create a realistic 3D video of each participant, displayed on a special 65-inch light field display with realistic size, depth, color, and eye contact.

HP says the device, which will be sold to select partners starting in late 2025, will be priced at $25,000. This notably doesn’t come with the Google Beam license, which is sold separately.

Image courtesy Google, HP

As an enterprise-first device, HP Dimension is slated to support Zoom Rooms and Google Meet, so it can do 3D immersive chats, but also 2D traditional group meetings, integrating cloud-based video services such as Teams and Webex.

“We believe that meaningful collaboration thrives on authentic human connections, which is why we partnered with Google to bring HP Dimension with Google Beam out of the lab and into the enterprise,” said Helen Sheirbon, SVP and President of Hybrid Systems, HP Inc. “HP Dimension with Google Beam bridges the gap between the virtual and physical worlds to create lifelike virtual communication experiences that brings us closer together.”

First introduced in 2021, Google Beam (ex ‘Project Starlight’) combines a light-field display to show natural 3D depth without the need for an XR headset or glasses of any sort—essentially simulating a face-to-face chat between two people.

In its testing, HP says Beam this makes for 39% more non-verbal behaviors noticed, as well as 37% more users noting better turn taking, and 28% noticing an increase in memory recall over traditional videoconferencing platforms.

Filed Under: News, XR Industry News

Report: Samsung’s Project Moohan XR Headset May Get a Launch Date at Unpacked Next Month

June 12, 2025 From roadtovr

Samsung Unpacked is expected to kick off next month with the usual slate of hardware announcements, which this year could include the company’s latest foldable smartphones, Galaxy Z Flip 7 and Fold 7, and its latest Galaxy Watch 8. Rumors suggest though the company is also looking to put its upcoming XR headset, Project Moohan, in the spotlight too.

Project Moohan was announced alongside Android XR back in December 2024, which will be the first device to run Google’s upcoming XR operating system. Samsung has said in the past that consumers should expect Project Moohan’s launch sometime this year, although it still doesn’t have a specific date or official name scheme.

Now, Samsung serial leaker ‘Panda Flash‘ reports the company’s upcoming mixed reality headset could finally get a release date there.

While were initially expecting to hear something about Project Moohan at Google I/O last month (we didn’t), Samsung might be keeping the device a little closer to home than initially thought.

Samsung Project Moohan | Image courtesy The Verge

Panda Flash, who has been following Galaxy Z Flip 7 and Fold 7 leaks and supply chain rumors, additionally reports the headset will launch first in South Korea, and then gradually launch globally sometime afterwards—essentially mirroring Apple’s US-first launch of Vision Pro before heading into other markets.

Samsung has shown its supposed Vision Pro competitor at a number of events over the past year, which includes our opportunity to go hands-on with Project Moohan in December, although the company has largely stayed mum on revealing the XR headset’s full spec sheet.

So far, we know the Android XR headset is packing in a Qualcomm Snapdragon XR2 + Gen 2, Sony-sourced micro‑OLED panels (resolution still TBA), pancake lenses, automatic interpupillary distance (IPD) adjustment, support for eye and hand-tracking, optional magnetically-attached light shield, and a removable external battery pack. It also supports VR motion controllers of some sort, although we haven’t seen those either.

We’re also hoping to learn more about the company’s smart glasses efforts; Samsung is reportedly working on a pair of smart glasses that could launch sometime this year—ostensibly looking to serve up competition to Ray-Ban Meta Smart Glasses.

Whatever the case, we’ll be looking out for official dates for Samsung Unpacked, which is expected to take place sometime early next month in New York City.

Filed Under: News, XR Industry News

Snap Plans to Launch New Consumer ‘Specs’ AR Glasses Next Year

June 10, 2025 From roadtovr

Snap, the company behind Snapchat, today announced it’s working on the next iteration of its Spectacles AR glasses (aka ‘Specs’), which are slated to release publicly sometime next year.

Snap first released its fifth generation of Specs (Spectacles ’24) exclusively to developers in late 2024, later opening up sales to students and teachers in January 2025 through an educational discount program.

Today, at the AWE 2025, Snap announced it’s launching an updated version of the AR glasses for public release next year, which Snap co-founder and CEO Evan Spiegel teases will be “a much smaller form factor, at a fraction of the weight, with a ton more capability.”

There’s no pricing or availability yet beyond the 2026 launch window. To boot, we haven’t even seen the device in question, although we’re betting they aren’t as chunky as these:

Snap Spectacles ’24 | Image courtesy Snap Inc

Spiegel additionally noted that its four million-strong library of Lenses, which add 3D effects, objects, characters, and transformations in AR, will be compatible with the forthcoming version of Specs.

While the company isn’t talking specs (pun intended) right now, the version introduced in 2024 packs in a 46° field of view via stereo waveguide displays, which include automatic tint, and dual liquid crystal on silicon (LCoS) miniature projectors boasting 37 pixels per degree.

As a standalone unit, the device features dual Snapdragon processors, stereo speakers for spatial audio, six microphones for voice recognition, as well as two high-resolution color cameras and two infrared computer vision cameras for 6DOF spatial awareness and hand tracking.

There’s no telling how these specs will change on the next version, although we’re certainly hoping for more than the original’s 45-minute battery life.

Snap Spectacles ’24 | Image courtesy Snap Inc

And as the company is gearing up to release its first publicly available AR glasses, Snap also announced major updates coming to Snap OS. Key enhancements include new integrations with OpenAI and Google Cloud’s Gemini, allowing developers to create multimodal AI-powered Lenses for Specs. These include things like real-time translation, currency conversion, recipe suggestions, and interactive adventures.

Additionally, new APIs are said to expand spatial and audio capabilities, including Depth Module API, which anchors AR content in 3D space, and Automated Speech Recognition API, which supports 40+ languages. The company’s Snap3D API is also said to enable real-time 3D object generation within Lenses.

For developers building location-based experiences, Snap says it’s also introducing a Fleet Management app, Guided Mode for seamless Lens launching, and Guided Navigation for AR tours. Upcoming features include Niantic Spatial VPS integration and WebXR browser support, enabling a shared, AI-assisted map of the world and expanded access to WebXR content.

Releasing Specs to consumers could put Snap in a unique position as a first mover; companies including Apple, Meta, and Google still haven’t released their own AR glasses, although consumers should expect the race to heat up this decade. The overall consensus is these companies are looking to own a significant piece of AR, as many hope the device class will unseat smartphones as the dominant computing paradigm in the future.

Filed Under: AR Development, News, XR Industry News

Vision Pro is Getting a Major Visual Upgrade to Its ‘Persona’ Avatars

June 9, 2025 From roadtovr

Apple ‘Personas’ on Vision Pro are already the most likelife real-time avatars you can find on any headset today, but in the next version of visionOS, they’re taking another step forward.

Apple today announced that its Persona avatars for Vision Pro will get a major visual upgrade with the launch of visionOS 26, due out later this year.

Personas on Vision Pro are generated on-device after users take a short scan of their face using the headset. Once generated, the avatar is used for social experiences like FaceTime.

Currently, they’re the most lifelike real-time avatars available on any headset today. Although they impressively capture subtle motion from the user, they have always felt somewhat blurry or ghostly.

VisionOS 26 promises a big visual update that will greatly reduce that ghostly look, and present a more complete view of the user’s head, including a “full side profile view.” Apple is also promising more realistic hair and lashes, and more than 1,000 variations of glasses, so glasses-wearers can find something that looks just right.

View post on imgur.com

Although visionOS 26 will be available as a developer beta starting today, it isn’t yet clear if the Personas upgrade will be available in the first version, or roll out in later versions of the beta.

Beyond the visual upgrade to Personas, visionOS 26 will also make improvements to how social experiences work on the headset. New developer tools will allow for the creation of co-located virtual experiences; meaning two headset users in the same physical space will be able to see a shared virtual experience that’s visually anchored in the same space for both. That same system will allow for remote participants to join as Persona avatars, making for a mixture of in-person headset users and remote participants in the same virtual experience.

Filed Under: Apple Vision Pro News & Reviews, XR Industry News

AWE 2025 Preview: 4 Companies Building XR’s Future

June 6, 2025 From roadtovr

AWE USA 2025, one of the XR industry’s largest annual conferences, kicks off next week. We got a preview of what four interesting companies attending the event will be showing.

As far as industry events go, AWE USA has become our must-attend XR event of the year. It kicks off next week on June 10–12 in Long Beach, CA. As the Premier Media Partner of this year’s event, our exclusive exclusive 20% discount on tickets is still available.

We’ll be on site at the event, reporting on the most important developments. Ahead of AWE though we asked four interesting companies for a preview of what they’re bringing to the show.

CREAL

At AWE 2025, CREAL will showcase its Clarity light-field display. Released at the beginning of the year, CREAL has, since then, continuously improved the image quality by innovating on the spatial light modulator. Visitors will be able to experience the new display technology through a headset as well as a tabletop pair of glasses

Both prototypes feature CREAL’s Clarity display, which includes the light field optical engine and holographic lenses. Beyond the display, the headset prototype integrates off-the-shelf components to enable full-scale demonstrations of our technology, while the glasses prototype is designed with custom components to showcase our ultimate form factor. | Image courtesy CREAL

XREAL

At AWE, XREAL will be demoing the ultra-popular XREAL One Series AR glasses with spatial computing capabilities. Also available for demo will be the XREAL EYE, a modular camera attachment for the One Series. XREAL will also be unveiling an exciting new accessory and showing it off in person for the very first time.

Image courtesy XREAL

ForgeFX

At AWE 2025, ForgeFX Simulations will unveil VORTEX, a next-generation XR training platform engineered for high-risk, high-consequence environments where traditional training methods fall short. Built on the proprietary ForgeSIM framework, VORTEX delivers immersive, AI-enhanced, scenario-based mission rehearsal through photorealistic LiDAR environments, GIS-enabled sand tables, voice-activated SMEs, and real-time performance analytics—already piloted by JPEO-CBRND for CBRN response. ForgeFX is also debuting an enhanced Horizontal Directional Drill (HDD) Simulator for the Meta Quest 3 PCVR, co-developed with Vermeer Corporation, featuring authentic drill controls and a new Auto Rod Exchange module that trains on a previously unsimulated, safety-critical task. At Booth #346, attendees can experience six interactive demos, including the JLG Access Ready XR trainer, Somero S-22EZ Laser Screed simulator, CBRND HoloTrainer, Trumpf Laser Cutting simulator, ForgeFX Excavator trainer, and Ocuweld welding VR simulator, each showcasing ForgeFX’s leadership in immersive, equipment-integrated training solutions.

Image courtesy ForgeFX

PICO

At AWE USA 2025, PICO will showcase the PICO 4 Ultra Enterprise, its latest all-in-one mixed reality headset designed for enterprise applications. Equipped with advanced MR capabilities and the PICOMotion Tracker for full-body and object tracking, the headset empowers industries to deliver highly immersive, practical solutions. PICO has successfully expanded into education, training and location-based entertainment (LBE), and visitors to the booth will have the opportunity to experience a selection of these real-world use cases firsthand. A private meeting space will also be available for deeper conversations about how PICO’s solutions can accelerate business strategies. PICO will also host two featured speaking sessions: ‘Unlocking the Potential of LBE: Scaling with PICO’s XR Solutions’ and ‘Superpowers for Spatial Developers: WebSpatial and SpatialML.’

Image courtesy PICO

What are you hoping to see from AWE 2025? Let us know in the comments below.

Filed Under: News, XR Industry News

A Look Inside Meta’s ‘Aria’ Research Glasses Shows What Tech Could Come to Future AR Glasses

June 5, 2025 From roadtovr

Earlier this year, Meta unveiled Aria Gen 2, the next iteration of its research glasses. At the time, Meta was pretty sparse with details, however now the company is gearing up to release the device to third-party researchers sometime next year, and in the process, showing what might come to AR glasses in the future.

Meta revealed more about Aria Gen 2 in recent blog post, filling in some details about the research glasses’ form factor, audio, cameras, sensors, and on-device compute.

Although Aria Gen 2 can’t do the full range of augmented reality tasks since it lacks any sort of display, much of what goes into Meta’s latest high-tech specs are leading the way for AR glasses of the future.

Better Computer Vision Capabilities

One of the biggest features all-day-wearable AR glasses of the future will undoubtedly need is robust computer vision (CV), such as mapping an indoor space and recognizing objects.

In terms of computer vision, Meta says Aria Gen 2 doubles the number of CV cameras (now four) over Gen 1, features a 120 dB HDR global shutter, an expanded field of view, and 80° stereo overlap—dramatically enhancing 3D tracking and depth perception.

To boot, Meta showed off the glasses in action inside of a room as it performed simultaneous localization and mapping (SLAM):

New Sensors & Smarter Compute

Other features include sensor upgrades, such as a calibrated ambient light sensor, a contact microphone embedded in the nosepad for clearer audio in noisy environments, and a heart rate sensor (PPG) for physiological data.

Additionally, Meta says Aria Gen 2’s on-device compute has also seen a leap over Gen 1, with real-time machine perception running on Meta’s custom coprocessor, including:

  • Visual-Inertial Odometry (VIO) for 6DOF spatial tracking
  • Advanced eye tracking (gaze, vergence, blink, pupil size, etc.)
  • 3D hand tracking for precise motion data and annotation
  • New SubGHz radio tech enables sub-millisecond time alignment between devices, crucial for multi-device setups.

And It’s Light

Aria Gen 2 may contain the latest advancements in computer vision, machine learning, and sensor technology, but they’re also remarkably light at just 74-76g. For reference, a pair of typical eyeglasses can weigh anywhere from 20-50g, depending on materials used and lens thickness.

Aria Gen 2 | Image courtesy Meta

The device’s 2g weight variation is due to Meta offering eight size variants, which the company says will help users get the right fit for head and nose bridge size. And like regular glasses, they also fold for easy storage and transport.

Notably, the company hasn’t openly spoken about battery life, although it does feature a UBS-C port on the glasses’ right arm, which could possibly be used to tether to a battery pack.

Human Perception Meets Machine Vision

Essentially, Aria Gen 2 not only tracks and analyses the user’s environment, but also the user’s physical perception of that environment, like the user preparing a coffee in the image below.

Image courtesy Meta

While the device tracks a user’s eye gaze and heart rate—both of which could indicate reaction to stimulus—it also captures the relative position and movement through the environment, which is informed by its CV cameras, magnetometer, two inertial measurement units (IMUs) and barometer.

That makes for a mountain of useful data for human-centric research projects, but also the sort of info AR glasses will need (and likely collect) in the future.

The Road to AR Glasses

According to Meta, Aria Gen 2 glasses will “pave the way for future innovations that will define the next computing platform,” which is undoubtedly set to be AR. That said, supplanting smartphones in any meaningful way is probably still years away.

Meta’s Orion AR Glasses Prototype | Image courtesy Meta

Despite some early consumer AR glasses out there already, such as XREAL One Pro, packing in thin displays, powerful processors, and enough battery to run it all-day isn’t a trivial feat—something Meta is trying to address both with Aria as well as its Orion AR prototype, which tethers to a wireless compute unit.

Still, Meta CTO and Reality Labs chief Andrew Bosworth says an AR device based on Orion is coming this decade, and will likely shoot for a price point somewhere north of a smartphone.

We’re likely to learn more about Aria Gen 2 soon. Meta says it’s showcasing the device at CVPR 2025 in Nashville, which will include interactive demos. We’ll have our eyes out for more to come from CVPR, which is taking place June 11th – 15th, 2025 at the Music City Center in Nashville TN.

Filed Under: AR Development, ar industry, News, XR Industry News

Meta Partners with Ousted Oculus Founder’s Company to Build “the world’s best AR and VR systems for the US military”

May 29, 2025 From roadtovr

In a twist that promises to make the inevitable Palmer Luckey documentary even more dramatic, Palmer Luckey’s military tech company Anduril has now officially partnered with Meta to build “the world’s best AR and VR systems for the US military.”

Luckey founded Oculus in 2012, the company whose Rift headset was the spark that rebooted the modern era of VR. As a rapidly growing startup, Oculus attracted the attention of Meta (at the time Facebook), which acquired the company in 2014 for more than $2 billion. Luckey continued in VR under Meta’s roof for several years but was eventually pushed out of the company due to backlash over his politics. After leaving Meta, Luckey went on to found Anduril, a tech-defense startup which itself went on to achieve a multi-billion valuation.

Unsurprisingly, given Luckey’s background, Anduril itself has been developing XR tech alongside more traditional military products like drones and sensors. In February, Anduril announced that it was taking over Microsoft’s beleaguered Integrated Visual Augmentation System (IVAS) program, which seeks to produce AR helmets for the United States Army.

An early version of the IVAS helmet | Image courtesy Microsoft

Now Anduril says it’s working in concert with Meta to build “the world’s best AR and VR systems for the US military.”

“Anduril and Meta are partnering to design, build, and field a range of integrated XR products that provide warfighters with enhanced perception and enable intuitive control of autonomous platforms on the battlefield,” the announcement reads. “The capabilities enabled by the partnership will draw on more than a decade of investment by both companies in advanced hardware, software, and artificial intelligence. The effort has been funded through private capital, without taxpayer support, and is designed to save the U.S. military billions of dollars by utilizing high-performance components and technology originally built for commercial use.”

“I am glad to be working with Meta once again.” says Luckey. “Of all the areas where dual-use technology can make a difference for America, this is the one I am most excited about. My mission has long been to turn warfighters into technomancers, and the products we are building with Meta do just that.”

Both Meta CEO Mark Zuckerberg and CTO Andrew “Boz” Bosworth—who were publicly at odds with Luckey following his prior ousting from Meta—both provided quotes as part of the announcement, further cementing a renewed relationship between Meta and Luckey.

Oculus & Anduril founder Palmer Luckey (left) and Meta CEO Mark Zuckerberg (right) pose for a new image demonstrating their renewed relationship | Image courtesy Palmer Luckey

Thus far it sounds like the work between the companies will largely be around the headset that’s being built for the IVAS project, a $20 billion program to build an AR helmet for ground soldiers. Initially headed by Microsoft, Anduril has purportedly taken a leading role over project, and has now tapped Meta to bring some of its technology to the battlefield.

Filed Under: News, XR Industry News

Google Partners with Prominent Eyewear Makers for Upcoming Android XR Smartglasses

May 20, 2025 From roadtovr

Google today announced that it is working with eyewear makers Warby Parker and Gentle Monster to bring the first Android XR smartglasses to market. The move mirrors Meta’s early partnership with EssilorLuxottica, the dominant eyewear maker that’s behind Meta’s Ray-Ban smartglasses.

While no productized Android XR smartglasses have been announced, Google said today it is working with eyewear makers Warby Parker and Gentle Monster on the first generation of products. Android XR smartglasses will prominently feature Google’s Gemini AI, and some will include on-board displays for visual output.

Image courtesy Google

Warby Parker is a well known American eyewear brand, founded in 2010, which has pioneered a lower cost, direct-to-consumer glasses business. Gentle Monster, founded in 2011, is a well known South Korean eyewear brand, and has a similar approach as Warby Parker.

While influential, both eyewear makers pale in comparison to EssilorLuxottica, the massive eyewear and lens conglomerate behind brands like Ray-Ban and Oakley.

EssilorLuxottica and Meta partnered several years ago around their smartglasses ambitions. Things seem to be going well for the partnership as the duo has launched several iterations of the Meta Ray-Ban smartglasses featuring classic Ray-Ban designs.

Ray-Ban Meta Glasses, Image courtesy Meta, EssilorLuxottica

Google is now taking the same tact by partnering with two well known glasses-makers to ensure that it has strong brand and fashion credibility behind its upcoming Android XR smartglasses.

The company’s first pair of smartglasses, Google Glass, launched way back in 2012. Although they were impressively compact for their time (especially considering the inclusion of a display), the asymmetrical design of the bulky display optics was seen as socially off-putting—just a bit too weird to pass as regular glasses.

That sent Google (and others) back to the drawing board for years, waiting until the tech could advance enough to make smartglasses that looked more socially acceptable.

It’s unclear when the first Android XR smartglasses will launch, or what they might cost, but Google also said today that developers will be able to start developing for Android XR smartglasses later this year.

Filed Under: News, XR Industry News

Project Starline Immersive Videoconferencing Now Branded Google Beam, Heading to Market with HP

May 20, 2025 From roadtovr

Today at its annual I/O developer conference, Google affirmed plans to bring its Project Starline immersive videoconferencing platform to market with HP. While this partnership was confirmed last year, the product is now officially called Google Beam, with more info promised soon.

Google’s Project Starline is a platform for immersive videoconferencing which was first introduced in 2021. But rather than using a headset, the platform is built around cameras and a light-field display. The light-field display shows natural 3D depth without the need for the viewer to wear a headset or glasses. The goal, the company says, is to create a system that feels like two people are talking to each other face-to-face in the same room, rather than feeling like they are separated by a screen and cameras.

Image courtesy Google

Google has been evolving the system over the years to improve usability and quality. Today the company showed a glimpse of the latest version of the system which it says is coming to market under the name Google Beam.

Image courtesy Google

As confirmed last year, Google is working with HP to bring Google Beam to market starting this year with an initial focus on enterprise customers seeking high-quality videoconferencing. While details are still light, Google says that “HP will have a lot more to share a few weeks from now.”

Image courtesy Google

Filed Under: News, XR Industry News

« Previous Page
Next Page »

  • Home