• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

News

Snap Plans to Launch New Consumer ‘Specs’ AR Glasses Next Year

June 10, 2025 From roadtovr

Snap, the company behind Snapchat, today announced it’s working on the next iteration of its Spectacles AR glasses (aka ‘Specs’), which are slated to release publicly sometime next year.

Snap first released its fifth generation of Specs (Spectacles ’24) exclusively to developers in late 2024, later opening up sales to students and teachers in January 2025 through an educational discount program.

Today, at the AWE 2025, Snap announced it’s launching an updated version of the AR glasses for public release next year, which Snap co-founder and CEO Evan Spiegel teases will be “a much smaller form factor, at a fraction of the weight, with a ton more capability.”

There’s no pricing or availability yet beyond the 2026 launch window. To boot, we haven’t even seen the device in question, although we’re betting they aren’t as chunky as these:

Snap Spectacles ’24 | Image courtesy Snap Inc

Spiegel additionally noted that its four million-strong library of Lenses, which add 3D effects, objects, characters, and transformations in AR, will be compatible with the forthcoming version of Specs.

While the company isn’t talking specs (pun intended) right now, the version introduced in 2024 packs in a 46° field of view via stereo waveguide displays, which include automatic tint, and dual liquid crystal on silicon (LCoS) miniature projectors boasting 37 pixels per degree.

As a standalone unit, the device features dual Snapdragon processors, stereo speakers for spatial audio, six microphones for voice recognition, as well as two high-resolution color cameras and two infrared computer vision cameras for 6DOF spatial awareness and hand tracking.

There’s no telling how these specs will change on the next version, although we’re certainly hoping for more than the original’s 45-minute battery life.

Snap Spectacles ’24 | Image courtesy Snap Inc

And as the company is gearing up to release its first publicly available AR glasses, Snap also announced major updates coming to Snap OS. Key enhancements include new integrations with OpenAI and Google Cloud’s Gemini, allowing developers to create multimodal AI-powered Lenses for Specs. These include things like real-time translation, currency conversion, recipe suggestions, and interactive adventures.

Additionally, new APIs are said to expand spatial and audio capabilities, including Depth Module API, which anchors AR content in 3D space, and Automated Speech Recognition API, which supports 40+ languages. The company’s Snap3D API is also said to enable real-time 3D object generation within Lenses.

For developers building location-based experiences, Snap says it’s also introducing a Fleet Management app, Guided Mode for seamless Lens launching, and Guided Navigation for AR tours. Upcoming features include Niantic Spatial VPS integration and WebXR browser support, enabling a shared, AI-assisted map of the world and expanded access to WebXR content.

Releasing Specs to consumers could put Snap in a unique position as a first mover; companies including Apple, Meta, and Google still haven’t released their own AR glasses, although consumers should expect the race to heat up this decade. The overall consensus is these companies are looking to own a significant piece of AR, as many hope the device class will unseat smartphones as the dominant computing paradigm in the future.

Filed Under: AR Development, News, XR Industry News

Spatial Photos on Vision Pro Are Getting a Volumetric Upgrade for Greater Immersion

June 9, 2025 From roadtovr

At WWDC today, Apple announced the headlining features of visionOS 26, its next big OS release for Vision Pro. Among them is a new revamped spatial photos feature that ought to make them even more immersive.

Vision Pro launched with the ability to view spatial photos, captured either with the headset itself or with iPhone 16, 15 Pro and Pro Max. These spatial photos created a sense of depth and dimensionality by combining stereo capture and applying depth mapping to the image.

Now, Apple says it’s applied a new generative AI algorithm to create “spatial scenes with multiple perspectives, letting users feel like they can lean in and look around,” essentially ‘guessing’ at details not actually captured on camera.

With visionOS 26, Vision Pro users will be able to view spatial scenes in the Photos app, Spatial Gallery app, and Safari. The company says developers will also be able to use the Spatial Scene API to add the feature into their apps.

To show off the new AI-assisted spatial photos feature, real-estate marketplace Zillow says it’s adopting Spatial Scene API in the Zillow Immersive app for Vision Pro, which lets users to see spatial images of homes and apartments.

Apple’s visionOS 26 is slated to arrive sometime later this year, although the company says testing is already underway.

Filed Under: Apple Vision Pro News & Reviews, News

Vision Pro’s Next Big Update Will Add Anchored Widgets That Live Around Your House

June 9, 2025 From roadtovr

Apple today announced at WWDC that Vision Pro is getting spatialized Widgets, coming along when visionOS 26 drops later this year.

On basically all of Apple’s devices, Widgets are designed to offer up personalized and useful info at a glance.

Now Apple says Vision Pro is also getting spatial Widgets too, which will let you place a variety of these mini-apps around your house which reappear every time you put on Vision Pro.

Apple says Widgets in visionOS 26 are “customizable, with a variety of options for frame width, color, and depth. Beautiful new widgets — including Clock, Weather, Music, and Photos — all offer unique interactions and experiences.”

Essentially, you’ll be able to decorate you space with things like spatial photos, clocks with distinctive face designs, a calendar with your events, and also quick access to music playlists and songs so you can, say, keep your favorite track in a specific part of your room.

Notably, Apple says developers will be able to create their own widgets using WidgetKit. There’s no word on exactly when visionOS 26 releases, although the company says we can expect it sometime later this year.


This story is breaking. We’re currently at WWDC today, and will report back when we learn more about all things Vision Pro.

Filed Under: Apple Vision Pro News & Reviews, News

Unofficial SteamVR Driver Will Reportedly Enable Support for WMR Headsets on Latest Windows Versions

June 9, 2025 From roadtovr

Microsoft pulled the plug on support for its entire WMR platform on Windows 11 last year, putting an official end to the company’s foray into PC VR headsets. Now, an unofficial SteamVR driver hopes to bring it back.

Microsoft deprecated the Mixed Reality Portal app, Windows Mixed Reality for SteamVR, and Steam VR Beta when Windows 11 24H2 update rolled out last October, making a fleet of PC VR headsets from Acer, Asus, Dell, Lenovo, HP, and Samsung essentially expensive paperweights.

Granted, if you haven’t updated to Windows 11 24H2, or are still on Windows 10, Microsoft says you’ll be able to play SteamVR content through November 2026 before the plug is pulled for good. Still, that’s a bitter pill to swallow for users of WMR’s most modern headset, HP Reverb G2, which released less than five years ago.

Now, Reddit user ‘mbucchia’ claims an unofficial SteamVR driver is in the works, which aims to bring all WMR headsets back into the fold sometime this Fall. Below you can see the first HP Reverb (2019) in action:

Dubbed the ‘Oasis’ driver, mbucchia says the SteamVR driver “does not need the Mixed Reality Portal,” which was deprecated in Windows 11 24H2 last year alongside SteamVR beta support.

“This means it can work on Windows 11 24H2 and newer. It supports full 6DoF tracking along with motion controllers,” mbucchia says.

“As mentioned on another post, I don’t have all WMR headsets to test with. Though I can tell you that it [also] works on the original Acer AH100. It should in theory work on any brand/model,” mbucchia says.

The first wave of WMR headsets launched in 2017 | Image courtesy Microsoft

Work on the driver is remaining behind closed doors, mbucchia says, noting that Oasis will be restricted to Nvidia GPUs due to the way SteamVR interfaces with the GPU drivers.

Most interesting of all, though, is Oasis isn’t being undertaken by just anyone. Mbucchia claims they are currently an employee at Microsoft who has previously worked in the company’s Mixed Reality division.

“I am bound by NDAs and other obligations. I want to be clear that I have taken much care to NOT BREACH any of these agreements while working on this project. In particular, I am leveraging SteamVR for a lot of heavy lifting and I am not borrowing any Microsoft intellectual property,” mbucchia says.

For these reasons, Oasis won’t feature a beta, or similar early access, before its release in Fall 2025. It also won’t be open source.

“Much of the code is the result of deep reverse-engineer. Reverse-engineering that if shared, could be construed as exposing internals of programs like SteamVR or the Nvidia GPU drivers,” mbucchia explains. “Not[e] that here again, I am NOT BREACHING any proprietary/intellectual property. Having respect for both Valve and Nvidia, I will not divulge any of the code that they do not consider public.”

Mbucchia says they’ll reveal more about the project in the Windows Mixed Reality subreddit leading up to its Fall 2025 release.

Filed Under: News, PC VR News & Reviews

AWE 2025 Preview: 4 Companies Building XR’s Future

June 6, 2025 From roadtovr

AWE USA 2025, one of the XR industry’s largest annual conferences, kicks off next week. We got a preview of what four interesting companies attending the event will be showing.

As far as industry events go, AWE USA has become our must-attend XR event of the year. It kicks off next week on June 10–12 in Long Beach, CA. As the Premier Media Partner of this year’s event, our exclusive exclusive 20% discount on tickets is still available.

We’ll be on site at the event, reporting on the most important developments. Ahead of AWE though we asked four interesting companies for a preview of what they’re bringing to the show.

CREAL

At AWE 2025, CREAL will showcase its Clarity light-field display. Released at the beginning of the year, CREAL has, since then, continuously improved the image quality by innovating on the spatial light modulator. Visitors will be able to experience the new display technology through a headset as well as a tabletop pair of glasses

Both prototypes feature CREAL’s Clarity display, which includes the light field optical engine and holographic lenses. Beyond the display, the headset prototype integrates off-the-shelf components to enable full-scale demonstrations of our technology, while the glasses prototype is designed with custom components to showcase our ultimate form factor. | Image courtesy CREAL

XREAL

At AWE, XREAL will be demoing the ultra-popular XREAL One Series AR glasses with spatial computing capabilities. Also available for demo will be the XREAL EYE, a modular camera attachment for the One Series. XREAL will also be unveiling an exciting new accessory and showing it off in person for the very first time.

Image courtesy XREAL

ForgeFX

At AWE 2025, ForgeFX Simulations will unveil VORTEX, a next-generation XR training platform engineered for high-risk, high-consequence environments where traditional training methods fall short. Built on the proprietary ForgeSIM framework, VORTEX delivers immersive, AI-enhanced, scenario-based mission rehearsal through photorealistic LiDAR environments, GIS-enabled sand tables, voice-activated SMEs, and real-time performance analytics—already piloted by JPEO-CBRND for CBRN response. ForgeFX is also debuting an enhanced Horizontal Directional Drill (HDD) Simulator for the Meta Quest 3 PCVR, co-developed with Vermeer Corporation, featuring authentic drill controls and a new Auto Rod Exchange module that trains on a previously unsimulated, safety-critical task. At Booth #346, attendees can experience six interactive demos, including the JLG Access Ready XR trainer, Somero S-22EZ Laser Screed simulator, CBRND HoloTrainer, Trumpf Laser Cutting simulator, ForgeFX Excavator trainer, and Ocuweld welding VR simulator, each showcasing ForgeFX’s leadership in immersive, equipment-integrated training solutions.

Image courtesy ForgeFX

PICO

At AWE USA 2025, PICO will showcase the PICO 4 Ultra Enterprise, its latest all-in-one mixed reality headset designed for enterprise applications. Equipped with advanced MR capabilities and the PICOMotion Tracker for full-body and object tracking, the headset empowers industries to deliver highly immersive, practical solutions. PICO has successfully expanded into education, training and location-based entertainment (LBE), and visitors to the booth will have the opportunity to experience a selection of these real-world use cases firsthand. A private meeting space will also be available for deeper conversations about how PICO’s solutions can accelerate business strategies. PICO will also host two featured speaking sessions: ‘Unlocking the Potential of LBE: Scaling with PICO’s XR Solutions’ and ‘Superpowers for Spatial Developers: WebSpatial and SpatialML.’

Image courtesy PICO

What are you hoping to see from AWE 2025? Let us know in the comments below.

Filed Under: News, XR Industry News

A Look Inside Meta’s ‘Aria’ Research Glasses Shows What Tech Could Come to Future AR Glasses

June 5, 2025 From roadtovr

Earlier this year, Meta unveiled Aria Gen 2, the next iteration of its research glasses. At the time, Meta was pretty sparse with details, however now the company is gearing up to release the device to third-party researchers sometime next year, and in the process, showing what might come to AR glasses in the future.

Meta revealed more about Aria Gen 2 in recent blog post, filling in some details about the research glasses’ form factor, audio, cameras, sensors, and on-device compute.

Although Aria Gen 2 can’t do the full range of augmented reality tasks since it lacks any sort of display, much of what goes into Meta’s latest high-tech specs are leading the way for AR glasses of the future.

Better Computer Vision Capabilities

One of the biggest features all-day-wearable AR glasses of the future will undoubtedly need is robust computer vision (CV), such as mapping an indoor space and recognizing objects.

In terms of computer vision, Meta says Aria Gen 2 doubles the number of CV cameras (now four) over Gen 1, features a 120 dB HDR global shutter, an expanded field of view, and 80° stereo overlap—dramatically enhancing 3D tracking and depth perception.

To boot, Meta showed off the glasses in action inside of a room as it performed simultaneous localization and mapping (SLAM):

New Sensors & Smarter Compute

Other features include sensor upgrades, such as a calibrated ambient light sensor, a contact microphone embedded in the nosepad for clearer audio in noisy environments, and a heart rate sensor (PPG) for physiological data.

Additionally, Meta says Aria Gen 2’s on-device compute has also seen a leap over Gen 1, with real-time machine perception running on Meta’s custom coprocessor, including:

  • Visual-Inertial Odometry (VIO) for 6DOF spatial tracking
  • Advanced eye tracking (gaze, vergence, blink, pupil size, etc.)
  • 3D hand tracking for precise motion data and annotation
  • New SubGHz radio tech enables sub-millisecond time alignment between devices, crucial for multi-device setups.

And It’s Light

Aria Gen 2 may contain the latest advancements in computer vision, machine learning, and sensor technology, but they’re also remarkably light at just 74-76g. For reference, a pair of typical eyeglasses can weigh anywhere from 20-50g, depending on materials used and lens thickness.

Aria Gen 2 | Image courtesy Meta

The device’s 2g weight variation is due to Meta offering eight size variants, which the company says will help users get the right fit for head and nose bridge size. And like regular glasses, they also fold for easy storage and transport.

Notably, the company hasn’t openly spoken about battery life, although it does feature a UBS-C port on the glasses’ right arm, which could possibly be used to tether to a battery pack.

Human Perception Meets Machine Vision

Essentially, Aria Gen 2 not only tracks and analyses the user’s environment, but also the user’s physical perception of that environment, like the user preparing a coffee in the image below.

Image courtesy Meta

While the device tracks a user’s eye gaze and heart rate—both of which could indicate reaction to stimulus—it also captures the relative position and movement through the environment, which is informed by its CV cameras, magnetometer, two inertial measurement units (IMUs) and barometer.

That makes for a mountain of useful data for human-centric research projects, but also the sort of info AR glasses will need (and likely collect) in the future.

The Road to AR Glasses

According to Meta, Aria Gen 2 glasses will “pave the way for future innovations that will define the next computing platform,” which is undoubtedly set to be AR. That said, supplanting smartphones in any meaningful way is probably still years away.

Meta’s Orion AR Glasses Prototype | Image courtesy Meta

Despite some early consumer AR glasses out there already, such as XREAL One Pro, packing in thin displays, powerful processors, and enough battery to run it all-day isn’t a trivial feat—something Meta is trying to address both with Aria as well as its Orion AR prototype, which tethers to a wireless compute unit.

Still, Meta CTO and Reality Labs chief Andrew Bosworth says an AR device based on Orion is coming this decade, and will likely shoot for a price point somewhere north of a smartphone.

We’re likely to learn more about Aria Gen 2 soon. Meta says it’s showcasing the device at CVPR 2025 in Nashville, which will include interactive demos. We’ll have our eyes out for more to come from CVPR, which is taking place June 11th – 15th, 2025 at the Music City Center in Nashville TN.

Filed Under: AR Development, ar industry, News, XR Industry News

Meta Partners with Ousted Oculus Founder’s Company to Build “the world’s best AR and VR systems for the US military”

May 29, 2025 From roadtovr

In a twist that promises to make the inevitable Palmer Luckey documentary even more dramatic, Palmer Luckey’s military tech company Anduril has now officially partnered with Meta to build “the world’s best AR and VR systems for the US military.”

Luckey founded Oculus in 2012, the company whose Rift headset was the spark that rebooted the modern era of VR. As a rapidly growing startup, Oculus attracted the attention of Meta (at the time Facebook), which acquired the company in 2014 for more than $2 billion. Luckey continued in VR under Meta’s roof for several years but was eventually pushed out of the company due to backlash over his politics. After leaving Meta, Luckey went on to found Anduril, a tech-defense startup which itself went on to achieve a multi-billion valuation.

Unsurprisingly, given Luckey’s background, Anduril itself has been developing XR tech alongside more traditional military products like drones and sensors. In February, Anduril announced that it was taking over Microsoft’s beleaguered Integrated Visual Augmentation System (IVAS) program, which seeks to produce AR helmets for the United States Army.

An early version of the IVAS helmet | Image courtesy Microsoft

Now Anduril says it’s working in concert with Meta to build “the world’s best AR and VR systems for the US military.”

“Anduril and Meta are partnering to design, build, and field a range of integrated XR products that provide warfighters with enhanced perception and enable intuitive control of autonomous platforms on the battlefield,” the announcement reads. “The capabilities enabled by the partnership will draw on more than a decade of investment by both companies in advanced hardware, software, and artificial intelligence. The effort has been funded through private capital, without taxpayer support, and is designed to save the U.S. military billions of dollars by utilizing high-performance components and technology originally built for commercial use.”

“I am glad to be working with Meta once again.” says Luckey. “Of all the areas where dual-use technology can make a difference for America, this is the one I am most excited about. My mission has long been to turn warfighters into technomancers, and the products we are building with Meta do just that.”

Both Meta CEO Mark Zuckerberg and CTO Andrew “Boz” Bosworth—who were publicly at odds with Luckey following his prior ousting from Meta—both provided quotes as part of the announcement, further cementing a renewed relationship between Meta and Luckey.

Oculus & Anduril founder Palmer Luckey (left) and Meta CEO Mark Zuckerberg (right) pose for a new image demonstrating their renewed relationship | Image courtesy Palmer Luckey

Thus far it sounds like the work between the companies will largely be around the headset that’s being built for the IVAS project, a $20 billion program to build an AR helmet for ground soldiers. Initially headed by Microsoft, Anduril has purportedly taken a leading role over project, and has now tapped Meta to bring some of its technology to the battlefield.

Filed Under: News, XR Industry News

Meta is Testing a Quest UI Overhaul and 3D Instagram Photos in Latest Horizon OS Release

May 23, 2025 From roadtovr

Meta announced it’s now running a test in Quest’s latest Horizon OS release (v77) that overhauls the platform’s dock-based UI for a new launcher overlay. Additionally, Meta says some users will also see 3D Instagram photos in their feed on Quest too, which is neat.

First teased at Connect 2024, Meta is finally bringing Navigator to Quest, which serves as a new centralized hub for apps, quick actions, and system functions.

“As part of our work to develop a fully spatial operating system designed around people, Navigator gives you convenient access to your recently used applications, with the added ability to pin up to 10 items in your library for quick access and seamless task resumption. This makes it easier to multitask in-headset and connect with the people and things you care about most,” Meta says in the v77 patch notes.

Essentially, Navigator is supposed to make it easier to access system-level controls and then quickly return to what you were doing in-headset. More specifically, the new UI should feel pretty familiar to smartphone users thanks to its more traditional layout.

YouTuber ‘The Construct’ shows off Navigator, including a tutorial video and hands-on impressions:

“We designed Navigator based on everything we’ve learned over the last decade. It’s unobtrusive, intuitive, and built from the ground up for the unique needs of spatial computing,” Meta says.

The company says Navigator will begin rolling out as a limited test to some people on the Public Test Channel (PTC) v77, which is expected to roll out gradually to all users over the coming months.

Additionally, Instagram is getting a little love on Quest too, as Meta says it’s currently testing 3D-ified photos on the platform. For some users on PTC v77, Meta’s AI will automatically transform existing 2D photos not originally captured in 3D into an immersive format.

“And it’s an early look at our plans to continue bringing more social and entertainment experiences that are 2D today into a more immersive, 3D future,” Meta says.

Note: To enroll in Quest’s Public Test Channel (PTC), you need to use the Meta Horizon app on your phone and navigate to the ‘Devices’ section. Select your Quest headset and then go to ‘Headset settings’ and then ‘Advanced Settings’. Finally, toggle on ‘Public Test Channel’.

Filed Under: Meta Quest 3 News & Reviews, News

Google Teases Next Android XR Device: XREAL’s Upcoming AR Glasses ‘Project Aura’

May 21, 2025 From roadtovr

When it launches later this year, Android XR is coming first to Samsung’s mixed reality headset, Project Moohan. Now, Google has tapped AR glasses creator XREAL to be the second with its newly unveiled Project Aura.

Google announced at its I/O developer event that China-based XREAL will be the second device officially slated to run Android XR, the company’s forthcoming XR operating system currently in developer preview.

Codenamed Project Aura, the companies describe the optical see-through (OST) device as “a portable and tethered device that gives users access to their favorite Android apps, including those that have been built for XR.”

Information is still thin, however XREAL says Project Aura was created in collaboration with Google and chip-maker Qualcomm, and will be made available to developers “soon after” the launch of Project Moohan, which was recently affirmed to arrive later this year.

Image courtesy XREAL

XREAL hasn’t released specs, although the company has a track record of pairing micro-OLEDs with birdbath optics, which differs from the more expensive waveguide optics seen in devices such as Microsoft HoloLens, Magic Leap One, or Meta’s Orion AR glasses prototype.

Birdbath optics use a curved mirror system for brighter, higher field-of-view (FOV) and lower-cost AR displays, although this typically results in bulkier designs. Waveguides are often thinner and more expensive to manufacture, but provide more wearable form factors with better transparency; waveguides also typically feature a lower FOV, although prototypes like Meta Orion are bucking that trend.

Like the Android XR glasses seen on stage at Google I/O, which are coming from eyewear companies Warby Parker and Gentle Monster, XREAL Project Aura is expected to feature built-in Gemini AI, allowing it do things like real-time translation, AI assistant chats, web searches, object recognition, and displaying contextual info.

Choosing XREAL as its next Android XR hardware partner makes a good deal of sense here. Founded in 2017, XREAL (previously Nreal) has developed a number of AR glasses generations over the years, including its own custom Android launcher, Nebula, to handle native AR experiences on Android devices.

Like previous XREAL devices, Project Aura is meant to be a tethered, and not standalone. It’s uncertain just what external device the device will run Android XR, be it a standard smartphone or dedicated ‘puck’ like XREAL Beam.

That said, XREAL says they’ll be talking more about Project Aura at the Augmented World Expo (AWE) next month, which takes place June 10th – 12th in Long Beach, California. We’re going to present at AWE this year, so check back soon for more on all things XR to come from the event.

Filed Under: AR News, News

Google Partners with Prominent Eyewear Makers for Upcoming Android XR Smartglasses

May 20, 2025 From roadtovr

Google today announced that it is working with eyewear makers Warby Parker and Gentle Monster to bring the first Android XR smartglasses to market. The move mirrors Meta’s early partnership with EssilorLuxottica, the dominant eyewear maker that’s behind Meta’s Ray-Ban smartglasses.

While no productized Android XR smartglasses have been announced, Google said today it is working with eyewear makers Warby Parker and Gentle Monster on the first generation of products. Android XR smartglasses will prominently feature Google’s Gemini AI, and some will include on-board displays for visual output.

Image courtesy Google

Warby Parker is a well known American eyewear brand, founded in 2010, which has pioneered a lower cost, direct-to-consumer glasses business. Gentle Monster, founded in 2011, is a well known South Korean eyewear brand, and has a similar approach as Warby Parker.

While influential, both eyewear makers pale in comparison to EssilorLuxottica, the massive eyewear and lens conglomerate behind brands like Ray-Ban and Oakley.

EssilorLuxottica and Meta partnered several years ago around their smartglasses ambitions. Things seem to be going well for the partnership as the duo has launched several iterations of the Meta Ray-Ban smartglasses featuring classic Ray-Ban designs.

Ray-Ban Meta Glasses, Image courtesy Meta, EssilorLuxottica

Google is now taking the same tact by partnering with two well known glasses-makers to ensure that it has strong brand and fashion credibility behind its upcoming Android XR smartglasses.

The company’s first pair of smartglasses, Google Glass, launched way back in 2012. Although they were impressively compact for their time (especially considering the inclusion of a display), the asymmetrical design of the bulky display optics was seen as socially off-putting—just a bit too weird to pass as regular glasses.

That sent Google (and others) back to the drawing board for years, waiting until the tech could advance enough to make smartglasses that looked more socially acceptable.

It’s unclear when the first Android XR smartglasses will launch, or what they might cost, but Google also said today that developers will be able to start developing for Android XR smartglasses later this year.

Filed Under: News, XR Industry News

« Previous Page
Next Page »

  • Home