• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

XR Industry News

Vision Pro Knock-off Gets High Praise From Former Quest Engineer

January 10, 2025 From roadtovr

Wait, Apple Vision Pro doesn’t come in black, does it? Nope, but Play For Dream MR does. And with what some are calling the ‘Android Vision Pro’, owing to its Android-based OS, Play For Dream seems to have turned some heads at CES 2025 this past week.

Initially launched in Asia last year, China-based headset creator Play For Dream had its sights on bringing the heavily Vision Pro-inspired mixed reality headset to the West. Launching a Kickstarter campaign in September, Play For Dream MR went on to garner $2,271,650.00 Hong Kong dollars (~$292,000 USD).

Play For Dream MR has packed in a laundry list of modern XR features, including a Snapdragon XR2+ Gen 2 chipset running Android 15, dual 3,840 × 3,552 micro-OLED displays (90Hz), eye-tracking, auto IPD adjustment, wired and wireless PC streaming, and also a Quest Pro-inspired rear-mounted battery and Touch-style controllers.

In short, the headset appears to have it all—even Vision Pro’s user interface.

Design inspirations aside, former Quest engineer Amanda Watson got a chance go hands-on with Play For Dream’s MR headset, noting in an X post it was “absolutely the best all around HMD demo I saw on the floor today.”

“It is quite literally an ‘Android Apple Vision Pro’. but the execution was excellent. Great performance, optics, UI and media capture/playback features,” Watson continues, who departed Meta in 2022.

During her time at Meta/Oculus, Watson worked on a number of Quest-related projects, including both the tethered Link and the company’s Wi-Fi streaming tool, Air Link. At one time, she was the sole developer of Air Link for 13 months prior to its release. So when Watson says something is good, it probably is.

“It has USB and wireless PCVR streaming (I tried USB) — this was more [work-in-progress] quality (frame rate and latency) compared to other features, but it’s a relatively recent feature [as I understand it]. The basics like controller motion were nailed down and resolution was solid.”

Image courtesy Play For Dream

Furthermore, Watson reports its Touch-style controllers were “also very good. They said hand tracking exists, but they didn’t demo it.” Notably, the headset’s pancake lenses had “excellent distortion correction,” which Watson says is “the biggest thing to me personally.”

Established in 2020 under the name YVR, Play For Dream has already launched two generations of standalone VR headsets, its YVR 1 and YVR 2, both of which were released in China in 2022.

Play For Dream MR doesn’t have a firm release date or pricing yet, however the company has said it will come in under $2,000. For more, check out Play For Dream’s website for detailed specs and ordering opportunities when they arrive.

Filed Under: News, PC VR News & Reviews, XR Industry News

Immersive Web Company Infinite Reality Raises $3 Billion From a Private Investor

January 8, 2025 From roadtovr

Infinite Reality, a company building an engine for creating immersive web content underpinned with WebXR support, announced it has raised a whopping $3 billion to continue building the company’s “vision for the next generation of the internet.” Not only is this a huge fundraising round, such an investment coming from a single private investor is highly uncommon.

Founded in 2019, Infinite Reality’s flagship product is iR Studio, an engine for building immersive websites that represent visitors as avatars in a virtual environment. iR Studio sites support multi-user and WebXR by default, meaning sites built with the engine can be visited in any XR headset that supports WebXR, directly through a web browser.

The company announced today that it has raised $3 billion in additional funding, bringing its valuation to $12.25 billion. For comparison, a ‘unicorn’ (a startup valued at $1 billion) is a commonly cited benchmark for a highly successful tech startup. That makes Infinite Reality’s $12.25 billion valuation quite rare.

Even rarer, however, is that this $3 billion investment was purportedly made by a single private investor. While there are often $1+ billion investments or acquisitions in the tech world, the money usually comes from an investment firm, a major corporation, or a combination of the two, rather than an individual.

The investor is said to have a portfolio that “focuses on global technology and real estate investments.”

While the identity of the investor has not been disclosed, the sheer amount of the investment likely whittles the list of possible investors down to someone among the 1,000 wealthiest people in the world.

Ashcroft Law Firm, the firm representing the investor in the deal, provided a statement on the client’s behalf:

Our client has made and evaluated several investments in the technology sector. What compelled this investment was not just [Infinite Reality CEO’s] stated vision for the product itself, but his commitment to giving customers ownership of their data. As everyone understands, it is a crucial time for businesses of all sizes to own their data, customers, and intellectual property as AI gains momentum in the marketplace. Recognizing the significance of scalability and mass marketability, our client was particularly impressed by what he believes is iR’s revolutionary product, which caters to individuals and artists seeking to build their brand image with immediate global reach while servicing clientele from small businesses to Fortune 500 companies. It is our client’s belief this investment underscores a pivotal move towards empowering users and redefining ownership in the digital age.

From Infinite Reality’s side, the company says the funding is a win for building a next-generation immersive internet and giving businesses control of their own data.

“While we are excited that our ability to secure this level of funding validates our mission to build an immersive experience platform that will power the next generation of the web, we are absolutely ecstatic to share this news with our customers: businesses all over the world,” says John Acunto, co-founder and CEO of Infinite Reality. “The ability to provide them a platform where they can not only create a great immersive environment, but one where they own their data, own their customer, and own their experience means the world to Infinite Reality—and to me personally.”

Despite its focus on the immersive web, and considering the amount the company has raised, Infinite Reality is not particularly well known in the XR space. In fact the company has seemingly grown into its own little conglomerate, having made several acquisitions of its own over the years.

On one hand, the company has made strategic acquisitions like the AR tech company Zappar, avatar company Action Face, and the immersive web company Ethereal Engine. But the company has also made a series of seemingly disjointed acquisitions like the entertainment and esports business RektGlobal. And, further afield still, the acquisition of the Drone Racing League.

The huge investment by a private investor and Infinite Reality’s assimilation of several different companies certainly make for a unique—but uncertain—situation.

Filed Under: News, XR Industry News

Sony’s Standalone MR Headset Now Called ‘XYN’, Release Date & Pricing Still to Be Revealed

January 7, 2025 From roadtovr

Sony announced its previously revealed XR standalone for enterprise is now called ‘XYN’, which the company is targeting at spatial content creators.

Initially unveiled at CES last year, XYN (pronounced ‘zin’) packs in some impressive displays, offering 13.6MP (3,552 × 3840) per-eye using Sony’s own ECX344A OLED microdisplay.

The display is capable of 90 FPS and 1,000 nits (at 20% duty cycle), with 96% DCI-P3 color coverage, putting it above Apple Vision Pro in terms of resolution and color accuracy.

We got an opportunity to go hands-on with the pre-XYN prototype in July, back when it was still being referred to as the Sony SRH-S1 “content creation system.”

Here’s the short of it: on one hand, the headset’s ergonomics, flip-up design, and display clarity were all great. On the other, the system’s stylus-ring controller combo was very poorly tracked during our demo, and the content shown wasn’t well optimized for its internal Snapdragon XR2 Gen 2 chipset.

Beyond that, information is still thin on the ground. XYN’s price and release date are still uncertain, which is a baffling move as far as product announcements go.

What is known however is that its XYN Motion Studio PC companion software is coming out in March 2025, which supports connection with 12 ‘mocopi’ sensors for more seamless motion capture workflow.

mocopi senesors | Image courtesy Sony

Additionally, Sony is also launching what it calls its XYN Spatial capture solution, which converts images captured with a mirrorless camera and proprietary algorithms into high-quality, photorealistic 3D CG assets.

Notably, the headset itself is said to support “a wide range of third-party tools,” according to XYN’s press release.

That said, you probably shouldn’t expect XYN to compete with Quest on the lower end in terms of price-performance, as Sony’s standalone is targeting businesses and professional users. Notably, two color ways have been seen on the showroom floor at CES 2025 this week, with a black colorway for prosumers and a grey scheme for enterprise.

Filed Under: News, XR Industry News

Pimax Reveals Dream Air Prototypes and Answers Key Questions

December 30, 2024 From roadtovr

With the reveal of Pimax’s upcoming Dream Air headset, the VR community at large had plenty of questions. We put those questions straight to the company, and also got a glimpse of early prototypes, a full list of specs, and an update on unreleased products.

Pimax is, at this point, a seasoned maker of VR headsets. But the company has faced recurring criticism regarding product polish, strategic focus, missed release dates, and announcing new products before fulfilling older promises.

The company’s latest product announcement, the compact Dream Air headset, naturally resurfaced these complaints, with many people asking how Pimax would do better this time around. So we sent many of the most commonly asked questions direct to the company. Here’s what we got back, including photos of Dream Air prototypes, a full list of specs, and an update on previously announced (but still unreleased) products.

Q: How confident is Pimax that Dream Air will be completed and ship in meaningful quantities by May 2025?

A: Internally, we’ve been developing the Crystal Super micro-OLED and Dream Air for over a year now (internally, they’re largely the same headset). We have a fully working optical engine, and think the remaining time to May is enough to get the rest done, similar to the timeframe of the Crystal Super’s development of the past year.

The Dream Air utilizes the same optical engine solution as the Crystal Super, along with its underlying technologies, but in a new form-factor design. You can read here more about how the Dream Air and the Crystal Super micro-OLED have the same technical components.

The main challenge is the supply of micro-OLED panels, and perhaps the ringless controllers. (We currently think that the first batches of the headset may ship with ringed controllers as on the Crystal/Light/Super, which we can exchange for ringless controllers later).

We’re confident of shipping around 200 to 300 headsets in May. This is also why we had to announce the headset now. (Several reasons addressed below.)

Q: Why was the headset announced so soon after Super? And why already open up pre-orders?

A: Several reasons. We announced the Dream Air now because we don’t want to announce this after the Super starts shipping, and then have users feel they would have ordered this one if they knew. We already see this remark now in our Discord, but actually—customers can still change their pre-order from the Super to the Dream Air if they wish.

Another reason is the scarcity of micro-OLED panels. Currently in the market of micro-OLED panels, demand firmly outnumbers supply, so the delivery times for when we place an order is long, several months. We open up pre-orders to get a better idea of how many headsets our users want, and also to place the order for the panels for the Dream Air to be shipped in May. This order needs to be placed before or in early January, as suppliers also take holidays during Chinese New Year.

The long wait time for micro-OLED panels isn’t unique to Pimax. We also see similar products from competitors with the same issue, and therefore they don’t offer refundable pre-orders.

That said, our pre-orders are refundable before shipping (and users also have a trade-in window once the headset arrives), and we have added a $1 reservation option.

Q: What do you say to people who think Pimax should focus on fewer products?

A: We have the strong ambition to be a multi-SKU company, as VR headsets are also quickly diversifying. Our focus is always on providing the ultimate experience, and for different use cases we’ll have the Crystal line, as well as the new Dream line.

That said, all our headsets share a lot of the same core technology, from software to hardware. All headsets are focussed firmly on PCVR. We have learned from the past (e.g. Portal, which wasn’t PCVR.)

Pimax has a 9 years history of making VR headsets, we own two R&D offices and are opening our second assembly line to support this multi-SKU strategy.

To provide multi-SKU using shared technology allows us to pour more resources into developing technology that benefits all headsets. It also prevents us from having just one huge sales peak in the year, and it spreads out orders across a whole year more evenly, which makes supply and production resources easier to manage (we own our own factory with our own staff). Peaks are generally really bad for efficiency.

Q: Any more headsets coming from Pimax?

A: We’ll update some old models, but there are no more headsets coming that are more advanced in specs than the Dream Air and Crystal Super, except for the 12K.

Q: How far along is the design of the Dream Air? Were the renders shown in the announcement just a mockup or a fully realized design? Is there a functional prototype yet?

A: The internals of the headset are fully designed, and we’re testing with a fully working optical engine, and software wise, everything is shared with the Crystal Super, including SLAM tracking of the headset and the controllers, eye-tracking, hand-tracking, and all settings in Pimax Play.

On the exterior: We are currently testing and developing this in the Crystal Super housing (micro-OLED optical engine), while we’re developing the Dream Air’s exterior housing.

Here is a look at two prototypes made during development.

Newer:

Image courtesy Pimax

Older:

Image courtesy Pimax

Update (December 31st, 2024): A prior version of this article mixed up the ‘older’ and ‘newer’ labels on the above prototype images, this has been fixed.

Q: Will Cobb [the standalone module for Dream Air] ship in 2025?

A: We have no exact ETA on Cobb yet. Cobb is an add-on for the Dream Air and we still want to add some features that we did not communicate in our Frontier announcement.

Q: What safety mechanisms are in place to ensure the auto-tightening headstrap can’t be dangerous if it malfunctions?

A: The main thing is that it’s strong enough to hold the lightweight headset, but not strong enough to hurt anyone. The internal straps are made of elastic rubber. (Also this is not new technology, the same is used in self-lacing shoes such as the Nike Auto Adapt.)

Q: Can the head straps be replaced, and how?

A: Yes, the head strap can be taken off at the stems

Q: Would we be able to see this running HorizonOS or AndroidXR in the future?

There are no plans for this. Internally it’s exactly the same headset as the micro-OLED optical engine of the Crystal Super, and so it runs with Pimax Play as a PC VR headset (also with OpenXR/OpenVR runtime and with SteamVR)

Pimax also shared a detailed list of specifications for the headset:

Pimax Dream Air Specs

Visuals
Display 2 × micro-OLED
100% DCI-P3 colors
Resolution per-eye 13MP (3,840 × 3,552)
Pixels per-degree unknown
Max refresh rate 90Hz
Optics Pancake
Field-of-view 102°H
Pass-through view Black & white
Optical adjustments Continous IPD (automatic)
Prescription lenses (optional)
IPD adjustment range 58–72mm
Input & Output
Connectors DP 1.4 (PC) to USB-C (headset)
1 × USB-C accessory port
Input Dream Air controllers (rechargable battery)
Hand-tracking
Audio In-headstrap speakers
Microphone Dual-microphone
Weight 200g
Sensing
Headset-tracking Inside-out (no external beacons)
SteamVR Tracking (external beacons) [optional]
Controller-tracking Headset-tracked (headset line-of-sight needed)
Eye-tracking Yes
Expression-tracking No
On-board cameras 4 × tracking
2 × passthrough
Depth-sensor No
Price
MSRP $1,900

Pimax Product Shipping Update

Q: Can you provide the latest estimated shipping time for all unreleased Pimax products?

A: The Crystal Super is ready to be demoed at CES 2025, especially the QLED 57 PPD optical engine which is shipping at the end of January. The 50 PPD optical engine and micro-OLED optical engine are also nearing ready, and shipping March and April respectively.

The non-local dimming version of the Crystal Light is coming out around June 2025, pushing down that price even further down.

The 60G Airlink for the original Crystal is also being demoed at CES2025, and has its external beta test starting almost any moment now. This is shipping in April 2025.

For the 12K, we cannot give an exact ETA now. When we announced it, we had solutions for each of the key technical challenges. Unfortunately, some of those solutions did not meet our quality requirements. Some just didn’t work out well, like a dual DP 1.4 solution, as well as a panel solution we can’t share more about.


More questions for Pimax? Drop them in the comments below.

Filed Under: PC VR News & Reviews, XR Industry News

Pimax Announces Dream Air, a Compact VR Headset With One Totally Unique Feature

December 23, 2024 From roadtovr

The Pimax Dream Air headset represents a new area of focus for the company. While most of its headsets up to this point have been necessarily bulky to achieve their signature large field-of-view, the Dream Air aims to make a headset that’s compact but still feature-rich. One of those features—a headstrap that automatically tightens—would be an industry first.

Priced at $1,900 and purportedly shipping in May 2025, Pimax’s Dream Air headset aims to take on an emerging segment of compact high-end PC VR headsets like Bigscreen Beyond and the Shiftall MaganeX Superlight.

Image courtesy Pimax

But it wouldn’t be Pimax if it didn’t make additional ambitious promises which risk pulling the company’s attention away from delivering its products on time and as promised. For the Dream Air, that additional promise is an optional compute puck which the headset can plug into to become a standalone VR headset. The company is calling the puck ‘Cobb’, and says it will include a Snapdragon XR2 chip and battery. Oh, and don’t forget the optional SteamVR Tracking faceplate.

Speaking of pulling the company’s attention… the announcement of the Dream Air continues Pimax’s trend of revealing new products before delivering on those it has previously announced. The company’s Crystal Super headset was announced back in April 2024 and originally planned for a Q4 2024 release, but is now said to be releasing sometime in Q1 2025.

As for the Dream Air, it will purportedly be compact and also full of a wishlist of specs and features:

  • Weight of 200g
  • Resolution: 13MP (3,840 × 3,552) micro-OLED per-eye @ 90Hz and “HDR”
  • 102° field-of-view
  • Inside-out tracking
  • Motion controllers & hand-tracking
  • On-board audio
  • Optional prescription lenses
  • Eye-tracking
  • Automatic IPD and automatic strap tightening

That last one—automatic strap tightening—is a feature that hasn’t been included in any major headset to date. It’s an interesting idea considering the challenge of fitting a headset comfortably; many users want to crank their headset tight to their face so it won’t move, but the most comfortable way to use a headset is to balance tightness with stability.

The design of the auto-tightening strap also looks carefully considered. While we’ve only seen renders so far, it appears the tightening mechanism is hidden under fabric, making the tightening of the headstrap looks like it’s simply shrinking in place.

If the headset could effectively dial in the ideal tightness, it would be a boon for many users. Dream Air also has automatic IPD adjustment, which sets the distance between the lenses to match the user’s eye width (something most people also aren’t good at doing manually).

While it remains to be seen if Pimax can deliver something as svelte as promised, for now it looks like the company is flexing an industrial design muscle that’s been largely hidden by the utilitarian and boxy style of its previous headsets.

Image courtesy Pimax

However, Pimax isn’t giving up those boxy designs of yore. The company says that a compact headset is a new area of focus for the company, but it will continue developing its larger and wider field-of-view headsets.

Pimax is already taking pre-orders for the Dream Air, with a price of $1,900 and an expected release date of May 2025.

Filed Under: News, XR Industry News

Google & Pico Adoption Further Cements OpenXR as Industry Standard, With One Major Holdout

December 18, 2024 From roadtovr

OpenXR is an open standard made to improve compatibility between XR software and XR headsets. Google—one of the biggest tech companies in the world—is adopting the standard right out of the gate, joining other major firms like Meta and Microsoft. Other players (like ByteDance recently) also support the standard, cementing it as not just an open standard, but an industry standard. And while the vast majority of major XR companies now support OpenXR, a major holdout remains.

Initially announced in 2017, OpenXR is an open standard that makes it easier for developers to build XR applications that can run on a wide range of XR headsets with little to no modifications. While major players in the space like Meta, Microsoft, Valve, HTC, and plenty more all support OpenXR, the industry’s big holdout is—can you guess? Apple.

Apple is somewhat notorious for rejecting industry standards and forging its own path; sometimes the company sticks to its own proprietary formats and other times ends up adopting the industry standard in the end.

Vision Pro not only doesn’t support OpenXR, but it doesn’t have built-in support for motion-tracked controllers (which most existing XR content requires). If Vision Pro supported OpenXR, it would be significantly less work for developers to bring their XR apps to the headset (though the lack of controllers still poses a major hurdle).

As ever, Apple is the odd one out.

Meanwhile, Google wasted no time confirming its newly announced Android XR platform will support OpenXR, making it easier for developers to port content that was built XR apps for headsets like Quest.

Google says Android XR is already compatible with OpenXR 1.1, and the company has built out some of its own ‘vendor extensions’ which are new capabilities that extend what OpenXR can do on specific devices. Vendor extensions sometimes go on to become part of future versions of OpenXR.

Last month Pico (ByteDance’s XR division) also announced that its runtime is now compliant with the OpenXR 1.1 standard on Pico 4 Ultra, with plans to bring support to Pico 4 and Neo 3 by mid-2025.

Pico also has its own ideas about where the standard should go in the future. The company recently presented a framework for standardizing the way that XR applications can run simultaneously so users can run multiple XR applications in a shared space. Pico says it’s advocating for this approach to the OpenXR working group, and industry body which guides the evolution of the standard.

With the addition of support from both Google and Pico, OpenXR has truly achieved industry standard status, even if the odds of Apple ever adopting it remain slim.

Filed Under: News, XR Industry News

Lynx Confirms Android XR For Next Headset, Sony & XREAL Also On-board for Google’s OS

December 17, 2024 From roadtovr

Although Android XR isn’t properly open-source for the time being, Google hopes the OS will run on multiple partner headsets. While Samsung is said to be the first to launch an Android XR headset, Sony, Lynx, and XREAL are also planning to use the operating system.

Meta announced earlier this year that it intends to open its Quest’s Horizon OS operating system to third-parties, but now Android XR presents another choice for headset makers.

According to Google, Sony, Lynx, and XREAL are on board with Android XR.

Sony SRH-S1 MR headset | Image courtesy Sony

For Sony’s part, its SRH-S1 enterprise-focused MR headset is very likely the first target for Android XR. When we went hands-on with the headset earlier this year, the company was tight-lipped about whether it was building its own platform and where users could source content from. Android XR makes a lot more sense for the company than trying to build out its own XR OS and platform.

As for Sony’s current and future PSVR headsets, we expect they’ll continue to be tied directly to the PlayStation OS rather than switch to Android XR.

Lynx R-1 MR headset | Image courtesy Lynx

Lynx R-1 is a long-in-development MR headset that has struggled to make it fully to market. Part of that struggle, naturally, is building out a software stack that does everything an XR headset needs to do.

Lynx founder Stan Larroque tells Road to VR the R-1 won’t adopt Android XR, but future headsets from the company will. Making this move could well put the company in a better position for the future, by reducing software development costs and giving its headsets access to a larger ecosystem of apps and content.

XREAL Air 2 Ultra AR Glasses | Image courtesy XREAL

As for XREAL—a company building AR glasses primarily made to provide a large floating screen that projects content from other devices—it’s not clear yet exactly how they will use Android XR. But a good bet is that it will be the basis for future devices from the company.

While both Meta and Google are open to allowing their XR OS to work on third-party headsets, they’re still the gatekeepers. Neither Horizon OS nor Android XR are actually ‘open’ at this point. Only hand-picked partners can build on either OS.

But now that both operating systems are in play, there’s increased pressure for both to strive to be the ‘more open’ of the two. That pressure could quickly lead one or both companies to make their XR OS properly open for anyone to use.

Filed Under: XR Industry News

Blackmagic’s New 8K Camera for Apple Immersive Video is Pre-ordering Now for $30,000

December 16, 2024 From roadtovr

Blackmagic Design has revealed full specs and details for its new URSA Cine Immersive camera, specially designed to shoot 8K VR180 footage for the Apple Immersive Video format. Pre-orders for the $30,000 camera are open now, with shipping planned for Q1 2025. A forthcoming update to DaVinci Resolve Studio (also made by Blackmagic) adds editing tools specifically for Apple Immersive Video, including support for calibration data from the camera.

Apple Immersive Video is a 180° 3D video format intended for playback on Apple Vision Pro. Early versions of Blackmagic’s URSA Cine Immersive are likely the cameras used to film Apple Immersive Video content currently available on the headset.

Now the camera is being made available commercially, with pre-orders available for a cool $30,000. Though certainly expensive, this is in-line with many other high-end cinema cameras.

The URSA Cine Immersive is specially made to capture Apple Immersive Video, featuring a pair of 180° stereo lenses, capturing 59MP (8,160 x 7,200) each, with 16 stops of dynamic range. The camera can shoot up to 90 FPS in the Blackmagic RAW format, which also embeds calibration data (unique to each camera) that’s carried into the editing process for more precise and stable footage.

The forthcoming update to the DaVinci Resolve Studio editing software will include features specific to editing footage from the camera:

  • Immersive Video Viewer: Pan, tilt, and roll clips on 2D monitors or directly on Apple Vision Pro
  • Seamless Transitions: Clean master files using metadata-based bypass for Apple Vision Pro transitions
  • Export Presets: Streamlined delivery to Apple Vision Pro-ready packages

Both Blackmagic and Apple hope the release of the camera and streamlined editing workflow will make it easier for filmmakers to capture and release content in the Apple Immersive Video format.

It’s unclear if the camera and editor will work equally well for capturing VR180 footage for playback on other platforms and headsets, or if there’s something proprietary to the Apple Immersive Video format that would prevent straightforward compatibility and multi-platform releases.

Filed Under: Apple Vision Pro News & Reviews, XR Industry News

Hands-on: Samsung‘s Android XR Headset is a Curious Combo of Quest & Vision Pro, With One Stand-out Advantage

December 12, 2024 From roadtovr

Samsung is the first partner to formally announce a new MR headset based on the newly announced Android XR. The device, codenamed “Project Moohan,” is planned for consumer release in 2025. We went hands on with an early version.

Note: Samsung and Google aren’t yet sharing any key details for this headset like resolution, weight, field-of-view, or price. During my demo I also wasn’t allowed to capture photos or videos, so we only have an official image for the time being.

If I told you that Project Moohan felt like a mashup between Quest and Vision Pro, you’d probably get the idea that it has a lot of overlapping capabilities. But I’m not just making a rough analogy. Just looking at the headset, it’s clear that it has taken significant design cues from Vision Pro. Everything from colors to button placement to calibration steps, make it unmistakably aware of other products on the market.

And then on the software side, if I had told you “please make an OS that mashes together Horizon OS and VisionOS,” and you came back to me with Android XR, I’d say you nailed the assignment.

It’s actually uncanny just how much Project Moohan and Android XR feel like a riff on the two other biggest headset platforms.

But this isn’t a post to say someone stole something from someone else. Tech companies are always borrowing good ideas and good designs from each other—sometimes improving them along the way. So as long as Android XR and Project Moohan got the good parts of others, and avoided the bad parts, that’s a win for developers and users.

And many of the good parts do indeed appear to be there.

Hands-on With Samsung Project Moohan Android XR Headset

Image courtesy Google

Starting from the Project Moohan hardware—it’s a good-looking device, no doubt. It definitely has the ‘goggles’-style look of Vision Pro, as well as a tethered battery pack (not pictured above).

But where Vision Pro has a soft strap (that I find rather uncomfortable without a third-party upgrade), Samsung’s headset has a rigid strap with tightening dial, and an overall ergonomic design that’s pretty close to Quest Pro. That means an open-peripheral design which is great for using the headset for AR. Also like Quest Pro, the headset has some magnetic snap-on blinders for those that want a blocked-out peripheral for fully immersive experiences.

And though the goggles-look and even many of the button placements (and shapes) are strikingly similar to Vision Pro, Project Moohan doesn’t have an external display to show the user’s eyes. Vision Pro’s external ‘EyeSight’ display has been criticized by many, but I maintain it’s a desirable feature, and one that I wish Project Moohan had. Coming from Vision Pro, it’s just kind of awkward to not be able to ‘see’ the person wearing the headset, even though they can see you.

Samsung has been tight-lipped about the headset’s tech details, insisting that it’s still a prototype. However, we have learned the headset is running a Snapdragon XR2+ Gen 2 processor, a more powerful version of the chip in Quest 3 and Quest 3S.

In my hands-on I was able to glean a few details. For one, the headset is using pancake lenses with automatic IPD adjustment (thanks to integrated eye-tracking). The field-of-view feels smaller than Quest 3 or Vision Pro, but before I say that definitively, I first need to try different forehead pad options (confirmed to be included) which may be able to move my eyes closer to the lenses for a wider field-of-view.

From what I got to try however, the field-of-view did feel smaller—albeit, enough to still feel immersive—and so did the sweet spot due to brightness fall-off toward the outer edges of the display. Again, this is something that may improve if the lenses were closer to my eyes, but the vibe I got for now is that, from a lens standpoint, Meta’s Quest 3 is still leading, followed by Vision Pro, with Project Moohan a bit behind.

Although Samsung has confirmed that Project Moohan will have its own controllers, I didn’t get to see or try them yet. I was told they haven’t decided if the controllers will ship with the headset by default or be sold separately.

So it was all hand-tracking and eye-tracking input in my time with the headset. Again, this was a surprisingly similar mashup of both Horizon OS and VisionOS. You can use raycast cursors like Horizon OS or you can use eye+pinch inputs like VisionOS. The Samsung headset also includes downward-facing cameras so pinches can be detected when your hands are comfortably in your lap.

When I actually got to put the headset on, the first thing I noticed was how sharp my hands appeared to be. From memory, the headset’s passthrough cameras appear to have a sharper image than Quest 3 and less motion blur than Vision Pro (but I only got to test in excellent lighting conditions). Considering though how my hands seemed sharp but things further away seemed less so, it almost felt like the passthrough cameras might have been focused at roughly arms-length distance.

Continue on Page 2: Inside Android XR »

Inside Android XR

Anyway, onto Android XR. As said, it’s immediately comparable to a mashup of Horizon OS and VisionOS. You’ll see the same kind of ‘home screen’ as Vision Pro, with app icons on a transparent background. Look and pinch to select one and you get a floating panel (or a few) containing the app. It’s even the same gesture to open the home screen (look at your palm and pinch).

The system windows themselves look closer to those of Horizon OS than VisionOS, with mostly opaque backgrounds and the ability to move the window anywhere by reaching for an invisible frame that wraps around the entire panel.

In addition to flat apps, Android XR can do fully immersive stuff too. I got to see a VR version of Google Maps which felt very similar to Google Earth VR, allowing me to pick anywhere on the globe to visit, including the ability to see locations like major cities modeled in 3D, Street View imagery, and, newly, volumetric captures of interior spaces.

While Street View is monoscopic 360 imagery, the volumetric captures are rendered in real-time and fully explorable. Google said this was a gaussian splat solution, though I’m not clear on whether it was generated from existing interior photography that’s already available on standard Google Maps, or if it required a brand new scan. It wasn’t nearly as sharp as you’d expect from a photogrammetry scan, but not bad either. Google said the capture was running on-device and not streamed, and that sharpness is expected to improve over time.

Google Photos has also been updated for Android XR, including the ability to automatically convert any existing 2D photo or video from your library into 3D. In the brief time I had with it, the conversions looked really impressive; similar in quality to the same feature on Vision Pro.

YouTube is another app Google has updated to take full advantage of Android XR. In addition to watching regular flatscreen content on a large, curved display, you can also watch the platform’s existing library of 180, 360, and 3D content. Not all of it is super high quality, but it’s nice that it’s not being forgotten—and will surely be added to as more headsets are able to view this kind of media.

Google also showed me a YouTube video that was originally shot in 2D but automatically converted to 3D to be viewed on the headset. It looked pretty good, seemingly similar in quality to the Google Photos 3D conversion tech. It wasn’t made clear whether this is something that YouTube creators would need to opt-in to have generated, or something YouTube would just do automatically. I’m sure there’s more details to come.

The Stand-out Advantage (for now)

Android XR and Project Moohan, both from a hardware and software standpoint, feel very much like a Google-fied version of what’s already on the market. But what it clearly does better than any other headset right now is conversational AI.

Google’s AI agent, Gemini (specifically the ‘Project Astra‘ variant) can be triggered right from the home screen. Not only can it hear you, but it can see what you see in both the real world and the virtual world—continuously. Its ongoing perception of what you’re saying and what you’re seeing makes it feel smarter, better integrated, and more conversational than the AI agents on contemporary headsets.

Yes, Vision Pro has Siri, but Siri can only hear you and is mostly focused on single-tasks rather than an ongoing conversation.

And Quest has an experimental Meta AI agent that can hear you and see what you’re seeing—but only the real world. It has no sense of what virtual content is in front of you, which creates a weird disconnect. Meta says this will change eventually, but for now that’s how it works. And in order to ‘see’ things, you have to ask it a question about your environment and then stand still while it makes a ‘shutter’ sound, then starts thinking about that image.

Gemini, on the other hand, gets something closer to a low framerate video feed of what you’re seeing in both the real and virtual worlds; which means no awkward pauses to make sure you’re looking directly at the thing you asked about as a single picture is taken.

Gemini on Android XR also has a memory about it, which gives it a boost when it comes to contextual understanding. Google says it has a rolling 10-minute memory and retains “key details of past conversations,” which means you can refer not only to things you talked about recently, but also things you saw.

I was shown what is by now becoming a common AI demo: you’re in a room filled with stuff and you can ask questions about it. I tried to trip the system up with a few sly questions, and was impressed at its ability to avoid the diversions.

I used Gemini on Android XR to ask it to translate sign written in Spanish into English. It quickly gave me a quick translation. Then I asked it to translate another nearby sign into French—knowing full well that this sign was already in French. Gemini had no problem with this, and correctly noted, “this sign is already in French, it says [xyz],” and it even said the French words in a French accent.

I moved on to asking about some other objects in the room, and after it had been a few minutes since asking about the signs, I asked it “what did that sign say earlier?” It knew what I was talking about and read the French sign aloud. Then I said “what about the one before that?”….

A few years ago this question—”what about the one before that?”—would have been a wildly challenging question for any AI system (and it still is for many). Answering it correctly requires multiple levels of context from our conversation up to that point, and an understanding of how the thing I had just asked about relates to another thing we had talked about previously.

But it knew exactly what I meant, and quickly read the Spanish sign back to me. Impressive.

Gemini on Android XR can also do more than just answer general questions. It remains to be seen how deep this will be at launch, but Google showed me a few ways that Gemini can actually control the headset.

For one, asking it to “take me to the Eiffel tower,” pulls up an immersive Google Maps view so I can see it in 3D. And since it can see virtual content as well as real, I can continue having a fairly natural conversation, with questions like “how tall is it?” or “when was it built?”

Gemini can also fetch specific YouTube videos that it thinks are the right answer to your query. So saying something like “show a video of the view from the ground,” while looking at the virtual Eiffel tower, will pop up a YouTube video to show what you asked for.

Ostensibly Gemini on Android XR should also be able to do the usual assistant stuff that most phone AI can do (ie: send text messages, compose an email, set reminders), but it will be interesting to see how deep it will go with XR-specific capabilities.

Gemini on Android XR feels like the best version of an AI agent on a headset yet (including what Meta has right now on their Ray-Ban smartglasses) but Apple and Meta are undoubtedly working toward similar capabilities. How long Google can maintain the lead here remains to be seen.

Gemini on Project Moohan feels like a nice value-add when using the headset for spatial productivity purposes, but its true destiny probably lies on smaller, everyday wearable smartglasses, which I also got to try… but more on that in another article.

Filed Under: Feature, hardware preview, News, XR Industry News

Google Announces Android XR Operating System Alongside Samsung MR Headset

December 12, 2024 From roadtovr

Google today announced Android XR, a new core branch of Android, designed as a spatial operating system for XR headsets and glasses. The company is pitching this as a comprehensive spatial computing platform, and hopes to establish its own territory in the XR landscape against incumbents Meta and Apple.

Google has revealed Android XR and it’s basically what the name implies: a full-blown version of Android that’s been adapted to run on XR headsets, supports the entire existing library of flat Android apps, and opens the door to “spatialized” versions of those apps, as well as completely immersive VR content.

Samsung’s newly announced headset, codenamed Project Moohan, will be the first MR headset to launch with Android XR next year. Check out our early hands-on with the headset.

Samsung Project Moohan | Image courtesy Google

Google tells us that Android apps currently on the Play Store will be available on immersive Android XR headsets by default, with developers able to opt-out if they choose. That means a huge library of existing flat apps will be available on the device on day one—great for giving the headset a baseline level of productivity.

That includes all of Google’s major first-party apps like Chrome, Gmail, Calendar, Drive, and more. Some of Google’s apps have been updated to uniquely take advantage of Android XR (or, as Google says, they have been “spatialized”).

Google TV, for instance, can be watched on a large, curved screen, with info panels popping out of the main window for better use of real-estate.

Google Photos has been redesigned with a layout that’s unique to Android XR, and the app can automatically convert photos and videos to 3D (with pretty impressive results).

YouTube not only supports a large curved screen for viewing but also supports the platform’s existing library of 360, 180, and 3D content.

Chrome supports multiple browser windows for multi-tasking while web-browsing, which pairs nicely with Android XR’s built-in support for bluetooth mice and keyboards.

And Google Maps has a fully immersive view that’s very similar to Google Earth VR, including the ability to view Street View photography and newly added volumetric captures of business interiors and other places (based on gaussian splats).

Functionally, this is all pretty similar to what Apple is doing with VisionOS, but Android flavored.

Where Android XR significantly differentiates itself is through its AI integration. Gemini is built right into Android XR. But this goes far beyond a chat agent. Gemini on Android XR is a conversational agent which allows you to have free-form voice conversations about what you see in both the real world and the virtual world. That means you can ask it for help in an app that’s floating in front of you, or ask it something about things you see around you via passthrough.

Apple has Siri on VisionOS, but it can’t see anything in or out of the headset. Meta has an experimental AI on Horizon OS that can see things in the real world around you, but it can’t see things in the virtual world. Gemini’s ability to consider both real and virtual content makes it feel more seamlessly integrated into the system and more useful.

Android XR is designed to power not only immersive MR headsets, but smartglasses too. In the near-term, Google envisions Android XR smartglasses as HUD-like companions to your smartphone, rather than full AR.

Prototype Android XR smartglasses | Image courtesy Google

And it’s Gemini that forms the core of Google’s plans for Android XR on smartglasses. The near-term devices for this use-case are compact glasses that can actually pass for regular-looking glasses, and offer small displays floating in your field-of-view for HUD-like informational purposes, as well as audio feedback for conversations with Gemini. Uses like showing texts, directions, or translations are being shown. Similar to Android XR on an MR headset, these smartglasses are almost certain to be equipped with cameras, giving Gemini the ability to see and respond to things you see.

It’s a lot like what Google Glass was doing a decade ago, but sleeker and much smarter.

While no specific smartglasses products have been announced for Android XR yet, Google and Samsung have been collaborating on an MR headset called “Project Moohan,” which Samsung will launch to consumers next year.

When it comes to development, Google is supporting a wide gamut of dev pathways. For devs building with Android Studio, a new Jetpack XR SDK extends that workflow to help developers create spatial versions of their existing flat apps. This includes a new Android XR Emulator for testing Android XR apps without a headset. Unity is also supported through a new Android XR Extension, as well as WebXR and OpenXR.

Google also says it’s bringing new capabilities to OpenXR through vendor extensions, including the following:

  • AI-powered hand mesh, designed to adapt to the shape and size of hands to better represent the diversity of your users
  • Detailed depth textures that allow real world objects to occlude virtual content
  • Sophisticated light estimation, for lighting your digital content to match real-world lighting conditions
  • New trackables that let you bring real world objects like laptops, phones, keyboards, and mice into a virtual environment

On the design side, Google has updated its ‘Material Design’ to include new components and layouts that automatically adapt for spatial apps.

Developers interested in building for Android XR can reach out via this form to express interest in an Android XR Developer Bootcamp coming in 2025.

Filed Under: News, XR Industry News

« Previous Page
Next Page »

  • Home