• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

News

Google & Pico Adoption Further Cements OpenXR as Industry Standard, With One Major Holdout

December 18, 2024 From roadtovr

OpenXR is an open standard made to improve compatibility between XR software and XR headsets. Google—one of the biggest tech companies in the world—is adopting the standard right out of the gate, joining other major firms like Meta and Microsoft. Other players (like ByteDance recently) also support the standard, cementing it as not just an open standard, but an industry standard. And while the vast majority of major XR companies now support OpenXR, a major holdout remains.

Initially announced in 2017, OpenXR is an open standard that makes it easier for developers to build XR applications that can run on a wide range of XR headsets with little to no modifications. While major players in the space like Meta, Microsoft, Valve, HTC, and plenty more all support OpenXR, the industry’s big holdout is—can you guess? Apple.

Apple is somewhat notorious for rejecting industry standards and forging its own path; sometimes the company sticks to its own proprietary formats and other times ends up adopting the industry standard in the end.

Vision Pro not only doesn’t support OpenXR, but it doesn’t have built-in support for motion-tracked controllers (which most existing XR content requires). If Vision Pro supported OpenXR, it would be significantly less work for developers to bring their XR apps to the headset (though the lack of controllers still poses a major hurdle).

As ever, Apple is the odd one out.

Meanwhile, Google wasted no time confirming its newly announced Android XR platform will support OpenXR, making it easier for developers to port content that was built XR apps for headsets like Quest.

Google says Android XR is already compatible with OpenXR 1.1, and the company has built out some of its own ‘vendor extensions’ which are new capabilities that extend what OpenXR can do on specific devices. Vendor extensions sometimes go on to become part of future versions of OpenXR.

Last month Pico (ByteDance’s XR division) also announced that its runtime is now compliant with the OpenXR 1.1 standard on Pico 4 Ultra, with plans to bring support to Pico 4 and Neo 3 by mid-2025.

Pico also has its own ideas about where the standard should go in the future. The company recently presented a framework for standardizing the way that XR applications can run simultaneously so users can run multiple XR applications in a shared space. Pico says it’s advocating for this approach to the OpenXR working group, and industry body which guides the evolution of the standard.

With the addition of support from both Google and Pico, OpenXR has truly achieved industry standard status, even if the odds of Apple ever adopting it remain slim.

Filed Under: News, XR Industry News

Hands-on: Samsung‘s Android XR Headset is a Curious Combo of Quest & Vision Pro, With One Stand-out Advantage

December 12, 2024 From roadtovr

Samsung is the first partner to formally announce a new MR headset based on the newly announced Android XR. The device, codenamed “Project Moohan,” is planned for consumer release in 2025. We went hands on with an early version.

Note: Samsung and Google aren’t yet sharing any key details for this headset like resolution, weight, field-of-view, or price. During my demo I also wasn’t allowed to capture photos or videos, so we only have an official image for the time being.

If I told you that Project Moohan felt like a mashup between Quest and Vision Pro, you’d probably get the idea that it has a lot of overlapping capabilities. But I’m not just making a rough analogy. Just looking at the headset, it’s clear that it has taken significant design cues from Vision Pro. Everything from colors to button placement to calibration steps, make it unmistakably aware of other products on the market.

And then on the software side, if I had told you “please make an OS that mashes together Horizon OS and VisionOS,” and you came back to me with Android XR, I’d say you nailed the assignment.

It’s actually uncanny just how much Project Moohan and Android XR feel like a riff on the two other biggest headset platforms.

But this isn’t a post to say someone stole something from someone else. Tech companies are always borrowing good ideas and good designs from each other—sometimes improving them along the way. So as long as Android XR and Project Moohan got the good parts of others, and avoided the bad parts, that’s a win for developers and users.

And many of the good parts do indeed appear to be there.

Hands-on With Samsung Project Moohan Android XR Headset

Image courtesy Google

Starting from the Project Moohan hardware—it’s a good-looking device, no doubt. It definitely has the ‘goggles’-style look of Vision Pro, as well as a tethered battery pack (not pictured above).

But where Vision Pro has a soft strap (that I find rather uncomfortable without a third-party upgrade), Samsung’s headset has a rigid strap with tightening dial, and an overall ergonomic design that’s pretty close to Quest Pro. That means an open-peripheral design which is great for using the headset for AR. Also like Quest Pro, the headset has some magnetic snap-on blinders for those that want a blocked-out peripheral for fully immersive experiences.

And though the goggles-look and even many of the button placements (and shapes) are strikingly similar to Vision Pro, Project Moohan doesn’t have an external display to show the user’s eyes. Vision Pro’s external ‘EyeSight’ display has been criticized by many, but I maintain it’s a desirable feature, and one that I wish Project Moohan had. Coming from Vision Pro, it’s just kind of awkward to not be able to ‘see’ the person wearing the headset, even though they can see you.

Samsung has been tight-lipped about the headset’s tech details, insisting that it’s still a prototype. However, we have learned the headset is running a Snapdragon XR2+ Gen 2 processor, a more powerful version of the chip in Quest 3 and Quest 3S.

In my hands-on I was able to glean a few details. For one, the headset is using pancake lenses with automatic IPD adjustment (thanks to integrated eye-tracking). The field-of-view feels smaller than Quest 3 or Vision Pro, but before I say that definitively, I first need to try different forehead pad options (confirmed to be included) which may be able to move my eyes closer to the lenses for a wider field-of-view.

From what I got to try however, the field-of-view did feel smaller—albeit, enough to still feel immersive—and so did the sweet spot due to brightness fall-off toward the outer edges of the display. Again, this is something that may improve if the lenses were closer to my eyes, but the vibe I got for now is that, from a lens standpoint, Meta’s Quest 3 is still leading, followed by Vision Pro, with Project Moohan a bit behind.

Although Samsung has confirmed that Project Moohan will have its own controllers, I didn’t get to see or try them yet. I was told they haven’t decided if the controllers will ship with the headset by default or be sold separately.

So it was all hand-tracking and eye-tracking input in my time with the headset. Again, this was a surprisingly similar mashup of both Horizon OS and VisionOS. You can use raycast cursors like Horizon OS or you can use eye+pinch inputs like VisionOS. The Samsung headset also includes downward-facing cameras so pinches can be detected when your hands are comfortably in your lap.

When I actually got to put the headset on, the first thing I noticed was how sharp my hands appeared to be. From memory, the headset’s passthrough cameras appear to have a sharper image than Quest 3 and less motion blur than Vision Pro (but I only got to test in excellent lighting conditions). Considering though how my hands seemed sharp but things further away seemed less so, it almost felt like the passthrough cameras might have been focused at roughly arms-length distance.

Continue on Page 2: Inside Android XR »

Inside Android XR

Anyway, onto Android XR. As said, it’s immediately comparable to a mashup of Horizon OS and VisionOS. You’ll see the same kind of ‘home screen’ as Vision Pro, with app icons on a transparent background. Look and pinch to select one and you get a floating panel (or a few) containing the app. It’s even the same gesture to open the home screen (look at your palm and pinch).

The system windows themselves look closer to those of Horizon OS than VisionOS, with mostly opaque backgrounds and the ability to move the window anywhere by reaching for an invisible frame that wraps around the entire panel.

In addition to flat apps, Android XR can do fully immersive stuff too. I got to see a VR version of Google Maps which felt very similar to Google Earth VR, allowing me to pick anywhere on the globe to visit, including the ability to see locations like major cities modeled in 3D, Street View imagery, and, newly, volumetric captures of interior spaces.

While Street View is monoscopic 360 imagery, the volumetric captures are rendered in real-time and fully explorable. Google said this was a gaussian splat solution, though I’m not clear on whether it was generated from existing interior photography that’s already available on standard Google Maps, or if it required a brand new scan. It wasn’t nearly as sharp as you’d expect from a photogrammetry scan, but not bad either. Google said the capture was running on-device and not streamed, and that sharpness is expected to improve over time.

Google Photos has also been updated for Android XR, including the ability to automatically convert any existing 2D photo or video from your library into 3D. In the brief time I had with it, the conversions looked really impressive; similar in quality to the same feature on Vision Pro.

YouTube is another app Google has updated to take full advantage of Android XR. In addition to watching regular flatscreen content on a large, curved display, you can also watch the platform’s existing library of 180, 360, and 3D content. Not all of it is super high quality, but it’s nice that it’s not being forgotten—and will surely be added to as more headsets are able to view this kind of media.

Google also showed me a YouTube video that was originally shot in 2D but automatically converted to 3D to be viewed on the headset. It looked pretty good, seemingly similar in quality to the Google Photos 3D conversion tech. It wasn’t made clear whether this is something that YouTube creators would need to opt-in to have generated, or something YouTube would just do automatically. I’m sure there’s more details to come.

The Stand-out Advantage (for now)

Android XR and Project Moohan, both from a hardware and software standpoint, feel very much like a Google-fied version of what’s already on the market. But what it clearly does better than any other headset right now is conversational AI.

Google’s AI agent, Gemini (specifically the ‘Project Astra‘ variant) can be triggered right from the home screen. Not only can it hear you, but it can see what you see in both the real world and the virtual world—continuously. Its ongoing perception of what you’re saying and what you’re seeing makes it feel smarter, better integrated, and more conversational than the AI agents on contemporary headsets.

Yes, Vision Pro has Siri, but Siri can only hear you and is mostly focused on single-tasks rather than an ongoing conversation.

And Quest has an experimental Meta AI agent that can hear you and see what you’re seeing—but only the real world. It has no sense of what virtual content is in front of you, which creates a weird disconnect. Meta says this will change eventually, but for now that’s how it works. And in order to ‘see’ things, you have to ask it a question about your environment and then stand still while it makes a ‘shutter’ sound, then starts thinking about that image.

Gemini, on the other hand, gets something closer to a low framerate video feed of what you’re seeing in both the real and virtual worlds; which means no awkward pauses to make sure you’re looking directly at the thing you asked about as a single picture is taken.

Gemini on Android XR also has a memory about it, which gives it a boost when it comes to contextual understanding. Google says it has a rolling 10-minute memory and retains “key details of past conversations,” which means you can refer not only to things you talked about recently, but also things you saw.

I was shown what is by now becoming a common AI demo: you’re in a room filled with stuff and you can ask questions about it. I tried to trip the system up with a few sly questions, and was impressed at its ability to avoid the diversions.

I used Gemini on Android XR to ask it to translate sign written in Spanish into English. It quickly gave me a quick translation. Then I asked it to translate another nearby sign into French—knowing full well that this sign was already in French. Gemini had no problem with this, and correctly noted, “this sign is already in French, it says [xyz],” and it even said the French words in a French accent.

I moved on to asking about some other objects in the room, and after it had been a few minutes since asking about the signs, I asked it “what did that sign say earlier?” It knew what I was talking about and read the French sign aloud. Then I said “what about the one before that?”….

A few years ago this question—”what about the one before that?”—would have been a wildly challenging question for any AI system (and it still is for many). Answering it correctly requires multiple levels of context from our conversation up to that point, and an understanding of how the thing I had just asked about relates to another thing we had talked about previously.

But it knew exactly what I meant, and quickly read the Spanish sign back to me. Impressive.

Gemini on Android XR can also do more than just answer general questions. It remains to be seen how deep this will be at launch, but Google showed me a few ways that Gemini can actually control the headset.

For one, asking it to “take me to the Eiffel tower,” pulls up an immersive Google Maps view so I can see it in 3D. And since it can see virtual content as well as real, I can continue having a fairly natural conversation, with questions like “how tall is it?” or “when was it built?”

Gemini can also fetch specific YouTube videos that it thinks are the right answer to your query. So saying something like “show a video of the view from the ground,” while looking at the virtual Eiffel tower, will pop up a YouTube video to show what you asked for.

Ostensibly Gemini on Android XR should also be able to do the usual assistant stuff that most phone AI can do (ie: send text messages, compose an email, set reminders), but it will be interesting to see how deep it will go with XR-specific capabilities.

Gemini on Android XR feels like the best version of an AI agent on a headset yet (including what Meta has right now on their Ray-Ban smartglasses) but Apple and Meta are undoubtedly working toward similar capabilities. How long Google can maintain the lead here remains to be seen.

Gemini on Project Moohan feels like a nice value-add when using the headset for spatial productivity purposes, but its true destiny probably lies on smaller, everyday wearable smartglasses, which I also got to try… but more on that in another article.

Filed Under: Feature, hardware preview, News, XR Industry News

Google Announces Android XR Operating System Alongside Samsung MR Headset

December 12, 2024 From roadtovr

Google today announced Android XR, a new core branch of Android, designed as a spatial operating system for XR headsets and glasses. The company is pitching this as a comprehensive spatial computing platform, and hopes to establish its own territory in the XR landscape against incumbents Meta and Apple.

Google has revealed Android XR and it’s basically what the name implies: a full-blown version of Android that’s been adapted to run on XR headsets, supports the entire existing library of flat Android apps, and opens the door to “spatialized” versions of those apps, as well as completely immersive VR content.

Samsung’s newly announced headset, codenamed Project Moohan, will be the first MR headset to launch with Android XR next year. Check out our early hands-on with the headset.

Samsung Project Moohan | Image courtesy Google

Google tells us that Android apps currently on the Play Store will be available on immersive Android XR headsets by default, with developers able to opt-out if they choose. That means a huge library of existing flat apps will be available on the device on day one—great for giving the headset a baseline level of productivity.

That includes all of Google’s major first-party apps like Chrome, Gmail, Calendar, Drive, and more. Some of Google’s apps have been updated to uniquely take advantage of Android XR (or, as Google says, they have been “spatialized”).

Google TV, for instance, can be watched on a large, curved screen, with info panels popping out of the main window for better use of real-estate.

Google Photos has been redesigned with a layout that’s unique to Android XR, and the app can automatically convert photos and videos to 3D (with pretty impressive results).

YouTube not only supports a large curved screen for viewing but also supports the platform’s existing library of 360, 180, and 3D content.

Chrome supports multiple browser windows for multi-tasking while web-browsing, which pairs nicely with Android XR’s built-in support for bluetooth mice and keyboards.

And Google Maps has a fully immersive view that’s very similar to Google Earth VR, including the ability to view Street View photography and newly added volumetric captures of business interiors and other places (based on gaussian splats).

Functionally, this is all pretty similar to what Apple is doing with VisionOS, but Android flavored.

Where Android XR significantly differentiates itself is through its AI integration. Gemini is built right into Android XR. But this goes far beyond a chat agent. Gemini on Android XR is a conversational agent which allows you to have free-form voice conversations about what you see in both the real world and the virtual world. That means you can ask it for help in an app that’s floating in front of you, or ask it something about things you see around you via passthrough.

Apple has Siri on VisionOS, but it can’t see anything in or out of the headset. Meta has an experimental AI on Horizon OS that can see things in the real world around you, but it can’t see things in the virtual world. Gemini’s ability to consider both real and virtual content makes it feel more seamlessly integrated into the system and more useful.

Android XR is designed to power not only immersive MR headsets, but smartglasses too. In the near-term, Google envisions Android XR smartglasses as HUD-like companions to your smartphone, rather than full AR.

Prototype Android XR smartglasses | Image courtesy Google

And it’s Gemini that forms the core of Google’s plans for Android XR on smartglasses. The near-term devices for this use-case are compact glasses that can actually pass for regular-looking glasses, and offer small displays floating in your field-of-view for HUD-like informational purposes, as well as audio feedback for conversations with Gemini. Uses like showing texts, directions, or translations are being shown. Similar to Android XR on an MR headset, these smartglasses are almost certain to be equipped with cameras, giving Gemini the ability to see and respond to things you see.

It’s a lot like what Google Glass was doing a decade ago, but sleeker and much smarter.

While no specific smartglasses products have been announced for Android XR yet, Google and Samsung have been collaborating on an MR headset called “Project Moohan,” which Samsung will launch to consumers next year.

When it comes to development, Google is supporting a wide gamut of dev pathways. For devs building with Android Studio, a new Jetpack XR SDK extends that workflow to help developers create spatial versions of their existing flat apps. This includes a new Android XR Emulator for testing Android XR apps without a headset. Unity is also supported through a new Android XR Extension, as well as WebXR and OpenXR.

Google also says it’s bringing new capabilities to OpenXR through vendor extensions, including the following:

  • AI-powered hand mesh, designed to adapt to the shape and size of hands to better represent the diversity of your users
  • Detailed depth textures that allow real world objects to occlude virtual content
  • Sophisticated light estimation, for lighting your digital content to match real-world lighting conditions
  • New trackables that let you bring real world objects like laptops, phones, keyboards, and mice into a virtual environment

On the design side, Google has updated its ‘Material Design’ to include new components and layouts that automatically adapt for spatial apps.

Developers interested in building for Android XR can reach out via this form to express interest in an Android XR Developer Bootcamp coming in 2025.

Filed Under: News, XR Industry News

Apple Vision Pro Gets Ultrawide Mac Virtual Display in visionOS 2.2 Release

December 12, 2024 From roadtovr

Previously only available in beta, Apple has now pushed its panoramic display feature to all Vision Pro users, bringing the choice of three virtual screen sizes when using Mac Virtual Display.

Mac Virtual Display initially launched with a single virtual screen size back in February, which also allowed users to have multiple app windows, although screen real estate was somewhat limited for a device opining to be a general computing machine first, entertainment device second.

Now, in visionOS 2.2, all Vision Pro users have access to two new display formats: ‘Wide’ (21:9) and ‘Ultrawide’ (32:9), the latter of which is said to allow for max resolutions “equivalent to two 4K monitors, side by side,” Apple said at its unveiling in June. Mac-side dynamic foveated rendering also keeps content “sharp wherever you look,” the company added.

In our hands-on test of the feature, we found it to be a huge value-add to the headset.

The feature requires a Mac computer with macOS Sequoia 15.2, which covers a pretty wide range of devices, including everything from 2017-era iMac Pros to the company’s latest M4 chip MacBooks.

Additionally, the visionOS 2.2 update also includes support for iOS’s Personal Hotspot feature, which the company says now lets you share the cellular data connection of your iPhone or iPad with other devices, including Vision Pro, effectively giving you access to 5G download speeds.

Filed Under: Apple Vision Pro News & Reviews, News

This Modder Hopes to Bring VR Support to Massive ‘Fallout: London’ DLC

December 10, 2024 From roadtovr

Fallout: London is a total conversion mod based on Bethesda’s popular Fallout 4 (2015) for PC. While the team behind the mod hasn’t mentioned whether potential VR support is in the works, an intrepid modder is taking things into their own hands.

As first reported by PC Guide’s Charlie Noon, it’s still early days for this particular VR mod, which hopes to let PC VR players jump into the post-apocalyptic London as envisioned by the FOLON modding team. It’s a complete overhaul, with custom maps, assets, and even voice acting—all built independently from Bethesda.

‘Raezroth Elnheim’, the unassociated creator behind the VR mod, says on the Fallout: London Discord it’s “a passion project right now,” however they note they’re “interested in making it a reality.”

Here’s an early look at the VR mod in action, which also includes support for motion controllers.

Notably, while many flatscreen Fallout 4 mods work when applied to the official VR version by managing mods via the Nexus launcher, Fallout: London presents a few more challenges, it seems.

Raezroth says experimentation started when the Fallout: London DLC was released back in July, which is the result of importing Fallout: London files into Bethesda’s Fallout 4 VR (2017), although it’s clearly not such a simple plug-and-play solution.

Just how they did it will be revealed “in time,” Raezroth says. In the meantime, the VR mod is still a solo project—at least for now—as the creator is hoping to get the attention of the mod’s FOLON team to dig in further.

Still, progress looks good thus far, as Raezroth notes the mod can load save files and the mod’s custom assets, saying however “there are missing meshes and such [that] need fixing.” Additionally, some patches are required to make the FOLON UI compatible in VR.

While it’s still too early to download Raezroth’s mod yourself, we’ll be keeping our eye on the project and let you know as soon as you can.

Filed Under: News, PC VR News & Reviews

Meta Announces Multi-year Exclusive Agreement for Spatial Content with James Cameron’s 3D Studio

December 6, 2024 From roadtovr

Meta announced it’s partnering with Lightstorm Vision, James Cameron’s 3D film studio, to produce spatial content across multiple genres, including live events and full-length entertainment.

The agreement includes the production of live sports and concerts, feature films, and TV series featuring “big-name IP,” Meta says in a recent blog post, noting that Quest will be Lightstorm Vision’s the exclusive MR hardware platform.

Meta says the collaboration with Lightstorm Vision will include the co-production of original stereoscopic content, but also be geared towards “improving content creators’ ability to make high-quality stereoscopic content through the use of advanced tooling, including employing AI.”

The multi-year partnership was struck after Meta CTO and Reality Labs head Andrew Bosworth demoed some of the company’s latest hardware to Cameron.

“I was amazed by its transformational potential and power, and what it means for content creators globally,” Cameron says. “I’m convinced we’re at a true, historic inflection point. Navigating that future with Meta will ensure ALL of us have the tools to create, experience, and enjoy new and mind-blowing forms of media.”

Cameron is best known for his writing and directorial work on a slew of box office hits, including The Terminator (1984), Aliens (1986), The Abyss (1989), Terminator 2: Judgment Day (1991), Titanic (1997), as well as Avatar (2009) and its sequels.

The filmmaker is also a long-time supporter of 3D, having helped kickstart the rash of 3D films in the 2010s with the development of the Fusion Camera System, which was used to capture stereoscopic 3D for a number of films, including Avatar, Tron: Legacy (2010), and Life of Pi (2012).

Filed Under: News, XR Industry News

Meta Pushes Back on Reported Outsourcing of XR Headset Designs

December 6, 2024 From roadtovr

A recent report from The Information maintains Meta has started outsourcing some design for upcoming headsets amid a shift to move part of its production out of China. Now, Meta CTO Andrew Bosworth refutes those specific design claims, saying headsets will continue to be designed “in house.”

According to The Information (non-paywalled via SeekingAlpha) Meta is reportedly planning to move half of its Quest headset manufacturing from China to Vietnam, a step done to avoid upcoming steep import tariffs soon to be levied on that country by US President-elect Donald Trump.

Goertek Vietnam | Image courtesy Vietnam Investment Review

The report additionally alleges Meta is set to shift more of its component design, including lenses and displays, to Goertek, the Chinese original design manufacture (ODM) known for creating both reference designs and manufacturing devices for companies across the XR industry.

Furthermore, Meta has reportedly tapped Goertek and other manufacturers to eventually develop its headsets by 2030, as the company is said to transition to focus more on its lucrative software business. Such a joint design manufacturing relationship would allegedly include Meta outlining goals to Goertek, which then proposes multiple options for Meta to choose from.

Andrew Bosworth, Meta CTO and head of the company’s XR-focused Reality Labs, has however refuted those specific design claims in a recent X post.

“[S]omeone is pushing the design rumor hard to multiple outlets, and that aspect remains false,” Bosworth says. “We continue to design our headsets in house as we have and have no plans to change that. We always partner with our manufacturers to some degree but nothing material is changing there.”

In a follow-up post, Bosworth underlines its work with Goertek will be business as usual.

“To be clear, Goertek is a great partner and as parts of our stack are more mature and used from headset to headset we’re glad to have them carry the designs across which has always been true. But this isn’t a change from how we’ve done business with them even as we scale it up.”

Citing a Meta employee, The Information additionally reports Goertek has begun designing the outer shell for future versions of Meta’s MR headsets, and is now playing a larger role in R&D for other Meta products, including its Ray-Ban Meta smart glasses.

Earlier this year Goertek injected $280 million into its Vietnamese subsidiary, which according to the Shenzhen Stock Exchange filing is said to specialize in manufacturing consumer electronics products, such as headphones, smartwatches, and VR and AR devices.

Filed Under: Meta Quest 3 News & Reviews, News

Samsung Reportedly Set to Unveil Smart Glasses at Galaxy S25 Event in January

December 3, 2024 From roadtovr

A recent report from South Korea’s Yonhap News maintains Samsung is set to unveil a pair of XR glasses at its annual Unpacked product event, which is expected to take place sometime next month.

Samsung promised back in July we’d be hearing more about its forthcoming “XR platform” before the end of this year, which it’s developing in partnership with Google and Qualcomm.

We still don’t know precisely what form its “XR platform” will take, with previous rumors suggesting work on an Apple Vision Pro competitor in addition to a Ray-Ban Meta smart glasses competitor, as reported in October by The Information.

Now, a Yonhap News report (via Techradar) maintains Samsung is expected to unveil some sort of device on stage, coming in the shape of “regular glasses or sunglasses, and weigh[ing] around 50g.”

The report notes the device is expected to have a payment function, gesture recognition, and facial recognition, and further maintains industry insiders expect the product to launch around Q3 2025. Its Android-powered XR software is also expected to be unveiled sometime this month.

Image courtesy Samsung

While Yonhap calls the device “AR glasses” (machine translated from Korean), the rumored weight and the lack of any mention of built-in displays suggests it may be closer in function to Ray-Ban Meta Smart Glasses instead of something like Meta’s Orion AR glasses prototype.

You can read more about the difference between AR glasses and smart glasses in our handy primer, although here’s the short of it: smart glasses don’t overlay immersive imagery, instead providing the user with access to data you might otherwise use on a smartphone or smartwatch, be it a visual heads-up display, or audio output, such as the case of Ray-Ban Meta. AR headsets on the other hand do overlay immersive imagery, like HoloLens 2 or Meta Orion, and are consequently more expensive and difficult to build as a result.

Provided Samsung is indeed releasing a pair of smart glasses and not a full-fledged AR device, it would be in good company. According to Meta, its smart glasses partnership with Ray-Ban has been very successful since the product’s initial release in 2021, prompting Meta to extend its smart glasses collaboration with parent company EssilorLuxottica into 2030. China’s Xiaomi is also reportedly preparing such a device with the help of long-standing ODM Goertek, which is reported to “fully benchmark” against Ray-Ban Meta.

Notably, this follows a string of Samsung trademarks ostensibly geared towards the next generation of XR devices. In mid-2023, the South Korean tech giant filed the name ‘Samsung Glasses’ with the UK’s Intellectual Property Office. In early 2024, Samsung filed a similar trademark request with the United States Patent and Trademark Office for ‘Galaxy Glasses’.

Whatever the case, Samsung hasn’t intimated when Unpacked will kick off, which typically takes place in January or February. An Android Police report suggests however the date has leaked: January 23rd in San Francisco, California.

Filed Under: News, XR Industry News

‘Shapelab’ 3D Modeling App Launches on Quest in Early Access

December 2, 2024 From roadtovr

Leopoly, the studio behind PC VR modeling software Shapelab (2023), has launched a new version built specifically for Quest headsets.

Called Shapelab Lite, the polygon-based 3D modeling app is now in early access for Quest 3, Quest 3S and Quest Pro.

Shapelab Lite offers up many of the core features of the PC VR app’s toolset, targeting its intuitive sculpting tools at beginners, hobbyists, and professionals seeking a standalone 3D modeling solution.

Key features include a polygon mesh-based framework for precision modeling, dynamic topology for flexible detail adjustments, and user-friendly controls for creating 3D assets like props and characters.

“Shapelab Lite represents a significant step forward in making professional-grade 3D modeling accessible to everyone,” said Daniel Andrassy, Chief Product Officer of Shapelab. “We’re excited to bring the core features of Shapelab PCVR to standalone VR users, empowering a new wave of creators. This is just the beginning of what we envision for the future of Shapelab Lite. As an early-access software, we’re actively listening to user feedback to guide future updates and ensure the app meets the needs of our community, keeping in mind the capabilities and constraints of a device with lower processing power compared to the PC version.”

You can find Shapelab Lite on the Horizon Store for Quest 3/S/Pro, priced at $15.

Filed Under: Meta Quest 3 News & Reviews, News

This $45 Headstrap Makes Apple’s $3,500 Headset Much Better

November 25, 2024 From roadtovr

Vision Pro is an incredible headset in many ways, but its most obvious weak point (after the pricetag) is comfort. Apple’s obsession with aesthetics made a headset that’s striking for those looking at it, but less comfortable than it could be for those actually using it. Luckily, fixing this flaw is quite simple.

Many critiques of Vision Pro’s comfort attribute the issue to the headset’s weight. It’s metal after all! So that must be the issue, right?

Well, Vision Pro actually isn’t much heavier than contemporary headsets. Quest 3’s display housing (the headset without the headstrap or facepad) weighs 394g. Vision Pro’s display housing weighs just 81g more at 475g.

Photo by Road to VR

Weight is a key component of headset comfort, but way a headstrap distributes the weight is also a massively important factor.

And to be fair, even Quest 3’s default soft strap isn’t particularly comfortable. Clearly recognizing this, Meta offers an after-market ‘Elite Strap’, which adds 183g, bringing Quest up to 642g (including the facepad). That’s actually heavier than Vision Pro, with its default strap and facepad, at 625g.

Quest 3 with its default soft strap is quite uncomfortable for me | Photo by Road to VR

In the case of Meta’s Elite Strap, adding weight actually makes the headset more comfortable.

Like Quest 3’s default strap, Vision Pro’s default ‘Solo Knit Band’ headstrap also isn’t that comfortable.

Solo Knit Band | Image courtesy Apple

Clearly recognizing this, Apple also opted to include a ‘Dual Loop Band’ headstrap with every Vision Pro. It’s a better (thanks to a top strap for improved weight distribution) but it’s still not great.

Dual Loop Band | Image courtesy Apple

It’s a real shame because on the one hand, the default Knit Band is actually really awesome. It’s soft, cups the back of your head nicely, and is incredibly easy to adjust with a built-in dial on the side. But if you use it, you forgo the benefit of the top strap that comes with the Dual Loop Band. So you can have one or the other, but not both.

It’s obvious that Apple should have just combined the two. Luckily for us, third-party strap options fix this issue for significantly less than an official Elite Strap from Apple—even if they did offer one.

After trying multiple third-party straps for Vision Pro, I’ve finally found one that does exactly I want: it combines with the excellent Knit Band, allows me to use the headset without the facepad (thereby reducing weight), and it gets my eyes closer to the lenses for a wider field-of-view.

Photo by Road to VR

This is the ANNAPRO A2 strap for Vision Pro, and it’s pretty much what Apple should have offered right out of the gate. The $45 pricetag feels very reasonable considering how much better it makes Apple’s $3,500 headset.

I’ve been testing it for a few weeks now and it has made using Vision Pro for long sessions significantly more comfortable. In fact, it’s a huge factor in making the new ultrawide virtual monitor for Vision Pro actually useful. The improved comfort makes Vision Pro much more attractive for day-to-day work.

When the company sent us the headstrap to check out, they also extended a 10% discount code to our readers: be sure to use the code ROADTOVR at checkout on Amazon if you plan to buy one.

The Annapro A2 strap slides easily onto Vision Pro’s struts, and works seamlessly with the Knit Band (it can also work with the Dual Loop if you want even more top-strap support). It includes four different pad sizes (5mm, 12mm, 18mm, and 25mm) in the box, allowing it to fit to different head shapes.

Photo by Road to VR

I found the 5mm pad works best for me, allowing me to wear the headset without the facepad, and bring the lenses as close to my eyes as I comfortably can, resulting in an expanded field-of-view and a more natural AR experience thanks to the open periphery.

Photo by Road to VR

Apple clearly prioritized form-over-function when it came to Vision Pro. They wanted to deliver something that looked no more clunky than a large pair of ski goggles. But that goal led them to compromises on comfort that have become one of the main critiques of the headset.

It’s nice that this can now be fixed thanks to affordable third-party accessories. This particular approach works so well that I wouldn’t be surprised if the next iteration of Vision Pro comes adopts something similar right out of the gate.

Filed Under: Apple Vision Pro News & Reviews, hardware review, News

« Previous Page
Next Page »

  • Home