• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

XR Design & Development

Snap Spectacles Offer a Peek into All-day AR with New Geo-location Platform Update

March 17, 2025 From roadtovr

Snap, the company behind Snapchat, introduced its fifth-gen Spectacles AR glasses six months ago, and now the company is releasing a number of new features that aim to improve geo-located AR experiences.

Released in September 2024, Spectacles are still very much a developer kit—the AR glasses only have 45 minutes of standalone battery power—although Snap is one of the few companies out there actively engaging developers to build the sort of mainstay content you might find on the all-day consumer AR glasses of the near future.

While we’re not there yet, Snap announced that developers can start building Lenses (apps) integrating data from GPS, GNSS, compass heading, and custom locations, essentially giving devs access to geo-location data for better outdoor AR experiences.

Snap has highlighted a few sample Lenses to show off the integration, including Utopia Labs’ NavigatAR, which guide users with Snap Map Tiles, and Path Pioneer, which lets users create AR walking courses.

Geo-location data also helped Niantic bring multiplayer to Peridot Beyond, its AR pet simulator exclusively for Spectacles. The recent update also connects Spectacles with the mobile version of Peridot, allowing progression within the AR glasses experience to carry over to mobile.

Similarly, Snap has also worked with Wabisabi to integrate its machine learning model SnapML into Doggo Quest, the gamified dog-walking AR app, letting you overlay digital effects on your pooch as it tracks metrics such as routes and step counts.

Today’s update comes with a few more platform features too, including the ability to easily add leaderboards to Lenses, an AR keyboard for hand-tracked text input, and improved ability to open Lens links from messaging threads.

The update also features three new hand-tracking capabilities—phone detector to identify when a user has a phone in their hands, grab gesture, and refinements to targeting intent to reduce false positives while typing.

Additionally, Snap is kicking off a ‘Spectacles Community Challenges’ on April 1st that lets teams win cash prizes for submitting new or updating existing Lenses, which are judged on engagement, technical excellence, and Lens quality. The company says that each month it’s going to give out over $20,000 to the top five new Lenses, the top five updated Lenses, and top open source Lens.

This follows Snap’s recent bid to bring Spectacles to more than just developers. In January, Snap announced it was making the the fifth-gen device more affordable to students and teachers, bringing the price down to $594 for 12 months of subscription-free access, then $49.50 per month afterward for continued use of the headset.

While Snap’s Spectacles remain a developer-focused device, these updates signal the company’s long-term ambition for mainstream AR adoption, where it will notably be competing with companies like Meta, Apple and Google. Better geo-located experiences are undoubtedly a vital piece of the puzzle to making AR glasses a daily necessity rather than a niche tool.

Filed Under: AR apps, AR Design, AR News, News, XR Design & Development

Global Hackathon for Vision Pro Development, Vision Hack, Kicks Off Next Month

August 1, 2024 From roadtovr

Vision Hack is a forthcoming remote hackathon for developers building apps for visionOS. Open to all skill levels, the event is set to take place September 13–15th.

Guest Article by Cosmo Scharf

Cosmo Scharf is an Emmy-nominated product designer and entrepreneur with a decade of experience in XR. He co-founded Vision Hack and visionOS Dev Partners to support visionOS developers. Previously, Scharf created MovieBot, an AI-powered 3D animation app, and Mindshow, a VR platform for animated content creation. His also started VRLA, one of the world’s largest immersive technology expos.

Imagine building the first iPhone apps in 2008. That’s where we are with visionOS today. Vision Hack is your launchpad into the new frontier of spatial computing.

Over the past decade, I’ve had the privilege of witnessing the virtual reality industry evolve from a niche technology to a transformative medium. From the early days of clunky prototypes to the sleek, powerful devices we have today, the journey has been nothing short of extraordinary. Now, with the introduction of Apple Vision Pro, we’re standing at the threshold of a new era.

As one of the organizers of Vision Hack, I’m thrilled to announce the launch of the first global visionOS hackathon. Scheduled for September 13–15th, this event represents a significant milestone in our industry’s progression. It’s an opportunity to explore and shape the future of spatial computing as Apple Vision Pro continues its global rollout.

Vision Hack is designed to be a truly immersive experience. We’re encouraging teams to communicate in Vision Pro itself using spatial Personas, in addition to Discord. This approach not only showcases the device’s capabilities but also provides participants with authentic, hands-on experience in visionOS development.

Our three-day program caters to both seasoned spatial computing developers and newcomers:

  • Day 1: Workshops and team formation
  • Day 2: Intensive development with mentorship from industry experts
  • Day 3: Development, project presentations, and awards

To foster collaboration while ensuring focused development, we’ve capped team sizes at 5 people each. Understanding the global nature of our community, we’re organizing local meetups in various cities so developers can connect in person.

While we highly recommend access to a Vision Pro for the full experience, it’s not a strict requirement for participation. However, developers will need a Mac with an Apple chip to run the visionOS simulator. This setup will enable meaningful participation even without the physical device.

The organizing team brings extensive experience from major VR expos (VRLA), Metaverse hackathons, and XR startups.

As spatial computing evolves, we believe early developer engagement is crucial in building a robust ecosystem. Vision Hack aims to play a key role in nurturing the visionOS developer community, potentially influencing the trajectory of spatial computing applications.

For developers keen on exploring visionOS, Vision Hack offers a unique opportunity to dive into this emerging platform. There’s a $25 registration fee, which helps us cover some of the event costs and ensures committed participation.

For companies interested in being at the forefront of spatial computing development, we offer various sponsorship opportunities. These partnerships not only support the event but also provide sponsors with direct access to a pool of talented developers working on cutting-edge spatial computing applications.

More details, registration information, and sponsorship opportunities can be found at visionoshackathon.com. We’re excited to see the innovative projects and ideas that will emerge from this event, and we look forward to welcoming you to the next chapter of spatial computing development.

Filed Under: Apple Vision Pro News & Reviews, XR Design & Development

Quest ‘Augments’ Feature for Concurrent AR Apps Needs More Time to Cook, Says Meta CTO

June 18, 2024 From roadtovr

Last year Meta announced the so-called Augments feature, planned for Quest 3, which would allow persistent mini AR apps to live in the world around you. Now, eight months after the headset hit store shelves, Meta’s CTO explains why the feature has yet to ship.

Augments was announced as a framework for developers to build mini AR apps that could not just live persistently in the space around you, but also run concurrently alongside each other—similar to how most apps work on Vision Pro today.

Image courtesy Meta

And though Meta had shown a glimpse of Augments in action when it was announced last year, it seems the company’s vision (and desire to market that vision) got ahead of its execution.

This week Meta CTO Andrew “Boz” Bosworth responded to a question during an Instagram Q&A about when the Augments feature would ship. He indicated the feature as initially shown wasn’t meeting the company’s expectation.

We were playing with [Augments] in January and we decided it wasn’t good enough. It was too held back by some system architecture limitations we had; it ended up feeling more like a toy and it didn’t really have the power that we think it needed to deliver on the promise of what it was.

So we made a tough decision there to go back to the drawing board, and basically [it needed] a completely different technical architecture. Starting from scratch basically. Including actually a much deeper set of changes to the system to enable what we wanted to build there. I think we made the right call—we’re not going to ship something we’re not excited about.

But it did restart the clock, and so [Augments is] going to take longer than we had hoped to deliver. I think it’s worth while, I think it’s the right call. But that’s what happened.

We’re only two-and-a-half months out from Meta Connect 2024, which would be the one-year anniversary of the Augments announcement. That’s where we likely to hear more about the feature, but at this point it’s unclear if it could ship by then.

Filed Under: Meta Quest 3 News & Reviews, News, XR Design & Development, XR Industry News

VisionOS 2 Enables WebXR by Default, Unlocking a Cross-platform Path to Vision Pro

June 12, 2024 From roadtovr

We’ve know that Apple planned to support WebXR for quite some time, but with VisionOS 2, the company is enabling the feature for all users. WebXR allows developers to deliver cross-platform XR experiences directly from the web, with no gatekeepers to approve or reject content.

WebXR is the widely supported web standard that allows developers to deliver AR and VR content from the web.

Just like anyone can host a webpage online without any explicit approval from another party, WebXR allows the same for AR and VR content. And because it’s delivered through the browser, accessing and sharing WebXR experiences is as easy as clicking or sending a link—like this one.

Vision Pro has supported initial WebXR support since its launch, but it required users to manually enable the feature by digging into Safari’s settings.

With VisionOS 2—available today as a developer preview, and coming to all this Fall—WebXR will be enabled by default, making it easy for anyone to access WebXR through the headset. Vision Pro thus joins headsets like Quest, HoloLens 2, and Magic Leap 2 in supporting WebXR content.

Though WebXR is “supported” on VisionOS 2, our understanding is that it only support VR (or ‘fully immersive’) experiences. WebXR is also capable of delivering AR experiences (where virtual content is merged with a view of the real world), but VisionOS 2 doesn’t yet support that portion of the standard.

There’s many reasons why developers might want to use WebXR to build experiences over native apps that are distributed through a headset’s proprietary store.

For one, any headset with WebXR support can run any compatible WebXR experience, meaning developers can build one experience that works across many headsets, rather than needing to make multiple builds for multiple headsets, then uploading and managing those builds across multiple platform stores.

Like a webpage, WebXR content can also be updated at any time, allowing developers to tweak and enhance the experience on the fly, without needing to upload new builds to multiple stores, or for users to download a new version.

WebXR also has no gatekeepers. So content that wouldn’t be allowed on, say, Apple or Meta’s app stores—either for technical or content-related reasons—can still reach users on those headsets. That could include adult content that’s explicitly forbidden on most platform app stores.

Filed Under: Apple Vision Pro News & Reviews, News, XR Design & Development, XR Industry News

VisionOS 2 Improvement Targets Key Vision Pro Critique Among Developers

June 12, 2024 From roadtovr

For basic Vision Pro interactions like navigating apps and scrolling web pages, the headset’s look-and-pinch input system works like magic. But if you want to go more ‘hands-on’ with virtual content, the headset’s full hand-tracking leaves much to be desired.

Compared to Quest 3, Vision Pro’s full hand-tracking has notably more latency. That means when moving your hands it takes longer for the headset to register the movement. Especially in interactive content where you directly grab virtual objects, this can make the objects feel like they lag behind your hand.

Changes coming in VisionOS 2 stand to improve hand-tracking. Apple detailed the changes in a developer session at WWDC 2024 this week.

For one, the headset will now report estimated hand positions at 90Hz instead of the previous 30Hz. That means the system can reflect changes in hand position in one-third of the time, also making the movement of the hand smoother thanks to more frequent updates. This only applies to a small portion of the overall latency pipeline (which we previously estimated at a total of 127.7ms) but it could reduce hand-tracking latency by as much as 22ms in the best case scenario.

Here’s a look at that in action:

It’s an improvement, but you can still easily see the latency of the teapod compared to the hand, even with this slow movement.

For a snappier experience, VisionOS 2 will alternatively allow developers to enable hand-tracking prediction, which provides an estimate of the user’s future hand position. While this doesn’t truly reduce latency, it can reduce perceived latency in many cases. Similar prediction techniques are common across various XR tracking systems; it’s quite surprising that Vision Pro wasn’t already employing it—or at least not making it available to developers.

Here’s a look at predictions in action:

Now we can see the virtual teapot staying much more aligned to the user’s hand. Granted, this isn’t likely to look quite as good with faster motions.

We’ll be looking forward to putting Vision Pro’s hand-tracking latency to the test with VisionOS 2 soon!

Filed Under: Apple Vision Pro News & Reviews, News, XR Design & Development

Some of Vision Pro’s Biggest New Development Features Are Restricted to Enterprise

June 11, 2024 From roadtovr

VisionOS 2 is bringing a range of new development features, but some of the most significant are restricted to enterprise applications.

VisionOS 2 will bring some of the top requested development features to the headset, but Apple says its reserving some of them for enterprise applications only.

Developers that want to use the features will need ‘Enterprise’ status, which means having at least 100 employees and being accepted into the Apple Developer Enterprise Program ($300 per year).

Apple says the restriction on the new dev capabilities is to protect privacy and ensure a predictable experience for everyday users.

Here’s a breakdown of the enterprise-only development features coming to VisionOS 2, which Apple detailed in a WWDC session.

Vision Pro Camera Access

Up to this point, developers building apps for Vision Pro and VisionOS couldn’t actually ‘see’ the user’s environment through the headset’s cameras. That limits the ability for developers to create Vision Pro apps that directly detect and interact with the world around the user.

With approval from Apple, developers building Vision Pro enterprise apps can now access the headset’s camera feed. This can be used to detect things in the scene, or to stream the view for use elsewhere. This is popular for ‘see what I see’ use-cases, where a remote person can see the video feed of someone at a work site in order to give them help or instruction.

Developers could also use the headset’s camera feed with a computer vision algorithm to detect things in view. This might be used to automatically identify a part, or verify that something was repaired correctly.

Even with Apple’s blessing to use the feature, enterprise apps will need to explicitly ask the user for camera access each time it is used.

Barcode and QR Code Detection

Image courtesy Apple

Being able to use the headset’s camera feed naturally opens the door for reading QR codes and barcodes, which allow structured data to be transmitted to the headset visually.

Apple is providing a readymade system for developers to detect, track, and read barcodes using Vision Pro.

The company says this could be useful for workers to retrieve an item in a warehouse and immediately know they’ve found the right thing by looking at a barcode on the box. Or to scan a barcode to easily pull up instructions for assembling something.

Neural Engine Access

Enterprise developers will have the option to tap into Vision Pro’s neural processor to accelerate machine learning tasks. Previously developers could only access the compute resources of the headset’s CPU and GPU.

Object Tracking

Although the new Object Tracking feature is coming to VisionOS 2 more broadly, there are additional enhancements to this feature that will only be available to enterprise developers.

Object Tracking allows apps to include reference models of real-world objects (for instance, a model of a can of soda), which can be detected and tracked once they’re in view of the headset.

Enterprise developers will have greater control over this feature, including the ability to tweak the max number of tracked objects, deciding to track only static or dynamic objects, and changing the object detection rate.

Greater Control Over Vision Pro Performance

Enterprise developers working with VisionOS 2 will have more control over the headset’s performance.

Apple explains that, out of the box, Vision Pro is designed to strike a balance between battery life, performance, and fan noise.

But some specific use-cases might need a different balance of those factors.

Enterprise developers will have the option to increase performance by sacrificing battery life and fan noise. Or perhaps stretch battery life by reducing performance, if that’s best for the given use-case.


There’s more new developer features coming to Vision Pro in VisionOS 2, but these above will be restricted to enterprise developers only.

Filed Under: Apple Vision Pro News & Reviews, XR Design & Development

VisionOS 2 is Available in Developer Preview Starting Today

June 10, 2024 From roadtovr

Today at WWDC 2024, Apple revealed VisionOS 2, the first major update for its Vision Pro headset. The new version of the software will be available for developers to experiment with starting today.

VisionOS 2 is primarily designed to round out some rough edges from the headset’s release earlier this year, while adding some new features and also expanding development capabilities so developers can take greater advantage of the headset.

While it won’t release publicly until the fall, Apple says developers can get their hands on VisionOS 2 starting today. We haven’t spotted the direct page to the developer preview update yet, but the company says it will be available through its official developer website.

VisionOS 2 is bringing a range of new features and improvements like 2D to 3D photo conversion, ultrawide Mac Virtual Display, new ways to navigate the headset’s core interface, and much more. We’ll be breaking down the full range of features soon, stay tuned to the front page!

Filed Under: Apple Vision Pro News & Reviews, visionos 2, XR Design & Development

  • Home