• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home
  • About us
  • Contact Us

AR glasses

Google to Publicly Test AR Prototypes Starting in August

July 20, 2022 From roadtovr

Google announced that the company will be conducting real world tests of its early AR prototypes starting next month.

The company says in a blog post that it plans to to test AR prototypes in the real world as a way to “better understand how these devices can help people in their everyday lives.”

Some of the key areas Google is emphasizing are things like real-time translation and AR turn-by-turn navigation.

“We’ll begin small-scale testing in public settings with AR prototypes worn by a few dozen Googlers and select trusted testers,” the company says. “These prototypes will include in-lens displays, microphones and cameras — but they’ll have strict limitations on what they can do. For example, our AR prototypes don’t support photography and videography, though image data will be used to enable experiences like translating the menu in front of you or showing you directions to a nearby coffee shop.”

Critically, Google says the research prototypes look like “normal glasses.” This was no doubt partially informed by their rocky experience with Google Glass starting in 2013 which spawned the neologism ‘glasshole’ due to the device’s relative high visibility and supposed privacy concerns of wearing a camera. Glass is still around, albeit only for enterprise users.

Google says it wants to take it slow with its AR glasses though and include a “strong focus on ensuring the privacy of the testers and those around them.” Although the units will clearly pack camera sensors to do its job, Google says after translating text or doing turn-by-turn directions, the image data will be deleted unless it will be used for analysis and debugging.

“In that case, the image data is first scrubbed for sensitive content, including faces and license plates. Then it is stored on a secure server, with limited access by a small number of Googlers for analysis and debugging. After 30 days, it is deleted,” the company says in a FAQ on the program.

Testers will also be prohibited from testing in public places such as schools, government buildings, healthcare locations, places of worship, social service locations, areas meant for children (e.g., schools and playgrounds), emergency response locations, rallies or protests, and other similar places. For navigation, testers are also banned from using AR prototypes while driving, operating heavy machinery, and engaging in sports.

Google’s inclusion of displays in its public prototypes is a step beyond Meta’s Project Aria, which started on-campus testing of AR prototypes in 2020 that notably included everything you’d expect from AR glasses but the displays. We’re waiting to hear more about Meta’s Project Nazare however, which are said to be “true augmented reality glasses.”

As for Apple, well, there’s only rumors out there for now on specifications and target launch dates for the company’s MR headset and follow-up AR glasses. It’s clear however we’re inching ever closer to a future where the biggest names in established tech will directly compete to become leading forces in what many have hailed as a class of device which will eventually replace your smartphone.

Filed Under: AR glasses, AR News, augmented reality glasses, google, google ar, google ar glasses, google ar testing, Google Glass, News

Report: Meta to Release First AR Glasses to Developers Only & Not Consumers

June 14, 2022 From roadtovr

Meta has been working on what it calls a “fully-fledged AR device” for some time now, however a recent report from The Information maintains the first in Meta’s line of AR glasses will be reserved for developers only and not for enthusiasts as previously thought.

The report alleges that the company won’t be commercially releasing what is now codenamed Project Nazare, which was teased back at Connect 2021 when the company changed its name from Facebook to Meta.

Back then, Meta CEO Mark Zuckerberg had said the “ultimate goal” with Project Nazare was to develop “true augmented reality glasses.”

Here’s a mockup of the sort of augmented reality interactions Meta hopes to provide with its AR glasses:

The Information maintains Nazare was scheduled to launch commercially in 2024, however now Meta has allegedly changed course and scrapped those plans to position its first AR glasses as a sort of demonstration product, or hardware developer kit as such. It’s said that a follow-up device, codenamed Artemis, will be the first on offer to consumers.

It’s said that the move to reposition Nazare as a developer-only device comes alongside a wider push to downsize Reality Labs, Meta’s AR/VR and emerging tech division responsible for a host of devices. Reality Labs is known for everything from the standalone VR headset Quest 2 to devices such as Project Aria, a sensor-rich pair of glasses which Meta is using to train its AR perception systems and asses public perception of the technology.

A previous report from The Verge in April held that Meta was already internally expecting tepid sales expectations in the low tens of thousands for Nazare. It was also said Nazare would likely test Zuckerberg’s appetite for further hardware subsidies that may have stretched well beyond that of Meta’s $300 Quest 2.

To boot, the company is also reportedly shelving plans to release a smartwatch with a detachable display and two cameras in favor of a design better suited to control a later version of the glasses.

Meta’s Wrist-worn XR controller | Image courtesy Meta, Mark Zuckerberg

Last month, Zuckerberg traveled to Italy to show off Meta’s wrist-worn XR controller prototype to EssilorLuxottica, the Italian parent company behind the Ray-Ban Stories camera glasses and a host of other conventional luxury eyewear brands. Critically, the controller prototype uses electromyography (EMG) sensors to detect electrical signals which control the muscles in your hands, and doesn’t incorporate camera sensors.

All of this may be years out, however Meta is looking forward to the upcoming release of Project Cambria, a VR headset capable of AR interactions thanks to color passthrough camera sensors, aka ‘mixed reality’. That headset, which doesn’t have an official name yet, is undoubtedly meant to serve as a precursor to the company’s AR glasses, as apps developed for Cambria could one day inform a Meta-driven AR app ecosystem.

Whatever the case, Project Cambria is set to be “significantly higher” than $800, the company confirmed in May, which also puts it squarely in the developers/prosumers realm of accessibility.

Filed Under: AR glasses, Artemis, Meta, meta ar, meta ar glasses, meta vr, News, orion, project aria, project Artemis, project nazare

Qualcomm’s Latest AR Glasses Reference Design Drops the Tether, Keeps the Compute

May 21, 2022 From roadtovr

Qualcomm has revealed its latest AR glasses reference design, which it offers up to other companies as a blueprint for building their own AR devices. The reference design, which gives us a strong hint at the specs and capabilities of upcoming products, continues to lean on a smartphone to do the heavy compute, but this time is based on a wireless design.

Qualcomm’s prior AR glasses reference design was based on the Snapdragon XR1 chip and called for a wired connection between a smartphone and the glasses, allowing the system to split rendering tasks between the two devices.

Now the company’s latest design, based on Snapdragon XR2, takes the wire out of the equation. But instead of going fully standalone, the new reference design continues to rely on the smartphone to handle most of the heavy rendering, but now does so over a wireless connection between the devices.

Image courtesy Qualcomm

In addition to Snapdragon XR2, the AR glasses include Qualcomm’s FastConnect 6900 chip which equips it with Wi-Fi 6E and Bluetooth 5.3. The company says the chip is designed for “ultra-low latency,” and manages less than 3ms of latency between the headset and the smartphone. The company has also announced XR-specific software for controlling its FastConnect 6900, allowing device makers to tune the wireless traffic between the devices to prioritize the most time-critical data in order to reduce instances of lag or jitter due to wireless interference.

Though a connected smartphone seems like the most obvious use-case, Qualcomm also says the glasses could just as well be paired to a Windows PC or “processing puck.”

Beyond the extra wireless tech, the company says the latest design is 40% thinner than its previous reference design. The latest version has a 1,920 × 1,080 (2MP) per-eye resolution at 90Hz. The microdisplays include a ‘no-motion-blur’ feature—which sounds like a low persistence mode designed to prevent blurring of the image during head movement. A pair of monochrome cameras are used for 6DOF tracking and an RGB camera for video or photo capture. The company didn’t mention the device’s field-of-view, so it’s unlikely to be any larger than the prior reference design at 45° diagonal.

Like its many prior reference designs, Qualcomm isn’t actually going to make and sell the AR glasses. Instead, it offers up the design and underlying technology for other companies to use as a blueprint to build their own devices (hopefully using Qualcomm’s chips!). Companies that build on Qualcomm’s blueprint usually introduce their own industrial design and custom software offering; some even customize the hardware itself, like using different displays or optics.

That makes this AR glasses reference design a pretty good snapshot of the current state of AR glasses that can be mass produced, and a glimpse of what some companies will be offering in the near future.

Qualcomm says its latest AR glasses reference design is “available for select partners,” as of today, and plans to make it more widely available “in the coming months.”

Filed Under: AR glasses, ar glasses reference design, AR Headset, News, Qualcomm, snapdragon xr2

Reality Labs Chief Scientist Outlines a New Compute Architecture for True AR Glasses

May 2, 2022 From roadtovr

Speaking at the IEDM conference late last year, Meta Reality Labs’ Chief Scientist Michael Abrash laid out the company’s analysis of how contemporary compute architectures will need to evolve to make possible the AR glasses of our sci-fi conceptualizations.

While there’s some AR ‘glasses’ on the market today, none of them are truly the size of a normal pair of glasses (even a bulky pair). The best AR headsets available today—the likes of HoloLens 2 and Magic Leap 2—are still closer to goggles than glasses and are too heavy to be worn all day (not to mention the looks you’d get from the crowd).

If we’re going to build AR glasses that are truly glasses-sized, with all-day battery life and the features needed for compelling AR experiences, it’s going to take require a “range of radical improvements—and in some cases paradigm shifts—in both hardware […] and software,” says Michael Abrash, Chief Scientist at Reality Labs, Meta’s XR organization.

That is to say: Meta doesn’t believe that its current technology—or anyone’s for that matter—is capable of delivering those sci-fi glasses that every AR concept video envisions.

But, the company thinks it knows where things need to head in order for that to happen.

Abrash, speaking at the IEDM 2021 conference late last year, laid out the case for a new compute architecture that could meet the needs of truly glasses-sized AR devices.

Follow the Power

The core reason to rethink how computing should be handled on these devices comes from a need to drastically reduce power consumption to meet battery life and heat requirements.

“How can we improve the power efficiency [of mobile computing devices] radically by a factor of 100 or even 1,000?” he asks. “That will require a deep system-level rethinking of the full stack, with end-to-end co-design of hardware and software. And the place to start that rethinking is by looking at where power is going today.”

To that end, Abrash laid out a graph comparing the power consumption of low-level computing operations.

Image courtesy Meta

As the chart highlights, the most energy intensive computing operations are in data transfer. And that doesn’t mean just wireless data transfer, but even transferring data from one chip inside the device to another. What’s more, the chart uses a logarithmic scale; according to the chart, transferring data to RAM uses 12,000 times the power of the base unit (which in this case is adding two numbers together).

Bringing it all together, the circular graphs on the right show that techniques essential to AR—SLAM and hand-tracking—use most of their power simply moving data to and from RAM.

“Clearly, for low power applications [such as in lightweight AR glasses], it is critical to reduce the amount of data transfer as much as possible,” says Abrash.

To make that happen, he says a new compute architecture will be required which—rather than shuffling large quantities of data between centralized computing hubs—more broadly distributes the computing operations across the system in order to minimize wasteful data transfer.

Compute Where You Least Expect It

A starting point for a distributed computing architecture, Abrash says, could begin with the many cameras that AR glasses need for sensing the world around the user. This would involve doing some preliminary computation on the camera sensor itself before sending only the most vital data across power hungry data transfer lanes.

Image courtesy Meta

To make that possible Abrash says it’ll take co-designed hardware and software, such that the hardware is designed with a specific algorithm in mind that is essentially hardwired into the camera sensor itself—allowing some operations to be taken care of before any data even leaves the sensor.

Image courtesy Meta

“The combination of requirements for lowest power, best requirements, and smallest possible form-factor, make XR sensors the new frontier in the image sensor industry,” Abrash says.

Continue on Page 2: Domain Specific Sensors »

Filed Under: AR glasses, AR Headset, ar industry, iedm 2021, Meta, michael abrash, News, Reality Labs, vr industry

We Tried Snapchat’s AR Glasses And Had Our Minds Blown

March 25, 2022 From vrscout

Snap Inc’s AR Spectacles glasses are nothing short of black magic.

First announced back in 2016, Snap Inc.’s offbeat smart glasses have seen several different iterations over the past five and a half years, each one improving upon the last with new features designed to merge the company’s social media platform with the real world. The latest Spectacles design was the first to include built-in AR (augmented reality) functionality, allowing the wearer to interact with existing Snapchat AR Lenses straight from the glasses in real-time.

The glasses aren’t available for purchase at the moment. Snap has opted instead to distribute hardware to a select group of creators familiar with Lens Studio, the company’s Lens-building software. Unlike the headset, Lens Studio is available to the public and 100% free of charge. While the company is best known for its social media app, Lens Studio has quietly become one of the largest areas of focus for the company. The AR Spectacles feel almost like a conduit, harnessing the power of the platform to deliver some of the most impressive AR experiences I’ve tried to date.

Not too long ago I was invited by Snap to try the Spectacles out for myself in Beverly Hills. Upon arriving, the team gave me a brief rundown of the many different projects built using the Lens Studio platform. This included Michael Nicoll, founder of BLNK and one of the first Lens Studio creators to experiment with the recently released Custom Landmark feature which allows multiple users to scan and interact with famous real-world locations around the world. Nicoll has created numerous music-based filters on the Snapchat platform for a number of high-profile clients, including Megan Thee Stallion.

After the quick introduction, I was presented with a pair of AR Spectacles straight from the box. At first glance, you could easily mistake the Spectacles for a pair of fashion-forward sunglasses. Unlike existing AR headsets such as the Magic Leap 2 or Microsoft HoloLens, the Spectacles feature a discrete, light-weight design more akin to glasses than a headset.

The device features foldable arms that allow for easy storage in the accompanying charging case. Two dual waveguide displays provide impressive AR visuals in real-time. The headset also features two front-facing cameras for capturing video and images, stereo speakers, touch controls that allow you to cycle through Lenses as well as scan the environment for Lens suggestions, WiFi connectivity with a paired smartphone app, and a USB-C port for charging.

Trying the device out for myself, I was blown away by the quality of the AR visuals. Each display offers 2,000 nits of brightness, resulting in ultra-clear 3D images that blend seamlessly with the real-world environment, which means no more of those annoying semi-translucent graphics. The only thing more impressive than the graphics was the tracking. In order to get the full experience, the team carved out an area outside for me to stretch my legs.

One Lens had me running for my life as a cartoon zombie chased me throughout the backyard. Despite my rapid head movements, the AR character moved consistently throughout the real world, even when I almost slipped and fell head-first into the pool. Another Lens had me interacting with an AR art project using the device’s built-in hand-tracking capabilities. While I found the experience a bit wonky at times, it was still an impressive use of the technology, one I hope to see expanded upon in future iterations.

Unfortunately, hand-tracking isn’t the only limitation I found with the device. Speaking with the team, I learned that the AR Spectacles have an average battery life of about 30 minutes. Needless to say, this is an incredibly short amount of time, even for an immersive technology device. That said, the accompanying case does allow for quick and easy wireless charging. I also found the overall field of view to be relatively underwhelming. The AR visuals only cover about half of the display, which can easily take you out of the experience.

That said, many of these issues are common among current AR headsets and glasses. As the technology continues to improve so too can you expect the overall quality and convenience. The amount of tech Snap has been able to jam into this sleek-looking device is nothing short of amazing. Moving forward, we’ll be keeping an extremely close eye on the company as it continues to advance upon its already stellar Lens Studio and Spectacles technology.

As previously mentioned, Snap this month made its Custom Landmarkers technology available to all creators via Lens Studio, allowing you to create immersive AR experiences tied directly to real-world locations. Previous creators used the technology to anchor their AR creations to famous structures like the Gateway of India and the Great Sphinx of Giza in Egypt. Now anyone can begin creating AR Lenses for their local areas. Creators can even use the Spectacles to demo new Lenses in real-time.

Image Credit: Snap Inc.

Filed Under: Android, AR, AR glasses, augmented reality, iOS, Lens Studio, News

Copyright © 2022 GenVR, Inc.
  • Home
  • About us
  • Contact Us