• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

AR glasses

Xiaomi Unveils Wireless AR Glasses Prototype, Powered by Same Chipset as Meta Quest Pro

February 27, 2023 From roadtovr

Chinese tech giant Xiaomi today showed off a prototype AR headset at Mobile World Congress (MWC) that wirelessly connects to the user’s smartphone, making for what the company calls its “first wireless AR glasses to utilize distributed computing.”

Called Xiaomi Wireless AR Glass Discovery Edition, the device is built upon the same Qualcomm Snapdragon XR2 Gen 1 chipset as Meta’s recently released Quest Pro VR standalone.

While specs are still thin on the ground, the company did offer some info on headline features. For now, Xiaomi is couching it as a “concept technology achievement,” so it may be a while until we see a full spec sheet.

Packing two microOLED displays, the company is boasting “retina-level” resolution, saying its AR glasses pack in 58 pixels per degree (PPD). For reference, Meta Quest Pro has a PPD of 22, while enterprise headset Varjo XR-3 cites a PPD of 70.

The company hasn’t announced the headset’s field of view (FOV), however it says its free-form light-guiding prisms “minimizes light loss and produces clear and bright images with a to-eye brightness of up to 1200nit.”

Electrochromic lenses are also said to adapt the final image to different lighting conditions, even including a full ‘blackout mode’ that ostensibly allows it to work as a VR headset as well.

Image courtesy Xiaomi

As for input, Xiaomi Wireless AR Glass includes onboard hand-tracking in addition to smartphone-based touch controls. Xiaomi says its optical hand-tracking is designed to let users to do things like select and open apps, swipe through pages, and exit apps.

As a prototype, there’s no pricing or availability on the table, however Xiaomi says the lightweight glasses (at 126g) will be available in a titanium-colored design with support for three sizes of nosepieces. An attachable glasses clip will also be available for near-sighted users.

In an exclusive hands-on, XDA Developers surmised it felt near production-ready, however one of the issues noted during a seemingly bump-free demo was battery life; the headset had to be charged in the middle of the 30-minute demo. Xiaomi apparently is incorporating a self-developed silicon-oxygen anode battery that is supposedly smaller than a typical lithium-ion battery. While there’s an onboard Snapdragon XR 2 Gen 1 chipset, XDA Developers also notes it doesn’t offer any storage, making a compatible smartphone requisite to playing AR content.

This isn’t the company’s first stab at XR tech; last summer Xiaomi showed off a pair of consumer smartglasses, called Mijia Glasses Camera, that featured a single heads-up display. Xiaomi’s Wireless AR Glass is however much closer in function to the concept it teased in late 2021, albeit with chunkier free-form light-guiding prisms than the more advanced-looking waveguides teased two years ago.

Xiaomi is actively working closely with chipmaker Qualcomm to ensure compatibility with Snapdragon Spaces-ready smartphones, which include Xiaomi 13 and OnePlus 11 5G. Possible other future contributions from Lenovo and Motorola, which have also announced their intentions to support Snapdragon Spaces.

Qualcomm announced Snapdragon Spaces in late 2021, a software tool kit which focuses on performance and low power devices which allows developers to create head-worn AR experiences from the ground-up, or add head-worn AR to existing smartphone apps.

Filed Under: AR Development, AR glasses, AR Headset, AR News, mobile world congress 2023, mwc 2023, News, qualcomm ar glasses, xiaomi, xiaomi ar glasses

Qualcomm Reveals Snapdragon AR2 Processor for Glasses-sized AR Devices

November 16, 2022 From roadtovr

Qualcomm today announced Snapdragon AR2, its “purpose-built headworn augmented reality platform.” Differentiating from the company’s existing Snapdragon XR2 chips, Qualcomm says the AR2 architecture is better suited for creating AR glasses with low power consumption and compact form factors.

Today during Qualcomm’s Snapdragon Summit event, the company revealed the Snapdragon AR2 platform which consists of a trio of chips which the company says will help make truly glasses-sized AR devices possible.

Qualcomm was early to the standalone VR space and has been dominant with its Snapdragon XR2 chips which have found their way into many of the leading standalone headsets on the market, and are now in more than 60 devices total, the company says.

Aiming to take a similar bite out of the forthcoming AR glasses segment, Qualcomm has created a new Snapdragon AR2 platform with a distributed processing design. The platform consists of three chips:

  • AR processor (for sensor perception and video output)
  • AR co-processor (for sensor fusion and dedicated computer vision tasks)
  • Wi-Fi 7 chip (for communication to a host processing device)

By creating a more distributed workload across a main processor and a co-processor, Qualcomm claims AR2 is up to 50% more power efficient while offering 2.5 times better AI performance, and a more compact form-factor, compared to the single-chip Snapdragon XR2 solution.

Not only will the AR processor and co-processor help share a workload, Qualcomm also sees AR2 devices using the speedy Wi-Fi 7 chip to communicate with a host device like a smartphone or wireless compute puck that will do the heavy lifting like application processing and rendering. Qualcomm claims the Wi-Fi 7 chip (FastConnect 7800) can achieve 5.8 Gbps bandwidth with just 2ms of latency.

Using this three-chip framework for distributed processing, Qualcomm claims it will be possible to build compact AR glasses that consume less than one watt of power.

The AR2 platform supports up to nine concurrent cameras for a bevy of head-tracking, environment-sensing, and user-tracking tasks.

“We built Snapdragon AR2 to address the unique challenges of headworn AR and provide industry leading processing, AI and connectivity that can fit inside a stylish form factor,” said Hugo Swart, vice president of XR product management at Qualcomm. “With the technical and physical requirements for VR/MR and AR diverging, Snapdragon AR2 represents another metaverse-defining platform in our XR portfolio to help our OEM partners revolutionize AR glasses.”

There’s no word yet on when the first AR2 devices will hit the market, but Qualcomm lists a handful of partners actively working with the platform: Lenovo, LG, Niantic, Nreal, Oppo, Pico, Qonoq, Rokid, Sharp, TCL, Vuzix, and Xiaomi.

Filed Under: AR glasses, AR Headset, ar industry, AR News, News, Qualcomm, qualcomm ar2, snapdragon ar2

Google to Publicly Test AR Prototypes Starting in August

July 20, 2022 From roadtovr

Google announced that the company will be conducting real world tests of its early AR prototypes starting next month.

The company says in a blog post that it plans to to test AR prototypes in the real world as a way to “better understand how these devices can help people in their everyday lives.”

Some of the key areas Google is emphasizing are things like real-time translation and AR turn-by-turn navigation.

“We’ll begin small-scale testing in public settings with AR prototypes worn by a few dozen Googlers and select trusted testers,” the company says. “These prototypes will include in-lens displays, microphones and cameras — but they’ll have strict limitations on what they can do. For example, our AR prototypes don’t support photography and videography, though image data will be used to enable experiences like translating the menu in front of you or showing you directions to a nearby coffee shop.”

Critically, Google says the research prototypes look like “normal glasses.” This was no doubt partially informed by their rocky experience with Google Glass starting in 2013 which spawned the neologism ‘glasshole’ due to the device’s relative high visibility and supposed privacy concerns of wearing a camera. Glass is still around, albeit only for enterprise users.

Google says it wants to take it slow with its AR glasses though and include a “strong focus on ensuring the privacy of the testers and those around them.” Although the units will clearly pack camera sensors to do its job, Google says after translating text or doing turn-by-turn directions, the image data will be deleted unless it will be used for analysis and debugging.

“In that case, the image data is first scrubbed for sensitive content, including faces and license plates. Then it is stored on a secure server, with limited access by a small number of Googlers for analysis and debugging. After 30 days, it is deleted,” the company says in a FAQ on the program.

Testers will also be prohibited from testing in public places such as schools, government buildings, healthcare locations, places of worship, social service locations, areas meant for children (e.g., schools and playgrounds), emergency response locations, rallies or protests, and other similar places. For navigation, testers are also banned from using AR prototypes while driving, operating heavy machinery, and engaging in sports.

Google’s inclusion of displays in its public prototypes is a step beyond Meta’s Project Aria, which started on-campus testing of AR prototypes in 2020 that notably included everything you’d expect from AR glasses but the displays. We’re waiting to hear more about Meta’s Project Nazare however, which are said to be “true augmented reality glasses.”

As for Apple, well, there’s only rumors out there for now on specifications and target launch dates for the company’s MR headset and follow-up AR glasses. It’s clear however we’re inching ever closer to a future where the biggest names in established tech will directly compete to become leading forces in what many have hailed as a class of device which will eventually replace your smartphone.

Filed Under: AR glasses, AR News, augmented reality glasses, google, google ar, google ar glasses, google ar testing, Google Glass, News

Report: Meta to Release First AR Glasses to Developers Only & Not Consumers

June 14, 2022 From roadtovr

Meta has been working on what it calls a “fully-fledged AR device” for some time now, however a recent report from The Information maintains the first in Meta’s line of AR glasses will be reserved for developers only and not for enthusiasts as previously thought.

The report alleges that the company won’t be commercially releasing what is now codenamed Project Nazare, which was teased back at Connect 2021 when the company changed its name from Facebook to Meta.

Back then, Meta CEO Mark Zuckerberg had said the “ultimate goal” with Project Nazare was to develop “true augmented reality glasses.”

Here’s a mockup of the sort of augmented reality interactions Meta hopes to provide with its AR glasses:

The Information maintains Nazare was scheduled to launch commercially in 2024, however now Meta has allegedly changed course and scrapped those plans to position its first AR glasses as a sort of demonstration product, or hardware developer kit as such. It’s said that a follow-up device, codenamed Artemis, will be the first on offer to consumers.

It’s said that the move to reposition Nazare as a developer-only device comes alongside a wider push to downsize Reality Labs, Meta’s AR/VR and emerging tech division responsible for a host of devices. Reality Labs is known for everything from the standalone VR headset Quest 2 to devices such as Project Aria, a sensor-rich pair of glasses which Meta is using to train its AR perception systems and asses public perception of the technology.

A previous report from The Verge in April held that Meta was already internally expecting tepid sales expectations in the low tens of thousands for Nazare. It was also said Nazare would likely test Zuckerberg’s appetite for further hardware subsidies that may have stretched well beyond that of Meta’s $300 Quest 2.

To boot, the company is also reportedly shelving plans to release a smartwatch with a detachable display and two cameras in favor of a design better suited to control a later version of the glasses.

Meta’s Wrist-worn XR controller | Image courtesy Meta, Mark Zuckerberg

Last month, Zuckerberg traveled to Italy to show off Meta’s wrist-worn XR controller prototype to EssilorLuxottica, the Italian parent company behind the Ray-Ban Stories camera glasses and a host of other conventional luxury eyewear brands. Critically, the controller prototype uses electromyography (EMG) sensors to detect electrical signals which control the muscles in your hands, and doesn’t incorporate camera sensors.

All of this may be years out, however Meta is looking forward to the upcoming release of Project Cambria, a VR headset capable of AR interactions thanks to color passthrough camera sensors, aka ‘mixed reality’. That headset, which doesn’t have an official name yet, is undoubtedly meant to serve as a precursor to the company’s AR glasses, as apps developed for Cambria could one day inform a Meta-driven AR app ecosystem.

Whatever the case, Project Cambria is set to be “significantly higher” than $800, the company confirmed in May, which also puts it squarely in the developers/prosumers realm of accessibility.

Filed Under: AR glasses, Artemis, Meta, meta ar, meta ar glasses, meta vr, News, orion, project aria, project Artemis, project nazare

Qualcomm’s Latest AR Glasses Reference Design Drops the Tether, Keeps the Compute

May 21, 2022 From roadtovr

Qualcomm has revealed its latest AR glasses reference design, which it offers up to other companies as a blueprint for building their own AR devices. The reference design, which gives us a strong hint at the specs and capabilities of upcoming products, continues to lean on a smartphone to do the heavy compute, but this time is based on a wireless design.

Qualcomm’s prior AR glasses reference design was based on the Snapdragon XR1 chip and called for a wired connection between a smartphone and the glasses, allowing the system to split rendering tasks between the two devices.

Now the company’s latest design, based on Snapdragon XR2, takes the wire out of the equation. But instead of going fully standalone, the new reference design continues to rely on the smartphone to handle most of the heavy rendering, but now does so over a wireless connection between the devices.

Image courtesy Qualcomm

In addition to Snapdragon XR2, the AR glasses include Qualcomm’s FastConnect 6900 chip which equips it with Wi-Fi 6E and Bluetooth 5.3. The company says the chip is designed for “ultra-low latency,” and manages less than 3ms of latency between the headset and the smartphone. The company has also announced XR-specific software for controlling its FastConnect 6900, allowing device makers to tune the wireless traffic between the devices to prioritize the most time-critical data in order to reduce instances of lag or jitter due to wireless interference.

Though a connected smartphone seems like the most obvious use-case, Qualcomm also says the glasses could just as well be paired to a Windows PC or “processing puck.”

Beyond the extra wireless tech, the company says the latest design is 40% thinner than its previous reference design. The latest version has a 1,920 × 1,080 (2MP) per-eye resolution at 90Hz. The microdisplays include a ‘no-motion-blur’ feature—which sounds like a low persistence mode designed to prevent blurring of the image during head movement. A pair of monochrome cameras are used for 6DOF tracking and an RGB camera for video or photo capture. The company didn’t mention the device’s field-of-view, so it’s unlikely to be any larger than the prior reference design at 45° diagonal.

Like its many prior reference designs, Qualcomm isn’t actually going to make and sell the AR glasses. Instead, it offers up the design and underlying technology for other companies to use as a blueprint to build their own devices (hopefully using Qualcomm’s chips!). Companies that build on Qualcomm’s blueprint usually introduce their own industrial design and custom software offering; some even customize the hardware itself, like using different displays or optics.

That makes this AR glasses reference design a pretty good snapshot of the current state of AR glasses that can be mass produced, and a glimpse of what some companies will be offering in the near future.

Qualcomm says its latest AR glasses reference design is “available for select partners,” as of today, and plans to make it more widely available “in the coming months.”

Filed Under: AR glasses, ar glasses reference design, AR Headset, News, Qualcomm, snapdragon xr2

Reality Labs Chief Scientist Outlines a New Compute Architecture for True AR Glasses

May 2, 2022 From roadtovr

Speaking at the IEDM conference late last year, Meta Reality Labs’ Chief Scientist Michael Abrash laid out the company’s analysis of how contemporary compute architectures will need to evolve to make possible the AR glasses of our sci-fi conceptualizations.

While there’s some AR ‘glasses’ on the market today, none of them are truly the size of a normal pair of glasses (even a bulky pair). The best AR headsets available today—the likes of HoloLens 2 and Magic Leap 2—are still closer to goggles than glasses and are too heavy to be worn all day (not to mention the looks you’d get from the crowd).

If we’re going to build AR glasses that are truly glasses-sized, with all-day battery life and the features needed for compelling AR experiences, it’s going to take require a “range of radical improvements—and in some cases paradigm shifts—in both hardware […] and software,” says Michael Abrash, Chief Scientist at Reality Labs, Meta’s XR organization.

That is to say: Meta doesn’t believe that its current technology—or anyone’s for that matter—is capable of delivering those sci-fi glasses that every AR concept video envisions.

But, the company thinks it knows where things need to head in order for that to happen.

Abrash, speaking at the IEDM 2021 conference late last year, laid out the case for a new compute architecture that could meet the needs of truly glasses-sized AR devices.

Follow the Power

The core reason to rethink how computing should be handled on these devices comes from a need to drastically reduce power consumption to meet battery life and heat requirements.

“How can we improve the power efficiency [of mobile computing devices] radically by a factor of 100 or even 1,000?” he asks. “That will require a deep system-level rethinking of the full stack, with end-to-end co-design of hardware and software. And the place to start that rethinking is by looking at where power is going today.”

To that end, Abrash laid out a graph comparing the power consumption of low-level computing operations.

Image courtesy Meta

As the chart highlights, the most energy intensive computing operations are in data transfer. And that doesn’t mean just wireless data transfer, but even transferring data from one chip inside the device to another. What’s more, the chart uses a logarithmic scale; according to the chart, transferring data to RAM uses 12,000 times the power of the base unit (which in this case is adding two numbers together).

Bringing it all together, the circular graphs on the right show that techniques essential to AR—SLAM and hand-tracking—use most of their power simply moving data to and from RAM.

“Clearly, for low power applications [such as in lightweight AR glasses], it is critical to reduce the amount of data transfer as much as possible,” says Abrash.

To make that happen, he says a new compute architecture will be required which—rather than shuffling large quantities of data between centralized computing hubs—more broadly distributes the computing operations across the system in order to minimize wasteful data transfer.

Compute Where You Least Expect It

A starting point for a distributed computing architecture, Abrash says, could begin with the many cameras that AR glasses need for sensing the world around the user. This would involve doing some preliminary computation on the camera sensor itself before sending only the most vital data across power hungry data transfer lanes.

Image courtesy Meta

To make that possible Abrash says it’ll take co-designed hardware and software, such that the hardware is designed with a specific algorithm in mind that is essentially hardwired into the camera sensor itself—allowing some operations to be taken care of before any data even leaves the sensor.

Image courtesy Meta

“The combination of requirements for lowest power, best requirements, and smallest possible form-factor, make XR sensors the new frontier in the image sensor industry,” Abrash says.

Continue on Page 2: Domain Specific Sensors »

Filed Under: AR glasses, AR Headset, ar industry, iedm 2021, Meta, michael abrash, News, Reality Labs, vr industry

  • Home