• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

Reality Labs

Meta Reality Labs Latest Revenue & Operating Cost Figures Aren’t Going to Make Investors Happy

October 26, 2022 From roadtovr

In Meta’s most recent quarterly earnings call the company shared the latest revenue figures of Reality Labs, the company’s XR division. In the third quarter of the year the division hit new milestones… unfortunately not the kind investors like to see.

Meta has been clear about its plan to spend aggressively on its XR initiatives over the next several years. So while it isn’t a surprise to see the company’s latest operating costs for Reality Labs reaching an all-time high, seeing that record in the face of an all-time low revenue record for the quarter isn’t a great look.

Meta has only been sharing its Reality Labs revenue and operating cost figures since Q4 2020, so while it’s certain that prior periods had less revenue and potentially even more spending, these new milestones shared for the third quarter of 2022 are as far back as Meta has shared the data (roughly the last two years).

The likely reasons for the lower revenue and higher spending may have to do with timing more than anything. As of Q3 2022, Meta hasn’t launched a new headset in two years. That’s probably meant slowing sales of Quest 2, especially considering the company confirmed work on its next headset a year ago, which may also have slowed sales. Not to mention that the company raised the price of Quest 2 earlier this year. While on its face that should mean more revenue, it also may have reduced demand for the headset.

It won’t be until the Q4 earnings call that we see the impact Quest Pro will have on the Reality Labs bottom line.

Meta CEO Mark Zuckerberg has warned shareholders that the company’s XR investments may not flourish until 2030, but the company still needs to tread carefully to maintain faith among its investors.

Filed Under: Meta, meta revenue, News, q3 2022, Reality Labs, reality labs operating costs, reality labs revenue, vr industry

Meta Connect Dev Conference to Be Held Virtually on October 11th

September 6, 2022 From roadtovr

Meta today announced that its next installment of Connect, the company’s annual XR developer conference, is again going digital this year. The event, which promises to share updates and looks into the “near and far future” of Meta, will be held on October 11th, 2022.

Like last year’s all-digital Connect, this year’s dev conference will include livestreaming keynotes and developer sessions, which the company says will be available on the Reality Labs Facebook page. And it’s all being boiled down into one day.

There’s no schedule out yet, however the company says its one-day virtual event will explore “the building of the metaverse and the future of augmented and virtual reality.”  You can sign up for updates over at metaconnect.com for any upcoming livestreams.

We’d expect that much, but more specifically we’re hoping to learn a few key things. Topping the list is info surrounding its next VR headset, Project Cambria, or what is rumored to be dubbed Meta Quest Pro.

Image courtesy Meta, Mark Zuckerberg

On the Joe Rogan Experience Podcast last week, Meta CEO Mark Zuckerberg let it slip that its “next device [is] coming out in October,” which suggests a Cambria launch during Connect. That would also mean we get the full info drop on the so-called Quest Pro then. Reminder: at more than $800 Cambria is likely meant for prosumers and developers looking to get their hands on a VR headset capable of AR interactions (aka ‘mixed reality’) thanks to color passthrough and built-in eye tracking.

Whatever we see though will most certainly need top last year’s big name change if the company wants to make good on its complete rebranding, which essentially unplugged the Facebook and Oculus brand names and pivoted the company to focus more on building the metaverse.

Some other bits we’d expect in Connect 2022: Meta’s social VR space Horizon Worlds demonstrating a more definite direction, which we’d hope includes better-looking avatars and the sort of baked-in social features its VR hardware has been missing since, well, forever.

Now that Oculus…er…Meta Quest 2 has seen a price bump and the company still hasn’t delivered on its ambitious metaverse concepts, Connect 2022 may be just the place for the company to set expectations for Meta moving forward.

Filed Under: cambria, connect 2022, connect 22, Meta, meta connect, meta project cambria, meta quest 2, meta quest pro, News, Oculus Connect, project cambria, Quest, quest pro, Reality Labs

Meta Reveals VR Headset Prototypes Designed to Make VR ‘Indistinguishable From Reality’

June 20, 2022 From roadtovr

Meta says its ultimate goal with its VR hardware is to make a comfortable, compact headset with visual finality that’s ‘indistinguishable from reality’. Today the company revealed its latest VR headset prototypes which it says represent steps toward that goal.

Meta has made it no secret that it’s dumping tens of billions of dollars in its XR efforts, much of which is going to long-term R&D through its Reality Labs Research division. Apparently in an effort to shine a bit of light onto what that money is actually accomplishing, the company invited a group of press to sit down for a look at its latest accomplishments in VR hardware R&D.

Reaching the Bar

To start, Meta CEO Mark Zuckerberg spoke alongside Reality Labs Chief Scientist Michael Abrash to explain that the company’s ultimate goal is to build VR hardware that meets all the visual requirements to be accepted as “real” by your visual system.

VR headsets today are impressively immersive, but there’s still no question that what you’re looking at is, well… virtual.

Inside of Meta’s Reality Labs Research division, the company uses the term ‘visual Turing Test’ to represent the bar that needs to be met to convince your visual system that what’s inside the headset is actually real. The concept is borrowed from a similar concept which denotes the point at which a human can tell the difference between another human and an artificial intelligence.

For a headset to completely convince your visual system that what’s inside the headset is actually real, Meta says you need a headset that can pass that “visual Turing Test.”

Four Challenges

Zuckerberg and Abrash outlined what they see as four key visual challenges that VR headsets need to solve before the visual Turing Test can be passed: varifocal, distortion, retina resolution, and HDR.

Briefly, here’s what those mean:

  • Varifocal: the ability to focus on arbitrary depths of the virtual scene, with both essential focus functions of the eyes (vergence and accommodation)
  • Distortion: lenses inherently distort the light that passes through them, often creating artifacts like color separation and pupil swim that make the existence of the lens obvious.
  • Retina resolution: having enough resolution in the display to meet or exceed the resolving power of the human eye, such that there’s no evidence of underlying pixels
  • HDR: also known as high dynamic range, which describes the range of darkness and brightness that we experience in the real world (which almost no display today can properly emulate).

The Display Systems Research team at Reality Labs has built prototypes that function as proof-of-concepts for potential solutions to these challenges.

Varifocal

Image courtesy Meta

To address varifocal, the team developed a series of prototypes which it called ‘Half Dome’. In that series the company first explored a varifocal design which used a mechanically moving display to change the distance between the display and the lens, thus changing the focal depth of the image. Later the team moved to a solid-state electronic system which resulted in varifocal optics that were significantly more compact, reliable, and silent. We’ve covered the Half Dome prototypes in greater detail here if you want to know more.

Virtual Reality… For Lenses

As for distortion, Abrash explained that experimenting with lens designs and distortion-correction algorithms that are specific to those lens designs is a cumbersome process. Novel lenses can’t be made quickly, he said, and once they are made they still need to be carefully integrated into a headset.

To allow the Display Systems Research team to work more quickly on the issue, the team built a ‘distortion simulator’, which actually emulates a VR headset using a 3DTV, and simulates lenses (and their corresponding distortion-correction algorithms) in-software.

Image courtesy Meta

Doing so has allowed the team to iterate on the problem more quickly, wherein the key challenge is to dynamically correct lens distortions as the eye moves, rather than merely correcting for what is seen when the eye is looking in the immediate center of the lens.

Retina Resolution

Image courtesy Meta

On the retina resolution front, Meta revealed a previously unseen headset prototype called Butterscotch, which the company says achieves a retina resolution of 60 pixels per degree, allowing for 20/20 vision. To do so, they used extremely pixel-dense displays and reduced the field-of-view—in order to concentrate the pixels over a smaller area—to about half the size of Quest 2. The company says it also developed a “hybrid lens” that would “fully resolve” the increased resolution, and it shared through-the-lens comparisons between the original Rift, Quest 2, and the Butterscotch prototype.

Image courtesy Meta

While there are already headsets out there today that offer retina resolution—like Varjo’s VR-3 headset—only a small area in the middle of the view (27° × 27°) hits the 60 PPD mark… anything outside of that area drops to 30 PPD or lower. Ostensibly Meta’s Butterscotch prototype has 60 PPD across its entirely of the field-of-view, though the company didn’t explain to what extent resolution is reduced toward the edges of the lens.

Continue on Page 2: High Dynamic Range, Downsizing »

Filed Under: butterscotch, Feature, half dome, holocake 2, mark zuckerberg, Meta, meta reality labs, meta reality labs research, michael abrash, News, Reality Labs, reality labs display systems research, starburst, vr hdr, VR Headset, vr headset prototypes, VR Research

Reality Labs Chief Scientist Outlines a New Compute Architecture for True AR Glasses

May 2, 2022 From roadtovr

Speaking at the IEDM conference late last year, Meta Reality Labs’ Chief Scientist Michael Abrash laid out the company’s analysis of how contemporary compute architectures will need to evolve to make possible the AR glasses of our sci-fi conceptualizations.

While there’s some AR ‘glasses’ on the market today, none of them are truly the size of a normal pair of glasses (even a bulky pair). The best AR headsets available today—the likes of HoloLens 2 and Magic Leap 2—are still closer to goggles than glasses and are too heavy to be worn all day (not to mention the looks you’d get from the crowd).

If we’re going to build AR glasses that are truly glasses-sized, with all-day battery life and the features needed for compelling AR experiences, it’s going to take require a “range of radical improvements—and in some cases paradigm shifts—in both hardware […] and software,” says Michael Abrash, Chief Scientist at Reality Labs, Meta’s XR organization.

That is to say: Meta doesn’t believe that its current technology—or anyone’s for that matter—is capable of delivering those sci-fi glasses that every AR concept video envisions.

But, the company thinks it knows where things need to head in order for that to happen.

Abrash, speaking at the IEDM 2021 conference late last year, laid out the case for a new compute architecture that could meet the needs of truly glasses-sized AR devices.

Follow the Power

The core reason to rethink how computing should be handled on these devices comes from a need to drastically reduce power consumption to meet battery life and heat requirements.

“How can we improve the power efficiency [of mobile computing devices] radically by a factor of 100 or even 1,000?” he asks. “That will require a deep system-level rethinking of the full stack, with end-to-end co-design of hardware and software. And the place to start that rethinking is by looking at where power is going today.”

To that end, Abrash laid out a graph comparing the power consumption of low-level computing operations.

Image courtesy Meta

As the chart highlights, the most energy intensive computing operations are in data transfer. And that doesn’t mean just wireless data transfer, but even transferring data from one chip inside the device to another. What’s more, the chart uses a logarithmic scale; according to the chart, transferring data to RAM uses 12,000 times the power of the base unit (which in this case is adding two numbers together).

Bringing it all together, the circular graphs on the right show that techniques essential to AR—SLAM and hand-tracking—use most of their power simply moving data to and from RAM.

“Clearly, for low power applications [such as in lightweight AR glasses], it is critical to reduce the amount of data transfer as much as possible,” says Abrash.

To make that happen, he says a new compute architecture will be required which—rather than shuffling large quantities of data between centralized computing hubs—more broadly distributes the computing operations across the system in order to minimize wasteful data transfer.

Compute Where You Least Expect It

A starting point for a distributed computing architecture, Abrash says, could begin with the many cameras that AR glasses need for sensing the world around the user. This would involve doing some preliminary computation on the camera sensor itself before sending only the most vital data across power hungry data transfer lanes.

Image courtesy Meta

To make that possible Abrash says it’ll take co-designed hardware and software, such that the hardware is designed with a specific algorithm in mind that is essentially hardwired into the camera sensor itself—allowing some operations to be taken care of before any data even leaves the sensor.

Image courtesy Meta

“The combination of requirements for lowest power, best requirements, and smallest possible form-factor, make XR sensors the new frontier in the image sensor industry,” Abrash says.

Continue on Page 2: Domain Specific Sensors »

Filed Under: AR glasses, AR Headset, ar industry, iedm 2021, Meta, michael abrash, News, Reality Labs, vr industry

  • Home