• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home
  • About us
  • Contact Us

Apple

Ming-Chi Kuo: Apple Likely to Release Mixed Reality Headset in January 2023

June 24, 2022 From roadtovr

Industry analyst Ming-Chi Kuo, a respected figure in all things Apple supply chain leaks, says the Cupertino tech giant is likely preparing to launch its long-rumored mixed reality headset early next year.

In a Medium post, Kuo outlines a few key points based on how he gathers the industry is headed.

In short, Kuo posits that Meta is slowing down investment in VR hardware due to looming economic recession, but this will provide others opportunity to play catchup as market share shifts away from Meta to companies such as Sony, Valve, Pico, and HTC. It’s not VR, its Meta’s core business that’s taking a hit.

Kuo says there’s still a “vast” potential demand for VR headsets in China which could be filled by companies with ready access to the Chinese market, such as ByteDance subsidiary Pico Interactive and Taiwan’s HTC.

Apple is also tapped to fill growing demand. Codenamed N301, Apple’s MR headset will “likely release in January 2023,” Kuo maintains, and is set to “favor the continued rapid growth of the headset sector,” adding that it’s “the most complicated product Apple has ever designed.”

“Although Apple has repeatedly reiterated its focus on AR, I believe Apple AR/MR supporting video see-thru could also offer an excellent immersive experience,” Kuo says. “Therefore, the launch of Apple AR/MR will further boost the demand for immersive gaming/multimedia entertainment.”

N301 is said to combine VR displays with passthrough cameras for both VR and AR applications. Check out the roundup below for all of the rumors surrounding Apple’s MR headset:

What We (think we) Know About N301 Mixed Reality Headset

Filed Under: Apple, apple ar, apple mr, apple n301, apple vr, n301, News

Apple Quietly Released One of The Most Impressive AR Room-mapping Tools

June 22, 2022 From roadtovr

Apple has barely mentioned augmented or virtual reality in its big keynotes lately, however at WWDC 2022 earlier this month, the company quietly released probably one of the best 3D room-mapping tools for mobile AR yet.

Called RoomPlan, the ARKit Swift API uses the camera and LiDAR scanner on recent iPhones and iPads to create a 3D floor plan of a room, including key characteristics such as dimensions and types of furniture.

It’s not for consumers (yet) though. Apple says it’s aiming to appeal to professionals like architecture and interior designers for conceptual exploration and planning, as well as developers of real estate, e-commerce, or hospitality apps; developers can integrate RoomPlan directly into their AR-capable apps.

When it was released earlier this month, Jonathan Stephens, Chief Evangelist at spatial computing company EveryPoint, took RoomPlan for a test drive to see what it could do. The results are pretty surprising.

Follow along as I do a series of structured @Apple RoomPlan tests and share my findings/notes in this thread.

First up, I tried tricking RoomPlan with a large mirror. Surprisingly it wasn’t fooled! Also, it was way off on french doors height.#WWDC22 #AR #ARKit #AI @Scobleizer pic.twitter.com/R4hJbO57Km

— Jonathan Stephens (@jonstephens85) June 7, 2022

RoomPlan seems to be able to deal with a number of traditionally difficult situations, including the mirror seen above, but also messy spaces, open and closed doors, windows, and generally complex architecture. Still, Stephens’ house isn’t just a bunch of cube-shaped rooms, so there’s a few bits that just didn’t match up.

Test #2 – vaulted ceilings. I noticed that the wall shapes have to be rectangular. It could not follow the slant angle of the ceiling line. This made parts of my walls much taller than in reality.

It did a great job at picking out the desks and bedroom furniture. pic.twitter.com/fbu5B9L3Ds

— Jonathan Stephens (@jonstephens85) June 7, 2022

Vaulted ceilings, wall openings, multifloor areas like you might find in foyers were all a bit too difficult for RoomPlan to correctly digest. Although not perfect, it seems to at least autocorrect to some degree based on some assumptions of how things might best fit together.

Here is probably the coolest find so far. When I look top down, the walls correct themselves based on some assumptions from Apple. pic.twitter.com/KblqeLYm5x

— Jonathan Stephens (@jonstephens85) June 7, 2022

RoomPlan isn’t just for app integrations though. Apple says it outputs in USD or USDZ file formats which include dimensions of each component recognized in the room, such as walls or cabinets, as well as the type of furniture detected.

If you’re looking to finetune the scan, dimensions and placement of each individual components can be adjusted when exported into various USDZ-compatible tools, such as Cinema 4D, Shapr3D, or AutoCAD, Apple says.

We’re still no closer to learning when the company plans to release its rumored mixed reality headset or its full-fledged AR glasses, however either AR or MR headset would need extremely robust space-mapping capabilities. Seeing Apple make these sorts of strides using its existent platforms certainly shows they’re on the right track.

If you haven’t been following along with the Apple rumor mill, check out some of the links below regarding the company’s mixed reality headset, codenamed N301:

What We (think we) Know About N301 Mixed Reality Headset


A special thanks to Hrafn Thorisson for pointing us to the news!

Filed Under: Apple, apple ar, apple arkit, apple room plan, apple roomplan, AR News, ARKit, News

Report: Apple Recruits Hollywood Directors For VR/AR Headset

June 6, 2022 From vrscout

Rumor has it the company may also be working on VR FaceTime.

According to The New York Times, Apple is enlisting the help of the Hollywood elite to help bolster its catalog of immersive content for its long-rumored VR/AR headset. This will supposedly include mysterious video content from celebrated director Jon Favreau (The Lion King, Ironman, The Mandalorian).

Citing three people close to the project, the publication claims that Favreau is developing original content for the mixed reality device based on Apple TV+’s new “documentary” series, Prehistoric Planet, for which Favreau serves as executive producer. According to sources, this mixed reality content will be available next year when Apple’s VR/AR headset launches sometime in 2023.

Credit: John Salangsang / Shutterstock

While Apple has yet to confirm or deny these reports, it’s worth noting that Favreau has a history of using immersive technology as part of his filmmaking process. Back in 2019, we talked about how the star director used VR technology to create and preview virtual environments for his 2019 remake of The Lion King. That being said, it would make sense for Apple to seek the help of Favreau in developing immersive content.

According to Mark Gurman, Bloomberg’s official Apple reporter, the company’s VR/AR headset and RealityOS platform is just the tip of the metaphorical iceberg. In addition to the long-rumored hardware, Gurman claims that the company is also working on VR integration for FaceTime, Apple Maps, and a variety of other apps.

Director Jon Favreau and his team immersed within their virtual set. / Credit: Wired

“While third-party apps are a key ingredient of this project, Apple is also planning a slew of its own apps,” says Gurman in his report. “That includes a VR version of FaceTime that can scan a person’s face to replicate their movements in a Memoji, a new VR version of Maps, and rOS variants of core Apple apps like Notes and Calendar. Also in the works is a way for the headset to extend a Mac’s display, bringing it into 3D.”

This past week rumors began circulating about the potential reveal of Apple’s combination VR/AR headset and “Reality Operating System” during the company’s annual WWDC developer conference. Unfortunately, the event came and went with no mention of VR or AR, let alone the highly-anticipated mixed reality device. That said, we did learn about a new MacBook Air as well as several exciting software updates for a variety of iOS devices.

Here’s hoping we learn more about Apple’s VR/AR device in the coming months.

Filed Under: Apple, AR, augmented reality, iOS, News

Apple’s iPhone Will Soon Scan Your Ear to Solve a Big Problem with Spatial Audio

June 6, 2022 From roadtovr

Today during Apple’s WWDC 2022 keynote, the company announced that iOS 16 will allow users of modern iPhones to scan the shape of their ear to create more accurate spatial audio. Likely implemented as an HRTF, creating custom HRTFs for consumers was once impractical due to the need for sophisticated equipment, but advances in computer vision are making the technology much more accessible.

When it comes to digital spatial audio, there’s a limit to how accurate the sense of ‘position’ or ‘3D’ the audio can be without taking into account the unique shape of the user’s head and ears.

Because everybody has a uniquely shaped head, and especially ears, elements of incoming sound from the real world bounce off your head and into your ears in different and very subtle ways. For instance, when a sound is behind you, the precise geometry of the folds in your ear reflect sound from that angle in a unique way. And when you hear sound coming to your ear in that particular way, you’re attuned to understand that the source of the sound is behind you.

To create a highly accurate sense of digital spatial audio, you need a model which accounts for these factors, such that the audio is mixed with the correct cues that are created by the unique shape of your head and ears.

Audiologists have described this phenomena mathematically in a model known as a Head-related Transfer Function (also known as an HRTF). Using an HRTF, digital audio can be modified to replicate the spatial audio cues that are unique to an individual’s ear.

So while the math is well studied and the technology to apply an HRTF in real-time is readily available today, there’s still one big problem: every person needs their own custom HRTF. This involves accurately measuring each ear of each person, which isn’t easy without specialized equipment.

But now Apple says it will make use of advanced sensors in its latest iPhones to allow anyone to scan their head and ears, and create a custom spatial audio profile from that data.

Apple isn’t the first company to offer custom HRTFs based on a computer-vision model of the ear, but having it built into iOS will certainly make the technology much more widespread than it ever has been.

During the WWDC 2022 keynote, Apple announced the feature as part of the forthcoming iOS 16 update which is due out later this year. It will work on iPhones with the TrueDepth camera system, which includes the iPhone 10 and beyond.

But just having an accurate model of the ear isn’t enough. Apple will need to have developed an automated process to simulate the way that real sound would interact with the unique geometry of the ear. The company hasn’t specifically said this will be based on an HRTF implementation, but it seems highly likely as it’s a known quantity in the spatial audio field.

Ultimately this should result in more accurate digital spatial audio on iPhones (and very likely future Apple XR headsets). That means that a sound 10 feet from your left ear will sound more like it should at that distance, making it easier to differentiate between a sound 2 feet from your left ear, for instance.

This will pair well with the existing spatial audio capabilities of Apple products; especially when used with AirPods which can track the movement of your head for a head-tracked spatial audio experience. Apple’s iOS and MacOS both support spatial audio out of the box, which is capable of taking standard audio and making it sound as if it’s coming from speakers in your room (instead of in your head), and accurately playing back sound that’s specially authored for spatial audio, such as Dolby Atmos tracks on Apple Music.

And there’s another potential upside to this feature to. If Apple makes it possible for users to download their own custom HRTF profile, they may be able to take it and use it on other devices (like on a VR headset for instance).

Filed Under: Apple, apple hrtf, apple spatial audio, hrtf scan, ios hrtf, iphone custom spatial audio, iphone hrtf, iphone spatial audio, News

Apple May Announce AR/VR Operating System at WWDC Next Week, Trademark Suggests

May 30, 2022 From roadtovr

Recent reports maintain we may be seeing a mixed reality headset from Apple sometime soon, however it appears the Cupertino tech giant has filed a global trademark for realityOS, its alleged XR operating system, which could suggest we’ll learn more about that and its XR device(s) as early as next week.

The trademark was discovered by Parker Ortolani, a brand licensing manager at Vox Media. As first reported by The Verge, Ortolani’s investigative work points to a possible reveal or mention of realityOS at WWDC’s keynote, taking place June 6th.

The trademark for realityOS, which is supposed to be used with “wearable computer hardware”, wasn’t filed by Apple directly. Instead, it was filed back in December 2021 by an entity called “Realityo Systems LLC”, which seems to have all of the hallmarks of a shell company specifically created to obfuscate the actual trademark holder. In the past, the company has used similar shells to register its successive macOS update names including Yosemite, Big Sur, and Monterey.

As Ortolani points out in a Twitter tread, the trademark was initially filed just two months before “realityOS” began showing up in Apple source code. The June 8th deadline to renew this filing is conveniently slated to take place only two days after the upcoming WWDC keynote.

“Apple typically files trademarks for products announced at WWDC a day or two after the keynote. This would be one helluva coincidence,” Ortolani concludes.

Image courtesy USPTO via Parker Ortolani

Unless someone is looking for a trademark dispute that they’ll surely lose against Apple, it’s possible we’re seeing the dominoes fall in place for the company to formally announce realityOS, and possibly allude to its first XR headset.

Earlier this month, a report from The Information alleged that Apple showed off a host of AR/VR prototypes to its board as far back as 2016. The report maintained that, more recently, project lead Mike Rockwell and then-Apple hardware designer Jony Ive found themselves in a bit of a tussle when it came to just how the company’s first immersive headset would function. It’s said the headset, codenamed N301, was in the end set to become a standalone headset with VR displays and passthrough AR camera sensors, making it a ‘mixed reality’ headset.

We’ve assembled some of the key takeaways from reports past. Like all things Apple, we’re unable to verify any of the claims below, so please take them with a big grain of salt:

What We (think we) Know About N301 Mixed Reality Headset

Filed Under: Apple, apple ar, apple glass, apple glasses, apple headset, apple mr, apple vr, apple xr, News

Report Details Apple MR Headset Design Challenges & Internal Hurdles

May 18, 2022 From roadtovr

Apple is a notorious black box when it comes to internal projects, although sometimes details based on supply chain rumors shed a sliver of light on what might be happening with the company’s AR/VR headset behind closed doors. Much less common coming from Apple are direct internal leaks, however a report from The Information alleges that 10 people on Apple’s mixed reality headset project team have detailed some of the past design challenges and possible direction the headset may take moving forward.

The report (via 9to5Mac) details some anecdotes reaching back as far as 2016, when the company allegedly first showed off a number of AR and VR prototypes to industry leaders and Apple elite.

Former Vice President Al Gore, then–Disney CEO Bob Iger and other Apple board members walked from room to room, trying out prototype augmented and virtual reality devices and software. One of the gadgets made a tiny digital rhinoceros appear on a table in the room. The creature then grew into a life-size version of itself, according to two people familiar with the meeting. In the same demo, the drab surroundings of the room transformed into a lush forest, showing how users could seamlessly transition from AR, in which they can still view the physical world around them, to the more immersive experience of VR—a combination known as mixed reality.

It was more of a conceptual showcase at the time, the report maintains, as some prototypes ran on Windows while others were based on the original HTC Vive. Like the ‘The Sword of Damocles’ built by Ivan Sutherland in the late 60s—the founding father of virtual reality—one such prototype was also supposedly so heavy it was “suspended by a small crane so the Apple board members could wear it without straining their necks.”

None of that’s particularly uncommon practice when it comes to hardware development—just ask Magic Leap insiders from the early days—however the report notes the company’s MR headset hasn’t gained the same support from Apple’s current CEO, Tim Cook, that Steve Jobs had for iPhone’s development. The report says Cook “rarely visits the group at its offices away from the main Apple campus.”

There’s also allegedly been some political infighting that has stymied development, which we’ve heard in a previous report from 2019 when it was alleged Apple was pumping the breaks on the headset due to discord between then-Apple hardware designer Jony Ive and project lead Mike Rockwell. Ive has since departed the company in 2019 to pursue his own design company, LoveFrom.

Rockwell, Meier and Rothkopf soon encountered pushback from Ive’s team. The three men had initially wanted to build a VR headset, but Ive’s group had concerns about the technology, said three people who worked on the project. They believed VR alienated users from other people by cutting them off from the outside world, made users look unfashionable and lacked practical uses. Apple’s industrial designers were unconvinced that consumers would be willing to wear headsets for long periods of time, two of the people said.

While the teams proposed adding passthrough cameras to the front of the headset, codenamed N301, Apple industrial designers were decidedly more intrigued with a concept for what sources tell The Information was an “outward-facing screen on the headset. The screen could display video images of the eyes and facial expressions of the person wearing the headset to other people in the room.”

The report doesn’t go any further than 2019, however The Information’s Wayne Ma is supposedly publishing a piece soon that covers “pivotal moment for the Apple headset.”

Like we said, Apple is a black box, which means it doesn’t comment on on-going projects or respond meaningfully to media requests for clarity. Looking back at previously reports however may provide a rough picture of what to expect. The information below is based on reports, so please take it with a grain of salt.

What We (think we) Know About N301 Mixed Reality Headset

Filed Under: Apple, apple ar, apple glasses, apple mixed reality, apple mr, apple vr, apple vr headset, News, VR Headset

The Apple Car Could Feature VR Technology And No Windows

May 17, 2022 From vrscout

Who needs windows when you have VR?

It was back in 2014 that Apple first began the development of its own automotive vehicle. After years of radio silence from the company, reports began circulating that the company had switched gears and was now working on a self-driving vehicle that requires no input on behalf of the passengers. As such this autonomous car would lack any and all driver controls, including a steering wheel and foot pedals.

Since then Apple has filed several car-related patents teasing a number of potential features. That said, the most recent patent might be the company’s most interesting one yet.

On May 3rd, 2022, Apple filed a patent with the United States Patent & Trademark Office for an in-car VR entertainment system that utilizes the motion of the vehicle to further immerse passengers in their in-headset experiences. VR content is synchronized with the movement and acceleration of the autonomous vehicle as it travels to the desired location, offering a unique location-based experience that changes based on your commute.

In addition to entertainment, the patent details how the technology referenced could be used to reduce motion sickness. Instead of conventional windows, passengers would view the outside world by using their VR headset to access cameras mounted on the outside of the vehicle. The technology could also be used to watch videos and read books in a stabilized environment as well as conduct virtual meetings while on the road.

According to past reports, Apple is expected to launch its long-awaited self-driving car sometime around 2025, though we’ve yet to receive any form of official confirmation from the company. Apple has filed a number of outrageous trademarks in the past that have yet to see the light of day. That said, if anyone can effectively sell a windowless self-driving vehicle with no steering wheel to the general public, it’s Apple.

For more information take a look at the patent filing here.

Image Credit: Letsgo Digital

Filed Under: Apple, In-Car VR, News

Apple Says Steep ‘Horizon Worlds’ Creator Fees Show Meta’s “Hypocrisy”

April 14, 2022 From roadtovr

Earlier this week Meta announced that it would begin testing tools to let creators sell things for real money in Horizon Worlds and would charge a fee of 47.5% of their earnings. The fee structure seemed at odds with prior comments from Meta which have criticized app store fees from the likes of Apple and Google. Now Apple is accusing the company of hypocrisy.

Following the news this week that Meta planned to take nearly half of a creator’s earnings in Horizon Worlds, Apple didn’t miss the chance to point out that this was coming from a company which has on multiple occasions criticized Apple’s app store fee of 30% (after 15% for the first $1 million in annual revenue).

Speaking to MarketWatch, Apple spokesman Fred Sainz had this to say:

Meta has repeatedly taken aim at Apple for charging developers a 30% commission for in-app purchases in the App Store—and have used small businesses and creators as a scapegoat at every turn. Now, Meta seeks to charge those same creators significantly more than any other platform. [Meta’s] announcement lays bare Meta’s hypocrisy. It goes to show that while they seek to use Apple’s platform for free, they happily take from the creators and small businesses that use their own.

And, well… he isn’t wrong. Just last year Meta CEO Mark Zuckerberg not-so-subtly said in a very widely viewed keynote that being subject to the app store fees of Apple and Google had changed the way he viewed the industry, going on to say that he wants his company to take “a different approach” when it comes to its creator platforms.

The last few years have been humbling for me and our company in a lot of ways. One of the main lessons that I’ve learned is that building products isn’t enough. We also need to help build ecosystems so that millions of people can have a stake in the future, can be rewarded for their work, and benefit as the tide rises, not just as consumers but as creators and developers.

But this period has also been humbling because as big of a company as we are, we’ve also learned what it is like to build for other platforms. And living under their rules has profoundly shaped my views on the tech industry. Most of all, I’ve come to believe that the lack of choice and high fees are stifling innovation, stopping people from building new things, and holding back the entire internet economy.

We’ve tried to take a different approach. We want to serve as many people as possible, which means working to make our services cost less, not more. Our mobile apps are free. Our ads business model is an auction, which guarantees every business the most competitive price possible. We offer our creator and commerce tools either at cost or with modest fees to enable as much creation and commerce as possible.

Indeed, those words seemed to fly in the face of Meta’s announcement that it would change creators a fee of 47.5% of their earnings for anything sold through Horizon Worlds. Not to mention that the company has also levied a 30% fee (the same that Apple and others charge) against developers since the very beginning of its VR app store.

Meta’s strongest defense, perhaps, is that Horizon World creator fees aren’t entirely out of line with similar platforms available today, but one must ask why the company wouldn’t want to set a better precedent given its public statements criticizing others for similar behavior.

Filed Under: Apple, horizon worlds, horizon worlds creator fees, horizon worlds creator share, horizon worlds revenue split, Meta, News, vr industry

Apple vs. Meta: Who Will Offer the Better Metaverse Experience?

January 21, 2022 From vrfocus

As we edge closer towards Web 3, it seems like every company wants a piece of the pie. After rebranding to Meta and laying out its plans to dominate the metaverse, Facebook has made significant waves within the last few months. Other Big Tech giants such as Microsoft, Samsung and Sony have also sunk their teeth into the metaverse space, with offerings such as collaborative software, better connectivity and more immersive user experiences.  

  • Mark Zuckerberg @ F8 2018 - Virtual Reality

The spotlight has also long since been on Apple, with many analysts and experts waiting for one of tech’s biggest trailblazers to introduce their own ‘mixed reality’ headset. However, recent reports have confirmed that Apple has no short-term plans to enter the metaverse with their much-awaited device, which is set to be announced later this year. Instead, the company is allegedly focusing only on providing access to gaming, communications and entertainment content for the time being.

With the metaverse being an inevitable prospect, will Apple eventually enter the market with a Web 3-compatible device? First, let’s take a look at what’s in store for both Apple and Meta’s next headset releases in 2022. We’ll then review what both Facebook and Apple are best at doing — and why we think that Apple won’t necessarily stay behind the curve.

Facebook’s first high-end headset under the Meta moniker is due for release sometime later in 2022 — though an exact timeframe has yet to be confirmed. 

Dubbed ‘Project Cambria’, Meta’s latest device was initially referenced last year at the company’s virtual Connect conference. This headset has been promised to be the successor to the popular Oculus Quest 2, packed with immersive features that were previously unseen in previous headset releases.

Notable features include lifelike facial communication capabilities, the ability to track users’ facial expressions, reconstruction of mixed reality objects, a special avatar personalisation engine and other advancements that are in line with bringing CEO Mark Zuckerberg’s promise of an ‘embodied’ metaverse experience to life.

In terms of its design, several defining assets were also revealed in Meta’s Connect 2021 demo. Some of the most notable ones include:

  • A more ergonomic design: In its current prototype form, Project Cambria will come in a sleek, all-black structure that is lighter, more compact and equipped with a much slimmer strap than its Oculus predecessors.
  • Tracked controllers: Project Cambria is also expected to feature full-body tracking capabilities, giving users a better sense and level of control over their virtual surroundings.
  • More advanced sensors and reconstruction algorithms: Project Cambria is also set to feature more superior sensors and reconstruction algorithms, with the ability to represent physical objects in the real world with impeccable perspective and depth. The sensors will also accommodate various different skin tones and facial features, making users’ experiences more immersive and lifelike.

According to Meta analyst Noelle Martin, the company: “aims to be able to simulate you down to every skin pore, every strand of hair, every micromovement […] the objective is to create 3D replicas of people, places and things, so hyper-realistic and tactile that they’re indistinguishable from what’s real.”

So far, Meta’s project appears to be off to a smooth start. Since its rebranding, the company’s share price has risen by about 5%. Meta’s plans involve hiring at least 10,000 new staff members to build out their metaverse space. And while this news hasn’t exactly been hailed across the board, Meta has even started poaching staff members from both Microsoft and Apple and recruting them to join their mission.

What do we know about Apple’s upcoming ‘mixed reality’ headset?

While multiple sources initially claimed that Apple’s upcoming headset would be set to launch in 2022, Bloomberg now suggests that we will more likely see the announcement of the new headset closer to the end of this year. 

Some features that are projected to be featured in Apple’s first XR offering include:

  • Turbo-fast processing: Apple’s headset release is expected to wield the same level of power as the M1 processor currently found in its latest MacBook Pro lineup, with a 96W USB-C power adapter at its helm. It’s also reported to feature a lower-end processor, which will power up any sensor-related computing.
  • Tracking cameras: Apple’s headset will apparently feature two tracking cameras, with the ability to relay information to two 8K displays located in front of the user’s eyes.
  • LiDAR sensors: These sensors have been cited as a possibility for Apple’s first headset — with lasers to measure distance, allowing for the fast and accurate gathering of a space’s area. This would allow for better placing of objects in AR.

Despite ample predictions that Apple would join the likes of Meta, Microsoft and other tech leaders in creating a metaverse-compatible device, it appears that they won’t be in the ranks just yet. According to Bloomberg’s Mark Gurman, known to be a reliable Apple analyst: “The idea of a completely virtual world where users can escape to — like they can in Meta Platforms/Facebook’s vision of the future — is off-limits from Apple.” Instead, he has said that the upcoming mixed-reality headset will allow users to perform shorter activity sessions — such as gaming, communications and entertainment consumption.

With Web 3 clearly on the horizon, Apple’s refusal to enter the metaverse space has prompted reactions of shock and disappointment from spectators. This news also places both Meta and Apple in very different areas of the playing field, with Apple’s upcoming vision feeling like a sharp contrast to that of Meta’s — a brand that has completely centralised its new positioning around creating a metaverse space in Web 3.

If we shift our focus back to Meta, we’re left with an important question — what kind of advantage do they have in this race? Has Facebook’s success and business model laid down the right foundation for Meta to rightfully take off?

What Facebook has done best: connecting people

From its earliest days, Facebook was created with one primary mission: to bring people closer together.

A then-sophomore at Harvard, a young Mark Zuckerberg launched The Facebook — a social media website built to forge better connections between Harvard students. This force of connectivity was then used to help students across different institutions connect with each other. Eventually, the Facebook universe would completely revolutionise how the rest of the world would connect, communicate and share personal information across a centralised database.

Photo by © Wachiwit – Shutterstock.com

Now as Meta, the company’s goal is to enhance the user experience and make these virtual connections more immersive. According to Mark Zuckerberg: “the defining quality of the metaverse will be a feeling of presence — like you are right there with another person or in another place.” Moreover, he describes the objective of allowing users to feel truly present with one another as: “the ultimate dream of social technology.”

Today, it can be argued that Meta is the only Big Tech corporation with the scale and capital to create a metaverse space — with a user base of 3.5 billion people and a total of $86 billion in generated profits from within the last year. With an unparalleled number of users at its fingertips, Meta already houses the largest web of interconnected people in all of social media history.

However, will the expansiveness of Meta’s ecosystem continue to foster a safe and equitable space for users to freely connect and share information? Despite Facebook’s long history of controversies, Zuckerberg seems to have a pretty egalitarian version of the metaverse — promising a need for greater interoperability and lower fees for developers. But with the advent of virtual land on decentralised platforms such as Decentraland and Somnium Space, questions have now arisen about how Meta will govern its new internet medium, or about where communities may find ways to connect more freely in Web 3.

With this taken into account, it’s also easy to wonder: should Meta be forced to share the metaverse with these newer, blockchain-powered platforms, will the Project Cambria headset offer fair access? Or will this one day be offered by another, potentially more mainstream and user-friendly device?

What Apple has done best: innovation

Apple is often credited for revolutionising some of our most widely-used product innovations. Well-known examples include the iPod, the iPhone and, of course, the Apple Macintosh — one of the very first machines that helped make personal computing ubiquitous. To illustrate an example, let’s jump into a time machine and backtrack to the very early days of computing. 

Steve Jobs, a then-aspiring tech mogul, paid a visit to Xerox’s PARC (Palo Alto Research Center) laboratory back in 1979. During this time, Xerox was the first company to have produced an operating system based on a graphical user interface (GUI) — a remarkable device called the Xerox Alto. 

However, the Alto would never see a commercial release. With a price tag of $32,000 USD (the equivalent to $114,105 USD in today’s market), Xerox’s managers saw nothing but an overly complicated workstation that was far too expensive to mass-produce. Steve Jobs, on the other hand, saw much more than that. He was amazed by the GUI and believed that the Alto was the ideal blueprint for how all computers should operate.

Photo by © Wachiwit – Shutterstock.com

Most analysts agree that the Xerox Alto was far ahead of its time. Before any other machine in computing history, it featured the same type of keyboard and mouse interface we still use today. It also, incredibly enough, featured now-universal concepts such as email, event reminders and word processing. 

Wanting a piece of the innovation for himself, Jobs sold shares of Apple to Xerox in exchange for access to the Alto’s technology. Apple would then use their data to create a more refined, user-friendly and affordable home computing device.

The same logic can be applied to the creation of the iPhone. While Apple wasn’t the pioneer of the mobile smartphone, they were able to reinvent the handset concept and turn it into the closest thing we then had to a pocket-sized computer. To date, the iPhone’s build has served as a de facto blueprint for how future touch-screen devices would be constructed and integrated into our everyday lives.

Throughout the course of tech history, Apple has mastered the art of taking existing technology and making it better. And while Steve Jobs may no longer be at the forefront of Apple’s empire, their continued efforts (such as the M1 processor in today’s lightweight, industry-standard MacBooks, or the highly expansive App Store library) have proven that the tech giant hasn’t lost its innovation edge.

So, how does this all relate to our current technological paradigm, which is Web 3? 

Well, it’s a prime example of what Apple does best: innovation. And while it might be too soon to tell, decades of Apple’s design-first trends suggest that we could very well see history repeat itself once the tech giant decides to create an innovative, metaverse-ready device. Like the iPhone or the Macintosh, it just might be the one that finds its way into the households of the masses.

So, what’s next?

With neither tech giant having released their dedicated XR headset yet, it’s still far too early to tell which path either will take. Recent reports have revealed that Meta plans to enter the NFT marketplace, though no evidence yet suggests that the company has any plans to embrace a more decentralised business model.

When we look back at the history of computing, however, one thing is clear: computers — or in this case, headsets — have never been the end goal. They’re not the thing, per se — they’re the thing that gets us to the next thing. And when it comes to getting closer to Web 3, the company that brings us towards the better, more ubiquitous user experience will win.

To keep learning more about Apple, Meta and other industry trends related to the metaverse and Web 3, stay tuned for more updates on gmw3.

Filed Under: Apple, Facebook, Features, Meta, Metaverse, project cambria, XR News

HoloLens Optics Chief Joins Google Amid Reported Push for Upcoming Google AR Headset

January 21, 2022 From roadtovr

Bernard Kress, principal optical architect on Microsoft’s HoloLens team, has left the company to take on the role of Director of XR Engineering at the recently formed Google Labs. A report by The Verge maintains Google is also now gearing up to produce an AR headset that could directly compete with similar offerings from the likes of Apple and Meta.

Before joining Microsoft in 2015, Kress worked as principal optical architect behind Google Glass, the company’s smartglasses that found marked success in the enterprise sector after a rocky reception by consumers in 2013.

At Microsoft, Kress continued his work—principally focused on micro-optics, wafer scale optics, holography and nanophotonics—as partner optical architect on the HoloLens team, overseeing the release of both HoloLens and HoloLens 2.

Now Kress is back at Mountain View working on Google’s next AR headset. According to his LinkedIn, Kress has been leading the Optical Engineering department at Google Labs since November 2021—or right as Google shook things up by creating the AR/VR division.

And there’s no doubts about it: Kress says he’s focusing on creating consumer AR hardware at Google.

“Google is best positioned along the key players in this market to effectively address the burgeoning consumer AR market by matching its existing and acclaimed digital services and products to next generation optics, displays and sensors technologies, providing a seamless and unparalleled experience to the user,” said Kress.

Hot on the heels of the strategic hire, a report from The Verge maintains Google is now gearing up to produce its own AR headset, which is allegedly codenamed Project Iris.

According to people familiar with the matter, Project Iris is said to ship sometime in 2024, although that date may simply be wishful thinking given the early stage of the project.

The prototype is said to be ski goggle-esque, providing a standalone experience with onboard power, computing, and outward-facing cameras for world sensing capabilities—similar in description and function to headsets like HoloLens or Magic Leap.

The standalone AR headset is said to use a custom Google processor running on either a version of Android or Google’s own Augmented Reality OS, which according to a recent job listing is currently in development.

Around 300 people are purportedly working on Project Iris, however Google plans to expand by “hundreds more.” Veteran AR/VR Google exec Clay Bavor is heading up the project, reporting directly to CEO Sundar Pichai.

Bavor is known for his work on Project Starline, an experimental light field display system created to be more natural way of chatting at a distance than conventional video conferencing apps. Bavor also oversaw the 2016 launch of Google’s Daydream VR platform (subsequently abandoned in 2019), and the development of ARCore, the software development kit for smartphone-based AR.

This comes as Apple is supposedly preparing to release a VR headset with passthrough AR capabilities (sometimes called ‘mixed reality), which reports maintain will come at some point in 2023 as a precursor to a dedicated Apple AR headset at some point afterwards.

Meta (formerly Facebook) is also working on its own VR headset with AR passthrough, codenamed Project Cambria, which may be positioned as direct competition to Apple’s own when the time comes.

Filed Under: Apple, AR Headset, AR News, bernard kress, google, google ar, google ar glasses, google ar headset, google ar vr, google glasses, google labs, google project iris, kress, Meta, News, Project Iris

Next Page »
Copyright © 2022 GenVR, Inc.
  • Home
  • About us
  • Contact Us