• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home

XR Industry News

Japan’s TDK Acquires Smart Glasses Maker SoftEye to Advance Wearable AI Tech

June 20, 2025 From roadtovr

TDK, the Japan-based electronic component and software company, announced it’s acquired SoftEye, the US-based smart glasses hardware and software maker.

TDK calls the acquisition of SoftEye a “key milestone in the development of TDK’s contribution to the entire AI ecosystem and reinforces the business portfolio to establish a leadership position in this critical market.”

While financial terms have not been made public, a Reuters report claims the deal is worth “less than $100 million,” according to a source familiar with the matter.

SoftEye, a San Diego-based company, manufactures custom chips, cameras and algorithms for use in AI-linked smart glasses, something TDK calls “a critical element in delivering a complete AR/VR display system and will also create a new Human Machine Interface (HMI) for interacting with AI through eye movement.”

“We are building technologies for AI glasses connecting the user with generative AI, which fits directly in line with TDK strategy for smart glasses which can connect people with AI for a more intuitive and compelling user experience,” says SoftEye CEO Te-Won Lee. “SoftEye’s novel, low power eye intent system unlocks a new type of Human Machine Interface that allows the user to communicate with AI simply through their eye movements. Together, we believe we can deliver even more advanced integrated solutions – spanning systems, software and machine learning and custom chips.”

Widely known for its major position in the cassette tape and CD-R disc industry in the ’90s and early 2000s, TDK has since focused on electronic components, including everything from sensors, transformers, capacitors, and application specific IC (ASIC).

In recent years, the company has also heavily invested in AI infrastructure—from neuromorphic “spin memristors” to reduce power consumption for AI applications, to ultra‑fast spin photo detectors to increase data speeds for AR/VR applications and data centers.

TDK’s SoftEye acquisition follows increased market interest in smart glasses, with industry veteran Vuzix recently securing a $5 million investment from Quanta Computer, the Taiwan-based ODM and major Apple assembler.

Meanwhile, familiar names in the consumer-focused XR segment are preparing what could shape up to be strong competitors to the field’s leader, Ray-Ban Meta Smart Glasses.

Ray-Ban Meta Glasses, Image courtesy Meta, EssilorLuxottica

Google announced last month it’s working with eyewear firms Warby Parker and Gentle Monster to release a line of fashionable smart glasses running the company’s forthcoming Android XR operating system—expected to release sometime after 2025.

Rumors additionally suggest that both Samsung and Apple are aiming to release their own smart glasses at some point, with reports claiming Samsung could release a device this year, and Apple as soon as 2026.

Meanwhile, Meta recently confirmed it’s expanding its partnership with Ray-Ban Meta-maker EssilorLuxottica to create Oakley-branded smart glasses, expected to launch today, June 20th.

Filed Under: AR Investment, Investment, News, VR Investment, XR Industry News

Vuzix Secures $5M Investment as Veteran Smart Glasses Maker Sets Sights on Consumers

June 17, 2025 From roadtovr

Vuzix, the veteran smart glasses maker, announced it’s secured a $5 million investment from Quanta Computer, the Taiwan-based ODM and major Apple assembler.

The latest investment was the second tranche following an initial $10 million investment made by Quanta in September 2024, which included the purchase of Vuzix common stock at $1.30 per share. At the time, Vuzix anticipated a total of $20 million from Quanta.

Paul Travers, President and CEO of Vuzix, notes the funding will be used to enhance Vuzix’s waveguide manufacturing capabilities, something he says will help Vuzix deliver “the world’s most affordable, lightweight, and performance-driven AI smart glasses for mass-market adoption.”

Additionally, Travers says the investment “marks another important milestone in strengthening our partnership with Quanta and expanding the capabilities of our cutting-edge waveguide production facility.”

Vuzix Z100 Smart Glasses | Image courtesy Vuzix

Founded in 1997, Vuzix has largely serviced enterprise with its evolving slate of smart glasses, which have typically targeted a number of industrial roles, including healthcare, manufacturing, and warehousing.

The company also produces its own waveguides for both in-house use and licensing. In the past, Vuzix has worked to integrate its waveguide tech with Garmin, Avegant, an unnamed US Fortune 50 tech company, and an unnamed U.S. defense supplier.

While the company has made a few early consumer devices in the 2010s, including V920 video eyewear and STAR 1200 AR headset, in November 2024, Vuzix introduced the Z100 smart glasses, its first pair of sleek, AI‑assisted smart glasses, priced at $500.

Its Z100 smart glasses include a 640 × 480 monochrome green microLED waveguide, and were designed to pair with smartphones to display notifications, fitness metrics, maps, targeting everyday consumers and enterprise customers alike.

Notably, the investment also coincides with greater market interest in smart glasses on the whole. Google announced last month it’s partnering with eyewear companies Warby Parker and Gentle Monster to release a line of fashionable smart glasses running Android XR.

Meta also recently confirmed it’s expanding its partnership with Ray-Ban Meta-maker EssilorLuxottica to create Oakley-branded smart glasses, expected to launch on June 20th, 2025.

Meanwhile, rumors suggest that both Samsung and Apple are aiming to release their own smart glasses in the near future, with reports maintaining that Samsung could release a device this year, and Apple as soon as next year.

Filed Under: AR Development, ar industry, AR Investment, News, XR Industry News

What We Know So Far About Anduril’s ‘Eagle Eye’ Military XR Headset and Founder’s Reunion With Meta

June 16, 2025 From roadtovr

Palmer Luckey’s military tech company Anduril recently announced a partnership with Meta to build “the world’s best AR and VR systems for the US military.” In two recent public conversations, Luckey offered up some details on the XR helmet his company is building for the military and how this unlikely partnership arose years after his VR company Oculus was acquired by Meta, followed by his unceremonious firing.

Following the announcement, Luckey spoke to host Ashlee Vance on an episode of the Core Memory podcast, and on stage with author and creative technologist Stephanie Riggs during a conversation at the AWE USA 2025 conference. From these conversations, we’ve detailed the most interesting information about Anduril’s upcoming military XR headset.

Eagle Eye

Luckey said that Anduril’s upcoming military XR device is codenamed ‘Eagle Eye’. The goal is to build a complete helmet replacement (with built-in XR capabilities) for soldiers, rather than merely an add-on device that would be worn or attached to standard-issue helmets.

“Eagle Eye is not just a head mounted display. It’s a fully-integrated ballistic shell, with hearing protection, vision protection, head protection, on-board compute, on-board networking, radios… and also vision augmentation systems… sensor systems that enhance your perception,” Luckey said on Core Memory. “And what we’re doing is working with Meta to take the building blocks that they’ve invested enormous amounts of money and expertise in, and we’re able to use those building blocks in Eagle Eye without having to recreate them from scratch ourselves.”

More specifically, he explained at AWE that, “Eagle Eye is not one head mounted display. It’s actually a platform for building vision augmentation systems. We’re building different versions because you have different people who have different roles. The guy who is a front-line infantryman being shot at has a different job than the guy who’s a logistician, or aircraft maintainer, or somebody who works in a warehouse. The field-of-view they need, the level of ballistic rating they need—it’s very very different. So Eagle Eye is actually a platform for hosting multiple vision augmentation systems.”

While not many technical specifics have been shared thus far, Luckey mentioned the headset uses multiple microdisplays per-eye. That tells us the headset could be a passthrough AR headset rather than transparent. That might seem surprising, (considering the need for battlefield awareness) but he repeatedly emphasises the goal of the helmet offering greater perception for soldiers through augmentation, rather than less.

Luckey admitted that the multi-microdisplay layout results in a visible seam in the peripheral image (which reminds me of an old ultrawide field-of-view headset prototype from Panasonic).

He said the seam wouldn’t be acceptable for the consumer market, but because the headset is being built as a tool to keep people alive, the tradeoff is worth it.

“One of the things we’re doing with eagle eye is using multiple microdisplays per-eye, with a tiled seam. And so you end up with this small little kind of distorted seam that’s living out in your peripheral view. And you can see it really easily. It’s there. It doesn’t bother you. It doesn’t make you sick. But it’s definitely there,” he told host Ashlee Vance. “Apple [for example] can’t make something like that [because it wouldn’t be acceptable to the consumer market]. They can’t make a thing where there’s a seamless magical experience, except for this weird distorted bubble seam down both sides of your vision in your periphery. But for a tool [like Eagle Eye] you can do that… it’s not actually a problem.”

As for cost, at AWE Luckey suggested that the headset could cost in excess of $10,000.

“[The US military] would rather have something that is significantly more performant even if it’s somewhat more expensive. Now I’m not saying we should charge the government some obscene price, but if they can choose between a $1,000 sensor that lets them see things that are twice as far, or a $100 sensor that has half the range, every time they’re going to make the choice for the $1,000 sensor, because the cost of losing that soldier or failing the mission is so much higher than the cost of that headset,” he said. “So what’s fun for me—from a tech perspective—is we’re able to build a headset that costs tens of thousands of dollars to make. We can load it with image sensors that are nicer than even Apple would put in something like the Vision Pro. We can afford to put extremely high-end displays in it that are far beyond what the consumer market would reasonably bear today.”

Without a consumer cost restriction, Luckey said Eagle Eye will have some specs that are significantly beyond anything that’s available on the consumer market today.

“Eagle Eye is gonna be the best AR and VR device that’s ever been made; it’s not even close. We’re running at an extraordinarily high framerate and extraordinarily high resolution. I’d tell you the specs but unfortunately the customer doesn’t want me to at this point,” Luckey told Stephanie Riggs at AWE. “But I will tell you it’s several times higher resolution in capability than even Apple Vision Pro. There’s nothing in the consumer market that’s going to be able to meet it where it is, because I have a different set of requirements. I’m not making an entertainment device you buy at Best Buy, I’m building a tool that keeps you alive. And that’s something the Army is willing to pay for.”

He also emphasized not just the helmet’s XR tech but also the integration of artificial intelligence, likening the end goal being “in the vein of Cortana,” the artificially intelligent sidekick of Master Chief (the hero from the Halo franchise).

“[…talking about Iron Man’s sci-fi armor suit] it wasn’t just the suit right? It was also the augmented vision paired with [some] kind of AI guardian angel in the form of Jarvis; that is what we were building. Eagle Eye has an onboard AI guardian angel, maybe less in the style of Jarvis and more in the vein of Cortana from Halo, but this idea of having this ever-present companion who can operate systems, who can communicate with others, that you can offload tasks onto, that is looking out for you with more eyes than you could ever look out for yourself, right there in your helmet—that is such a powerful thing to make real.”

One of the key capabilities of the headset involves threat detection, Luckey said at AWE.

“Eagle Eye has a 360° threat awareness system… that is able to detect drone threats, vehicular threats, threats on foot, and automatically categorize ‘what is a threat and what is not’ and then present that to you.”

Further, he spoke of the AI as a way to make all of the helmet’s capabilities easy to use without overwhelming the wearer.

“You shouldn’t be toggling between 10 different sensor menus. You should just see seamless view that’s built by kind of an AI interpolator that looks out into the world and says ‘ok well I know he probably wants to see all of the hot human signatures, I know he probably wants to see all the drones…’ you can build technology that is transparent to the user,” said Luckey. “[…] maybe I’m not the guy to argue that the tech is easy to use because I’m a hardcore technohead from birth and I can operate wacky stuff. But you can put it on a normal person… they can look out into the world and do things and see things with zero training that they never would have been able to do otherwise. I’m not concerned about information overload because I’m [confident in our ability to build the right tool for the job].”

Regarding manufacturing, Luckey said the Eagle Eye XR helmet will be built in the US or with US allies, with “no Chinese parts,” as a matter of operational security. He expects the first prototypes of Eagle Eye this year, and says the company already has working prototypes.

“We’re gonna be delivering the first prototypes to the army this year. That’s the intent anyway, if all goes according to plan in the way that I hope,” he told Vance. “But we’ve been working on the technology that underpins Eagle Eye for years. And we’ve been making a really serious hardware effort for over a year at this point. And so actually there’s an Eagle Eye sitting on my desk back at my office right now.”

Reunion with Meta and Zuckerberg

But how did Luckey go from having his VR startup (Oculus) acquired by Meta, then getting fired from Meta for political backlash, starting a military technology company (Anduril), raising it to a valuation of billions, and then end up partnering once again with the company that had booted him out?

Well, by Luckey’s telling, it started last year when Meta CEO Mark Zuckerberg offered a quote to an article about Luckey that was surprisingly conciliatory. That openness from Zuckerberg (and outright apology from Meta CTO Andrew “Boz” Bosworth) opened the door to a renewed relationship.

“We ended up reconnecting [after the article], talking about some of the problems that are going on with America, some of the inefficiencies that exist for terrible reasons… how there are people who are dying needlessly because of barriers between our technology industry and our national security community,” Luckey said on the Core Memory podcast. “We ended up deciding that this was something that we needed to work on together. Meta’s been doing a lot more on the national security front; they’ve been working a lot more work with the government.”

Luckey says he’s moved on from any anger he harbored for his firing by Meta, saying that it’s a different company than it was those nine years ago—not just culturally, but also many of the people advocating for his ousting are no longer working at Meta.

Luckey sees the partnership as a win for Anduril (as it doesn’t need to rebuild key XR technology), while saving the American taxpayer from paying for tech that already exists in the private sector.

“[…] there’s a lot of things in Meta that I invented, my team invented, before they acquired [Oculus]. There’s other things that I invented, that the team invented, while I was at Facebook (now Meta). And there was a bunch of technology that was invented after I was fired,” he explained to Vance. “And this partnership is about taking that entire base of technology and IP—around hardware, software, in AI, VR, AR space—and applying it to solving our military’s most pressing challenges. It’s taking a lot of the people who have been working on these technologies for consumer applications and adapting their work to solve national security problems at a very low cost to the taxpayer.”

Luckey says the partnership will allow Anduril to build “the world’s best” XR tech for the US government and allies.

On the other hand, he said that the details of the partnership with the likes of Meta and Qualcomm mean that future innovations will hopefully trickle back to the consumer side.

“The way I see this is: the tech that we’re building—working with partners like Qualcomm and Meta—they’re going to be able to bring back into their consumer devices. And that’s the way our licensing agreement works,” he told Riggs. “The tech that we co-develop together… I’m the guy who is going to be deploying it to the military; they’re going to be the people taking it back into the consumer realm.”

It’ll be some time yet until we know more about what Eagle Eye actually looks like and how it works, but there may well be some overlap with Microsoft’s prototype IVAS system, as that’s the helmet that Eagle Eye is being built to replace.

Filed Under: News, XR Industry News

Meta Teases Oakley Partnership for Sportier Smart Glasses, Reportedly Releasing This Year

June 16, 2025 From roadtovr

Meta officially confirmed the expansion of its EssilorLuxottica partnership to include a pair of Oakley smart glasses—possibly arriving soon.

Earlier this year, Bloomberg’s Mark Gurman reported that Meta was looking to expand its line of smart glasses beyond Ray-Ban Meta, which would include two possible new devices: a sportier Oakley-branded model, and a high-end model with built-in display—the latter has yet to be announced.

Now, Meta CTO Andrew ‘Boz’ Bosworth confirmed that ‘Oakley Meta’ smart glasses are coming in an X post, showing a graphic of both brands merging and linking to a new @oakleymeta profile.

🤘🏼 @oakleymeta pic.twitter.com/lRL6oimgMR

— Boz (@boztank) June 16, 2025

Details remain scarce, however Gurman’s January report maintained the Oakley smart glasses would be designed for athletes and could launch sometime this year.

Meta’s EssilorLuxottica partnership has been growing steadily since the release of the first-gen Facebook Ray-Ban Stories in 2021, prompting the company to offer a second-gen version in 2023, Ray-Ban Meta, which which introduced updated styles, improved audio and cameras, and on-board AI features.

In late 2024, Meta announced it was expanding its smart glasses partnership with EssilorLuxottica into 2030. At the time, Meta CEO Mark Zuckerberg described its long-term roadmap as giving the companies “the opportunity to turn glasses into the next major technology platform, and make it fashionable in the process.”

In addition to Ray-Ban and Oakley, the French-Italian luxury eyewear company owns other major brands, including Persol, Oliver Peoples, and Vogue Eyewear, along with eyewear retailers LensCrafters, Pearle Vision, and Sunglass Hut.

Filed Under: News, XR Industry News

Google’s First ‘Beam’ Videoconferencing Device is ‘HP Dimension’, Coming Late 2025 at $25,000

June 13, 2025 From roadtovr

HP announced last year it was going to be the first to offer hardware based on Google Beam (formerly ‘Project Starline’), the light field-based 3D videoconferencing platform. Now, HP unveiled ‘Dimension’, which is being pitched to enterprise at $25,000 a pop.

HP Dimension with Google Beam is said to use six cameras and “state of the art AI” to create a realistic 3D video of each participant, displayed on a special 65-inch light field display with realistic size, depth, color, and eye contact.

HP says the device, which will be sold to select partners starting in late 2025, will be priced at $25,000. This notably doesn’t come with the Google Beam license, which is sold separately.

Image courtesy Google, HP

As an enterprise-first device, HP Dimension is slated to support Zoom Rooms and Google Meet, so it can do 3D immersive chats, but also 2D traditional group meetings, integrating cloud-based video services such as Teams and Webex.

“We believe that meaningful collaboration thrives on authentic human connections, which is why we partnered with Google to bring HP Dimension with Google Beam out of the lab and into the enterprise,” said Helen Sheirbon, SVP and President of Hybrid Systems, HP Inc. “HP Dimension with Google Beam bridges the gap between the virtual and physical worlds to create lifelike virtual communication experiences that brings us closer together.”

First introduced in 2021, Google Beam (ex ‘Project Starlight’) combines a light-field display to show natural 3D depth without the need for an XR headset or glasses of any sort—essentially simulating a face-to-face chat between two people.

In its testing, HP says Beam this makes for 39% more non-verbal behaviors noticed, as well as 37% more users noting better turn taking, and 28% noticing an increase in memory recall over traditional videoconferencing platforms.

Filed Under: News, XR Industry News

Report: Samsung’s Project Moohan XR Headset May Get a Launch Date at Unpacked Next Month

June 12, 2025 From roadtovr

Samsung Unpacked is expected to kick off next month with the usual slate of hardware announcements, which this year could include the company’s latest foldable smartphones, Galaxy Z Flip 7 and Fold 7, and its latest Galaxy Watch 8. Rumors suggest though the company is also looking to put its upcoming XR headset, Project Moohan, in the spotlight too.

Project Moohan was announced alongside Android XR back in December 2024, which will be the first device to run Google’s upcoming XR operating system. Samsung has said in the past that consumers should expect Project Moohan’s launch sometime this year, although it still doesn’t have a specific date or official name scheme.

Now, Samsung serial leaker ‘Panda Flash‘ reports the company’s upcoming mixed reality headset could finally get a release date there.

While were initially expecting to hear something about Project Moohan at Google I/O last month (we didn’t), Samsung might be keeping the device a little closer to home than initially thought.

Samsung Project Moohan | Image courtesy The Verge

Panda Flash, who has been following Galaxy Z Flip 7 and Fold 7 leaks and supply chain rumors, additionally reports the headset will launch first in South Korea, and then gradually launch globally sometime afterwards—essentially mirroring Apple’s US-first launch of Vision Pro before heading into other markets.

Samsung has shown its supposed Vision Pro competitor at a number of events over the past year, which includes our opportunity to go hands-on with Project Moohan in December, although the company has largely stayed mum on revealing the XR headset’s full spec sheet.

So far, we know the Android XR headset is packing in a Qualcomm Snapdragon XR2 + Gen 2, Sony-sourced micro‑OLED panels (resolution still TBA), pancake lenses, automatic interpupillary distance (IPD) adjustment, support for eye and hand-tracking, optional magnetically-attached light shield, and a removable external battery pack. It also supports VR motion controllers of some sort, although we haven’t seen those either.

We’re also hoping to learn more about the company’s smart glasses efforts; Samsung is reportedly working on a pair of smart glasses that could launch sometime this year—ostensibly looking to serve up competition to Ray-Ban Meta Smart Glasses.

Whatever the case, we’ll be looking out for official dates for Samsung Unpacked, which is expected to take place sometime early next month in New York City.

Filed Under: News, XR Industry News

Snap Plans to Launch New Consumer ‘Specs’ AR Glasses Next Year

June 10, 2025 From roadtovr

Snap, the company behind Snapchat, today announced it’s working on the next iteration of its Spectacles AR glasses (aka ‘Specs’), which are slated to release publicly sometime next year.

Snap first released its fifth generation of Specs (Spectacles ’24) exclusively to developers in late 2024, later opening up sales to students and teachers in January 2025 through an educational discount program.

Today, at the AWE 2025, Snap announced it’s launching an updated version of the AR glasses for public release next year, which Snap co-founder and CEO Evan Spiegel teases will be “a much smaller form factor, at a fraction of the weight, with a ton more capability.”

There’s no pricing or availability yet beyond the 2026 launch window. To boot, we haven’t even seen the device in question, although we’re betting they aren’t as chunky as these:

Snap Spectacles ’24 | Image courtesy Snap Inc

Spiegel additionally noted that its four million-strong library of Lenses, which add 3D effects, objects, characters, and transformations in AR, will be compatible with the forthcoming version of Specs.

While the company isn’t talking specs (pun intended) right now, the version introduced in 2024 packs in a 46° field of view via stereo waveguide displays, which include automatic tint, and dual liquid crystal on silicon (LCoS) miniature projectors boasting 37 pixels per degree.

As a standalone unit, the device features dual Snapdragon processors, stereo speakers for spatial audio, six microphones for voice recognition, as well as two high-resolution color cameras and two infrared computer vision cameras for 6DOF spatial awareness and hand tracking.

There’s no telling how these specs will change on the next version, although we’re certainly hoping for more than the original’s 45-minute battery life.

Snap Spectacles ’24 | Image courtesy Snap Inc

And as the company is gearing up to release its first publicly available AR glasses, Snap also announced major updates coming to Snap OS. Key enhancements include new integrations with OpenAI and Google Cloud’s Gemini, allowing developers to create multimodal AI-powered Lenses for Specs. These include things like real-time translation, currency conversion, recipe suggestions, and interactive adventures.

Additionally, new APIs are said to expand spatial and audio capabilities, including Depth Module API, which anchors AR content in 3D space, and Automated Speech Recognition API, which supports 40+ languages. The company’s Snap3D API is also said to enable real-time 3D object generation within Lenses.

For developers building location-based experiences, Snap says it’s also introducing a Fleet Management app, Guided Mode for seamless Lens launching, and Guided Navigation for AR tours. Upcoming features include Niantic Spatial VPS integration and WebXR browser support, enabling a shared, AI-assisted map of the world and expanded access to WebXR content.

Releasing Specs to consumers could put Snap in a unique position as a first mover; companies including Apple, Meta, and Google still haven’t released their own AR glasses, although consumers should expect the race to heat up this decade. The overall consensus is these companies are looking to own a significant piece of AR, as many hope the device class will unseat smartphones as the dominant computing paradigm in the future.

Filed Under: AR Development, News, XR Industry News

Vision Pro is Getting a Major Visual Upgrade to Its ‘Persona’ Avatars

June 9, 2025 From roadtovr

Apple ‘Personas’ on Vision Pro are already the most likelife real-time avatars you can find on any headset today, but in the next version of visionOS, they’re taking another step forward.

Apple today announced that its Persona avatars for Vision Pro will get a major visual upgrade with the launch of visionOS 26, due out later this year.

Personas on Vision Pro are generated on-device after users take a short scan of their face using the headset. Once generated, the avatar is used for social experiences like FaceTime.

Currently, they’re the most lifelike real-time avatars available on any headset today. Although they impressively capture subtle motion from the user, they have always felt somewhat blurry or ghostly.

VisionOS 26 promises a big visual update that will greatly reduce that ghostly look, and present a more complete view of the user’s head, including a “full side profile view.” Apple is also promising more realistic hair and lashes, and more than 1,000 variations of glasses, so glasses-wearers can find something that looks just right.

View post on imgur.com

Although visionOS 26 will be available as a developer beta starting today, it isn’t yet clear if the Personas upgrade will be available in the first version, or roll out in later versions of the beta.

Beyond the visual upgrade to Personas, visionOS 26 will also make improvements to how social experiences work on the headset. New developer tools will allow for the creation of co-located virtual experiences; meaning two headset users in the same physical space will be able to see a shared virtual experience that’s visually anchored in the same space for both. That same system will allow for remote participants to join as Persona avatars, making for a mixture of in-person headset users and remote participants in the same virtual experience.

Filed Under: Apple Vision Pro News & Reviews, XR Industry News

AWE 2025 Preview: 4 Companies Building XR’s Future

June 6, 2025 From roadtovr

AWE USA 2025, one of the XR industry’s largest annual conferences, kicks off next week. We got a preview of what four interesting companies attending the event will be showing.

As far as industry events go, AWE USA has become our must-attend XR event of the year. It kicks off next week on June 10–12 in Long Beach, CA. As the Premier Media Partner of this year’s event, our exclusive exclusive 20% discount on tickets is still available.

We’ll be on site at the event, reporting on the most important developments. Ahead of AWE though we asked four interesting companies for a preview of what they’re bringing to the show.

CREAL

At AWE 2025, CREAL will showcase its Clarity light-field display. Released at the beginning of the year, CREAL has, since then, continuously improved the image quality by innovating on the spatial light modulator. Visitors will be able to experience the new display technology through a headset as well as a tabletop pair of glasses

Both prototypes feature CREAL’s Clarity display, which includes the light field optical engine and holographic lenses. Beyond the display, the headset prototype integrates off-the-shelf components to enable full-scale demonstrations of our technology, while the glasses prototype is designed with custom components to showcase our ultimate form factor. | Image courtesy CREAL

XREAL

At AWE, XREAL will be demoing the ultra-popular XREAL One Series AR glasses with spatial computing capabilities. Also available for demo will be the XREAL EYE, a modular camera attachment for the One Series. XREAL will also be unveiling an exciting new accessory and showing it off in person for the very first time.

Image courtesy XREAL

ForgeFX

At AWE 2025, ForgeFX Simulations will unveil VORTEX, a next-generation XR training platform engineered for high-risk, high-consequence environments where traditional training methods fall short. Built on the proprietary ForgeSIM framework, VORTEX delivers immersive, AI-enhanced, scenario-based mission rehearsal through photorealistic LiDAR environments, GIS-enabled sand tables, voice-activated SMEs, and real-time performance analytics—already piloted by JPEO-CBRND for CBRN response. ForgeFX is also debuting an enhanced Horizontal Directional Drill (HDD) Simulator for the Meta Quest 3 PCVR, co-developed with Vermeer Corporation, featuring authentic drill controls and a new Auto Rod Exchange module that trains on a previously unsimulated, safety-critical task. At Booth #346, attendees can experience six interactive demos, including the JLG Access Ready XR trainer, Somero S-22EZ Laser Screed simulator, CBRND HoloTrainer, Trumpf Laser Cutting simulator, ForgeFX Excavator trainer, and Ocuweld welding VR simulator, each showcasing ForgeFX’s leadership in immersive, equipment-integrated training solutions.

Image courtesy ForgeFX

PICO

At AWE USA 2025, PICO will showcase the PICO 4 Ultra Enterprise, its latest all-in-one mixed reality headset designed for enterprise applications. Equipped with advanced MR capabilities and the PICOMotion Tracker for full-body and object tracking, the headset empowers industries to deliver highly immersive, practical solutions. PICO has successfully expanded into education, training and location-based entertainment (LBE), and visitors to the booth will have the opportunity to experience a selection of these real-world use cases firsthand. A private meeting space will also be available for deeper conversations about how PICO’s solutions can accelerate business strategies. PICO will also host two featured speaking sessions: ‘Unlocking the Potential of LBE: Scaling with PICO’s XR Solutions’ and ‘Superpowers for Spatial Developers: WebSpatial and SpatialML.’

Image courtesy PICO

What are you hoping to see from AWE 2025? Let us know in the comments below.

Filed Under: News, XR Industry News

A Look Inside Meta’s ‘Aria’ Research Glasses Shows What Tech Could Come to Future AR Glasses

June 5, 2025 From roadtovr

Earlier this year, Meta unveiled Aria Gen 2, the next iteration of its research glasses. At the time, Meta was pretty sparse with details, however now the company is gearing up to release the device to third-party researchers sometime next year, and in the process, showing what might come to AR glasses in the future.

Meta revealed more about Aria Gen 2 in recent blog post, filling in some details about the research glasses’ form factor, audio, cameras, sensors, and on-device compute.

Although Aria Gen 2 can’t do the full range of augmented reality tasks since it lacks any sort of display, much of what goes into Meta’s latest high-tech specs are leading the way for AR glasses of the future.

Better Computer Vision Capabilities

One of the biggest features all-day-wearable AR glasses of the future will undoubtedly need is robust computer vision (CV), such as mapping an indoor space and recognizing objects.

In terms of computer vision, Meta says Aria Gen 2 doubles the number of CV cameras (now four) over Gen 1, features a 120 dB HDR global shutter, an expanded field of view, and 80° stereo overlap—dramatically enhancing 3D tracking and depth perception.

To boot, Meta showed off the glasses in action inside of a room as it performed simultaneous localization and mapping (SLAM):

New Sensors & Smarter Compute

Other features include sensor upgrades, such as a calibrated ambient light sensor, a contact microphone embedded in the nosepad for clearer audio in noisy environments, and a heart rate sensor (PPG) for physiological data.

Additionally, Meta says Aria Gen 2’s on-device compute has also seen a leap over Gen 1, with real-time machine perception running on Meta’s custom coprocessor, including:

  • Visual-Inertial Odometry (VIO) for 6DOF spatial tracking
  • Advanced eye tracking (gaze, vergence, blink, pupil size, etc.)
  • 3D hand tracking for precise motion data and annotation
  • New SubGHz radio tech enables sub-millisecond time alignment between devices, crucial for multi-device setups.

And It’s Light

Aria Gen 2 may contain the latest advancements in computer vision, machine learning, and sensor technology, but they’re also remarkably light at just 74-76g. For reference, a pair of typical eyeglasses can weigh anywhere from 20-50g, depending on materials used and lens thickness.

Aria Gen 2 | Image courtesy Meta

The device’s 2g weight variation is due to Meta offering eight size variants, which the company says will help users get the right fit for head and nose bridge size. And like regular glasses, they also fold for easy storage and transport.

Notably, the company hasn’t openly spoken about battery life, although it does feature a UBS-C port on the glasses’ right arm, which could possibly be used to tether to a battery pack.

Human Perception Meets Machine Vision

Essentially, Aria Gen 2 not only tracks and analyses the user’s environment, but also the user’s physical perception of that environment, like the user preparing a coffee in the image below.

Image courtesy Meta

While the device tracks a user’s eye gaze and heart rate—both of which could indicate reaction to stimulus—it also captures the relative position and movement through the environment, which is informed by its CV cameras, magnetometer, two inertial measurement units (IMUs) and barometer.

That makes for a mountain of useful data for human-centric research projects, but also the sort of info AR glasses will need (and likely collect) in the future.

The Road to AR Glasses

According to Meta, Aria Gen 2 glasses will “pave the way for future innovations that will define the next computing platform,” which is undoubtedly set to be AR. That said, supplanting smartphones in any meaningful way is probably still years away.

Meta’s Orion AR Glasses Prototype | Image courtesy Meta

Despite some early consumer AR glasses out there already, such as XREAL One Pro, packing in thin displays, powerful processors, and enough battery to run it all-day isn’t a trivial feat—something Meta is trying to address both with Aria as well as its Orion AR prototype, which tethers to a wireless compute unit.

Still, Meta CTO and Reality Labs chief Andrew Bosworth says an AR device based on Orion is coming this decade, and will likely shoot for a price point somewhere north of a smartphone.

We’re likely to learn more about Aria Gen 2 soon. Meta says it’s showcasing the device at CVPR 2025 in Nashville, which will include interactive demos. We’ll have our eyes out for more to come from CVPR, which is taking place June 11th – 15th, 2025 at the Music City Center in Nashville TN.

Filed Under: AR Development, ar industry, News, XR Industry News

« Previous Page
Next Page »

  • Home