• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home
  • About us
  • Contact Us

NVIDIA

Manufacturing in the Metaverse: What Might it Look Like?

January 21, 2022 From vrfocus

Manufacturing is a highly complex process, in addition to being the most important part of supply chain management. There are several components that affect the manufacturing production process — such as the availability of raw materials, labour costs, inventory costs and overall marketplace demand.

Since the start of the Industrial Revolution, the effective marriage of systems and machines has allowed us to increase production times, reduce product costs and find new ways of organising work. Within the last 50 years, digital transformation has continued this trend, enabling us to better understand the physical through digital operations. 

With that being said, however, the physical has still held precedence over the digital for most of modern times. The rise of the metaverse will allow us to reverse this dichotomy, giving us access to a primarily digital space. In the case of the manufacturing industry, we will be able to translate this digital space onto the physical world, rather than simply just enhancing it.

Let’s look at some of the key ways where we can expect to see the manufacturing industry change within the metaverse.

SkyReal image1

An entrance into the creator’s economy

The metaverse will provide users with easier access to digital materials — a major shift that may very well encourage more creators and consumers to pursue industrial design. This will inevitably create new industry demands and completely change how products are made. 

3D content creation tools will also become more widely available in the metaverse. This will add manufacturing to the creator’s economy, providing the general public with more tools to render and simulate 3D prototypes at their own convenience. 

Just like with gaming platforms, streaming services or other various forms of online content creation, we will be sure to see the same type of growth proliferate within manufacturing and supply chain management. According to analyst firm TrendForce, the industrial metaverse revenue is set to reach $540 billion by 2025.

Easier collaboration on product development

The metaverse will also provide much easier collaboration on all aspects of product development. Given that it will be capable of serving as a communal space for all stakeholders involved with a project, multiple processes will be able to be achieved more rapidly and simultaneously — such as product design, sharing with manufacturers, iterating based on feedback and much more. 

NVIDIA’s VR-based collaboration tool Omniverse has experienced a successful launch in the enterprise sphere. As a multi-GPU, real-time development platform for design teamwork and 3D simulation, it has become a staple for those working in the industrial sector or for those who specialise in the creation of digital twin applications. 

To date, Omniverse has been downloaded by over 50,000 creators — with a recent platform subscription having been launched by NVIDIA to allow for wider outreach. The Omniverse platform has already experienced tremendous growth, with integrations from popular design platforms (such as Blender and Adobe) being made available for developers to use from any location. These integrations have well-positioned NVIDIA as a viable leader for collaborative product development in the metaverse.

Workplace changes due to the pandemic have also led to a rise in collaborative XR solutions within the enterprise sector. SkyReal, an aerospace-focused software company, started its operations by helping companies collaboratively approach their various stages of manufacturing — from conception and industrialization, though to training and marketing. Now, SkyReal helps aerospace teams work on CAD files in real-time, offering them an immersive experience that allows for even better collaboration capabilities.

More streamlined processes through digital twins

Digital twins are virtual representations that serve as real-time replicas of a physical object. From gaming companies to automotive manufacturers, many industries have already started using digital twins to collect real-time data and predict how objects will perform before they are manufactured and sold.

The digital twin market has been projected to grow to an incredible $86 billion by 2025. This level of growth is largely being fueled by an increase in demand for things such as predictive maintenance, industrial IoT solutions and a smarter and more energy-efficient infrastructure.

Digital twins also provide real-time data for users, allowing them to gain better insights on overall production processes. For example, automotive manufacturers are already using digital twins to better pinpoint equipment failures and ensure that all parts are meeting quality standards before being delivered to customers.

Photo by © Alexander Tolstykh – Shutterstock.com

BMW has already started using a simulated system to better streamline its production process. A version of the company’s Regensburg-based production line exists solely within a computer simulation, serving as a digital twin to its physical counterpart. Before any parts enter the production line, the entire manufacturing process runs in a hyper-realistic virtual iteration of the factory. By adopting this technology, managers can now plan their production process in greater detail.

Other large companies that have adopted the use of digital twins include Microsoft, Unilever, Boeing, Siemens Energy and Ericsson. With Azure Digital Twins, Microsoft has created a leading IoT platform that features a live execution environment, allowing users to create digital representations of real-life things, people, places and processes.

In all, digital twins will be an extremely integral building block of the metaverse. They will provide us with lifelike representations of things from our physical world and come equipped with live feeds of every sensor and component they contain.

Shorter lead times

The collaborative approach offered by working in the metaverse will certainly shorten the life cycle for projects. More robust virtual spaces will also allow manufacturers to quickly see how moving assets around can impact a production cycle. By simulating real physics and identifying potential errors, this approach is a great way for manufacturers to see more efficacy and faster turnaround times.

  • Microsoft Hololens 2

Down the road, greater interoperability initiatives will also make product designs generally easier and faster to implement. Designers and creators will no longer have to go through as many hoops to complete their designs and get them into the hands of manufacturers. This will result in shorter lead times, as well as an exponential increase in the number of product designs they can complete.

Supply chain transparency

In more recent years, demand for supply chain transparency has been on the rise. According to the MIT Sloan School of Management, consumers are reportedly willing to pay between 2% and 10% more for products that offer greater supply chain transparency. 

What we can deduct from this data is that consumers find value in the treatment of workers in a supply chain, as well as in a company’s efforts to provide decent working conditions. Ethical concerns, such as slave labour or deforestation, have made consumers increasingly more averse to purchasing products that don’t meet these standards.

With this being said, the truth is that supply chains were not originally designed to be transparent. However, access to the supply chain or to digital twin management in the metaverse could resolve this issue for good.

Working in the metaverse will also provide far better project visibility, for both staff members and consumers alike. Given that multiple collaborators will be able to work within the same space, regardless of their physical location, all parties will have access to 3D design representations of how products are designed, built, sold and distributed. Customers may even grow used to tracking their orders throughout the entire cycle, from raw materials through to a finished product. With this added insight, customers will gain full transparency into the entire production process.

Greater supply chain transparency will also give customers greater visibility of lead times. This will offer them a better sense of real-time shipping costs and allow them to better prepare for potential pitfalls (such as shipping delays).

Final thoughts

The metaverse will pave the way towards a digital-first approach to manufacturing. This will essentially be driven by both consumer preferences and different types of actions that will be necessary to operate inside a virtual world. 

There are valuable steps that manufacturers can take to bring us closer to an ideal metaverse system. For starters, it is critical that they work on harvesting data from their processes — and also that they implement the best interoperability protocols for connecting said data across the entire supply chain.

Recent innovations — such as NVIDIA’s CloudXR platform (which has been configured to work with Google Cloud) — have begun enabling organizations to securely access their data through cloud-based solutions. This will allow creators to access their work and collaborate on projects from anywhere in the world, all while doing so through the lens of an immersive, high-quality user experience.

In all, these areas are all currently being worked on to forever disrupt and change the concept of supply chains. This is an extremely exciting and innovative time for manufacturing technology — and we look forward to tracking the eventual paradigm shift that’s to come.

Filed Under: Collaboration, Enterprise, Metaverse, NVIDIA, SkyReal, Technology

NVIDIA’s New AI-Powered Avatars Are Pretty Incredible

November 10, 2021 From vrscout

NVIDIA’s CEO wants everyone to have a digital twin.

During this weeks NVIDIA GTC Conference keynote speech, company CEO Jenson Huang introduced 65 new and updated SDKs that will impact everything from self-driving vehicles and cybersecurity to cloud computing, robotics, and interactive conversational AI avatars.

First up is the NVIDIA Omniverse Avatar, a platform that can be used to create real-time conversational AI avatars that are able to see, speak, converse on a wide range of subjects and understand your intent as you speak to them. NVIDIA sees their Omniverse Avatar platform assisting a variety of industries such as food, banking, and retail just to name a few. 



“The dawn of intelligent virtual assistants has arrived,” said Huang, adding, “Omniverse Avatar combines NVIDIA’s foundational graphics, simulation and AI technologies to make some of the most complex real-time applications ever created. The use cases of collaborative robots and virtual assistants are incredible and far reaching.”

The company also took time during the conference to unveil its NVIDIA Omniverse Replicator, a powerful synthetic-data-generation engine that has the ability to produce physically simulated virtual worlds perfect for training pruposes. This technology, for example, could be used by major companies to safely train employees working in potentially dangerous conditions.

In an official press release,  Rev Lebaredian, Vice President of simulation technology and Omniverse Engineering at NVIDIA said, “Omniverse Replicator allows us to create diverse, massive, accurate datasets to build high-quality, high-performing and safe datasets, which is essential for AI. While we have built two domain-specific data-generation engines ourselves, we can imagine many companies building their own with Omniverse Replicator.”

Image Credit: NVIDIA

Part of Huang’s keynote focused on how these new technologies will play a role in transforming multi-billion dollar industries around the globe through Project Maxine, a GPU-accelerated SDK with state-of-the-art AI features that developers can use to build life-like video and audio effects, along with powerful AR experiences that can be used for work, entertainment, education, and social situations.

He also showed how Maxine could easily be integrated into computer vision technology—such as NVIDIA’s Riva Speech—and AI to create a real-time multi-language conversational avatars. To help drive his point home, Huang showcased how different divisions of NVIDIA were using the technology to build out experiences.

NVIDIA’s Metropolis team, for example, used Maxine to create a talking kiosk named Tokkio that greets you in any language and helps with your food order. NVIDIA’s Drive engineers created Concierge, an AI that would be used in self-driving vehicles. They also used Maxine to help create a virtual toy version of Haung that uses speech synthesis and AI to engage in “real” converstation. In the video provided below, we see an actual human being chatting with Huang’s AI-powered avatar in real-time. The pair talk about everything from climate change and space exploration to proteins. Again, this isn’t Huang talking, It’s a virtual avatar powered by artificial intelligence.



Not only did NVIDIA’s CEO talk about their new tools, he also talked about the importance of having a digital twin, and how virtual worlds will be important for brands in all industries as we grow closer to a legitamet metaverse.

To view Haung’s keynote, click here. NVIDIA’s GTC Conference began on Monday and runs until November 11th.

Feature Image Credit: NVIDIA

Filed Under: Artificial Intelligence, avatars, Metaverse, News, NVIDIA

VMware Integrates NVIDIA’s CloudXR Into its Collab Platform

October 5, 2021 From vrfocus

NVIDIA and VMware have a long-running partnership when it comes to working together on XR solutions. As part of VMworld 2021 which starts today, the companies have announced that VMware’s Workspace ONE XR Hub will utilise CloudXR so the business solution can run on all-in-one (AIO) headsets.

Currently, in open beta, Workspace ONE XR Hub is VMware’s immersive expansion of its Workspace ONE Intelligent Hub, a solution design to improve employee engagement and remote productivity. Facilitating use cases like immersive training and design visualization can warrant using high-end PC’s and tethered headsets which aren’t always suitable or they’re simply expensive. AIO headsets are becoming far more commonplace yet lack the processing power for these high-end enterprise applications.

Hence why VMware is utilising NVIDIA CloudXR so that all the complex environments, scenes and simulations companies want to run can be achieved remotely on RTX GPU’s and RTX Virtual Workstation Software that a standalone headset can tap into.

“From immersive training to immersive design solutions, our customers need the highest fidelity experience with the greatest mobility. Running VR applications on VMware vSphere with NVIDIA vGPU, combined with NVIDIA CloudXR to stream content to a mobile VR headset, is a great way to solve that challenge,” said Matt Coppinger, director of XR at VMware in a statement. “We’ve also been developing Workspace ONE XR Hub, which will provide simple and secure access to native and remote VR applications. CloudXR integrated in VMware Workspace ONE will provide the ultimate enterprise experience for users and IT.”

NVIDIA

This is being showcased at NVIDIA’s booth during VMworld this week, combining Autodesk VRED, Workspace ONE XR Hub, and CloudXR. Attendees will be able to view real-time renderings of digital models on VR headsets, handheld phones and tablets.

NVIDIA unveiled its streaming solution in early 2020, with a beta programme allowing a select number of applicants to register and implement CloudXR. Its most recent advancements took place over the summer, adding support for bidirectional audio as well as Google Cloud integration. As further improvements are made to CloudXR, VRFocus will keep you updated.

Filed Under: News, NVIDIA, NVIDIA CloudXR, VMware, Workspace ONE XR Hub

NVIDIA CloudXR and Google Cloud to Collaborate on Immersive Streaming

August 11, 2021 From vrfocus

With SIGGRAPH going on this week computing specialist NVIDIA has released several new updates regarding its virtual reality (VR) compatible tech. The latest focuses on the NVIDIA CloudXR platform, revealing that it’s now collaborating with Google Cloud to provide high-quality XR streaming.

NVIDIA

The announcement sees NVIDIA CloudXR brought to NVIDIA RTX Virtual Workstation instances on Google Cloud, enabling organisations and XR users to securely access data their data and work with others all inside virtual, augmented or mixed reality (VR/AR/MR) experiences.

To showcase the tech NVIDIA has teamed up with Masterpiece Studio and its 3D creation platform Masterpiece Studio Pro which works with a variety of PC-based VR headsets like HTC Vive and Valve Index. Leveraging CloudXR and Google Cloud, artists are able to stream and collaborate on character creation.

“Creators should have the freedom of working from anywhere, without needing to be physically tethered to a workstation to work on characters or 3D models in VR,” said Jonathan Gagne, CEO at Masterpiece Studio in a statement. “With NVIDIA CloudXR, our customers will be able to power their creative workflows in high-quality immersive environments, from any location, on any device.”

NVIDIA CloudXR - Masterpiece Studio
Masterpiece Studio Pro

“NVIDIA CloudXR technology delivered via Google Cloud’s global private fiber optic network provides an optimized, high-quality user experience for remotely streamed VR experiences. This unique combination unlocks the ability to easily stream work from anywhere using NVIDIA RTX Virtual Workstation,” said Rob Martin, Chief Architect for Gaming at Google. “With NVIDIA CloudXR on Google Cloud, the future of VR workflows can be more collaborative, intuitive, and productive.”

That all sounds great but it isn’t available just yet. NVIDIA CloudXR is currently in Early Access for developers to sign up to with the Google Cloud feature rolling out later this year. For those interested, a private beta will be available soon. VRFocus will continue its coverage of NVIDIA, reporting back with further updates.

Filed Under: Google Cloud, Masterpiece Studio, News, NVIDIA, NVIDIA CloudXR, VR Streaming

NVIDIA Opens Omniverse Platform to Millions More Users With Blender Integration

August 11, 2021 From vrfocus

NVIDIA’s virtual reality (VR) compatible, enterprise-ready collaboration tool NVIDIA Omniverse saw an early access launch last year after debuting back in 2017 at the company’s GPU Technology Conference (GTC). This week as part of SIGGRAPH 2021, NVIDIA has announced that the platform is expanding by adding new integrations with Blender and Adobe.

NVIDIA Omniverse

NVIDIA has revealed that since the arrival of its open beta in December, Omniverse has been downloaded by over 50,000 creators. These new integrations could open the platform up to millions of designers and artists who use Blender and Adobe software and wish to collaborate.

Blender is an open-source 3D animation tool used around the world and now gets Universal Scene Description (USD) support, enabling access to Omniverse’s production pipelines. On the Adobe side, the two companies are collaborating on a Substance 3D plugin, eventually opening Substance Material support to Omniverse. This will unlock new material editing abilities for users.

“NVIDIA Omniverse connects worlds by enabling the vision of the metaverse to become a reality,” said Richard Kerris, vice president of the Omniverse development platform at NVIDIA in a blog post. “With input from developers, partners and customers, we’re advancing this revolutionary platform so that everyone from individuals to large enterprises can work with others to build amazing virtual worlds that look, feel and behave just as the physical world does.”

NVIDIA Omniverse

“Lots of companies are, and have been, talking about the metaverse, but NVIDIA is one of the few that is actually delivering on its promise,” said Jon Peddie, noted author, consultant and founder of Jon Peddie Research. “NVIDIA brings a broader understanding of the needs of designers of all types and has many of the tools they can use — for free. The NVIDIA Omniverse platform has the potential to transform nearly every industry by making truly collaborative, creative innovation possible in a common virtual space for the first time.”

Designed for global and smaller enterprise customers, NVIDIA Omniverse is currently being evaluated by the likes of SHoP Architects, South Park and Lockheed Martin. Artists can use the platform to design buildings and create new products, viewing them in on a flatscreen or in VR.

Whilst NVIDIA Omniverse is currently in early access NVIDIA plans on officially launching it later this year on a subscription basis. For continued updates on Omniverse keep reading VRFocus.

Filed Under: Collaboration, News, NVIDIA, NVIDIA Omniverse, SIGGRAPH 2021

Copyright © 2022 GenVR, Inc.
  • Home
  • About us
  • Contact Us