Join GamesBeat Summit 2021 this April 28-29. Register for a free or VIP pass today.
Nvidia has announced its Omniverse, a virtual environment the company describes as a “metaverse” for engineers, will be available as an enterprise service later this year.
CEO Jensen Huang showed a demo of the Omniverse, where engineers can work on designs in a virtual environment, as part of the keynote talk at Nvidia’s GPU Technology Conference, a virtual event being held online this week. I also moderated a panel on the plumbing for the metaverse with a number of enterprise participants.
Huang said that the Omniverse is built on Nvidia’s entire body of work, letting people simulated shared virtual 3D worlds that obey the worlds of physics.
“The science-fiction metaverse is near,” he said in a keynote speech. “One of the most important parts of Omniverse is that it obeys the laws of physics.”
The Omniverse is a virtual tool that allows engineers to collaborate. It was inspired by the science fiction concept of the metaverse, the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One. The project started years ago as a proprietary Nvidia project called Holodeck, named after the virtual reality simulation in Star Trek. But it morphed into a more ambitious industry-wide effort based on the plumbing made possible by Pixar’s Universal Scene Description (USD) technology for making its movies. Nvidia has spent years and hundreds of millions of dollars on it, said Richard Kerris, Nvidia media and entertainment general manager, in a press briefing.
Omniverse debuted in beta form in December. More than 17,000 users have tested it since then, and now the company is making the Omniverse available as a subscription service for enterprises. It’s just the kind of thing that engineers need during the pandemic to work on complex projects remotely.
BMW Group, Ericsson, Foster + Partners, and WPP are using Omniverse. It has application support from Bentley Systems, Adobe, Autodesk, Epic Games, ESRI, Graphisoft, Trimble, McNeel & Associates, Blender, Marvelous Designer, Reallusion and Wrnch.
And support comes from the likes of Asus, Boxx Technologies, Cisco, Dell Technologies, HP, Lenovo, and Supermicro. More than 400 enterprises are going to use the new version for enterprises starting this summer. It comes with enterprise support for fully established enterprises, Kerris said.
What the Omniverse can do
The Omniverse, which was previously available only in early access mode, enables photorealistic 3D simulation and collaboration. It’s a metaverse that obeys the laws of physics, and so it enables companies and individuals to simulate things from the real world that can’t be tested easily in the real world, like self-driving cars, which can be dangerous to pedestrians if they aren’t perfected.
Mattias Wikenmalm, technical specialist at Volvo, said on the panel that it’s necessary to simulate not just the car but the context around the car like a city environment.
“The foundation is still the data, and this is the first time we can be data native, where we don’t have to focus on moving data between different systems. In this case, data is a first-class citizen,” Wikenmalm said. “It’s so nice we can just focus on the data and borrow our data for different applications and transform that data. Exchanging data between systems has been complex. If we can get that out of the way, we can start building a proper metaverse.”
BMW is using Omniverse to simulate a full car factory before it builds it. And there’s no limit to the testing. If someone wanted to create an entire city, or building a simulation of the entire United States for a self-driving car testing ground, it would be possible.
It is intended for tens of millions of designers, engineers, architects, and other creators to use at the same time. The designers can work on the same parts of their designs at the same times without overwriting each other, with changes offered as options for others to accept. That makes it ideal for large teams to work together.
Susanna Holt, vice president of engineering for Autodesk, said on the panel that being able to understand someone else’s data is important and it means you don’t have to be locked into a single tool or workflow.
“We need the bits to talk to one another, and that’s been so hard until now,” she said. “It is still hard, as you have to import and export data. With USD, it’s the beginning of a new future.”
The Omniverse uses Nvidia’s RTX 3D simulation tech to enable engineers to do things like work on a car’s design inside a simulation while virtually walking around it or sitting inside it and interacting with it in real time.
Martha Tsigkari, partner at architectural firm Foster + Partners, said on the panel that the architecture and construction industries really need the ability to transfer data easily from one site to the next.
“Being able to do that in an easy way without having to think about how we change that information is really important,” Tsigkari said. “In order to run really difficult simulations, or understand how buildings perform, we need to use all kinds of software to do this. Working in these processes right now can be painful, and we need to create all of these bespoke tools to do this. A future where this becomes a seamless process and opens to all kinds of industries is a fantastic opportunity that we need to grasp and go for.”
Engineers on remote teams will be able to work alongside architects, 3D animators, and other people working on 3D buildings simultaneously, as if they were jointly editing a Google Doc, Kerris said. He added that “The Omniverse was built for our own needs in development.”
USD’s roots at Pixar
Pixar’s Universal Scene Description (USD) is the HTML of 3D, and it’s the foundation for sharing different kinds of images from multiple parties in Ominverse, said Kerris.
“We felt that with the entire community starting to move towards this open platform for exchanging 3D information including the objects, scenes, materials and everything, it was the best place for us to start with the foundation for what this platform would become,” Kerris said.
Pixar’s USD standard came from over a decade of film production.
Guido Quaroni is director of engineering and 3D immersive at Adobe and before that he was at Pixar, where he was responsible for open sourcing USD. In a panel at GTC, he said the idea emerged at Pixar in 2010 as the company was dealing with multiple libraries that dealt with large scenes in its movies.
“Some of the ideas in USD go back 20 years to Toy Story 2, but the idea was to formalize it and write it in a way that we could eventually open source it,” Quaroni said.
He worked with Sebastian “Spiff” Grassia, head of the team that built USD at Pixar.
“We knew that every studio kind of had something like it,” Quaroni said. “And we wanted to see if we could offer something that became the standard because for us the biggest problem was the plugins and integrations with third parties. Why not give it to the world?”
The problem that they had was that they needed to be able at any point in the film pipeline, to extract an asset, to massage it with a third-party tool, and stick it back into the production process without losing information, said Michael Kass, distinguished engineer at Nvidia and software architect of the Omniverse, in an interview.
Grassia said USD is an interchange format for data.
“It represents decades of Pixar’s experience in building software that supports collaborative filmmaking,” Grassia said. “It’s for collaborative authoring and viewing for a very large 3D scene. It handles combining, assembling, overriding, and animating the assets that you have created in a non-destructive way. That allows for multiple artists to work on the same scene concurrently.”
Before USD, artists had to check out a piece of digital art, work on it, and check it back in. With USD, Nvidia has enabled sharing across all applications and different ways of viewing the art. The changes are transmitted back and forth. A large number of people can view and work on the same thing, Kass said. A feature dubbed Nucleus serves as a traffic cop that communicates what is changing in a 3D scene.
Early on, Pixar tried to create tools itself, but it found there were tools like Maya, 3D Studio Max, Unreal Engine, or Blender that were more advanced at doing particular tasks. And rather than have to train those vendors to continuously update their tools, Pixar made USD available as an open standard.
What Nvidia added
The platform also uses Nvidia technology, such as real-time photorealistic rendering, physics, materials, and interactive workflows between industry-leading 3D software products.
Pixar buit a renderer, a data visualization engine, dubbed Hydra. It was designed in a way to hook up other data sources, like a Maya image. So the artists can work with large datasets without having the vendor translate everything into their own native representation.
Kass and his colleagues at Nvidia found that USD was a “golden nugget” that let them represent data in a way that could be used for all sorts of different purposes.
“We decided to put USD at the center of our virtual worlds, but at Pixar, most of the collaboration was not real time. So we added on top of USD the ability to synchronize with different users,” Kass said.
The real test has been making sure that USD can be useful beyond the media and entertainment applications. Omniverse enables collaboration and simulation that could become essential for Nvidia customers working in robotics, automotive, architecture, engineering, construction, and manufacturing.
“There really isn’t anything else like it,” Kerris said. “Pixar built the standard, and we saw the potential in it. This is a demand and a need that everybody has. Can you imagine the internet without a standard way of describing a web page? It used to be that way. With 3D, no two applications use the same language today. That needs to change, or else we really can’t build the metaverse.”
Nvidia extended USD, which was built for Pixar’s needs, and added what is necessary for the metaverse, Kass said.
“We got to stand on top of giants but we are pushing it forward in a direction they weren’t envisioning when they started,” he added.
Nvidia built a tool called Omniverse Create, which accelerates scene composition and allows users in real time to interactively assemble, light, simulate, and render scenes. It also built Omniverse View, which powers seamless collaborative design and visualization of architectural and engineering projects with photorealistic rendering. Nvidia RTX Virtual Workstation software gives collaborators the freedom to run their graphics-intensive 3D applications from anywhere.
Omniverse Enterprise is a new platform that includes the Nvidia Omniverse Nucleus server, which manages the database shared among clients, and Nvidia Omniverse Connectors, which are plug-ins to industry-leading design applications.
With all of the applications working live, artists don’t have to go through a laborious exporting or importing process.
“Omniverse is an important tool for industrial design — especially with human-robot interactions,” said Kevin Krewell, an analyst at Tirias Research, in an email. “Simulation is a big new market for GPU cloud services.”
Big problems
The Omniverse and USD
aren’t going to lead to the metaverse overnight.
Tsigkari said that getting so many creative industries to work together has been a huge challenge, particularly for architecture firms that have to pull so many different disciplines to get work done from conception to completion.
“You need a way to allow for the creative people to quickly pass things directly from engineers to consultants so they can do their analysis and pass it on to the manufacturers,” she said. “In the simplest way, this doesn’t exist.”
At the same time, different industries work on different timetables, from long cycles to real time.
“For us, this has been really crucial to be able to do this in a seamless way where you don’t have to think about the in-between space,” she said.
Holt at Autodesk said she would like to see USD progress forward in dealing with huge datasets, on the level of modeling cities for construction purposes.
“It’s not up to that yet,” she said. “Some changes would be needed as we take it into other areas like construction.”
Grassia said there are features that allow of “lazy loading,” or different levels of detail becoming visible as a huge dataset loads.
Lori Hufford, vice president of applicatiosn integration at Bentley Systems, said on a panel her team has had good results so far working on large models.
“I’m really excited about the open nature of USD,” said Hufford. “We’ve been very impressed with the scale we have been able to achieve with USD.”
The Omniverse today
The enterprise version will support Windows and Linux machines, and it is coming later this year.
What can you do in this engineer’s metaverse? You can simulate the creation of robots through a tool dubbed Isaac. That lets engineers create variations of robots and see how they would work with realistic physics. So they can simulate what a robot would do in the real world by first making the robot in a virtual world. There are also Omniverse Connectors, which are plugins that connect third-party tools to the platform. That allows the Omniverse to be customized for different vertical markets.
BMW is using Omniverse to simulate the exact details of a car factory, simulating a complete physical space. The company calls the factory a “digital twin.” The factory has enough detail to include 300 cars in it at a given time, and each car has about 10 gigabytes of data.
Thousands of planners, product engineers, facility managers and lean experts within the global production network are able to collaborate in a single virtual environment to design, plan, engineer, simulate and optimize extremely complex manufacturing systems before a factory is actually built or a new product is integrated.
Milan Nedeljkovic, member of the board of management of BMW AG, said in a statement that the innovations will lead to a planning process that is 30% more efficient than before. Eventually, Omniverse will enable BMW to simulate all 31 of its factories.
Volvo is designing cars inside Omniverse before committing to physical designs, while Ericsson is simulating future 5G wireless networks. Industrial Light & Magic has been evaluating Omniverse for a broad range of possible workflows, but particularly for bringing together content created across multiple traditional applications, and facilitating simultaneous collaboration across teams that are distributed all over the world.
Foster + Partners, the United Kingdom architectural design and engineering firm, is implementing Omniverse to enable seamless collaborative design to visualization capabilities to teams spread across 14 countries.
Activision Publishing is exploring Omniverse’s AI-search capabilities for its games to allow artists, game developers and designers to search intuitively through massive databases of untagged 3D assets using text or images.
WPP, the world’s largest marketing services organization, is using the Omniverse to reinvent the way advertising content is made by replacing traditional on-location production methods with entirely virtual production.
Perry Nightingale, senior vice president at marketing services firm WPP, said on a panel that he is seeing collaboration on an enormous scale with multiple companies working together.
“I’m excited how far that could go, with governments doing it for city planning and other sorts of grand scale collaboration around USD,” Nightingale said.
Nvidia will use Omniverse to enable Drive Sim 2.0, which lets carmakers test their self-driving cars inside Omniverse. It uses USD as Nvidia transitions from game engines to a true simulation engine for Omniverse, said Danny Shapiro, senior director for automobiles at Nvidia. Nvidia’s own developers will now be able to support new hardware technologies earlier than they could in the past.
“We initially built it for our own needs, so that when technologies were being developed in different groups that they could share immediately, rather than have to wait for the development of it into their particular area,” Kerris said. “The same holds true with our developers. It used to be if we brought a technology out, we would then work with our developers, and it would take a period of time for them to support it. However, by building this platform that crosses over these, we have the ability now to bring out new technologies that they can take advantage of day one.”
The metaverse of the future
One question is how well Omniverse will be able to deal with latency, or interaction delays across the cloud. That would be important for game developers, who have to create games that operate in real time. Scenes built with Omniverse can be rendered at 30, 60, or 120 frames per second as needed for a real-time application like a game.
Kerris said in an earlier chat that most of what you’re looking at doesn’t have to be constantly refreshed on everybody’s screen, making the real-time updating of the Omniverse more efficient. Nvidia’s Nucleus tech is a kind of traffic cop that communicates what is changing in a scene as multiple parties work on it at once.
As for viewing the Omniverse, gamers could access it using a high-end PC with a single Nvidia RTX graphics card.
Huang said in his speech, “The metaverse is coming. Future worlds will be photorealistic, obey the laws of physics or not, and inhabited by human avatars and AI beings.”
He said that games like Fortnite or Minecraft or Roblox are like the early versions of the metaverse. But he said the metaverse is not only a place to play games. It’s a place to simulate the future.
“We are building cities because we need to simulate these virtual worlds for our autonomous vehicles,” Kerris said. “We need a world in which we can train them and test them. Our goal is to scale it so so you could drive continuously drive a virtual car continuously from Los Angeles to New York, in real time, using the actual hardware that’s going to be inside the car and give it a virtual reality experience plugged into its sensory inputs, the output of our simulator and fool it into thinking it’s in the real world. And for that, it has to be an extremely large world. We’re not quite there yet. But that is what we are moving towards.”
For game companies, I can foresee game publishers eventually trading around their cities, as one might build a replica of Paris while another might build New York. After all, if everyone works with USD technology, there might not be a need to rebuild every city from scratch for simulations like games.
Ivar Dahlberg, technical artist at Embark Studios, a game studio in Stockholm, said it is tantalizing to think about trading cities back and forth between game developers who are working on city-level games.
“Traditionally, developers have focused on a world for someone else to experience,” he said. “But now it seems there are lots more opportunities for developers to create something together with the inhabitants of that world. You can share the tools with everybody who is playing. That ties in quite nicely to the idea of a metaverse. USD is definitely a step in that direction.”
Tsigkari said, “That is an experience that may not be very far out. It won’t matter if one company builds Paris, London, or New York. It will be more about what you are doing with those assets. What is the experience that you offer to the user with those assets?”
As I saw recently in the film A Glitch in the Matrix, it will be easier to believe in the future that we’re all living in a simulation. I expect that Nvidia will be able to fake a moon landing for us next.
GamesBeat
GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. How will you do that? Membership includes access to:
- Newsletters, such as DeanBeat
- The wonderful, educational, and fun speakers at our events
- Networking opportunities
- Special members-only interviews, chats, and “open office” events with GamesBeat staff
- Chatting with community members, GamesBeat staff, and other guests in our Discord
- And maybe even a fun prize or two
- Introductions to like-minded parties