Physics-Based Simulation and the Future of the Metaverse
Some of the world’s biggest companies are going all-in on the metaverse. One you may not know about is Ansys, a US public company that makes engineering simulation software and has been around since 1970. Dr. Prith Banerjee is its Chief Technology Officer, and I spoke to him last week about his vision for the metaverse — and specifically, why he thinks the metaverse can’t reach its full potential without “optimum physics-based modeling and simulation.”
Ansys, it turns out, already has a number of partnerships with companies building the metaverse — including global telecoms companies, microchip and GPU manufacturers, data center and storage companies, and “all the cloud providers,” according to Banerjee. He said that Ansys provides a mix of hardware and software expertise to these customers; everything from building hardware to designing a structural electromagnetics system.
Banerjee himself has had a distinguished career as a specialist in electronic design automation, including founding a couple of companies in the 2000s and leading HP Labs from 2007-12. Originally, he was a Professor of Electrical and Computer Engineering at the University of Illinois during the 1980s and 90s (indeed, he was there when the Mosaic web browser was released in 1993).
Banerjee harkened back to those early days of the web in describing how he views the metaverse. The largely text-based internet of the Mosaic era was the first generation, he said. The second generation is based on 2D images and 2D video, which he said “is the world we live in now.” He thinks the metaverse will be the third generation of the internet since it will enable you to “interact with things in 3D.”
Physics Needed for Consumer Metaverse, too
Surprisingly, he is adamant that we will need what he terms “optimum physics-based modeling and simulation” not just for the enterprise metaverse, but for the type of consumer internet that Mark Zuckerberg and others are espousing.
I can understand why physics modeling is important in creating digital twins, one of the key concepts of the enterprise metaverse according to Microsoft, Nvidia, and others. It’s critical that a 3D “digital twin” object, such as a digital car or a twin of a piece of medical equipment, mimics its real-world counterpart precisely. But what does it matter if, say, a 3D equivalent of Facebook doesn’t have precise physics? Indeed, the 3D avatars from Horizon Worlds, a VR social app by Meta, are currently far from realistic — they’ve been mocked for being cartoonish and anatomically deficient.
“I would say that the physics-based simulation is more important for enterprise customers today, which is most of our customers […] but consumers will need this in the future,” said Banerjee. He pointed out that movie animation has become very realistic, down to the strands of hair on an animated character. Physics-based computer simulation will eventually bring that same level of detail to the metaverse, although Banerjee gives it a timeframe of about 50 years.
Web Technology Will Drive Frontend
Given Banerjee’s framing of the metaverse as the third generation of the internet, I asked him what role web technologies will play in the metaverse. After all, the web was what brought the first generation into existence and (along with the mobile internet) it played a major role in the second.
“We will rely on web display technologies, like React and so on, but we will feed the 3D data for the simulations into that,” he replied. He referenced specifically the open web technologies managed by The Khronos Group (a non-profit industry consortium in which Ansys is a member). That means open standards like WebGL, OpenXR, glTF, and more.
So essentially, he’s saying that the frontend of the metaverse will be web-based. I noted that currently, native game engines like Unity and Unreal Engines are preferred for high-end graphics. So will the web be able to compete with that?
“The frontend will be the web tools — all the things like React, Angular, and so on — but these frontends will work with a backend,” said Banerjee. “And our tools will run on the backend, on the cloud. It’s not like this fast simulation will run using the web on your phone, right? The compute capabilities on your phone are not good enough. So essentially […] we’ll have a frontend connection to the backend, the compute will happen in the backend on the cloud — on Azure, or whatever — and it will be visualized on the frontend with the web.”
Another key part of the technical infrastructure of the metaverse will be connectivity — specifically the ability of mobile networks to consume high-fidelity graphics content. Ansys thinks the metaverse will “require unprecedented levels of connectivity in a 5G and eventually 6G world.” I asked Banerjee how difficult it will be to reach that level of connectivity, given that we are barely in the 4G era currently.
He replied that it is a significant hurdle, since “in simulation, we are solving matrix problems” — which involves millions or billions of data points. To solve this complexity, AI and ML techniques like “reduced order models” can be used to reduce the data processed from, say, 1 billion points to 1 thousand points.
There are also encoding techniques to help. Banerjee asked me to imagine you have a 3D frame with 1 billion points. “Now, in the next frame, are all the things changing? No, it’s only that left part of that scene,” he said. “So, using very smart encoding, you can actually mimic as if you’re showing all billion points, but you’re only sending the deltas, which is 1,000.”
All 5 Senses in the Metaverse
Towards the end of the interview, Banerjee told me he thinks the metaverse will — in 30-40 years’ time — be able to simulate all five human senses. This is something that some of us have written science fiction books about, but Banerjee is a distinguished computer science professor and the CTO of a large engineering company. He thinks both smell and taste will be a part of the metaverse, joining the touch, sight and hearing that we have today in VR.
“I was at the University of Illinois [and] we had this virtual reality lab,” Banerjee said. “We were actually testing all of these senses. Now, they have not come to market, but in 50 years they will.”
Eventually, Ansys wants to open up its simulation technology for everyday people to use. Currently, Banerjee said, only a select few highly skilled engineers can use the type of advanced simulation tools that Ansys makes — such as Ansys Fluent, its fluid simulation software that has “advanced physics modeling capabilities.” The ultimate goal, he continued, is “democratizing simulation.”
According to Banerjee, “all 9 billion people on the planet will use simulation to do whatever things that they want. So that’s like a grand plan, which is beyond the metaverse.”