Gaming Giant Unity Wants to Digitally Clone the World

In video games, non-playable characters can be somewhat clueless. An NPC might wander across a city block and face-plant into a streetlamp, and then maybe vanish the next block over. NPCs leap into player-characters’ punches or commit to kicking a wall 400 times, never learning that the wall won’t kick back.

Unity Technologies is in the business of NPCs. Founded in 2004, Unity makes an eponymous game engine that provides the architecture for hundreds of video games using its real-time 3D computer graphics technology. Unity also provides countless tools integrated with that game engine, including AI tools. In the Unity game engine, developers design their 3D city blocks and streetlamps; model their NPCs; animate their punches; and maybe—through Unity’s AI technology—teach them when to stop kicking.

Five years ago, Unity’s executives had a realization: In the real world, there are a lot of situations that would enormously benefit from NPCs. Think about designing a roller coaster. Engineers can’t ask humans to stand up on a roller coaster ahead of a hairpin turn to test whether they’d fly off. And they definitely can’t ask them to do it 100 or 1,000 times, just to make sure. But if an NPC had all the pertinent qualities of a human being—weight, movement, even a bit of impulsiveness—the engineer could whip them around that bend 100,000 times, like a crazed kid playing RollerCoaster Tycoon, to discern under which circumstances they’d be ejected. The roller coaster, of course, would be digital too, with its metal bending over time and the speed of its cars sinking and rising depending on the number of passengers.

Unity spun that idea into an arm of its business and is now leveraging its game engine technology to help clients make “digital twins” of real-life objects, environments, and, recently, people. “The real world is so freaking limited,” said Danny Lange, Unity’s senior vice president of artificial intelligence, in Unity’s San Francisco headquarters last October. Speaking with WIRED in 2020, he had told me, “In a synthetic world, you can basically recreate a world that is better than the real world for training systems. And I can create many more scenarios with that data in Unity.”

Digital twins are virtual clones of real-life things, acting and reacting in virtual space the same way their physical counterparts do. Or at least, that’s what the term implies. The word “twin” does a lot of heavy lifting. It will be a long time before simulations boast one-to-one verisimilitude to their references; and these “twins” take a mountain of human labor to create. Right now, though, dozens of companies are using Unity to create digital twins of robots, manufacturing lines, buildings, and even wind turbines to virtually design, operate, monitor, optimize, and train them. These twins rust in the rain and quicken with lubricant. They learn to avoid a lump or identify a broken gear. With an accurate enough digital twin, Lange says, Unity’s game engine can even collect “synthetic data” off the simulation to better understand and advance its IRL double.

“We’re actually a massive data company,” says Lange. “We early on realized that at the end of the day, real-time 3D is about data and nothing but data.” Unity’s major digital twin clients are in the industrial machinery world, where they can use digital simulations in place of more expensive physical models. Unity executives believe that the company’s real-time 3D technology and AI capabilities position them to compete with the many other firms entering the $3.2 billion space, including IBM, Oracle, and Microsoft. David Rhodes, Unity’s senior vice president in charge of digital twins, says his goal is for Unity to one day host “a digital twin of the world.”

Source

Author: showrunner