After a bit of investigation, Neumann realized that Bing Maps’ data set essentially covered the entire planet. The only problem? It was all in 2D. After using some of that data to build a flyable 3D version of Seattle, Neumann turned to the Azure team to craft a machine learning method for converting the entire planet into a giant 3D model.
“AI has just tremendously grown in the last few years,” said Eric Boyd, CVP of Azure AI, in an interview. “It’s really driven by the massive amounts of data that are now available, combined with the massive amounts of compute that exist in the cloud … The results you can see are really pretty spectacular where you can come up with algorithms that now look at literally every square kilometer of the planet to identify the individual trees, grass and water, and then use that to build 3D models.”
Azure’s integration goes beyond the shape of the world. It also powers the flight controller voices using AI Speech Generation technology, which sound almost indistinguishable from humans. It’s so natural that many players may think Microsoft is relying solely on voice actors.
Since the company began exploring ways to bring Azure AI into the game in 2016, the capabilities of machine learning have also evolved dramatically, according to Boyd. ”The AI algorithm space has really grown in the last several years,” he said. “And so vision algorithms, which is what’s heavily used to identify all these different trees and buildings and classify them exactly, those have come a tremendous way.”
Since it leans so heavily on the cloud, Flight Simulator is a “living game” in the truest sense, Neumann said. All of the machine learning algorithms the game relies on will steadily improve over time, as the company irons out bugs and optimizes the engine. (And perhaps becomes more aware of potential issues, like the typo that created a 212-story tower in Melbourne.) But he points out the algorithms can only be as good as the source data, so Microsoft is working harder to refine that as well.