An artificial intelligence program has beaten the world's best players in the popular PlayStation racing game Gran Turismo Sport, and in doing so may have contributed towards designing better autonomous vehicles in the real world, according to one expert.
The latest development comes after an interesting couple of decades for A.I. playing games.
It began with chess, when world champion Garry Kasparov lost to IBM's Deep Blue in a match in 1997. Then with Go, when A.I. beat Korean champion Lee Sedol in 2016. And by 2019, an A.I. program ranked higher than 99.8% of world players in the wildly popular real-time strategy game StarCraft 2.
Now, an A.I. program has dethroned the best human players in the professional esports world of Gran Turismo Sport.
In a paper published recently in the science journal Nature, researchers at a team led by Sony A.I. detailed how they created a program called Gran Turismo Sophy, which was able to win a race in Tokyo last October.
Peter Wurman is the head of the team on the GT Sophy project and said they didn't manually program the A.I. to be good at racing. Instead, they trained it on race after race, running multiple simulations of the game using a computer system connected to roughly 1,000 PlayStation 4 consoles.
"It doesn't know what any of its controls do," Wurman said. "And through trial and error, it learns that the accelerator makes it go forward and the steering wheel turns left and right ... and if it's doing the right thing by going forward, then it gets a little bit of a reward."
"It takes about an hour for the agent to learn to drive around a track. It takes about four hours to become about as good as the average human driver. And it takes 24 to 48 hours to be as good as the top 1% of the drivers who play the game."
And after another 10 days, it can finally run toe-to-toe with the very best humanity has to offer.
After finishing behind two bots controlled by Gran Turismo Sophy at the race in Tokyo, champion player Takuma Miyazono said it was actually a rewarding experience.
"I learned a lot from the A.I. agent," Miyazono said. "In order to drive faster, the A.I. drives in a way that we would have never come up with, which actually made sense when I saw its maneuvers."
Chris Gerdes is a professor of mechanical engineering at Stanford and reviewed the team's findings through its publication process at Nature. Gerdes also specializes in physics and drives race cars himself.
He said he spent a lot of time watching GT Sophy in action, trying to figure out if the A.I. was actually doing something intelligent or just learning a faster path around the same track through repetition.
"And it turns out that Sophy actually is doing things that race car drivers would consider to be very intelligent, making maneuvers that it would take a human race car driver a career to be able to pull some off ... out of their repertoire at just the right moment," he said.
What's more, Gerdes said this work could have even greater implications.
"I think you can take the lessons that you learned from Sophy and think about how those work into the development, for instance, of autonomous vehicles," he said.
Gerdes should know: He researches and designs autonomous vehicles.
"It's not as if you can simply take the results of this paper and say, 'Great, I'm going to try it on an autonomous vehicle tomorrow,'" Gerdes said. "But I really do think it's an eye opener for people who develop autonomous vehicles to just sit back and say, well, maybe we need to keep an open mind about the extent of possibilities here with A.I. and neural networks."
Wurman and Gerdes both said that taking this work to cars in the real world could still be a long way off.
But in the short term, Wurman's team is working with the developers of Gran Turismo to create a more engaging A.I. for normal players to race against in the next game in the series.
So in the near future, we could try our hands at racing it, too.
Copyright 2022 NPR. To see more, visit https://www.npr.org.