Sony's AI Drives a Race Car Like a Champ

The company built GT Sophy to master the game Gran Turismo, but it may help the development of real self-driving cars.
Still from Gran Turismo game with two race cars driving on red and green speedway
Photograph: Clive Rose/Gran Turismo/Getty Images

Takuma Miyazono began driving virtual race cars at age 4, when his father brought home the highly realistic motorsport game Gran Turismo 4. Sixteen years later, in 2020, Miyazono became the Gran Turismo world champion, winning an unprecedented “triple crown” of esports motor racing events. But he had never faced a Gran Turismo driver quite like GT Sophy, an artificial intelligence developed by Sony and Polyphony Digital, the studio behind the Gran Turismo franchise.

“Sophy is very fast, with lap times better than expected for the best drivers,” he says via a translator. “But watching Sophy, there were certain moves that I only believed were possible afterward.”

Video games have become an important sandbox for AI research in recent years, with computers mastering a growing array of titles. But Gran Turismo represents a significant new challenge for a machine.

In contrast to board games that AI has mastered, such as chess or Go, Gran Turismo requires continuous judgments and high-speed reflexes. It demands challenging driving maneuvers, in contrast to complex strategy games like Starcraft or Dota. A Gran Turismo ace must balance pushing a virtual car to its limits and wrestling with friction, aerodynamics, and precise driving lines with the subtle dance of trying to overtake an opponent without unfairly blocking their line and incurring a penalty.

“Outracing human drivers so skillfully in a head-to-head competition represents a landmark achievement for AI,” said Chris Gerdes, a professor at Stanford who studies autonomous driving, in an article published on Wednesday alongside the Sony research in the journal Nature.

Gerdes says the techniques used to develop GT Sophy could help the development of autonomous cars. Currently, self-driving cars only use the kind of neural network algorithm that GT Sophy employed to keep track of road markings and perceive other vehicles and obstacles. The software controlling the car is handwritten. “GT Sophy’s success on the track suggests that neural networks might one day have a larger role in the software of automated vehicles than they do today,” Gerdes writes.

Sony announced in 2020 that it was developing a prototype electric car featuring advanced driver assistance features. But the company says there are no plans, as yet, to use GT Sophy in its automotive efforts.

GT Sophy also shows how important simulated environments have become for real-world AI systems. Many companies developing self-driving technology use sophisticated computer simulations to generate training data for their algorithms. For instance, Waymo, the self-driving car company owned by Alphabet, says its vehicles have traveled the equivalent of 20 million miles in simulations.

"The use of machine learning and autonomous control for racing is exciting," says Avinash Balachandran, senior manager for Human Centric Driving Research at the Toyota Research Institute, which is testing self-driving cars capable of operating at extreme speeds. He says Toyota is working on “human amplification, in which technologies that leverage expert learnings from motorsport can one day improve active safety systems.”

Bruno Castro da Silva, a professor at the University of Massachusetts Amherst who studies reinforcement learning, calls GT Sophy "an impressive achievement" and an important step toward training AI systems for autonomous vehicles. But da Silva says going from Gran Turismo to the real world will be challenging, because it's difficult for reinforcement learning algorithms such as GT Sophy to consider the long term implications of decisions, and because it's hard to guarantee the safety or reliability of such algorithms. 

"Safety guarantees are paramount if we wish such AI systems to be deployed in real life," da Silva says. "A lack of safety guarantees is one of the main reasons why machine learning-based robots are not yet widely used in factories and warehouses."

The type of AI algorithm developed for GT Sophy ​might also prove useful for other kinds of machines, including ​drones and robots that work alongside or assist humans, says Hiroaki Kitano, CEO of Sony AI, which led the development of the algorithm. “This can be applied to any physical system that interacts with a human,” Kitano says.

GT Sophy mastered Gran Turismo through hours of practice. As with other recent feats of AI gameplay, this involved training an algorithm known as a neural network to improve its operation of the game’s controls by providing positive and negative feedback. The approach, known as reinforcement learning, is inspired by the way animals respond to success and failure in the real world. Although it is decades old, the method has come to the fore in recent years, thanks to more sophisticated algorithms, more powerful computers, and more copious training data.

GT Sophy is the first AI capable of beating professional esports drivers in such a realistic, high-speed game. In a series of races held in July and October 2021, Sophy bested the best Gran Turismo drivers, including Miyazono.

Peter Wurman, director of Sony AI America, says mastering Gran Turismo is as important an AI landmark as dominating chess or Go. Wurman cofounded Kiva Systems, which developed the shelf-moving robots that transformed Amazon’s warehouses. Because a Gran Turismo driver must understand how to beat other drivers without incurring penalties for unfair play, Wurman says GT Sophy points the way to robots that learn how to interact with humans in more sophisticated social settings. “With board games, you have a long time to decide what you're going to do between each move,” Wurman says. “Real-time interaction is what we do every day.”

The GT Sophy project also points to potential changes in game design. In-game characters typically follow simple rules. AI players that learn ​for themselves ​could be far more lifelike and engaging to play against​ and alongside​.

Kazunori Yamauchi, the creator of Gran Turismo and a real-life race car driver, says GT Sophy’s ability to drive without incurring penalties is perhaps most impressive. He says the technology will become a part of future versions of the game and predicts that it will help teach both novice and expert drivers to improve their skills.

“Sophy takes some racing lines that a human driver would never think of,” he says. “I think a lot of the textbooks regarding driving skills will be rewritten.”

Updated, 2-18-22, 6:30pm ET: An earlier version of this article said Gran Turismo is far more complex that Starcraft or Dota. 

Updated, 2-9-22, 3:40pm ET: An earlier version of this article misspelled the name of Avinash Balachandran, senior manager for Human Centric Driving Research at the Toyota Research Institute.


More Great WIRED Stories