While AI on real roads still drives imperfectly, in the game it does it very well. Sony AI developed Gran Turismo Sophy, a program for driving cars in the world of Gran Turismo, which was able to surprise its best players.

Gran Turismo is a video game that simulates car racing in a super-realistic way. After conducting a series of tests, Sony AI trained the artificial intelligence to beat its champions under various conditions. Yes, the program beat multiple race winner Emily Jones by 1.5 seconds. That’s an eternity in a game where everything is decided by milliseconds of advantage.

But it wasn’t just speed that AI needed to win. GT Sophy ran the empty track at record speed, but in a group race with several human drivers, where intelligence was required, it lost at first. At times the program has been too aggressive in issuing fines, and at times too timid.

Sony took into account the mistakes and retrained the artificial intelligence, organizing a rematch afterwards. This time, GT Sophy left everyone behind with ease. What has changed? The program was given what Sony’s AI chief calls “etiquette” – the ability to balance aggression with caution and choose the behavior most appropriate to the situation.

This makes GT Sophy relevant outside the Gran Turismo tracks. Etiquette between drivers on a track is a concrete example of dynamic, context-dependent behavior. This behavior is expected from robots when interacting with people. Knowing when to take risks and when to hold back will be useful for self-driving cars, in manufacturing, and in household chores.

How did GT Sophy win?

Instead of reading pixels from the screen like human players, the app received updates about the position of the car on the track and the cars around it during the race. Updates occurred 10 times per second, which corresponds to human reaction.

GT Sophy also received data about the virtual physical forces acting on the car. Gran Turismo is a very realistic simulation that simulates car aerodynamics, tire friction on the track and other physical phenomena. The program learned to play on the edge of the possible, performing tricks that humans would not dare to do. For example, it dropped a wheel on the grass near the edge of the track, skidding into turns.

“You don’t want to do that because you’ll make a mistake. It’s like a controlled crash,” Emily Jones says. “I could maybe do that one in a hundred times.”

The program quickly learned the physics of the game, so in the second stage, Sony’s task was to teach the AI ​​to avoid fines for dangerous driving. The developers also managed to do this.