Pro gamers beware — AI is coming for your job

A Mercedes Benz car leading a race in Gran Turismo Sport.
(Image credit: Sony)

If you’re reading this and happen to be an eSports player, all I can say is… you had a good run. The show’s over now, though, because technology has advanced to the point where AI can hand a pro gamer their keester in a certain legendary driving sim series. 

Enter GT Sophy. Developed by Sony AI in New York, it was trained using deep reinforcement learning, which is a form of machine learning that teaches an AI system’s neural network through rewards and reprimands (thanks, New Scientist). 

Using both the carrot and the stick has clearly worked as after a mere number of days GT Sophy managed to beat 95% of players on Gran Turismo Sport’s online leaderboards. 

How was GT Sophy trained?

GT Sophy

(Image credit: Sony AI)

Of course, the AI was trained to have certain advantages human players can’t lean on. GT Sophy knows the position of every rival car in the PS4 driving simulator during a race, while it also has the data for the next 6 seconds of track at any given moment. Although the flipside is that Sony only taught GT Sophy to drive with auto shifting, whereas the best Gran Turismo players can shift gears manually. 

Despite having innate advantages, getting GT Sophy to the point it could complete certain tracks faster than any human on the planet took 45,000 thousand hours of virtual driving. Logistically, there’s a not insignificant hardware aspect at play too, with GT Sophy playing across 20 different PS4s at a time. 

Another growing pain the AI encountered was learning how to overtake rival racers. The team at Sony solved this by rewarding GT Sophy when it overtook a car while also penalizing it for being overtaken.

A sporting chance

A black Honda racing car racing a yellow Peugeot hatchback in Gran Turismo Sport.

(Image credit: Sony)

The researchers at Sony AI also teamed up with a competitive Gran Turismo player, then started to focus on teaching GT Sophy how to master the trickiest sections of laps from the game’s selection of courses.

At this point, the AI was ready to take on some of the world’s best players, so the researchers pitted four copies of Sophy against the four top competitive GT drivers over a series of races. The end result? GT Sophy comfortably beat the finest human racers by 104 to 54, with points being totaled according to a car’s final position. 

The moral of the story doesn’t quite end at “All hail Skynet”, thankfully. The deep reinforcement learning method that taught GT Sophy could be used by future developers to give in-game squadmates or NPCs more intelligent AI for players to interact with, according to Igor Babuskin from OpenAI in San Francisco. “The results suggest that it could be possible for game developers to use deep reinforcement learning to design and test their games, and to produce interesting opponents and teammates for human players.”

In the meantime, if I was a Gran Turismo 7 player, I’d be practicing my turns on Deep Forest Raceway for at least a couple of hours every day before GT Sophy steps up to shame the PS5 racer’s entire community.  

More from Tom's Guide

Category
Arrow
Arrow
Back to Game Consoles
Brand
Arrow
Storage Size
Arrow
Colour
Arrow
Condition
Arrow
Price
Arrow
Any Price
Showing 10 of 200 deals
Filters
Arrow
Load more deals
Dave Meikleham
UK Computing Editor

Dave is a computing editor at Tom’s Guide and covers everything from cutting edge laptops to ultrawide monitors. When he’s not worrying about dead pixels, Dave enjoys regularly rebuilding his PC for absolutely no reason at all. In a previous life, he worked as a video game journalist for 15 years, with bylines across GamesRadar+, PC Gamer and TechRadar. Despite owning a graphics card that costs roughly the same as your average used car, he still enjoys gaming on the go and is regularly glued to his Switch. Away from tech, most of Dave’s time is taken up by walking his husky, buying new TVs at an embarrassing rate and obsessing over his beloved Arsenal.