You may be familiar with the Infinite Monkey Theorem, an oft-cited (and often incorrectly quoted) claim that thousands of monkeys could bang on thousands of typewriters and eventually produce a work of art equivalent to William Shakespeare. (Yes, Simpsons did it.)
This week, Nvidia confirms that it has taken this theory quite seriously with its own twist: an army of AI routines, dubbed GameGAN (short for “generative adversarial networks”), trained to build a playable video game from scratch. More precisely, they’ve chosen one of the industry’s biggest, most recognizable games, celebrating its 40th anniversary today: Pac-Man.
If you’ve seen other farms of computers trained on existing games, this has usually come in the form of them learning how to play the game in question. After watching thousands of hours of a particular game and tracking the most successful moves and reactions in the course of a versus match, these AI routines can then control games, repeat and juggle thousands of strategies, and battle humans. (Sometimes the results go well for the computers, but not always.)