Google’s computer masters some Atari games through artificial intelligence

A computer developed by Google for the purpose of testing and developing artificial intelligence has successfully mastered several Atari games. The researchers were led by the co-founder of DeepMind, Dennis Hassabis, and they developed the deep neural network, now called Google DeepMind.

One of the more amazing things about this feat is that Google’s computer had no initial knowledge about any of the games. It only knew about the pixels that were on the computer screen and the score. It did not know anything about bats, balls, or lasers.

The computer, through trial and error, slowly learned about the game and the features in it. Through continuously playing the games, it learned more about it, and then even went on to master the games.

One of the games it has mastered is Space Invaders. The computer has learned to play it so well that Technology Review says it is at a “superhuman level.” It can now play the game better than any expert human player could.

A little more than a year ago, the computer with the earlier DeepMind software could only do better than humans in three Atari games. The new computer, with the software improved by Hassabis’ team, has been tested against 49 Atari games. So far, it can now beat humans in 22 of them. The computer matched human ability in seven of the other games, but was considerably behind the ability of humans in the remaining 20 games.

The computer learns by drawing on a limited replay of recent stored game experiences. This is similar to the way the human brain works. During periods of rest, there is an ultra-fast replay of experiences that it had recently, but only those in the last four video frames.

The goals for the AI computer now are to expand its memory span and attention ability. This will enable it to examine a game more systematically, rather than by merely using random motions. It will also enable it to learn and master much more complex environments, such as those found in Super Nintendo.

Already, the Google experts are talking about soon using the artificial intelligence software in the real world. As an example, they said that once it learns to drive a racing car in a game, it is just a step or two up from there to being able to control a real car, but they say don’t expect it for a while.

Be social, please share!

Facebooktwittergoogle_plusredditpinterestlinkedintumblrmail

Leave a Reply

Your email address will not be published. Required fields are marked *