Plan for 100,000 Examples When Training an AI

December 19, 2017

Just what is the magic number when it comes to the amount of data needed to train an AI? See VentureBeat’s article, “Google Brain Chief: Deep Learning Takes at Least 100,000 Examples” for an answer. Reporter Blair Hanley Frank cites Jeff Dean, a Google senior fellow, who spoke at this year’s VB Summit. Dean figures that supplying 100,000 examples gives deep learning systems enough examples of most types of data. Frank writes:

Dean knows a thing or two about deep learning — he’s head of the Google Brain team, a group of researchers focused on a wide-ranging set of problems in computer science and artificial intelligence. He’s been working with neural networks since the 1990s, when he wrote his undergraduate thesis on artificial neural networks. In his view, machine learning techniques have an opportunity to impact virtually every industry, though the rate at which that happens will depend on the specific industry. There are still plenty of hurdles that humans need to tackle before they can take the data they have and turn it into machine intelligence. In order to be useful for machine learning, data needs to be processed, which can take time and require (at least at first) significant human intervention. ‘There’s a lot of work in machine learning systems that is not actually machine learning,’ Dean said.

Perhaps poetically, Google is using machine learning to explore how best to perform this non-machine-learning work. The article points to a couple of encouraging projects, including Google DeepMind’s AlphaGo, which seems to have mastered the ancient game of Go simply by playing against itself.

Cynthia Murrell, December 19, 2017


Comments are closed.

  • Archives

  • Recent Posts

  • Meta