Back

 Industry News Details

 
OpenAI’s Jeff Clune on deep learning’s Achilles’ heel and a faster path to AGI Posted on : Feb 25 - 2020

People learn differently from neural networks. If a human comes back to a sport after years away, they might be rusty but they will still remember much of what they learned decades ago. A typical neural network will forget the last thing it was trained to do. Virtually all neural networks today suffer from this problem, called “catastrophic forgetting.”

It’s the Achilles’ heel of machine learning, OpenAI research scientist Jeff Clune told VentureBeat, because it prevents machine learning practitioners from “continual learning,” the ability to remember previous tasks. But some systems can be taught to remember.

Before joining OpenAI last month to lead its multi-agent team, Clune worked with researchers from Uber AI Labs and the University of Vermont. This week, they collectively shared ANML (A Neuromodulated Meta-Learning algorithm) that is able to learn 600 sequential tasks with minimal catastrophic forgetting.

“This is relatively unheard-of in machine learning. To my knowledge, it’s the longest sequence of tasks that AI has been able to do experiments to and at the end of it, it’s still pretty good at all the tasks that it saw,” Clune said. “I think that these sorts of advances will be used in almost every situation where we use AI. It will just make AI better.”

Clune helped cofound Uber AI Labs in 2017, following the acquisition of Geometric Intelligence, and is one of seven coauthors of a paper called “Learning to Continually Learn” published Monday on arXiv.

Teaching AI systems to learn and remember thousands of tasks is what the paper coauthors call a long-standing grand challenge of AI. Such systems can enable the creation of AI systems that can handle and remember a range of tasks, and Clune believes AI like ANML is key to achieving a faster path to the greatest challenge: artificial general intelligence (AGI).

In another paper Clune wrote before joining OpenAI — a startup with billions in funding aimed at creating the world’s first AGI — Clune argued a faster path to AGI can be achieved through improving meta-learning algorithm architectures, the algorithms themselves, and the automatic generation of training environments. View More