Google Deepmind Researchers Mimic Human Learning to Unlock New AI Abilities

A major hurdle in artificial intelligence up to this point has been the inability to effectively recall previous learnings – similar to an Alzheimer’s patient forgetting how to use a microwave even if they used it last week. Artificial intelligence experts call this problem “catastrophic forgetting”. Previous attempts to remedy this issue of forgetfulness have included ensuring relevant memory is always available to an agent during learning or allowing the agent to learn each task separately and integrate all the skills later. Both are impractical methods, however, because they either require multiple disparate networks or require the network to have total access to an unfathomable amount of data. Thus, continual learning with deep neural networks has only been a dream until now.

Google’s Deepmind researchers have been studying mouse brain activity for insights into how mammals are able to avoid this “catastrophic forgetting” conundrum. Luckily for us, mammals have the ability to strengthen and build up synapses after learned tasks so that, even a year from now, we can implement previously learned skills with total ease. Neuroscientists have found that mice are able to retain and call upon past skills by reducing the plasticity and strengthening neuronal pathways that have proved themselves continually useful.

This is the key piece that researchers used to inspire their novel algorithm, EWC. Just as you would guide a child to retain the ability to tie their shoes with careful supervision and repeated reinforcement, Deepmind researchers have ‘coached’ their agent to slow down and solidify learnings that are related to previously seen tasks and therefore must be important. This gives the agent the ability to strengthen and retain skills linearly (i.e. learn as you go), without the need for total access to all previous information or even multiple networks, which was the previous roadblock in continual learning. In addition, the agent with EWC is able to consolidate its processes by combining shared commonalities between older and developing skills to avoid redundant skills. Overall, these incredible findings provides us a mirror into understanding how our own brains function. The ability to interplay between plasticity and rigidity (memory retention) as needed is what allows us to able to learn new tasks, yet still access our previously learned skills. Interestingly, this Deepmind continual learning model currently only has the capability to become more rigid over time, meaning it cannot forget what skills it has learned and eventually will reach max capacity for new skills. Mammalian brains, on the other hand, are constantly losing and gaining skills as needed to stay agile and efficient. Only time will tell how closely AI will truly come to mimicking human processes, however, given this current trajectory it’s only a matter of time before artificial intelligence is able to rise up with comparable and potentially uncomfortably superior continual learning abilities.

Source: Kirkpatrick, J., Pascanu, R., Rabinowitz, N., Veness, J., Desjardins, G., Rusu, A. A., . . . Hadsell, R. (2017). Overcoming catastrophic forgetting in neural networks. Proceedings of the National Academy of Sciences, 201611835. doi:10.1073/pnas.1611835114



Leave a Reply