AI News, Google's Deep Mind Gives AI a Memory Boost That Lets It Navigate London's Underground

Google's Deep Mind Gives AI a Memory Boost That Lets It Navigate London's Underground

Google’s DeepMindartificial intelligence labdoes more than just develop computer programs capable of beating the world’s best human players in the ancient game of Go.

But neural networks face huge challenges when they try to rely solely on pattern recognition without having the external memory to store and retrieve information.To improve deep learning’s capabilities, Google DeepMind created a“differentiable neural computer” (DNC) that gives neural networks an external memory for storinginformation for later use.

“We once relied on our physical address books and Rolodexes;now of course we rely on the read-write storage capabilities of regular computers.” McClelland is a cognitive scientist who served as one of severalindependent peerreviewers for the Google DeepMind paper that describes development of this improved deep learning system.

The DeepMind team found that the DNC system’scombination of the neural network and external memory did much better than a neural network alone in tacklingthecomplex relationships between data points in so-called“graph tasks.”For example, they asked their system to either simplytake any path between points A and B or to find the shortest travel routes based on a symbolic map of the London Underground subway.

An unaidedneural network could not even finish the first level of training, based on travelingbetween two subway stations without trying to find the shortest route.It achievedan average accuracy of just 37 percent after going throughalmost two million training examples.

But Herbert Jaeger, professor for computational science atJacobs University Bremenin Germany, sees the DeepMind team’s work as a “passing snapshot in a fast evolution sequence of novel neural learning architectures.” In fact, he’s confident that the DeepMind team already has something better than the DNC system described in the Nature paper.

(Keep in mind that the paper was submitted back in January 2016.) DeepMind’swork is also part ofabigger trend in deep learning, Jaeger says.The leading deep learning teams at Google and other companies are racingto build new AI architectures with many different functional modules—among them, attentional control or working memory;

Differentiable neural computer

Some experts see promise that they can be trained to perform complex, structured tasks[1][2] and address big-data applications that require some sort of rational reasoning, such as generating video commentaries or semantic text analysis.[3][4] DNC can be trained to navigate a variety of rapid transit systems, and then what the DNC learns can be applied, for example, to get around on the London Underground.

On graph traversal and sequence-processing tasks with supervised learning, DNCs performed better than alternatives such as long short-term memory or a neural turing machine.[5] With a reinforcement learning approach to a block puzzle problem inspired by SHRDLU, DNC was trained via curriculum learning, and learned to make a plan.

This can be achieved by using an approximate nearest neighbors algorithm, such as Locality-sensitive hashing, or a random k-d tree like the Fast Library for Approximate Nearest Neighbors from UBC.[9] Adding Adaptive Computation Time (ACT) separates computation time from data time, which uses the fact that problem length and problem difficulty are not always the same.[10] Training using synthetic gradients performs considerably better than Backpropagation through time (BPTT).[11]

Google's AI Can Now Learn From Its Own Memory Independently

The DeepMind artificial intelligence (AI) being developed by Google's parent company, Alphabet, can now intelligently build on what's already inside its memory, the system's programmers have announced.

Their new hybrid system – called a Differential Neural Computer (DNC) – pairs a neural network with the vast data storage of conventional computers, and the AI is smart enough to navigate and learn from this external data bank.  What the DNC is doing is effectively combining external memory (like the external hard drive where all your photos get stored) with the neural network approach of AI, where a massive number of interconnected nodes work dynamically to simulate a brain.

Take a family tree: after being told about certain relationships, the DNC was able to figure out other family connections on its own – writing, rewriting, and optimising its memory along the way to pull out the correct information at the right time.

Of course, any smartphone mapping app can tell you the quickest way from one tube station to another, but the difference is that the DNC isn't pulling this information out of a pre-programmed timetable – it's working out the information on its own, and juggling a lot of data in its memory all at once.

Google’s New AI Gets Smarter Thanks to a Working Memory

Now, the DeepMind team is back with an updated deep neural net dubbed the “differentiable neural computer (DNC).” Taking inspiration from plasticity mechanisms in the hippocampus, our brain’s memory storage system, the team has added a memory module to a deep learning neural network that allows it to quickly store and access learned bits of knowledge when needed.

With training, the algorithm can flexibly solve difficult reasoning problems that stump conventional neural networks — for example, navigating the London Underground subway system or reasoning about interpersonal relationships based on a family tree.

Given deep learning’s superior ability at extracting structure from big data, by adding a dynamic knowledge base to these already powerful algorithms, DNCs could eventually be capable of “intuiting the variable structure and scale of the world within a single, generic model.” The team recently published their work in Nature.

Each piece of knowledge — say, a number, a list, a tree or a map — is represented by a variable stored in memory that links to other variables, making each piece of related knowledge easily accessible.

“When we designed DNCs, we wanted machines that could learn to form and navigate complex data structures on their own,” DeepMind researchers wrote in a detailed blog post explaining their work.

At the heart of the DNC is a neural network dubbed the controller, which takes in training data, communicates with the memory module and outputs answers — somewhat like a conventional computer CPU, but without the need for prior programming.

When the team feeds the DNC sets of training data — for example, a map of the London Underground — the controller chooses if and where to store each bit of information in the memory module and links associated pieces in the order in which they were stored.

When given a family free that only describes parent, child and sibling relationships, the network could correctly answer questions such as “Who is A’s maternal great uncle?”, a kind of generalization far beyond the ability of conventional neural networks.

However, Jaeger quickly pointed out that older programs were “handcrafted by humans and do not learn from examples.” The promise of DNCs and their future successors is big: a flexible and extensible neural network equipped with memory could allow us to harness deep learning to solve big-data problems with a rational reasoning component, for example, automatically generating video commentaries or semantic text analysis.

Differentiable neural computer family tree inference task

This animation shows a differentiable neural computer solving an inference problem on a family tree task (artistic interpretation). Paper link:

The Future of Deep Learning Research

Back-propagation is fundamental to deep learning. Hinton (the inventor) recently said we should "throw it all away and start over". What should we do? I'll describe how back-propagation works,...

DEF CON 24 - Delta Zero, KingPhish3r - Weaponizing Data Science for Social Engineering

Historically, machine learning for information security has prioritized defense: think intrusion detection systems, malware classification and bonnet traffic identification. Offense can benefit...

High-Accuracy Neural-Network Models for Speech Enhancement

In this talk we will discuss our recent work on AI techniques that improve the quality of audio signals for both machine understanding and sensory perception. Our best models utilize convolutional-...

Top 5 trends in Deep Learning

Top 5 Model trends in deep learning. These are the ones according to me , please share your views as comments.

The Art of White House Speechwriting

Sarah Hurwitz Senior Speechwriter for President Barack Obama and Chief Speechwriter for First Lady Michelle Obama Cody Keenan Director of Speechwriting for President Barack Obama John McConnell...

Saving the Web: Ethics & Challenges of Preserving the Internet (morning)

Preserving the contents of the internet is an increasingly vital activity. The web today is an ubiquitous global information system, and yet significant amounts of its contents disappear daily....

BRAIN Initiative Multi Council Working Group Open Session 20170516 1836 1