AI News, An Illustrated Explanation of Using SkipGram To Encode The ... artificial intelligence

An Illustrated Explanation of Using SkipGram To Encode The Structure of A Graph (DeepWalk)

In this article, I will walk through an example of applying DeepWalk to create embeddings of a graph.

Suppose we have the following graph: It is clear that this graph have two communities (call them community 1 and 2) in it, namely: Our goal is to come up with good vector representations for each node.

In this example, the vector representations are good if the vectors for nodes 1, 2, 3, 4, 5, 6 and nodes 8, 9, 10, 11, 12, 13 form distinct clusters.

For example, here’s the result of performing 1 random walk of length 20 on each node in our example graph presented as a 13 x 20 matrix: The i-th row refers to a random walk performed starting at node i.

For example, the 8-th row reads: This means that the random walk began at node 8, then it walked to node 9, then to node 10 then back to node 9 again, and then back to node 10, and then to node 11, and so on until in ended at node 9.

To understand why the resulting embeddings will encode the graph’s community structure, suppose for simplicity’s sake that we will create the embeddings using a window of size 1 i.e.

The following result is generated using DeepWalk with the following hyperparameters: The above diagram is the plot of the embeddings for each node using PCA (default arguments) to reduce the embedding dimension from 128 to 2.

Notice that the embeddings clearly capture the community structure of the graph since nodes 1, 2, 3, 4, 5, 6 together form a distinct cluster from nodes 8, 9, 10, 11, 12, 13.

Word Embeddings

Word embeddings are one of the coolest things you can do with Machine Learning right now. Try the web app: Word2vec ..

07 word and sentence embeddings 05 why words from character to sentence embeddings

Keras Tutorial TensorFlow | Deep Learning with Keras | Building Models with Keras | Edureka

TensorFlow Training - ** This Edureka Keras Tutorial TensorFlow video (Blog: .

Review Session: Midterm Review

This midterm review session covers work vectors representations, neural networks and RNNs. Also reviewed is backpropagation, gradient calculation and ...