Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save wolfram77/783d74a16b9c1293f16bc5efe7e1c997 to your computer and use it in GitHub Desktop.
Save wolfram77/783d74a16b9c1293f16bc5efe7e1c997 to your computer and use it in GitHub Desktop.
word2vec, node2vec, graph2vec, X2vec: Towards a Theory of Vector Embeddings of Structured Data : NOTES

Below are the important points I note from the 2020 paper by Martin Grohe:

  • 1-WL distinguishes almost all graphs, in a probabilistic sense
  • Classical WL is two dimensional Weisfeiler-Leman
  • DeepWL is an unlimited version of WL graph that runs in polynomial time.
  • Knowledge graphs are essentially graphs with vertex/edge attributes

ABSTRACT:

Vector representations of graphs and relational structures, whether handcrafted feature vectors or learned representations, enable us to apply standard data analysis and machine learning techniques to the structures. A wide range of methods for generating such embeddings have been studied in the machine learning and knowledge representation literature. However, vector embeddings have received relatively little attention from a theoretical point of view.

Starting with a survey of embedding techniques that have been used in practice, in this paper we propose two theoretical approaches that we see as central for understanding the foundations of vector embeddings. We draw connections between the various approaches and suggest directions for future research.

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment