cs.LG

# Title:RAND-WALK: A Latent Variable Model Approach to Word Embeddings

Abstract: Semantic word embeddings represent the meaning of a word via a vector, and are created by diverse methods such as Latent Semantic Analysis (LSA), generative text models such as topic models, matrix factorization, neural nets, and energy-based models. Many methods use nonlinear operations ---such as Pairwise Mutual Information or PMI--- on co-occurrence statistics, and have hand-tuned hyperparameters and reweightings.
Often a {\em generative model} can help provide theoretical insight into such modeling choices, but there appears to be no such model to "explain" the above nonlinear models. For example, we know of no generative model for which the correct solution is the usual (dimension-restricted) PMI model.
This paper gives a new generative model, a dynamic version of the loglinear topic model of \citet{mnih2007three}. The methodological novelty is to use the prior to compute {\em closed form} expressions for word statistics. These provide an explanation for nonlinear models like PMI, {\bf word2vec}, and GloVe, as well as some hyperparameter choices.
Experimental support is provided for the generative model assumptions, the most important of which is that latent word vectors are fairly uniformly dispersed ("isotropic") in space.
The model also helps explain why low-dimensional semantic embeddings contain linear algebraic structure that allows solution of word analogies, as shown by~\citet{mikolov2013efficient} and many subsequent papers.
 Comments: refinement of the theory with tighter bound on the errors and fix of a bug in the proof of earlier version Subjects: Machine Learning (cs.LG); Computation and Language (cs.CL); Machine Learning (stat.ML) Cite as: arXiv:1502.03520 [cs.LG] (or arXiv:1502.03520v6 [cs.LG] for this version)

## Submission history

From: Tengyu Ma [view email]
[v1] Thu, 12 Feb 2015 02:50:08 UTC (1,281 KB)
[v2] Tue, 3 Mar 2015 00:49:42 UTC (3,735 KB)
[v3] Tue, 21 Apr 2015 21:14:43 UTC (22,970 KB)
[v4] Wed, 22 Jul 2015 23:51:24 UTC (810 KB)
[v5] Wed, 14 Oct 2015 04:27:00 UTC (1,519 KB)
[v6] Wed, 24 Feb 2016 01:28:03 UTC (1,655 KB)
[v7] Fri, 22 Jul 2016 20:09:25 UTC (5,699 KB)
[v8] Wed, 19 Jun 2019 21:54:20 UTC (5,699 KB)