This project is about moment matching algorithms for topic models. Due to their computational speed and theoretical guarantees, they have recently emerged as strong competitors to graphical-model approximate inference techniques, such as variational inference or Gibbs sampling.
In brief, these moment matching algorithms are based on the construction of moment / cumulant tensors from the data and matching them to the respective theoretical expressions in order to learn such model parameters as, e.g., the topic matrix. Construction of these algorithms is twofold: (a) derivation of the moment / cumulant tensors and (b) construction of particular joint diagonalization type algorithms for matching the tensors. The two steps can be implemented independently and then combined. In our recent paper
we analyze both steps. For the tensors step, we focus on the latent Dirichlet allocation (LDA) and discrete independent component analysis (DICA) models. Importantly, we show that the latter model is similar and sometimes equivalent to the former and its cumulants have potentially better sample complexity than the LDA moments. For the algorithmic step, we demonstrate that the joint diagonalization algorithm has better performance than the tensor power method and spectral algorithm, which were earlier considered for the problem. Although we deal with tensors, we obtain fast algorithms through projecting these tensors onto vectors.
Our code is available. It contains implementations of the joint diagonalization algorithm (Matlab/C++), tensor power method (Matlab), and spectral algorithm (Matlab) for both the LDA moments and the DICA cumulants. If you are interested in reproducing our experiments and/or our datasets, check this repo. We used this Matlab/MEX/C implementation of LDA. If you are only interested in our Matlab/MEX/C++ implementation of the joint diagonalization algorithm, it is available here.