Supervising topic models with Gaussian processes

作者:

Highlights:

• We propose the first model that can supervise Latent Dirichlet Allocation (LDA) by Gaussian Processes (GPs).

• LDA and GP are jointly trained by a novel variational inference algorithm that adopts ideas form Deep GPs.

• Differently from Supervised LDA (sLDA), our model learns non-linear mappings from topic activations to document classes.

• By virtue of this non-linearity, our model outperforms s LDA, as well as a disjointly trained cascade of LDA and GP in three real-world data sets from two different domains.

摘要

•We propose the first model that can supervise Latent Dirichlet Allocation (LDA) by Gaussian Processes (GPs).•LDA and GP are jointly trained by a novel variational inference algorithm that adopts ideas form Deep GPs.•Differently from Supervised LDA (sLDA), our model learns non-linear mappings from topic activations to document classes.•By virtue of this non-linearity, our model outperforms s LDA, as well as a disjointly trained cascade of LDA and GP in three real-world data sets from two different domains.

论文关键词:Latent Dirichlet allocation,Nonparametric Bayesian inference,Gaussian processes,Variational inference,Supervised topic models

论文评审过程:Received 24 November 2016, Revised 8 December 2017, Accepted 30 December 2017, Available online 30 December 2017, Version of Record 9 January 2018.

论文官网地址:https://doi.org/10.1016/j.patcog.2017.12.019