VBNets: Learning Entity Representations via Variational Bayesian Networks
In this talk, we present Variational Bayesian Networks (VBNets) - A novel scalable Bayesian hierarchical model that utilizes both implicit and explicit relations for learning entity representations. VBNets are designed for Microsoft Store and Xbox services that handle around a billion users worldwide. Different from point estimate solutions that map entities to vectors and are usually over confident, VBNets map entities to densities and hence model uncertainty. VBNets are based on analytical approximations of the intractable entities' posterior and the posterior predictive distribution of the data. We demonstrate the effectiveness of VBNets on linguistic, recommendations, and medical informatics tasks, where it is shown to outperform other alternative methods that facilitate Bayesian modeling with or without semantic priors. In addition, we show that VBNets produce superior representations for rare words and cold items.
If time permits, we will give a brief overview of several recent deep learning works in the domains of deep neural attention mechanisms, multiview representation learning and inverse problems with applications for natural language understanding, recommender systems, computer vision, sound synthesis and biometrics.