学术报告通知: Bayesian Inference with Max-margin Posterior Regularization

清华大学计算机系朱军副教授将于11月23日上午来北邮作精彩学术报告。欢迎感兴趣的同学和老师踊跃参加。
 
报告题目: Bayesian Inference with Max-margin Posterior Regularization
报告人:朱军 副教授 清华大学智能技术与系统国家重点实验室智能媒体组
时间:2012年11月23日 10:00-11:30
地点:教三810会议室
 
报告摘要:Existing Bayesian models, especially nonparametric Bayesian methods, rely heavily on specially conceived priors to incorporate domain knowledge for discovering improved latent representations. While priors can affect posterior distributions through Bayes’ theorem, imposing posterior regularization is arguably more direct and in some cases can be more natural and easier. In this talk, I will present regularized Bayesian inference (RegBayes), a computational framework to perform posterior inference with a convex regularization on the desired post-data posterior distributions. When the convex regularization is induced from a linear operator on the posterior distributions, RegBayes can be solved with convex analysis theory. Furthermore, I will present some concrete examples, including MedLDA for learning discriminative topic representations and infinite latent support vector machines for learning discriminative latent features for classification. All these models explore the large-margin idea in combination with a (nonparametric) Bayesian model for discovering predictive latent representations. I will discuss both variational and Monte Carlo methods for approximate inference.
 
报告人简介:Dr. Jun Zhu is an associate professor in the Department of Computer Science and Technology at Tsinghua University. His principal research interests lie in the development of statistical machine learning methods for solving scientific and engineering problems arising from artificial and biological learning, reasoning, and decision-making in the high-dimensional and dynamic worlds. Prof. Zhu received his Ph.D. in Computer Science from Tsinghua University, and his advisor was Prof. Bo Zhang. He did post-doctoral research with Prof. Eric P. Xing in the Machine Learning Department at Carnegie Mellon University. His current work involves both the foundations of statistical learning, including theory and algorithms for probabilistic latent variable models, sparse learning in high dimensions, Bayesian nonparametrics, and large-margin learning; and the application of statistical learning in social network analysis, data mining, and multi-media data analysis.

发表评论

电子邮件地址不会被公开。 必填项已用*标注