Ph.D. candidate
Institute for Advanced Study (IAS)
Tsinghua University (THU)
Contact: daib09physics@gmail.com
Go to posters.
I am currently a Ph.D. student of the Institute for Advanced Study (IAS), Tsinghua University (THU). I am also under the joint Ph.D. program between THU and Microsoft Research Asia (MSRA). My supervisors are Baining Guo and David Wipf.
My research interests include generative model, sparse learning and network compression. Now I am working on projects in analyzing the properties of variational autoencoder (VAE), which is a popular generative model, both theoretically and empirically.
I completed my BS in physics at Tsinghua University in 2013. I worked with Douglas Lin on the theory of planet formation and evolution from 2011 to 2014. In the first year of my Ph.D., I switched my research field to computer science. I worked with Baoyuan Wang on action recognition and highlight detection. Then I was supervised by Gang Hua on projects of unsupervised learning and clustering.
Bin Dai, Yu Wang, Gang Hua, John Aston, and David Wipf, “Connections with Robust PCA and the Role of Emergent Sparsity in Variational Autoencoder Models,” Journal of Machine Learning Research (JMLR), 2018 (to appear).
Bin Dai, Chen Zhu, Baining Guo, and David Wipf, “Compressing Neural Networks Using the Variational Information Bottleneck,” International Conference on Machine Learning (ICML), 2018 [poster] [slides] [code] [video]
Bin Dai, Yu Wang, Gang Hua, John Aston, and David Wipf, “Veiled Attributes of the Variational Autoencoder,” arXiv:1706.05148, 2017.
Yu Wang, Bin Dai, Gang Hua, John Aston, and David Wipf, “Green Generative Modeling: Recycling Dirty Data using Recurrent Variational Autoencoders,” Uncertainty in Artificial Intelligence (UAI), 2017.