My name is Thanh Nguyen, a fourth-year Ph.D. student at Iowa State University. I have been working with Dr. Chinmay Hegde from ECE and Dr. Raymond Wong from Statistics (Texas A&M), starting from second year in Fall 2016. I spent my first year with software mining group under supervision of Dr. Tien Nguyen.
I received a B.Sc. in 2012 from Honors Program at Hanoi University of Science and Technology back in Vietnam. I worked as research engineer at Viettel R&D for 2 years before joining ISU in Summer 2015.
I am currently focusing on theoretical understanding and provable algorithms for unsupervised feature learning based on sparse coding and auto-encoders. I am broadly interested in statistical learning, learning theory and optimization.
- Dec, 24 2018 » Our paper on gradient dynamics of gradient descent for autoencoders got accepted to AISTAT-2019
- June, 15 2018 » Our paper on learning shallow autoencoders got accepted to ICML Workshop on Theoretical Foundations and Applications of Deep Generative Models
- June, 12 2018 » Good news! Our paper on learning mixed API and word embeddings from API documentation got accepted to ESEC/FSE-2018
- June, 04 2018 » A software engineering paper on Java template synthesis from English text got accepted to IEEE ICSME-2018
- May, 21 2018 » Started summer at Yahoo! Research, NYC
- May, 11 2018 » A paper on learning dictionary from incomplete samples got accepted to ICML-2018
- Feb, 05 2018 » Presented paper at AAAI-2018
- December, 01 » PhD qualifier oral defense
- November, 09 » Our paper on provable double-sparse coding got accepted to AAAI-2018
- July, 11 » My second talk in the reading group about Maximal Sparsity with Deep Nets
- July, 10-14 » Attended Midwest Big Data summer school at ISU
- July, 6 » My first talk on trainable sparse coding (LISTA and FacNet) in our summer reading group. Handwritten notes are available.
- June, 19-20 » Presented a poster on neurally plausible algorithm for dictionary learning at Midwest Machine Learning Symposium
- June-August » Our Data Science Lab has been organizing a summer reading group on theoretical foundations of deep learning. Check it out here