노영균
직함: 박사
School of Mechanical and Aerospace Engineering at Seoul National University
The theoretical study of nearest neighbors goes back to T. Cover and P. Hart’s work in the 1960s which is based on the asymptotic behavior of nearest neighbor classification with many data. Their best-known contribution is the upper bound of the error in the asymptotic situation, which is twice the Bayes error, as well as the idea of connecting nearest neighbor information to the underlying probability density functions. More recently, studies on nearest neighbors have developed various useful techniques for many contemporary machine learning algorithms showing how nearest neighbors can be better used from the theoretical perspective. In this talk, some of our works will be presented utilizing recent theoretical findings on nearest neighbors. First, metric learning methods will be introduced to minimize the finite sampling effect that produces a bias from the result in the asymptotic situation. Applications include the nearest neighbor classification and the estimation of variou s information-theoretic measures. Second, the metric learning in a finite sample situation is extended to different nonparametric methods such as Nadaraya-Watson (NW) regression. In particular, a proof will be provided showing the existence of a NW regression achieving the theoretically-minimum mean square error for any Gaussian data. All of the analysis in this talk is based on many data, and I will briefly explain my view on deep learning with large-scale data in comparison with the proposed method.
Yung-Kyun Noh is currently a BK Assistant Professor in the School of Mechanical and Aerospace Engineering at Seoul National University in Korea. His research interests are metric learning and dimensionality reduction in machine learning, and he is especially interested in applying statistical theory of nearest neighbors to real and large datasets. He received his B.S. in Physics from POSTECH, Korea and his Ph.D. in Computer Science from Seoul National University. He worked in the GRASP Robotics Laboratory at the University of Pennsylvania, USA, performing machine learning research for nonparametric methods.