김주경
직함: 박사
Amazon Alexa
Many neural network models require a large number of labeled training examples for the sufficient model training. This talk focuses on transfer learning methods, which can improve the performance of the target tasks in such situations by leveraging resources or models for other tasks. Specifically, we introduce transfer learning methods for enriching word or sentence vector representations of neural network models by transferring linguistic knowledge. As word-level knowledge transfer, we show that enriching word embeddings with semantic lexicons such as thesauri and semantic intensity orders can improve the performance of both word-level and sentence-level NLP tasks. As sentence-level knowledge transfer, we introduce cross-domain and cross-lingual transfer learning methods utilizing both common/private representations, adversarial training, and other auxiliary objectives to improve the performance of sequence tagging tasks without specific domain/linguistic resources.
Joo-Kyung Kim is an applied scientist at Amazon Alexa working on deep learning for large-scale dialog systems. He received his Ph.D. from the Ohio State University advised by Eric Fosler-Lussier. During his Ph.D, he interned at Microsoft, NEC Laboratories, and Nuance working on deep learning for natural language understanding. He obtained a M.S. from Seoul National University advised by Byoung-Tak Zhang and a B.S. from Sogang University.