Hye Won Chung, Associate Professor of School of Electrical Engineering at KAIST (joint affiliation at School of Computing and Graduate School of AI)
Distinguished Lecturer of IEEE Information Theory Society (ITSOC) for 2025-2026
Curriculum Vitae: [CV]
Email: hwchung@kaist.ac.kr
Office: N1 Building Room 206, KAIST
Phone: +82-42-350-7441
Short bio
I am an Associate Professor at the School of Electrical Engineering at KAIST, jointly affiliated with School of Computing. I completed my Ph.D. in Electrical Engineering and Computer Science (EECS) at MIT in 2014. From 2014 to 2017, I worked at University of Michigan as a research fellow. I received my M.S. from MIT, and my B.S. (with summa cum laude) from KAIST in Korea, all in the Department of EECS.
My research interests include data science, information theory, statistical inference, machine learning, and quantum information. I want to provide theoretical framework to data science by using tools from information theory, statistical inference and machine learning. I also aim to develop efficient algorithmic tools in extracting and exploiting information in statistical inference problems.
Selected recent papers
Algorithms and Theory for Data Science and Machine Learning
Rethinking Self-Distillation: Label Averaging and Enhanced Soft Label Refinement with Partial Labels, ICLR 2025
Exact Matching in Correlated Networks with Node Attributes for Improved Community Recovery, IEEE Trans. Information Theory 2025
Rank-1 Matrix Completion with Gradient Descent and Small Random Initialization, NeurIPS 2023
Recovering Top-Two Answers and Confusion Probability in Multi-Choice Crowdsourcing, ICML 2023
A Worker-Task Specialization Model for Crowdsourcing: Efficient Inference and Fundamental Limits, IEEE Trans. Information Theory 2024
Binary Classification with XOR Queries: Fundamental Limits and An Efficient Algorithm, IEEE Trans. Information Theory 2021
Detection of Signal in the Spiked Rectangular Models, ICML 2021
Efficient Deep Learning, Robust and Trustworthy AI
CovMatch: Cross-Covariance Guided Multimodal Dataset Distillation with Trainable Text Encoder, NeurIPS 2025
VIPAMIN: Visual Prompt Initialization via Embedding Selection and Subspace Expansion, NeurIPS 2025
BWS: Best Window Selection Based on Sample Scores for Data Pruning across Broad Ranges, ICML 2024
Data Valuation without Training of a Model, ICLR 2023
Test-Time Adaptation via Self-Training with Nearest Neighbor Information, ICLR 2023
Self-Diagnosing GAN: Diagnosing Underrepresented Samples in Generative Adversarial Networks, NeurIPS 2021
News
(Nov. 2025) I will be serving as an Area Chair for ICML 2026.
(Sep. 2025) Three papers are accepted to NeurIPS 2025. Congrats to my co-authors!
CovMatch: Cross-Covariance Guided Multimodal Dataset Distillation with Trainable Text Encoder
VIPAMIN: Visual Prompt Initialization via Embedding Selection and Subspace Expansion
SNAP: Low-Latency Test-Time Adaptation with Sparse Updates
(July 2025) One paper is accepted to IEEE TIT.
Exact Matching in Correlated Networks with Node Attributes for Improved Community Recovery [arXiv]
(Feb. 2025) Our lab's research on data-efficient learning has been selected for the NRF Mid-Career Research Funding (Leap Research Program), a newly established initiative supporting researchers with outstanding results to continue their work seamlessly. This achievement was highlighted as an exemplary case in the Republic of Korea Policy Briefing on the Ministry of Science and ICT [link].
(Jan. 2025) One paper is accepted to TMLR.
Label Distribution Shift-Aware Prediction Refinement for Test Time Adaptation [arXiv]
(Jan. 2025) One paper is accepted to ICLR 2025.
Rethinking Self-Distillation: Label Averaging and Enhanced Soft Label Refinement with Partial Labels
(Jan. 2025) I am selected as a Distinguished Lecturer of IEEE Information Theory Society (ITSOC) for 2025-2026.
(Jan. 2025) I will serve as a TPC for ISIT 2025.