Inference and Information for Data Science Lab
Hye Won Chung, Associate Professor of School of Electrical Engineering, jointly affiliated with School of Computing, at KAIST
Curriculum Vitae: [CV]
Email: hwchung@kaist.ac.kr
Office: N1 Building Room 206, KAIST
Phone: +82-42-350-7441
Short bio
I am an Associate Professor at the School of Electrical Engineering at KAIST, jointly affiliated with School of Computing. I completed my Ph.D. in Electrical Engineering and Computer Science (EECS) at MIT in 2014. From 2014 to 2017, I worked at University of Michigan as a research fellow. I received my M.S. from MIT, and my B.S. (with summa cum laude) from KAIST in Korea, all in the Department of EECS.
My research interests include data science, information theory, statistical inference, machine learning, and quantum information. I want to provide theoretical framework to data science by using tools from information theory, statistical inference and machine learning. I also aim to develop efficient algorithmic tools in extracting and exploiting information in statistical inference problems.
Recent papers
Algorithms and Theory for Data Science and Machine Learning
Rank-1 Matrix Completion with Gradient Descent and Small Random Initialization, NeurIPS 2023
Recovering Top-Two Answers and Confusion Probability in Multi-Choice Crowdsourcing, ICML 2023
A Worker-Task Specialization Model for Crowdsourcing: Efficient Inference and Fundamental Limits, IEEE Trans. Information Theory 2024
Binary Classification with XOR Queries: Fundamental Limits and An Efficient Algorithm, IEEE Trans. Information Theory 2021
Detection of Signal in the Spiked Rectangular Models, ICML 2021
Robust Hypergraph Clustering via Convex Relaxation of Truncated MLE, IEEE JSAIT 2020
Efficient Deep Learning, Robust and Trustworthy AI
BWS: Best Window Selection Based on Sample Scores for Data Pruning across Broad Ranges, ICML 2024
SelMatch: Effectively Scaling Up Dataset Distillation via Selection-Based Initialization and Partial Updates by Trajectory Matching, ICML 2024
Data Valuation without Training of a Model, ICLR 2023
Test-Time Adaptation via Self-Training with Nearest Neighbor Information, ICLR 2023
Self-Diagnosing GAN: Diagnosing Underrepresented Samples in Generative Adversarial Networks, NeurIPS 2021
News
(Aug. 2024) PI for Basic Research Lab (BRL) on the topic of Theoretical Framework for Foundation Models
Jointly with Prof. Jaekyun Moon (KAIST), Prof. Jy-yong Sohn (Yonsei), and Prof. Dohyun Kwon (Univ. of Seoul), our group will develop a theoretical framework and algorithms for foundation models. Great thanks to National Research Foundation (NRF).
(Aug. 2024) A new paper is accepted to TMLR
Representation Norm Amplification for Out-of-Distribution Detection in Long-Tail Learning [paper]
(July 2024) Jointly with Lele Wang (UBC), I delivered a tutorial at ISIT 2024 on the topic of "Graph Matching: Fundamental Limits and Efficient Algorithms." Here are the slides (Part I, Part II).
(Mar. 2024) A new paper is accepted to IEEE TIT
Detection Problems in the Spiked Random Matrix Models [arXiv]
(Apr. 2024) Two papers are accepted to ICML 2024
BWS: Best Window Selection Based on Sample Scores for Data Pruning across Broad Ranges [paper] [video]
SelMatch: Effectively Scaling Up Dataset Distillation via Selection-Based Initialization and Partial Updates by Trajectory Matching [paper] [video]
(Apr. 2024) I will give a tutorial at IEEE ISIT 2024 on the topic of "Graph Matching: Fundamental Limits and Efficient Algorithms"
(Apr. 2024) A new paper is accepted to IEEE ISIT 2024
Exact Graph Matching in Correlated Gaussian-Attributed Erdos-Renyi Model
(Mar. 2024) A new paper is published in IEEE TIT
A Worker-Task Specialization Model for Crowdsourcing: Efficient Inference and Fundamental Limits [arXiv]
(Feb. 2024) Delivered an invited talk at UCSD ITA on the topic of understanding self-distillation in multi-class classification
(Jan. 2024) I will be serving as an organizing committee of ISIT 2024