Inference and Information for Data Science Lab
Hye Won Chung, Associate Professor of School of Electrical Engineering and School of Computing at KAIST
Curriculum Vitae: [CV]
Email: hwchung@kaist.ac.kr
Office: N1 Building Room 206, KAIST
Phone: +82-42-350-7441
Short bio
I am an Associate Professor at the School of Electrical Engineering at KAIST. I completed my Ph.D. in Electrical Engineering and Computer Science (EECS) at MIT in 2014. From 2014 to 2017, I worked at University of Michigan as a research fellow. I received my M.S. from MIT, and my B.S. (with summa cum laude) from KAIST in Korea, all in the Department of EECS.
My research interests include data science, information theory, statistical inference, machine learning, and quantum information. I want to provide theoretical framework to data science by using tools from information theory, statistical inference and machine learning. I also aim to develop efficient algorithmic tools in extracting and exploiting information in statistical inference problems.
Recent papers
Algorithms and Theory for Data Science
Rank-1 Matrix Completion with Gradient Descent and Small Random Initialization, NeurIPS 2023
Recovering Top-Two Answers and Confusion Probability in Multi-Choice Crowdsourcing, ICML 2023
A Worker-Task Specialization Model for Crowdsourcing: Efficient Inference and Fundamental Limits, IEEE Trans. Information Theory 2024
Binary Classification with XOR Queries: Fundamental Limits and An Efficient Algorithm, IEEE Trans. Information Theory 2021
Detection of Signal in the Spiked Rectangular Models, ICML 2021
Robust Hypergraph Clustering via Convex Relaxation of Truncated MLE, IEEE JSAIT 2020
Efficient Deep Learning, Robust and Trustworthy AI
BWS: Best Window Selection Based on Sample Scores for Data Pruning across Broad Ranges, ICML 2024
SelMatch: Effectively Scaling Up Dataset Distillation via Selection-Based Initialization and Partial Updates by Trajectory Matching, ICML 2024
Data Valuation without Training of a Model, ICLR 2023
Test-Time Adaptation via Self-Training with Nearest Neighbor Information, ICLR 2023
Self-Diagnosing GAN: Diagnosing Underrepresented Samples in Generative Adversarial Networks, NeurIPS 2021
News
(Apr. 2024) Two papers are accepted to ICML 2024
BWS: Best Window Selection Based on Sample Scores for Data Pruning across Broad Ranges
SelMatch: Effectively Scaling Up Dataset Distillation via Selection-Based Initialization and Partial Updates by Trajectory Matching
(Apr. 2024) I will give a tutorial at IEEE ISIT 2024 on the topic of "Graph Matching: Fundamental Limits and Efficient Algorithms"
(Apr. 2024) A new paper is accepted to IEEE ISIT 2024
Exact Graph Matching in Correlated Gaussian-Attributed Erdos-Renyi Model
(Mar. 2024) A new paper is published in IEEE TIT
A Worker-Task Specialization Model for Crowdsourcing: Efficient Inference and Fundamental Limits [arXiv]
(Feb. 2024) Delivered an invited talk at UCSD ITA on the topic of understanding self-distillation in multi-class classification
(Jan. 2024) I will be serving as an organizing committee of ISIT 2024
(Sep. 2023) A new paper is accepted to NeurIPS 2023
Rank-1 Matrix Completion with Gradient Descent and Small Random Initialization [arXiv]
(Aug. 2023) A paper is accepted to Allerton 2023
Graph Matching in Correlated Stochastic Block Models for Improved Graph Clustering
(Apr. 2023) Two papers are accepted to ICML 2023
Recovering Top-Two Answers and Confusion Probability in Multi-Choice Crowdsourcing [arXiv]
Efficient Algorithms for Exact Graph Matching on Correlated Stochastic Block Models with Constant Correlation [arXiv]
(Mar. 2023) Received Departmental Outstanding Teaching Award for EE623 Information Theory.
(Feb. 2023) Delivered an invited talk at UCSD ITA on the topic of "Data Valuation without Training of a Model".
(Feb. 2023) Two first PhD graduates from our lab, Daesung Kim and Doyeon Kim. Congrats!
(Feb. 2023) Daesung has been awarded College of Engineering PhD Dissertation award. Congrats!
(Jan. 2023) Two papers are accepted to ICLR 2023