Inference and Information for Data Science Lab
Hye Won Chung, Associate Professor at School of Electrical Engineering, KAIST
Curriculum Vitae: [CV]
Email: hwchung@kaist.ac.kr
Office: N1 Building Room 206, KAIST
Phone: +82-42-350-7441
Short bio
I am an Associate Professor at the School of Electrical Engineering at KAIST. I completed my Ph.D. in Electrical Engineering and Computer Science (EECS) at MIT in 2014. From 2014 to 2017, I worked at University of Michigan as a research fellow. I received my M.S. from MIT, and my B.S. (with summa cum laude) from KAIST in Korea, all in the Department of EECS.
My research interests include data science, information theory, statistical inference, machine learning, and quantum information. I want to provide theoretical framework to data science by using tools from information theory, statistical inference and machine learning. I also aim to develop efficient algorithmic tools in extracting and exploiting information in statistical inference problems.
Recent papers
Algorithms and Theory for Data Science
Rank-1 Matrix Completion with Gradient Descent and Small Random Initialization, NeurIPS 2023
Recovering Top-Two Answers and Confusion Probability in Multi-Choice Crowdsourcing, ICML 2023
A Worker-Task Specialization Model for Crowdsourcing: Efficient Inference and Fundamental Limits, IEEE Trans. Information Theory 2023
Binary Classification with XOR Queries: Fundamental Limits and An Efficient Algorithm, IEEE Trans. Information Theory 2021
Detection of Signal in the Spiked Rectangular Models, ICML 2021
Robust Hypergraph Clustering via Convex Relaxation of Truncated MLE, IEEE JSAIT 2020
Efficient Deep Learning, Robust and Trustworthy AI
News
(Sep. 2023) A new paper is accepted to NeurIPS 2023
Rank-1 Matrix Completion with Gradient Descent and Small Random Initialization [arXiv]
(Sep. 2023) A paper is accepted to IEEE Trans. on Information Theory
A Worker-Task Specialization Model for Crowdsourcing: Efficient Inference and Fundamental Limits [arXiv]
(Aug. 2023) A paper is accepted to Allerton 2023
Graph Matching in Correlated Stochastic Block Models for Improved Graph Clustering
(Apr. 2023) Two papers are accepted to ICML 2023
Recovering Top-Two Answers and Confusion Probability in Multi-Choice Crowdsourcing [arXiv]
Efficient Algorithms for Exact Graph Matching on Correlated Stochastic Block Models with Constant Correlation [arXiv]
(Mar. 2023) Received Departmental Outstanding Teaching Award for EE623 Information Theory.
(Feb. 2023) Delivered an invited talk at UCSD ITA on the topic of "Data Valuation without Training of a Model".
(Feb. 2023) Two first PhD graduates from our lab, Daesung Kim and Doyeon Kim. Congrats!
(Feb. 2023) Daesung has been awarded College of Engineering PhD Dissertation award. Congrats!
(Jan. 2023) Two papers are accepted to ICLR 2023
Data Valuation without Training of a Model [arXiv]
Test-Time Adaptation via Self-Training with Nearest Neighbor Information [paper]
(Jan. 2023) Two new papers are available at arXiv.