Research
My research interests span probabilistic machine learning, representation learning and applications.
My recent researches focus on self-supervised representation learning and representation distillation methods by using contrastive learning.
|
|
Explaining Visual Biases as Words by Generating Captions
Younghyun Kim,
Sangwoo Mo,
Minkyu Kim,
Kyungmin Lee,
Jaeho Lee,
Jinwoo Shin
Preprint
Code
|
|
STUNT: Few-shot Tabular Learning with Self-generated Tasks from Unlabeled Tables
Jaehyun Nam,
Jihoon Tack,
Kyungmin Lee,
Hankook Lee,
Jinwoo Shin
ICLR, 2023,   (Spotlight Presentation)
Workshop on Table Representation Learning (NeurIPSW-TRL) 2022
Bronze Prize, Samsung Humantech Paper Awards 2023
Code | Openreview
|
|
RényiCL: Contrastive Representation Learning with Skew Rényi divergence
Kyungmin Lee,
Jinwoo Shin
NeurIPS, 2022
Code | Openreview
|
|
GCISG: Guided Causal Invariant Learning for Improved Syn-to-Real Generalization
Gilhyun Nam,
Gyeongjai Choi,
Kyungmin Lee
ECCV, 2022
|
|
Pseudo-spherical Knowledge Distillation
Kyungmin Lee,
Hyeongkeun Lee
IJCAI-ECAI, 2022
|
|
Representation Distillation by Prototypical Contrastive Predictive Coding
Kyungmin Lee
ICLR, 2022
Preliminary version presented at NeurIPS 2021 Workshop on Self-supervised Learning: Theory and Practice.
Openreview
|
|
Provable Defense by Denoised Smoothing with Learned Score function
Kyungmin Lee
ICLR Workshop on Security and Safety in Machine Learning Systems, 2021   (Travel Award)
|
|