My central focus of research is in theoretical understanding of statistical machine learning, particularly from the following perspectives.
- Learning theory of loss functions, through which I like to study robustness to adversarial attacks (COLT2020) and class imbalance (AISTATS2020, AISTATS2021).
- Evaluation metrics of predictions and representations. Recently, I am interested in how it is possible to learn good representations via similarity in light of a downstream task (ICML2018, preprint).
I am glad to have discussions with those who have common interests! You may have a look at the slides of my past talks such as this to see my tastes.
- Jan 23, 2021: Our paper “Fenchel-Young Losses with Skewed Entropies for Class-posterior Probability Estimation” has been accepted by AISTATS2021!
- Jan 8, 2021: Our presentation at IBIS2020 got the best presentation award (1st place out of 116 presentations)!
- Nov 18, 2020: Our paper on similarity classification was accepted by Neural Computation!
- Oct 20, 2020: We are going to hold an online talk event on learning theory and loss functions, with Jessie and Yutong. The registration and further information is available here.
- Jun 12, 2020: Our preprint “Similarity-based Classification: Connecting Similarity Learning to Binary Classification” has been published. We revealed a simple connection between binary classification and similarity learning, enabling to provably train a classifier by learning similarity.