My central focus of research is in theoretical understanding of statistical machine learning, particularly from the following perspectives.
- Learning theory of loss functions, through which I like to study robustness to adversarial attacks (COLT2020) and class imbalance (AISTATS2020, AISTATS2021).
- Evaluation metrics of predictions and representations. Recently, I am interested in how it is possible to learn good representations via similarity in light of a downstream task (ICML2018, preprint).
I am glad to have discussions with those who have common interests! You may have a look at the slides of my past talks such as this to see my tastes.
- Jun 21, 2021: Our paper “Learning from Noisy Similar and Dissimilar Data” has been accepted by ECMLPKDD2021.
- May 17, 2021: We have publicized a corrigendum to our COLT2020 paper. The definition of calibrated losses is corrected and the proofs of our main results are modified.
- Jan 23, 2021: Our paper “Fenchel-Young Losses with Skewed Entropies for Class-posterior Probability Estimation” has been accepted by AISTATS2021!
- Jan 8, 2021: Our presentation at IBIS2020 got the best presentation award (1st place out of 116 presentations)!
- Nov 18, 2020: Our paper on similarity classification was accepted by Neural Computation!
- Oct 20, 2020: We are going to hold an online talk event on learning theory and loss functions, with Jessie and Yutong. The registration and further information is available here.