About me
🔎 For students
I'm open for accepting PhD students and interns at ISM. Feel free to reach out to me. Here are some of my favorite recent papers, but I'm more broadly interested in learning theory.
Research interests
- Learning theory
- classification-calibrated losses, proper scoring rule, property elicitation
- class probability estimation, probability calibration
- learning dynamics, gradient descent
- convex analysis
- Representation learning
- contrastive learning
- Online convex optimization
News
- Mar 28, 2025: My grant proposal to JST-BOOST (Japanese govermental 5 year research funding for early-career researchers in AI field) has been accepted (official info).
- Feb 25, 2025: I moved to the Institute of Statistical Mathematics as an associate professor.
- Jan 23, 2025: Our three new papers are accepted by AISTATS2025: (1) unified understanding of online inverse optimization via Fenchel-Young loss, (2) non-principal-centric model of inverse optimization via prediction market, and (3) a new loss class extending proper loss to incorporate the focal loss. Additionally, our two papers are accepted by ICLR2025: (1) hippocampus-inspired self-supervised learning and (2) scheduled knowledge distillation for language modeling.
- (archived)
Upcoming travels
- Aug 28-29: Sapporo (KAKENHI meeting)
- Nov 12-15: Okinawa (IBIS)
- Dec 2-7: San Diego (NeurIPS, TBD)
- (archived)