Skip to content

Latest commit

 

History

History
54 lines (54 loc) · 1.99 KB

2022-06-28-abbas22b.md

File metadata and controls

54 lines (54 loc) · 1.99 KB
title booktitle abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Sharp-MAML: Sharpness-Aware Model-Agnostic Meta Learning
Proceedings of the 39th International Conference on Machine Learning
Model-agnostic meta learning (MAML) is currently one of the dominating approaches for few-shot meta-learning. Albeit its effectiveness, the training of MAML can be challenging due to the innate bilevel problem structure. Specifically, the loss landscape of MAML is much complex with possibly many more saddle points and local minima than its empirical risk minimization counterpart. To address this challenge, we leverage the recently invented sharpness-aware minimization and develop a sharpness-aware MAML approach that we term Sharp-MAML. We empirically demonstrate that Sharp-MAML and its computation-efficient variant can outperform popular existing MAML baselines (e.g., +12% accuracy on Mini-Imagenet). We complement the empirical study with the convergence analysis and the generalization bound of Sharp-MAML. To the best of our knowledge, this is the first empirical and theoretical study on sharpness-aware minimization in the context of bilevel optimization.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
abbas22b
0
Sharp-{MAML}: Sharpness-Aware Model-Agnostic Meta Learning
10
32
10-32
10
false
Abbas, Momin and Xiao, Quan and Chen, Lisha and Chen, Pin-Yu and Chen, Tianyi
given family
Momin
Abbas
given family
Quan
Xiao
given family
Lisha
Chen
given family
Pin-Yu
Chen
given family
Tianyi
Chen
2022-06-28
Proceedings of the 39th International Conference on Machine Learning
162
inproceedings
date-parts
2022
6
28