Smoothed Embeddings for Certified Few-Shot Learning
Mikhail Pautov,
Olesya Kuznetsova,
Nurislam Tursynbek,
Aleksandr Petiushko Александр Петюшко,
Ivan Oseledets
December, 2022
Abstract
Randomized smoothing is considered to be the state-of-the-art provable defense against adversarial perturbations. However, it heavily exploits the fact that classifiers map input objects to class probabilities and do not focus on the ones that learn a metric space in which classification is performed by computing distances to embeddings of class prototypes. In this work, we extend randomized smoothing to few-shot learning models that map inputs to normalized embeddings. We provide analysis of the Lipschitz continuity of such models and derive a robustness certificate against -bounded perturbations that may be useful in few-shot learning scenarios. Our theoretical results are confirmed by experiments on different datasets.
Publication
In Advances in Neural Information Processing Systems 35 (NeurIPS 2022)
Sr. Director, Head of AI Research / Adjunct Professor / PhD
Principal R&D Researcher (20 years of experience), R&D Technical Leader (15 years of experience), and R&D Manager (10 years of experience). Running and managing industrial research and academic collaboration (45 publications, 40 patents). Hiring and transforming AI/ML teams. Inspired by theoretical computer science and how it changes the world.