Smoothed Embeddings for Certified Few-Shot Learning

Abstract

Randomized smoothing is considered to be the state-of-the-art provable defense against adversarial perturbations. However, it heavily exploits the fact that classifiers map input objects to class probabilities and do not focus on the ones that learn a metric space in which classification is performed by computing distances to embeddings of class prototypes. In this work, we extend randomized smoothing to few-shot learning models that map inputs to normalized embeddings. We provide analysis of the Lipschitz continuity of such models and derive a robustness certificate against $l_2$-bounded perturbations that may be useful in few-shot learning scenarios. Our theoretical results are confirmed by experiments on different datasets.

Publication
In Advances in Neural Information Processing Systems 35 (NeurIPS 2022)
Aleksandr Petiushko Александр Петюшко
Aleksandr Petiushko Александр Петюшко
Sr. Director, Head of AI Research / Adjunct Professor / PhD

Principal R&D Researcher (15+ years of experience), R&D Technical Leader (10+ years of experience), and R&D Manager (8+ years of experience). Running and managing industrial research and academic collaboration (35+ publications, 30+ patents). Hiring and transforming AI/ML teams. Inspired by theoretical computer science and how it changes the world.