Jump to Content

Meta-Learning for Semi-Supervised Few-Shot Classification

Eleni Triantafillou
Jake Snell
Josh Tenenbaum
Mengye Ren
Richard Zemel
Sachin Ravi
ICLR (2018)

Abstract

In few-shot classification, we are interested in learning algorithms that train a classifier from only a handful of labeled examples. Recent progress made in few-shot classification has featured meta-learning, in which a parameterized model for a learning algorithm is defined and trained on episodes representing different classification problems, each with a small labeled training set and its corresponding test set. In this work, we advance this few-shot classification paradigm towards a scenario where unlabeled examples are also available within each episode. We consider two situations: one where all unlabeled examples are assumed to belong to the same set of classes as the labeled examples of the episode, as well as the more realistic situation where examples from other {\it distractor} classes are also provided. To address this paradigm, we propose novel extensions of prototypical networks (Snell et al. 2017) that are augmented with the ability to use unlabeled examples when producing prototypes. These models are trained in an end-to-end way on episodes, to learn to leverage the unlabeled examples successfully. We evaluate these methods on versions of the Omniglot and mini-ImageNet benchmarks, adapted to this new framework augmented with unlabeled examples. We also propose a new split of ImageNet. Our experiments confirm that our prototypical networks can learn to improve their predictions due to unlabeled examples, much like a semi-supervised algorithm would.

Research Areas