BaselineST(+) 2022

BaselineST isn a novel machine learning model that incorporates the SoftTriple loss into existing distance metric learning techniques to elevate the state-of-the-art baseline performance in few-shot classification.

In this model, a feature extractor and a Baseline++ classifier is trained from scratch with a hybrid loss function that combines the SoftTriple loss and standard cross-entropy loss. The goal is to improve model performance by capturing the hidden distribution of the dataset by reducing the intra-class variance, as each class is assigned multiple representative centers rather than one from the conventional SoftMax loss function. In the fine-tuning stage, the general structure from Baseline++ is retained and a new classifier is trained and fine-turned with the labeled examples from the support set. The model is evaluated in tasks such as image classification (with CUB-200-2011 dataset) and cross-domain character recognition (Omniglot -> EMNIST).

An alternative model, denoted as BaselineST+, is also investigated in the Appendix section of the paper. In BaselineST+, a classifier is built with the SoftTriple loss by summing up the resulting weights of each centroid from the same class, which is then fed through a SoftMax function to finalize the probability for each class.

    Core Features
  • Incorporates SoftTriple loss into existing distance metric learning techniques
  • Trained a feature extractor and a classifier from scratch and fine-tuned the model with support set
  • Evaluated model performance in image classification and cross-domain character recognition
  • Reproduced and re-evaluated the experimental results of Baseline++
  • Investigated an alternative model that performed few-shot classification based on SoftTriple loss only
© 2022 Yuzhe Y. All Rights Reserved.