Skip to content

This repository contains the experiments conducted in the ICLR 2022 spotlight paper "On the Importance of Firth Bias Reduction in Few-Shot Classification".

Notifications You must be signed in to change notification settings

ehsansaleh/firth_bias_reduction

Repository files navigation

Firth Bias Reduction in Few-shot Classification

This repository contains all the experiments conducted in the On the Importance of Firth Bias Reduction in Few-Shot Classification paper. For a concise and informal description of our work, check out our paper's website: https://ehsansaleh.github.io/firthfsl

To clone this repo with all three of its sub-modules, run:

git clone --recursive https://github.com/ehsansaleh/firth_bias_reduction.git

The Paper in Pictures

  • The MLE Bias in Few-shot Classification

    Here is a visualization to help you get the overall context of typical loss minimization (MLE) bias with only a few samples. drawing

  • Firth Bias Reduction in Few Words
    • For 1-Layer Logistic and Cosine Classifiers with the Cross-Entropy Loss

      All you need to do, is replace

      with

      where U is the uniform distribution over the classes, and lambda is a positive constant. The CE-term with the uniform distribution is basically the (negative) sum of the prediction log-probability values over all data points and classes.

    • General Firth Bias Reduction Form

      Add a log-det of FIM term to your loss minimization problem. That is, replace

      with

      ,

      This was proven to reduce the bias of your estimated parameters.

  • Firth Bias Reduction in a Geometric Experiment

    Here is a simple example show-casing average the MLE's bias from the true parameters in a geometric experiment with a fair coin, and the slow rate at which this bias disappears.

    drawing drawing

  • Firth Bias Reduction Improvements in Few-shot Classification Tasks

    Here is the effect of Firth bias reduction campared to typical L2 regularization in 16-way few-shot classification tasks using basic feature backbones and 3-layer logistic classifiers.

    drawing drawing

    Below is the effect of Firth bias reduction on cosine classifiers and S2M2R features.

    drawing drawing

    drawing

The Repository Structure

References

@inproceedings{ghaffari2022fslfirth,
    title={On the Importance of Firth Bias Reduction in Few-Shot Classification},
    author={Saba Ghaffari and Ehsan Saleh and David Forsyth and Yu-Xiong Wang},
    booktitle={International Conference on Learning Representations},
    year={2022},
    url={https://openreview.net/forum?id=DNRADop4ksB}
}

About

This repository contains the experiments conducted in the ICLR 2022 spotlight paper "On the Importance of Firth Bias Reduction in Few-Shot Classification".

Topics

Resources

Stars

Watchers

Forks