Skip to content

Instantly share code, notes, and snippets.

@nikos-kekatos
Last active December 12, 2022 07:59
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save nikos-kekatos/020d74e40f2df324af6c0865c43f7ff8 to your computer and use it in GitHub Desktop.
Save nikos-kekatos/020d74e40f2df324af6c0865c43f7ff8 to your computer and use it in GitHub Desktop.
A list of resources for neural network compression

Resources for Neural Network Compression

To be checked & Evaluated

Good papers but not applicable

https://arxiv.org/pdf/2210.14991.pdf

Papers

  1. A Survey of Neural Network Compression, [arxiv]

    Transformer-based architectures that are commonly used in NLP and CV have millions of parameters for each fully-connected layer.

Papers & Code

  1. EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks [arxiv]

    we use neural architecture search to design a new baseline network and scale it up to obtain a family of models, called EfficientNets, which achieve much better accuracy and efficiency than previous ConvNets. In particular, our EfficientNet-B7 achieves state-of-the-art 84.3% top-1 accuracy on ImageNet, while being 8.4x smaller and 6.1x faster on inference than the best existing ConvNet. Our EfficientNets also transfer well and achieve state-of-the-art accuracy on CIFAR-100 (91.7%), Flowers (98.8%), and 3 other transfer learning datasets, with an order of magnitude fewer parameters.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment