Skip to content

Instantly share code, notes, and snippets.

@sshleifer
Last active March 24, 2020 14:40
Show Gist options
  • Save sshleifer/5891c38b58a7f41927980899ea1b8ef0 to your computer and use it in GitHub Desktop.
Save sshleifer/5891c38b58a7f41927980899ea1b8ef0 to your computer and use it in GitHub Desktop.

BartModel (@sshleifer)

Bart is one of the first Seq2Seq models in the library, and achieves state of the art results on text generation tasks, like abstractive summarization. Three sets of pretrained weights are released:

  • bart-large: the pretrained base model
  • bart-large-cnn: the base model finetuned on the CNN/Daily Mail Abstractive Summarization Task
  • bart-large-mnli: the base model finetuned on the MNLI classification task.

Related:

Big thanks to the original authors, especially Mike Lewis, Yinhan Liu, Naman Goyal who helped answer our questions!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment