Skip to content

Instantly share code, notes, and snippets.

@guan-yuan
Created April 11, 2018 14:42
Show Gist options
  • Save guan-yuan/54de7a43f64cc6b10ad5b4a8bbb164e8 to your computer and use it in GitHub Desktop.
Save guan-yuan/54de7a43f64cc6b10ad5b4a8bbb164e8 to your computer and use it in GitHub Desktop.
Stanford Sentiment Treebank

Stanford Sentiment Treebank 5 results collected

Stanford Sentiment Treebank—an extension of MR but with train/dev/test splits provided and fine-grained labels (very positive, positive, neutral, negative, very negative), re-labeled by Socher et al. (2013).

Stanford Sentiment Treebank

The Stanford Sentiment Treebank is the first corpus with fully labeled parse trees that allows for a complete analysis of the compositional effects of sentiment in language.

Dataset

Dataset

The corpus is based on the dataset introduced by Pang and Lee (2005) and consists of 11,855 single sentences extracted from movie reviews. It was parsed with the Stanford parser (Klein and Manning, 2003) and includes a total of 215,154 unique phrases from those parse trees, each annotated by 3 human judges. This new dataset allows us to analyze the intricacies of sentiment and to capture complex linguistic phenomena.

Results

SST1 DataSet

References Method Accuracy (%)
Socher et al. (2011) RAE 43.2
Socher et al. (2012) MV-RNN 44.4
Socher et al. (2013) RNTN 45.7
Kalchbrenner et al. (2014) DCNN 48.5
Irsoy et al. (2014) Deep Recursive NNs 49.8
Le et al. (2014) Paragraph-Vec 48.7
Kim (2014) CNN-multichannel 48.0
Zhu et al. (2015) LSTM on tree 48.0
Ma et al. (2015) DCNNs 49.5

SST2 Dataset

References Method Accuracy (%)
Socher et al. (2011) RAE 82.4
Socher et al. (2012) MV-RNN 82.9
Socher et al. (2013) RNTN 85.4
Kalchbrenner et al. (2014) DCNN 86.8
Le et al. (2014) Paragraph-Vec 87.8
Kim. (2014) CNN-multichannel 88.1

References

  • Dependency-based Convolutional Neural Networks for Sentence Embedding (ACL'15), M Ma et al. [pdf]
  • Long Short-Term Memory Over Tree Structures (ICML'15), X Zhu et al. [pdf]
  • A Convolutional Neural Network for Modelling Sentences (ACL'14), N Kalchbrenner et al. [pdf]
  • Distributed Representations of Sentences and Documents (ICML'14), Q Le et al. [pdf]
  • Deep Recursive Neural Networks for Compositionality in Language (NIPS'14), O Irsoy et al. [[pdf
  • Convolutional Neural Networks for Sentence Classification (EMNLP'14), Y Kim. [pdf]
  • Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank (EMNLP'13), R Socher et al. [pdf]
  • Semantic Compositionality through Recursive Matrix-Vector Spaces (EMNLP'12), R Socher et al. [pdf]
  • Semi-Supervised Recursive Autoencoders for Predicting Sentiment Distributions (EMNLP'11), R Socher et al. [pdf]

See Also

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment