Semi-supervised recursive autoencoders for predicting sentiment distributions

From Brede Wiki
Jump to: navigation, search
Conference paper (help)
Semi-supervised recursive autoencoders for predicting sentiment distributions
Authors: Richard Socher, Jeffrey Pennington, Eric H. Huang, Andrew Y. Ng, Christopher D. Manning
Citation: Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing  : 151-161. 2011 July
Editors:
Publisher: Association for Computational Linguistics
Meeting: 2011 Conference on Empirical Methods in Natural Language Processing
Database(s): Google Scholar cites
DOI: Define doi.
Link(s): http://www.socher.org/uploads/Main/SocherPenningtonHuangNgManning_EMNLP2011.pdf
Search
Web: DuckDuckGo Bing Google Yahoo!Google PDF
Article: Google Scholar PubMed
Restricted: DTU Digital Library
Services
Format: BibTeX

Semi-supervised recursive autoencoders for predicting sentiment distributions describes a method for sentiment analysis using a recursive autoencoder.

Each word is projected into a subspace spanning 100 dimension. From the data in this subspace the neighboring words are autoencoded. An entire sentence is hierarchical represented in a tree by representing word pairs with a latent variable or representing one word and one latent variable or two latent variables with a new latent variable. The weights between the layers are the same. Which pair of words to first merge is selected based on lowest reconstruction error.

Learning is by backpropagation.

[edit] Data

  1. Experience project (Potts, 2010). Data set available at http://www.socher.org/index.php/Main/Semi-SupervisedRecursiveAutoencodersForPredictingSentimentDistributions
  2. Movie reviews
  3. MPQA

[edit] Related papers

  1. Recursive deep models for semantic compositionality over a sentiment treebank
  2. Sentic patterns: dependency-based rules for concept-level sentiment analysis
Personal tools