Recursive AE and Noise Contrastive Estimation

Q1 – What a recursive autoencoders? What makes them more general than the RNN which they generalize?

Q2 – The review paper refers to something called “noise contrastive estimation”, what is it?

Advertisements

1 Response to “Recursive AE and Noise Contrastive Estimation”


  1. 1 Gabriel Bernier-Colborne April 8, 2013 at 16:38

    Q1: To illustrate how recursive AEs work, let’s look at how they have been used for natural language processing. We can use an AE to learn a representation of a sequence of 2 words. We can then apply AEs to any sequence of 2 words, using the same parameters. We can also use an AE to model a word and the representation of 2 words. In this way, we can construct a tree to learn representations for arbitrarily long sequences of words. These representations can then be used for many NLP tasks. The recurrent neural network is a specific case of recursive AEs, in which the structure of the models is a simple chain.

    Q2: Noise contrastive estimation is a method for estimating the parameters of an energy-based model without having to use the gradient of the log-likelihood. Basically, it relies on negative examples sampled from a distribution which is much flatter than the true distribution, and the use of a classifier to discriminate between positive and negative examples.


Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s





%d bloggers like this: