Q1) weight constraints are used in both Convolutional Nets and RNN. What are their advantages and limitations? Hinton talks about linear Constraints. Does it mean that the error surface of both variables are the same? If not, associating them together would not avoid reaching their optimum values? Are there other types of constraining the weights that are commonly used?

Q2)  In RNN, we do back-propogation through time. Would it make RNN less useful for long runs, considering all the updates that are required?

Q3) In the video 7b hinton talks ways for specifying the target of RNNs. One is specifying the desired final state, another way is to specify some attractors, which specifies the desired output for the states before the final state. In which cases each one of these models are used and what are their differences considering the path they take on the error plane?

Advertisements

1 Response to “”


  1. 1 Sina Honari March 11, 2013 at 15:43

    Q1) weight constraint can be thought of as having one variable with two names or having two variables with the same values. Since technically it is the same variable it changes in the same error surface. However, they get different updates as they get different inputs. So, constraining the weights is a way to keep their values the same.

    Q2) yes, it technically makes it more difficult to optimize in long runs and different techniques are used to avoid it.

    Q3) If you get inputs during training it’s better to use RNNs with attracts to guide the learning algorithm. Otherwise, if you just have the final answer, it is better to use RNNs with the desired final state.


Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s





%d bloggers like this: