Deep learning

Deep learning

by Giorgio Satta -
Number of replies: 1

For the week of March 17th, we will use the family of Recurrent Neural Network (RNN) to implement language models.

For those of you who are attending Deep Learning this semester: RNN will be explained later in April I think; you then need to go through the equations for these systems yourself before March 17th. You can either look into the Wiki page for RNNs, or else ask the professor of the DL course to anticipate to you the slides that will later be used for the RNN lecture.

In reply to Giorgio Satta

Re: Deep learning

by Giorgio Satta -

Starting on March 19th we will be using the neural network architecture called Transformer for several NLP tasks.

Those of you who are taking deep learning this year may not have heard about it at this early point. I then recommend that you spend at least 1 hour with this architecture, by looking into any of the tutorials that you can find on the web. You should especially focus on the notion of attention.

Two wonderful animation videos on the Transformer are also available at the following links, very remarkable: