Advertisement

Week 6 – Lecture: CNN applications, RNN, and attention

Week 6 – Lecture: CNN applications, RNN, and attention Course website:
Playlist:
Week 6:

LECTURE Part A:
We discussed three applications of convolutional neural networks. We started with digit recognition and the application to a 5-digit zip code recognition. In object detection, we talk about how to use multi-scale architecture in a face detection setting. Lastly, we saw how ConvNets are used in semantic segmentation tasks with concrete examples in a robotic vision system and object segmentation in an urban environment.
0:00:43 – Word-level training with minimal supervision
0:20:41 – Face Detection and Semantic Segmentation
0:27:49 – ConvNet for Long Range Adaptive Robot Vision and Scene Parsing

LECTURE Part B:
We examine Recurrent Neural Networks, their problems, and common techniques for mitigating these issues. We then review a variety of modules developed to resolve RNN model issues including Attention, GRUs (Gated Recurrent Unit), LSTMs (Long Short-Term Memory), and Seq2Seq.
0:43:40 – Recurrent Neural Networks and Attention Mechanisms
0:59:09 – GRUs, LSTMs, and Seq2Seq Models
1:16:15 – Memory Networks

CNN,Yann LeCun,Deep Learning,RNN,LSTM,Attention,PyTorch,NYU,

Post a Comment

0 Comments