Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 8 – Translation, Seq2Seq, Attention

Stanford Online
Stanford Online
120.3 هزار بار بازدید - 5 سال پیش - For more information about Stanford’s
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3Cbvt8s

Professor Christopher Manning & PhD Candidate Abigail See, Stanford University
http://onlinehub.stanford.edu/

Professor Christopher Manning
Thomas M. Siebel Professor in Machine Learning, Professor of Linguistics and of Computer Science
Director, Stanford Artificial Intelligence Laboratory (SAIL)

To follow along with the course schedule and syllabus, visit: http://web.stanford.edu/class/cs224n/...

0:00 Introduction
1:07 Overview
2:46 1950s: Early Machine Translation
5:02 1990s-2010s: Statistical Machine Translation
8:51 What is alignment?
9:23 Alignment is complex
11:27 Learning alignment for SMT
12:36 Decoding for SMT
17:28 What is Neural Machine Translation?
20:54 Sequence-to-sequence is versatile!
27:56 Training a Neural Machine Translation system
32:05 Exhaustive search decoding
35:14 Beam search decoding: example
37:50 Beam search decoding: stopping criterion
39:25 Beam search decoding: finishing up
44:41 Disadvantages of NMT?
46:52 How do we evaluate Machine Translation?
50:32 MT progress over time
51:34 NMT: the biggest success story of NLP Deep Learning
53:31 So is Machine Translation solved?
5 سال پیش در تاریخ 1398/02/12 منتشر شده است.
120,375 بـار بازدید شده
... بیشتر