Stanford CS224N NLP with Deep Learning | 2023 | Lecture 9 - Pretraining

Stanford Online
Stanford Online
26.7 هزار بار بازدید - 10 ماه پیش - For more information about Stanford's
For more information about Stanford's Artificial Intelligence professional and graduate programs visit: https://stanford.io/ai

This lecture covers:
1. A brief note on subword modeling
2. Motivating model pretraining from word embeddings
3. Model pretraining three ways
1. Decoders
2. Encoders
3. Encoder-Decoders
4. Interlude: what do we think pretraining is teaching?
5. Very large models and in-context learning

To learn more about this course visit: https://online.stanford.edu/courses/c...
To follow along with the course schedule and syllabus visit: http://web.stanford.edu/class/cs224n/

John Hewitt
https://nlp.stanford.edu/~johnhew/

Professor Christopher Manning
Thomas M. Siebel Professor in Machine Learning, Professor of Linguistics and of Computer Science
Director, Stanford Artificial Intelligence Laboratory (SAIL)

#naturallanguageprocessing #deeplearning
10 ماه پیش در تاریخ 1402/06/28 منتشر شده است.
26,790 بـار بازدید شده
... بیشتر