Neural Nets for NLP 2019: Sentence and Contextualized Word Representations

Neural Nets for NLP 2019: Sentence and Contextualized Word Representations

Graham Neubig via YouTube Direct link

Intro

1 of 21

1 of 21

Intro

Class Central Classrooms beta

YouTube playlists curated by Class Central.

Classroom Contents

Neural Nets for NLP 2019: Sentence and Contextualized Word Representations

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Goal for Today
  3. 3 Where would we need/use Sentence Representations?
  4. 4 Sentence Classification
  5. 5 Paraphrase Identification (Dolan and Brockett 2005) • Identify whether A and B mean the same thing
  6. 6 Textual Entailment (Dagan et al. 2006, Marelli et al. 2014)
  7. 7 Model for Sentence Pair Processing
  8. 8 Types of Learning
  9. 9 Plethora of Tasks in NLP
  10. 10 Rule of Thumb 2
  11. 11 Standard Multi-task Learning
  12. 12 Thinking about Multi-tasking, and Pre-trained Representations
  13. 13 General Model Overview
  14. 14 Language Model Transfer
  15. 15 End-to-end vs. Pre-training
  16. 16 Context Prediction Transfer (Skip-thought Vectors) (Kiros et al. 2015)
  17. 17 Paraphrase ID Transfer (Wieting et al. 2015)
  18. 18 Large Scale Paraphrase Data (ParaNMT-50MT) (Wieting and Gimpel 2018)
  19. 19 Entailment Transfer (InferSent) (Conneau et al. 2017)
  20. 20 Bi-directional Language Modeling Objective (ELMO)
  21. 21 Masked Word Prediction (BERT)

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.