Courses from 1000+ universities
Coursera sees headcount decrease and faces lawsuit in 2023, invests in proprietary content while relying on Big 5 partners.
600 Free Google Certifications
Psychology
Marketing
Graphic Design
Mindfulness for Wellbeing and Peak Performance
Whole genome sequencing of bacterial genomes - tools and applications
Competencias para buscar, mantener y promocionar en un empleo
Organize and share your learning with Class Central Lists.
View our Lists Showcase
Explore document-level models, entity coreference, and discourse parsing in this less than 1-hour material by Graham Neubig, part of CMU's Neural Networks for NLP series.
Learn sentence and contextual word representations in NLP with Graham Neubig's 1-2 hour material, covering multi-task learning, pre-training, and language model transfer.
Explore the concept of attention in Neural Networks for NLP with Graham Neubig's 1-2 hour material, covering improvements, specialized varieties, and a case study.
Explore Recurrent Networks, LSTMs, and sentence modeling with Graham Neubig's 1-2 hour material. Gain insights into the strengths and weaknesses of recurrence in NLP.
Explore Neural Networks for NLP with Graham Neubig's material, covering Bag of Words, Convolution applications, and Convolutional Models of Sentence Pairs. 1-2 hours workload.
Explore Neural Networks for NLP with Graham Neubig's 1-2 hour material, covering word vectors, skip-grams, CBOW, and advanced methods for word vectors.
Explore neural networks for natural language processing with a focus on knowledge graphs in this less than 1-hour material by Graham Neubig.
Explore neural nets for NLP with Graham Neubig's material, covering document-level models, language modeling, coreference resolution, and discourse parsing. (1-2hr workload)
Explore neural models for dialogue response generation, diversity promoting objectives, and personality-infused dialog in this 1-2 hour material by Graham Neubig.
Explore latent random variables in neural networks for NLP with Graham Neubig's concise material, covering discriminative models, VAE objectives, and controllable text generation.
Explore neural semantic parsing with Graham Neubig's online material, covering topics from tree structures of syntax to neural models for semantic role labeling. Less than 1-hour workload.
Explore structured prediction basics in Neural Nets for NLP with Graham Neubig. Learn about types of prediction, sequence labeling, and training structured models in 1-2 hours.
Explore conditioned generation in neural nets for NLP with Graham Neubig's material, covering topics from language models to ensemble distillation. (1-2 hours)
Explore neural networks for NLP with Graham Neubig's online material. Learn to identify training problems, debug decoding, manage loss function, and prevent overfitting in under an hour.
Explore Recurrent Neural Networks (RNNs) for Natural Language Processing (NLP) with Graham Neubig's online material. Expect 1-2 hours of study on topics like LSTM structure, mini-batching, and more.
Get personalized course recommendations, track subjects and courses with reminders, and more.