Listen "Neural Language Models"
Episode Synopsis
In this module, we'll take a look at neural network based language models, which, unlike the previous N-gram based language models that we looked at earlier, use word embedding based representations for their contexts. This allows them to make much better probabilistic prediction about the next word in a sequence, and they have become the foundation for large pre-trained language models like Chat GPT that have led to exciting innovations in the field of NLP.
More episodes of the podcast Natural Language Generation
Practice Exam Review
26/04/2025
Final Exam Review
17/04/2025
Logical Representations of Sentence Meaning, Semantic Role Labeling & Information Extraction
03/04/2025
Parsing and Dependency Parsing
31/03/2025
Machine Translation
31/03/2025
Encoder-Decoders, BERT and Fine-tuning
17/03/2025
Transformers and Neural Text Generation
02/03/2025
Parts of Speech & Grammars
28/02/2025
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.