Listen "Vector Space Models"
Episode Synopsis
This week, we will continue our exploration of vector space semantics and embeddings. We'll begin the module by wrapping up word embeddings and discussing bias in vector space models. Then, we'll discuss a variety of goals that any representation of word meaning should aim to achieve. These six goals will help us understand different aspects of word meaning and the relationships of words with other words. Then, we'll pivot to a coding demo that will provide you with a hands on experience working with vector space models and see how word embeddings can be used to retrieve words with similar meaning and to solve word analogy tasks.
More episodes of the podcast Natural Language Generation
Practice Exam Review
26/04/2025
Final Exam Review
17/04/2025
Logical Representations of Sentence Meaning, Semantic Role Labeling & Information Extraction
03/04/2025
Parsing and Dependency Parsing
31/03/2025
Machine Translation
31/03/2025
Encoder-Decoders, BERT and Fine-tuning
17/03/2025
Transformers and Neural Text Generation
02/03/2025
Parts of Speech & Grammars
28/02/2025
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.