Listen "Vector Space Semantics"
Episode Synopsis
In this module, we'll begin to explore vector space semantics in natural language processing. (This will continue into next week.) Vector space semantics are powerful because they allow us to represent words in a way that allows us to measure similarity between words and capture several other kinds of meaning. We'll start this module by exploring important concepts that underpin this topic, like the distributional hypothesis and term-by-document matrices, and then switch to cover a recent approach to vector space models called word embeddings
More episodes of the podcast Natural Language Generation
Practice Exam Review
26/04/2025
Final Exam Review
17/04/2025
Logical Representations of Sentence Meaning, Semantic Role Labeling & Information Extraction
03/04/2025
Parsing and Dependency Parsing
31/03/2025
Machine Translation
31/03/2025
Encoder-Decoders, BERT and Fine-tuning
17/03/2025
Transformers and Neural Text Generation
02/03/2025
Parts of Speech & Grammars
28/02/2025
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.