Listen "Word Embeddings - A simple introduction to word2vec"
Episode Synopsis
Hey guys welcome to another episode for word embeddings! In this episode we talk about another popularly used word embedding technique that is known as word2vec. We use word2vec to grab the contextual meaning in our vector representation. I've found this useful reading for word2vec. Do read it for an in depth explanation.
p.s. Sorry for always posting episode after a significant delay, this is because I myself am learning various stuffs, I have different blogs to handle, multiple projects that are in place so my schedule almost daily is kinda packed. I hope you all get some value from my podcasts and helps you get an intuitive understanding of various topics.
See you in the next podcast episode!
p.s. Sorry for always posting episode after a significant delay, this is because I myself am learning various stuffs, I have different blogs to handle, multiple projects that are in place so my schedule almost daily is kinda packed. I hope you all get some value from my podcasts and helps you get an intuitive understanding of various topics.
See you in the next podcast episode!
More episodes of the podcast Code Logic
Collocations, Part Two (S3E2)
20/01/2022
Collocations, Part One (S3E1)
03/01/2022
Bag of Words in Natural Language Processing
09/10/2020
Lemmatization in Natural Language Processing
23/09/2020
Stemming in Natural Language Processing
17/09/2020
Tokenization in Natural Language Processing
14/09/2020
Data Cleaning in Natural Language Provessing
13/09/2020
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.