Creating word embeddings

Hi, I like the course so far and am practicing concepts by applying to radiology texts. In the assignment word embeddings were provided for us to create our vector space models. But I am less clear on how to build word embeddings themselves. Would this be through the Word by Word Design or Word by Document Design? Is there a more concrete example on how to do this over a large library of documents.

Thanks!

1 Like

I am pretty sure that as you progress on with the Specialization this will be shown to you. You have word2vec, glove… models that have previously been developed ready to be used. Glove was developed by Stanford, there are others too, instead of you going and build the embedding model yourself you can use it of the shelf. As far as I remember this topic will be covered there too.

3 Likes