The embedding matrix is just given to us as an input for the process in that slide. Needless to say, creating that embedding matrix was a very expensive training process that someone else took care of for us, but we can just load their trained result and use it. There are a number of pretrained “word embedding” systems e.g. Word2Vec and GloVe. The first 9 lectures in Week 2 of Course 5 discuss word embeddings, how they are trained and how to use them. If you missed that, you can go back and review that material.
If I am missing your point in my response above, please give us a more specific reference to which lecture (and the time offset) you are asking about.
If that is the question, then it is created by training a word embedding model. Prof Ng covers several techniques for doing that in the lectures in Week 2.