here

so, should I get it from previous lab?

Hey @someone555777,

Yes, you can definitely refer to the previous ungraded labs or the lecture videos for the formulae. But it would be better if you note down the formulae, and re-write the code yourself, instead of just copying the code. This will help you to understand how to implement the formulae for back-propagation, which is the motive behind this exercise. I hope this helps.

Cheers,

Elemento

should I know computing of backpropogation so deep?

Hey @someone555777,

If by “deep understanding”, you mean that you are supposed to know the equations by heart, then no. But if you mean that you are supposed to know how to transform the mathematical equations into code, and how to modify the mathematical equations if needed, then I suppose yes. Just copying the code won’t help you in any way, except completing the assignment. But if you play with the equations a little bit, it will help you to understand how the CBOW model works, in a much better way. Let us know if this resolves your issue.

Cheers,

Elemento

ok, I understood from lecture, that I should use np.sum. But why was it not used in ungraded labs ? And it doesn’t contain in formulas anything about 1m.T

Hey @someone555777,

If you take a look at the **Lab 03: Training CBOW**, you will find that the reason has been specified inside the lab itself. Let me quote it here for your reference:

Note: these formulas are slightly simplified compared to the ones in the lecture as you’re working on a single training example, whereas the lecture provided the formulas for a batch of examples. In the assignment you’ll be implementing the latter.

I hope this helps.

Cheers,

Elemento

Yes, it’s a bit confusing, because in all the other notebooks, the formulas were written down explicitly. You can find the formulas in the Video “Training a CBOW Model: Backpropagation and Gradient Descent” at minute 1:30. Hope that helps.

There are two ways. Another way to get the sum is multplying with a column vector of ones (np.ones).