Week 1 - Doubt in the Math

Hi!
I don’t understand how when lambda increases, the norm of the weight matrix comes close to zero.

Since λ/2m * norm(W) is the equation, shouldn’t the weight matrix increase as lambda increases?
Where am I going wrong?

1 Like

Did you look at the 2nd assignment for week 1 in course 2? It has all the equations for both forward and backward propagations.

I may not catch the point fully, but here is what I thought from your post.

I suppose what Andew talked is about updating the weights by backpropagations with partial differentiations, dw = \frac{\partial J}{\partial w^{[l]}} .
If we add L2 regularization term with \lambda, dw also has \frac{\lambda}{m}w^{[l]} term. Updating the weights by backpropagations is to subtract dw from the previous weights. So, if \lambda increases, then, updated weights become smaller.

1 Like

If you are taking the product of two positive numbers and you increase both of them, what happens to the product? Or suppose you are trying to get the product to stay roughly the same, but one of them is increased. What would you have to do to the other factor in order for the product to remain the same?