Course1 week 4 last assignment

Hello here ,
I wonder why i’m getting this kind of error:

in two_layer_model(X, Y, layers_dims, learning_rate, num_iterations, print_cost)
92 # YOUR CODE STARTS HERE
93
—> 94 parameters = update_parameters(parameters ,grads , learning_rate )
95
96

~/work/release/W4A2/dnn_app_utils_v3.py in update_parameters(parameters, grads, learning_rate)
378 # Update rule for each parameter. Use a for loop.
379 for l in range(L):
→ 380 parameters[“W” + str(l+1)] = parameters[“W” + str(l+1)] - learning_rate * grads[“dW” + str(l+1)]
381 parameters[“b” + str(l+1)] = parameters[“b” + str(l+1)] - learning_rate * grads[“db” + str(l+1)]
382

ValueError: operands could not be broadcast together with shapes (7,12288) (1,7)

from the broadcasting general rule , (m,n) + ( 1 , n ) or ( m ,1) = (m , n) ,
which means in order to succeed with the rule , We need to transpose the parameters(7,12288).

I try , however its not working.

is there anything i’m missing?

Yes, what you are missing is that the gradient of an object should be the same shape as the base object. The line of code in question would not be a problem if the shape of dW1 were correct.

If you look at the shape of W1 in the “two layer” case it is 7 x 12288. So the question is why did the shape of dW1 end up as 1 x 7 when it should also have been 7 x 12288?