In the programming assignment of week 4 of first course, I am facing problem in the code. I can’t find out what I am doing wrong. Please help
Please add the following lines to linear_forward
to debug the issue:
print(f'W.shape = {W.shape}')
print(f'A.shape = {A.shape}')
print(f'b.shape = {b.shape}')
Keep in mind that for dot product of 2 matrices, inner dimensions should match i.e. (m,n) \cdot (n,p) = (m, p)
For addition, ensure that dimensions of both operands are compatible.
all the dimensions are okay except the last one. what could be the reason?
The error is in your code of L_model_forward
function, for the sigmoid case (output layer). Input for this layer is the output of the last hidden layer and that is not A_prev
. Also, your parameters are also wrong. You have to grab it from a dictionary named parameters
.
Here’s a thread which walks you through the Dimensional Analysis for that test case. That will help you see more about what Saif is talking about. It looks like your code for handling the output layer is wrong.