C1_W2_Programming_assignment (logistic regression with a neural network mindset)

Screenshot 2023-10-27 134627

I’ve been trying to fix this…but I’m stuck.

You have to transpose one term in cost.

The error changes to this…
one

And when I remove the transpose in dw, I get this…
two

At this point, I don’t understand what I’m doing. I’m just trying to make the error go away by trial and error.
Please help me to understand

Using * (multiply) is not the same as np.dot.

Right! Please see this thread for more about when to use * and when to use np.dot.

Also note that once you fix this you’ll probably have some other issues to deal with. Maybe you have already fixed it, but the code you show for the cost above is incomplete even after you fix the transpose issue. If you get the wrong cost value, please compare your code to the math formula for the cost shown in the instructions.

I changed my cost function and the error has been this…

ex5_error

When I used the dot product, I had wrong values

I’ve DM you.

Thank you for sending me your code. As Paul and I mentioned earlier, using * is not the same as using np.dot. Please read the thread Paul has linked for you and avoid making the same mistake again.

Thank you for your patience…
I have edited it according to the notation he describes.
Before I put the np.sum, it was giving me a 4×4 matrix. I thought that the result of a dot operation is a number, not a matrix.
How then is it that I get a matrix.

When I put the np.sum on the numerator
ex5_2ndcode


, it gives me a number but it’s a wrong value

When you do a dot product between two vectors or matrices, the order matters. Matrix multiplication is not a commutative operation in general. In this case you’ve got two vectors of dimension 1 x m. You did Y^T \cdot log(A), so that is m x 1 dotted with 1 x m, so you get an m x m output, right? But you want a scalar value.

Here’s a thread which shows an example of why your method is incorrect and shows the way to understand the correct method.

Thank you so much…