I’m working on Cours1-Week2- Logistic Regression with a Neural Network Mindset assignment, but can;t find a way to get J using dot product.
I kind of getted J like this:
{moderator edit - solution code removed}
But can;t find a way to get J using np.dot, can anyone help me with this?
The definition of the dot product of two vectors is that it’s the sum of the products of the corresponding elements of the vectors, so that is exactly what you need here. Doing it that way will be more efficient, since it’s one vector operation instead of two separate ones. But as Prof Ng explained in the lectures, we are using 2 dimensional vectors here and the vectors A and Y in this case are both of dimension 1 x m. The rules for dot products is that the “inner” dimensions need to match. So you need to transpose the second operand so that the dot product becomes 1 x m dotted with m x 1 to give you a 1 x 1 or scalar output, which is what we need here.
Here’s a thread which gives some concrete examples of how to do this.