Here is the error I got.
It seems that the error is in the sigmoid_backward(), how should I solve this?
Here is the error I got.
One way I have in mind is to go to the sigmoid_backward()'s definition and put a float() for -Z, but it is obviously not the way I am supposed to do it.
Why you pass “activation” as the 2nd parameter for sigmoid_backword() ?
activation – the activation to be used in this layer, stored as a text string: “sigmoid” or “relu”
You need to pass an appropriate cache as the 2nd parameter.
Didn’t notice the cache has two values, thanks for reminding