I am facing an issue at the below line of codes
A1, cache1 = linear_activation_forward(X , W1, b1, activation = ‘relu’)
A2, cache2 = linear_activation__forward(X, W2, b2, activation = ‘sigmoid’)
I have uploaded the error image for your perusal
Hi @hark99 , in a two layer neural network A2 is fed by A1. So have a close look at your input paramters for the A2, cache2 step.
1 Like
Previous becomes the input. Thanks
1 Like