I am struggling with how to write the for loop for the Relu activations. My implementation is given below:
for l in range(1, L):
A_prev = A
#(≈ 2 lines of code)
A, cache = linear_activation_forward(A_prev, parameters[‘W’ + str(l)], parameters[‘b’ + str(l)], “relu”)
caches.append(cache)
Hello @ras325
Your code for relu activation is right but you have to recheck your implementation for sigmoid activation. [Think: how to implement activation function.]