Problem in Week 4 Assignment #1 EX5

I am struggling with how to write the for loop for the Relu activations. My implementation is given below:

for l in range(1, L):
A_prev = A
#(≈ 2 lines of code)
A, cache = linear_activation_forward(A_prev, parameters[‘W’ + str(l)], parameters[‘b’ + str(l)], “relu”)
caches.append(cache)

Please help

Hello @ras325
Your implementation is correct. Can you please share your error report so that we can figure out what is not going right?

Please find my code attached.
Thank You

Hello @ras325
Your code for relu activation is right but you have to recheck your implementation for sigmoid activation. [Think: how to implement activation function.]

All the best :+1: