Stuck in L_model_backward

Hi, I am stuck in this. I am following instructions in doc string.

for sigmoid layer I am using current cache as cache[L-1] and then while calling linear backward, I use current cache[0]

for relu layer, I am using cache[l],

I am getting different values of gradients than what is expected. I have spent several hours trying to fix it. Can instructors please look into my code and provide direction?

Hi @Ankit_Goyal , thanks for your post and welcome to the community. Let’s see if we can help you out. First of all, which week and which exercise are you referring to?