Neural Networks and Deep Learning Week4 Assignment 2 exercise 9


what should i put instead of the activation cache in linear activation backword

Hi, Ali Waleed.

Just recall what you did in the forward_pass? Using the activation Relu—> Sigmoid. Now, while doing the backward prop, what activation will you initiate in the first place?

1 Like

thank you i solved the problem