Hey Shan!
Check the layers that you are calling for passing the activation function. It should be caps (l).
Hello Shan,
You very well know why are we working on forward and backward pass and how we do this, right!
So, check the ‘for loops’ that you are working on:
A, cache= which layer are you trying to pull in for the string with relu as an activation function?
AL, cache= which layer are you pull in here for the string using sigmoid as an activation function?
You have mix-matched everything in both the cases.
Yes, as Rashmi says, it’s still mixed up, but in a different way. The first time you had the correct A, but the wrong W. Second time around, you’ve got the correct W, but now the A value is wrong.
Here’s a thread which shows you how to do the “dimensional analysis”, so that you know what should be happening at each layer. And this is just the forward case, which is supposed to be the easy one. Things get trickier when we’re going backwards, so it helps to have a “roadmap”.