W4 Programming assignment 1 Excerise 8 partial failure

In exercise 8, after implementing linear_activation_backward and running test I get failure only for two test but rest passes. Any pointers on debugging failures since in this exercise I just need to invoke two already existing function sigmod/relu_backward and linear_backward which I have implemented earlier in exercise for which all test passed.

Can you please share full error?

Correct functions can return wrong answers if you pass them incorrect parameters, right? So the place to look for the bug is in linear_activation_backward. The usual errors here result in shape mismatches, but that didn’t happen in your case. Are you sure you used the right activation function pairing?

Note that the fact that you passed 4 tests and failed 2 is not a cause for celebration. You can take a look at the actual test code by clicking “File → Open” and then opening the file public_tests.py. You’ll see that when they check an answer, there are usually three tests for each one:

  1. The type of the answer
  2. The shape of the answer (if it’s an array)
  3. The actual values

So if you pass 1) and 2), it still means you logic is broken.

Rest of error

Code snippet for excercise

{moderator edit - solution code removed}

Why are you using np.multiply instead of simply calling relu_backward and sigmoid_backward? Did you look at how those functions work?

Oh yes got it I miss read text, thanks for the help

Gettting ValueError issue as: "not enough values to unpack (expected 3, got 2).

The error seems to be pointing the utilization of linear_backward if the activation is “sigmoid” in linear_activation_backward. I am using the linear_backward function there with dZ and cache as inputs. Not sure I understand this error. Any help would be much appreciated.

Thank you

That probably means you passed the wrong value as the cache when you called linear_backward. It’s supposed to be the “linear cache”, right? But you must have passed the whole cache entry for the layer. At each layer, the cache entry looks like this:

((A, W, b), Z)

So it is a “2-tuple”: a python tuple with 2 entries. The first entry in that tuple is the “linear cache” which is a 3-tuple, right? You could also figure this out by reading the code we wrote in forward propagation that created the caches in the first place. They were nice and gave you the logic in the template code to extract the two entries from the layer cache. Now all you have to do is use the appropriate element in both cases here.

I understand and this worked! thank you!