Week 3 assignment: bug in the test

Nevermind… forgot to look in cache

My code said that every function in this lab was implemented correctly. When I went to run the final test at the end of the lab 3 though, I don’t get the same output as the tutorial guide

That’s very different from what I see. Your cost values are lower, but the accuracy is way worse. That’s a puzzling combination! My guess is that you “hard-coded” something in such a way that the tests don’t catch it. FWIW here’s what I see in that section:

One thing to notice is that your cost values are pretty close, but not exactly the same, as those generated by the small test case for nn_model. Not sure what to conclude from that, but maybe you did something that hard-wires the number of samples or the X and Y values.

@paulinpaloalto Hi, I am having a similar issue to other posters here. All tests have passed, but i get 88/100. Could be an error on my code or is something related to the test code? How can I solve it? Thanks

It is an error in one of your functions. The problem is that the grader for some reason can’t reveal which function is the one that fails. Are you sure you carefully checked all your test outputs in the notebook? Of course the tests in the notebook can’t catch all errors. E.g. are you hard-coding anything such that it agrees with the sizes of the tests in the notebook, but then fails the grader.

  1. I checked with the “validate” option and get a “Success! Your notebook passes all the tests.” as Validation results.
  2. I manually checked each test output along the notebook and can’t detect a problem (All tests passed! in all the exercises)
  3. Checking if I have hardcoded something I found that at the beginning one task asked to hardcode the n_h value to 4. Then inside the nn_model when I call the function (n_x, n_h, n_y) = layer_sizes(X, Y) it takes this value that is then used on the parameters = initialize_parameters(n_x,n_h,n_y). I try changing the call to (n_x, n_h2, n_y) = layer_sizes(X, Y) so the n_h value is taken from parameter. This correct a problem with the 6 - Tuning hidden layer size output which previous this was always showing the same value I was not figuring why, but does not solve the 88/100 gradient result. I can’t find other hardcoded value. Could be something else? @paulinpaloalto
1 Like

I will need to look at your notebook to figure this out. Please check your DMs …

1 Like

You should not have had to make that change to how layer_sizes was called in nn_model. They gave you that logic in the template code and they specifically ignored the returned n_h value. But if you rewrite it the way you did, then, yes, you have to make sure to use the n_h that is passed in at the top level.

But the above is a warning sign of the type of thing to look out for: it seems like you got maybe a bit too creative in how you implemented things and rewrote parts of the given template code that didn’t need to be touched. It’s fine to do that, but you need to be sure you really understand what is going on if that’s the way you want to go. You should only have to modify things in the “YOUR CODE HERE” blocks.

1 Like

I am experiencing the same issue. Can someone please assist me?

1 Like

Have you read all the earlier discussions? There are a lot of issues on this thread. Please show us the error output you are getting.

1 Like

Hello, thank you for responding to me.
Here is a screenshot of the bug.

Anytime you get a shape mismatch, the first question is “what shape is it?” Print the shape of A2 or Z2.

print(A2.shape)

What is it? It is supposed to be 1 x m, where m is the second dimension of X (i.e. the number of samples). Now you need to figure out why it came out wrong.

1 Like

Thank you again for assisting me.

Here is the shape of A2. It seems like my first dimension of A2 is off. I am trying to figure out why.

UPDATE: I figured it out. I was extracting b1’s value from the dictionary and assigning it (erroneously) to b2.

1 Like

It’s great to hear that you were able to find the solution on your own power. Congrats and thanks for confirming.

2 Likes

Hi Britton_Quist,
I think I have the same issue. Althoug my cost values are slightly different I end up too with an accuracy of 58%. Did you solve this? Thx!

1 Like