Hello,
I appreciate help with the result:_
The size of the input layer is: n_x = 5
The size of the hidden layer is: n_h = 4
The size of the output layer is: n_y = 2
Isn’t the input x1 and x2 (2 layer input ) instead of n_x = 5?
For the output, shouldn’t it be 1 output instead of n_y = 2?
1 Like
We are writing general purpose functions. Just because the actual data we are using in this exercise has 2 features and 1 output, that doesn’t mean that the layer_sizes
function shouldn’t be able to handle any other values for those parameters. The test case is checking that you did not “hard-code” any of those values other than the n_h value for which we have no choice but to hard code it.
I’ve been able to successfully complete all other exercises in the lab, but the lab grading fails me because of Exercise 2. In Exercise 1, I set shape_X = X.shape, so in Exercise 2 for n_x, I didn’t hardcode the values but instead set n_x = shape_X[0] which gives the value 2 (not 5 like the grader is suggesting). So I received the below error. What am I missing?
The size of the input layer is: n_x = 2
The size of the hidden layer is: n_h = 4
The size of the output layer is: n_y = 1
AssertionError Traceback (most recent call last)
in
5 print("The size of the output layer is: n_y = " + str(n_y))
6
----> 7 layer_sizes_test(layer_sizes)
~/work/release/W3A1/public_tests.py in layer_sizes_test(target)
14
15 assert type(output) == tuple, “Output must be a tuple”
—> 16 assert output == expected_output, f"Wrong result. Expected {expected_output} got {output}"
17
18 X = np.random.randn(7, 5)
AssertionError: Wrong result. Expected (5, 4, 2) got (2, 4, 1)
You must be hard-coding the values of the first and last dimensions. You are only supposed to hard-code the second dimension as 4. The others you need to derive from the shapes of the actual input parameters, not global variables.
1 Like
Thank you! Fixed by setting n_x directly as the output of the shape function X.shape[0] directly, instead of as a variable shape_X.
For my clarification, Is there a reason why setting n_x as shape_X fails if I had previously defined variable shape_X as X.shape[0] in a prior part of the lab?
(…also editing my prior post as I erroneously set n_x as variable shape_X previously, instead of the function X.shape[0])
This is a question about the concept of “scope” of variables in python. It is always a mistake to reference global variables from within your functions. That is what you did there: the variable shape_X
is not defined within the scope of the layer_sizes
function. What will the value of shape_X
be when the grader calls your layer_sizes
function? You have no idea and that is the point of why writing it that way is a mistake.
These courses are not set up as beginning python courses: it is assumed that you are already comfortable with python. If you are not familiar with the term “scope” or “global” that is a big warning sign and you should spend some time reading python documentation or tutorials on that topic. E.g. try googling “variable scope python” and read some articles that you find.
Or to put it another way: the reason that the test case failed is exactly because you depended on the setting of shape_X
, which had nothing to do with the actual parameters that were being passed to you in that test case. That test case was specifically designed to catch the type of mistake you were making there.
Thank you, Paul for the explanation. That makes sense - and a good reminder to develop a better grasp of python to be able to properly handle the labs of this specialization