Problems with neural network structure

They are determined by the shapes of the input values that are passed in the test case. You can examine the code by clicking “File → Open” and then opening the file public_tests.py. Note that there are several different test cases with different shapes, precisely to test the fact that you are not hard-coding any of the values other than n_h.

1 Like

Thks now it’s too much to understand…
But just a quick question:
How is it possible that as they are random I have the same results
In my notebook?
Shouldn’t I have had a np.random.seed(1) as you have in this public_test file?

1 Like

But I forgot the vital question … beg your pardon

In a real case, when I call X, Y they are just (2,400) and (1,400) and these are the values to feed the model, right?
Even because the parameters are correctly calculated on these training values

1 Like

The contents of the arrays are random, but their shapes are fixed by the parameters you pass to np.random.randn. Of course for the purposes of layer_sizes, we only care about the shapes, right?

1 Like

Yes, we pass the training data to the model, so those are the shapes that it will see.

1 Like

Note that the point is this exercise is not to explain the concept of variable scope. That is something you should already know. The point is to understand how to extract the shape of an array. They gave examples of how to do that earlier and in the Week 2 exercises as well.

1 Like

Thks to pointed out the difference, I needed… but I thought that the meaning of the attribute np.random.seed() was exactly to have consistent results … so it turn out that is on the “values” of the shape that is working now because are the ones to had been defined by You?
But shouldn’t have been declared in my notebook also
“np.random.seed(1)” as it’s been done for the initialize_parameters function?
Hope my question is clear…
Many thanks

1 Like

Did you read the code in layer_sizes_test in public_tests.py? It sets the random seed, but it wouldn’t have needed to. The shapes are determined by the arguments passed to np.random.randn.

You do not need to set the seed in your code. If it is needed, it will be set in the test cells or test functions to make sure the results are consistent.

To see what I mean about not needing to set it for layer_sizes, watch this:

A = np.random.randn(2,3)
print(f"A = {A}")
print(f"A.shape = {A.shape}")
A = np.random.randn(2,3)
print(f"A = {A}")
print(f"A.shape = {A.shape}")
A = [[-0.29514235 -0.25292172 -0.67321961]
 [-0.03369255  1.14873569 -0.74375326]]
A.shape = (2, 3)
A = [[-1.36191267  1.80211363 -1.06122973]
 [-1.89979391 -0.36907566 -1.25025587]]
A.shape = (2, 3)

As you can see, the values of A are different in the two cases, but the shapes are the same, because I specifically asked for a 2 x 3 array. And the shape is all we care about in the layer_sizes function.

1 Like

AssertionError Traceback (most recent call last)
in
----> 1 Y_hat = forward_propagation(X, parameters)
2
3 print(Y_hat)

in forward_propagation(X, parameters)
22 ### END CODE HERE ###
23
—> 24 assert(Y_hat.shape == (n_y, X.shape[1]))
25
26 return Y_hat

AssertionError:
I have the error and not able to solve please guide me ?

1 Like

Anytime you get a shape mismatch, the first question is “what shape is it?” Then the next question is “Ok, how did it get that way?”

So the first step is to add a print statement like this to your logic:

print(f"Y_hat.shape = {Y_hat.shape}")

Then you need to figure out why that is not n_y x m.

1 Like

How to fix the problem?

1 Like

What does the print statement that I suggested show in your case?

1 Like

Hi, Atif.

Any update on this? My suggestion is to print all of n_y and the shapes of X and Y_hat to get a sense for the nature of the problem:

print(f"n_y = {n_y}“)
print(f"X.shape = {X.shape}”)
print(f"Y_hat.shape = {Y_hat.shape}")

1 Like

im getting the same error. 2 test cases are passed but 2 test cases are failed. i cant figure out where im going wrong. can someone please help me!

1 Like