C4_W2_Assig1_Ex2

Hi,

Sorry, again a error I cannot find. In this case it seems a shape error. I checked all the params passed to the Keras functions and all of them seem correct.
Please, can you help me find where the mistake is?.

Many thanks

Joan Parés


,

It’s not a shape error.
The assert message says your code doesn’t give the correct values when “training” is set to False.

Hi @TMosh

Yes, but not understanding it, I pick the precedent error thinking it was a list. Now I realize is a selector not a list.

Anyway, if choosing the correct parameters is our job in this exercise, and I checked them a lot of times not seeing anything wrong. What else can I do?

BatchNormalization data sheet says, training=False: The layer will normalize its inputs using the mean and variance of its moving statistics, learned during training.

And then the values are wrong… Sorry, unable to work out the origin of the error.

Can you give a hint?

Many thanks

JoanSitges

There are lots of things that can be wrong besides just the BatchNorm calls. This whole assignment is an excruciating exercise in proof-reading. You have to check everything. Since your shapes are correct, the problem is probably not that you got the number of filters or strides or padding wrong anywhere. One common error is to use the wrong input value for the “shortcut” path. Other than that you just have to carefully check everything relative to the instructions. Note that your BatchNorm calls should be the same as the one example they show in the “first component” part of the template code.

Hi @paulinpaloalto

Thanks for your instructions. I suspected a error in the input of the BarchNorm of the Shortcut. Should be the output of the Conv2D, then is X_shortcut instead of X. I tried it with hope because I related this with your comments, but nothing, the error remains exactly the same as I attached.

There so few things to edit that the error should be evident, but I can’t see it. By the way in add(), why they recommend us the order X, X_shortcut? Should be the same, isn’t it?

Many thanks
Joansitges

It turns out you can do the add in either order. Addition is commutative of course, but the order does affect the structure of the compute graph that you get. That used to matter for the comparator logic later, but I think they moved that logic so that it is now part of the template code.

The key point is that it sounds like maybe there is some confusion about the fundamentals of the shortcut layer: the saved shortcut value from the beginning is the input to that layer, right? And then the output gets added to the X that is output from the straight thru path.