Residual networks wrong values when training=True

where is the “training” parameter??

{moderator edit - solution code removed}

For the shortcut path, in convolution layer you used X or X_shortcut?

1 Like

The training flag is managed for you by the test case. It controls whether training (learning) is done for the batch normalization parameters. Whether you do that or not is independent of whether you are running the model in what we normally think of as “training”, as opposed to “prediction” (inference) mode.

In the test for the convolutional_block function, the test case only really checks the values in the “training=True” case, so what this means is that there is something wrong with your logic. This whole assignment is a test of your careful proofreading abilities: there are lots of details that you have to translate from the instructions to your code and any one of them being off will throw off the results.

Please take another careful look through all the parameters that you are setting. If you still can’t see your mistake, then we can go to “Plan B” and look at your code.

Oh, sorry, you already gave us your code and what we can see matches the instructions. The one thing we can’t see, because it’s off the right side of the image, is the point that Deepti is asking about.

Update: I tried making the mistake of passing X as the input to the first shortcut function, but that gave me different results than you show.

Then I looked more closely at your code and you did not follow the instructions for the second thing you need to do in the shortcut layer. Please have another look at the instructions.

I will edit your post to remove the code.

X_shortcut

Thanks, I’ve fixed the second thing in the shortcut layer, I appreciate the quick response time and help. Everything works now, please delete the code.

Oh, you already did, nice.