/tf/W2A1/test_utils.py in comparator(learner, instructor)
21 “\n\n does not match the input value: \n\n”,
22 colored(f"{a}", “red”))
—> 23 raise AssertionError(“Error in test”)
24 print(colored(“All tests passed!”, “green”))
25
The number that is not matching there is the number of trainable parameters in that layer. The shape of the output is correct, but somehow you have too many trainable parameters. Note that it could be one of the conv layers in either the identity block or the convolutional block that is wrong. There was a recent bug discovered that the unit tests in the notebook do not catch errors with the padding on the identity block. You might want to check all those parameters carefully. You can also figure out which layer is wrong by studying the summary of your model to find that layer. That will tell you exactly at what point your code goes off the rails, but the summary is pretty long, so that may not be so easy to interpret.
Of course padding itself does not change the number of trainable parameters in a conv layer. And the output dimensions came out right, so it would have to be two errors: wrong padding and too big a filter size or too big a stride.
Actually, I think I have a better theory: I’ll bet you changed the code in the shortcut logic for convolutional_block. Of course addition is commutative, so when you add the shortcut value to the straight through value, it doesn’t matter which order you do the addition in terms of the actual resultant value. But if you change the order of the operands to “Add”, it changes the order of the computation graph that the “summary” function prints. They gave you the “Add” for X and X_shortcut as part of the template code in the new version of the course, but if you copied your code forward from an earlier version, you need to be careful to do the add in the order X plus X_shortcut. Here’s what that section of the template code looks like in my notebook (this is towards the end of the convolutional_block function):
### END CODE HERE
# Final step: Add shortcut value to main path (Use this order [X, X_shortcut]), and pass it through a RELU activation
X = Add()([X, X_shortcut])
X = Activation('relu')(X)
Please check that yours looks the same. Note that it is not an Honor Code violation to publish this section, since it is given to everyone as part of the template. It’s not illegal to change the template code, but extreme care is required if you “go there”.