Week 2 assignment 2nd code function continuously fails with AssertionError: Wrong values when training=True. Stuck here

course:
deep learning specialisation / convolutional neural network
week 2 assignment, exercise 2.

UNQ_C2

GRADED FUNCTION: convolutional_block

I cannot get past the following error. Any help most appreciated , thank you. I suspect this is a jupyter env problem again. Is it?

where are you setting training true? to the conv2d layer or batchnormalization?

Also an unusual mistake is done in shortcut path where one needs to use X_shortcut for the conv2d layer.

Actually instructions before the grade cell explains all the points very clearly.

Refer again, you should be able to debug your code as I know you are good at it.

i am not setting training=TRUE anywhere. I suspect this may be in the training /test / assertion libraries. Also, on the shortcut path i use X_shortcut , so i have…(not all the code is displayed). Does this look wrong?
image

this is the issue, set the training to true when you do batchnormalisation in every component as well as the shortcut path. The instructions given below the exercise header mentions you to use training you set as true indirectly when it mentions if you set training to false weight will not get updated.

Also make sure you have used x_shortcut in the conv2d layer too.

Brilliant , that was it (both mistakes). Thank you so much. :pray: :grin:

1 Like

It’s possible that there are different versions of this course still active, but in the current version that I see we do not set training = True anywhere: it is handled in the test code.

The other important thing about the current version is that they supply their own import for the BatchNormalization function from the local utility file. It is a mistake to use the standard TF version of BatchNormalization.

But if you have gotten full credit from the grader, then your code is consistent with the version you are using, so maybe this is a red herring.

the version I had both mentioned otherwise but as I am not mentor for this course, I don’t have access to its latest version. But I am wondering how @Cormac_Garvey issue was still related to the same if you are mentioning labs were updated. probably l.t. needs to be notified of this, to avoid confusion.

what I remember the training part was removed for sequence model course assignment after course update. But I don’t remember for CNN.

Thank for the update though.

@Mubsi can you check if the test cell still needs any update based on the current version as paul mentions the training doesn’t need to be set to true.

Hi @paulinpaloalto and @Deepti_Prasad,

Cormac’s notebook seems to be the latest, as looking at the screenshot shared, training = True is part of the test code.

Idea is to not specify the training flag within the exercise implementation anymore, the test sets it on its own.

Cormac might have specified the training flag to True/False in the implementation and/or it was occurring because of TF version of BatchNormalization

1 Like

Hi, i can confirm that for the 2nd graded exercise i used BatchNormalization(axis=3)(X,training=True) for the 3 components. I used a similar approach with the shortcut path (but used X_shortcut)

can you try removing training=true and re-run your codes?
did you pass the assignment?

1 Like

It would be worth looking at your notebook to confirm what version you are using. I’ll send you a DM about that.

1 Like

To close the loop on the public thread, Cormac is using the current version of the notebook. But setting training = True was done in the code at the same time as some other fix must have been done and training = True is a red herring. The current code will still pass the tests if you remove the setting of training = True on the BatchNorm calls.