C4 W2 EX2 TFOpLambda instead of TensorFlowOpLayer

For Assignment 2 of C4 W2 I get the following:

Test failed 
 Expected value 

 ['TensorFlowOpLayer', [(None, 160, 160, 3)], 0] 

 does not match the input value: 

 ['TFOpLambda', (None, 160, 160, 3), 0]

AssertionError                            Traceback (most recent call last)
<ipython-input-27-0346cb4bf847> in <module>
     10                     ['Dense', (None, 1), 1281, 'linear']] #linear is the default activation
     11 
---> 12 comparator(summary(model2), alpaca_summary)
     13 
     14 for layer in summary(model2):

~/Desktop/coursera_dl_spec_ass/W2A2/test_utils.py in comparator(learner, instructor)
     21                   "\n\n does not match the input value: \n\n",
     22                   colored(f"{a}", "red"))
---> 23             raise AssertionError("Error in test")
     24     print(colored("All tests passed!", "green"))
     25 

AssertionError: Error in test

These are the two layers corresponding to the data_augmentation. It seems that tf.keras.layers.experimental.preprocessing.RandomFlip and tf.keras.layers.experimental.preprocessing.RandomRotation are 'TFOpLambda’s instead of ‘TensorFlowOpLayer’.

Is this related to some change in a new tensorflow version? Anyone able to reproduce it?

Thanks

Yes, I think that means you are using a different version of TF. It looks like you are running the notebook on your own computer, so you will need to make sure you’re using the same versions of all packages if you want to get smooth sailing. Everything mutates really fast in this space. This is a non-trivial process and there are no “official” instructions. You’re on your own if you want to use an environment different than the course website, but here’s a thread from a fellow student with a lot of helpful pointers about how to get this to work.

4 Likes

I am getting this error while using the course website environment. Is there any other possible reason I am getting this error, or is there a way to ensure that my version of TF is the correct version? (My TF version is 2.10.0)

If you are using the course website, then the TF version is the matching one by definition. It looks like there must be other ways to get this error. Most likely it means you have used a different TF/Keras layer function someplace in the assignment than was expected. Is this the Transfer Learning with MobilNet assignment?

Yes, it was the Transfer Learning with MobileNet assignment, and I was able to get it to work. I had upgraded TF in the nb because I saw on stack overflow a suggestion to do so for an error code I was getting. I looked at the thread you linked in the previous reply to see that the TF version should be 2.3.0. Uninstalling TF and re-installing that specific version ended up working!

2 Likes

Glad to hear you found the real solution. For future reference the lesson there is that you can’t believe everything you find on StackExchange. It is never a good idea to install different versions of software packages used by the assignment if you are running in the course environment.

Sorry that I did not pick up the wrong version of TF. It didn’t occur to me that anyone would do something as crazy as run the installer within the notebook. :laughing:

Hi, I am also getting a similar error using the course website environment for Exercise 1 of Week 2 (ResNet-50):

"Expected value

[‘Add’, (None, 15, 15, 256), 0]

does not match the input value:

[‘TFOpLambda’, (None, 15, 15, 256), 0]"

I printed the model summary and the mismatch seems to occur before the 7th Activation - so I am guessing it is before the end of the first identity block of Stage 2 (which consist of convolutional_block and 2 identity blocks):

at "X = X + X_shortcut " …

The course TensorFlow version is 2.9.1.

Would appreciate any points on how to proceed.

They specifically pointed you to the “Add()” function in the instructions for the identity block. The two operations may be functionally equivalent (Add() and +), but they don’t look the same on the model “summary” output.

2 Likes

Thanks a lot - it works. Apologies for not paying attention to instructions.

No worries! But there is a lesson to be learned there: “saving time” by not reading the instructions carefully is usually not a net savings of time. You save a couple of minutes and then waste a lot more than that trying to figure out why it doesn’t work. :nerd_face:

1 Like