Having trouble with the alpaca_model function - the data_augmentation sequential layer has incorrect shape

I’m working on exercise 2 of Transfer Learning with MobileNet | Coursera where we have to build the alpaca_model function.

I supplied input_shape to the inputs layer of the model and passed those inputs into the data_augmentation sequential model. However, the test fails as this layer has shape (None, None, 160, None) instead of (None, 160, 160, 3). All the other layers have the correct shapes. All my previous and subsequent tests passed and I was even able to train the model with 90% accuracy. What could be going wrong here?

Assertion error:

Test failed 
 Expected value 

 ['Sequential', (None, 160, 160, 3), 0] 

 does not match the input value: 

 ['Sequential', (None, None, 160, None), 0]
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-85-0346cb4bf847> in <module>
     10                     ['Dense', (None, 1), 1281, 'linear']] #linear is the default activation
     11 
---> 12 comparator(summary(model2), alpaca_summary)
     13 
     14 for layer in summary(model2):

~/work/W2A2/test_utils.py in comparator(learner, instructor)
     21                   "\n\n does not match the input value: \n\n",
     22                   colored(f"{a}", "red"))
---> 23             raise AssertionError("Error in test")
     24     print(colored("All tests passed!", "green"))
     25 

AssertionError: Error in test

Alpaca model summary:

Model: "functional_33"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_61 (InputLayer)        [(None, 160, 160, 3)]     0         
_________________________________________________________________
sequential_1 (Sequential)    (None, None, 160, None)   0         
_________________________________________________________________
tf_op_layer_RealDiv_18 (Tens [(None, 160, 160, 3)]     0         
_________________________________________________________________
tf_op_layer_Sub_18 (TensorFl [(None, 160, 160, 3)]     0         
_________________________________________________________________
mobilenetv2_1.00_160 (Functi (None, 5, 5, 1280)        2257984   
_________________________________________________________________
global_average_pooling2d_18  (None, 1280)              0         
_________________________________________________________________
dropout_17 (Dropout)         (None, 1280)              0         
_________________________________________________________________
dense_17 (Dense)             (None, 1)                 1281      
=================================================================
Total params: 2,259,265
Trainable params: 1,281
Non-trainable params: 2,257,984
_________________________________________________________________

Interesting. I don’t think I’ve ever seen that specific error before. Are you sure you are calling data_augmentation, which is the actual function that was passed to you? As opposed to data_augmenter or something else?

We may need to see your code to figure this one out. I’ll DM you about that.

Hi @cfox

in case your issue still not resolved, your error output mentions the input shape from the mobilnet not passing to the sequential, so I suspect your below code need to be looked up on

Create the input layer (Same as the imageNetv2 input size)
make sure you recalled this layer as

tf.keras.Input and not as tf.keras.layers.Input

Regards
DP

Yes, I used data_augmentation instead of data_augmenter so it must be something else, hm. Code sent.

Hi @Deepti_Prasad. That is exactly how I called it.

Strangely, after re-opening the notebook today and re-running the code I no longer get the error (I made no changes). I’m not sure why that happened! Thanks to everyone who tried to help.

1 Like

It’s great that things work now. So your code was probably already correct, but the state of the notebook was out of sync. E.g. if you type new code into a cell and then call it again, that does nothing: it just reruns the old code. You actually have to click “Shift-Enter” on the modified cell to get the new code compiled into the runtime image. Now that you’re aware of this, you can run a little experiment to prove to yourself how this works:

Take a function that you already know works and passes its test cases. Now intentionally break it, e.g. multiply the return value by 2. Now run the test cell again and it still works. Then click “Shift-Enter” on the newly broken cell and then run the test again. Kaboom!

A cheap and easy way to make sure that what you see is what is actually being run is:

Kernel -> Restart and Clear Output
Save
Cell -> Run All Above

Note that the “Save” there is to make sure the Grader is also seeing the same code you’re seeing. Clicking “Submit” does not always do an autosave for you. In some courses it seems to, but I’m pretty sure that does not happen in DLS C4 and C5. But it’s easy to click “Save”, so safe is better than sorry.

2 Likes

Must’ve been! I definitely re-ran the cell with the function definition as I was printing different things, but never re-ran all the cells above or restarted the kernel. Thanks for your help!