Week2 assignment 2 ex 2

Test failed
Expected value

[‘Sequential’, (None, 160, 160, 3), 0]

does not match the input value:

[‘TensorFlowOpLayer’, [(None, 160, 160, 3)], 0]

AssertionError Traceback (most recent call last)
in
10 [‘Dense’, (None, 1), 1281, ‘linear’]] #linear is the default activation
11
—> 12 comparator(summary(model2), alpaca_summary)
13
14 for layer in summary(model2):

~/work/W2A2/test_utils.py in comparator(learner, instructor)
19 “\n\n does not match the input value: \n\n”,
20 colored(f"{a}", “red”))
—> 21 raise AssertionError(“Error in test”)
22 print(colored(“All tests passed!”, “green”))
23

AssertionError: Error in test
please help me out

Are you sure that you used the parameter that specifies the data augmentation function rather than directly referencing the global definition of the function that happens to get passed in to alpaca_model?

But data_augmentation is a function which takes an argument, right? Did you read the comment line there? Here’s what it says:

# apply data augmentation to the inputs

So what might that look like?

should it be
x = data_augmentation(inputs)?

Yes, that looks right. But notice you then can’t use “inputs” as the input to the next step, right?

i used
x = preprocess_input(x)
in the next step but it still shows the same error

Ok, please show the current complete contents of that code cell.

Hmmm, that looks pretty similar to mine. The only differences are that you used GlobalMax pooling where they told you to use Global Average pooling. The only other difference is that I did not specify an activation on the Dense layer.

And what is the actual error output you are getting?

changing max to avg worked out :sweat_smile:
thanks a lot

1 Like

It never hurts to actually read the comments and I’ll bet the error message was also pretty explicit in this case. :nerd_face:

Glad to hear that you got it all to work!

Also thanks a lot for editing your posts to remove the source code! Now that you’ve got the solution, it’s better not to leave that lying around on the forums. Thanks!

1 Like