@AJnoble your mistake is in
apply data augmentation to the inputs
You have to give the inputs as an argument to the model created by the augmentater() method
You can also check an example in this link it will help you clear your mistakes out.
@AJnoble your mistake is in
apply data augmentation to the inputs
You have to give the inputs as an argument to the model created by the augmentater() method
You can also check an example in this link it will help you clear your mistakes out.
I don’t know the default value of the training parameter as it is not actually mentioned in the documentation. Assuming the default value is true, if you remove it, then yes still we will get the same results due to the reasons you mentioned.
Thank you so much, it worked, the link was really helpful
Can someone please help me find the mistake in my code. I tried to follow the links provided in the discussion, but still getting errors. I think there is some issue with “preprocess_input”, but could figure it. Thanks!
{moderator edit - solution code removed}
You can try “x = preprocess_input(x)”
I also got this result. How did you fix it?
Hi
I was facing some problems with alpaca_model. I have compared my model’s (moddel2) summery and have found it does not match the required functional layer.
I feel that these images will help you understand the problem
I hope you’ll help me in this regard
Thank you
Can someone help with this error?
The first bits of the error:
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-20-11ffc7a7acb3> in <module>
----> 1 model2 = alpaca_model(IMG_SIZE, data_augmentation)
<ipython-input-19-c3ef45acbf9d> in alpaca_model(image_shape, data_augmentation)
47 ### END CODE HERE
48
---> 49 model = tf.keras.Model(inputs, outputs)
50
51 return model
/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/training.py in __new__(cls, *args, **kwargs)
240 # Functional model
241 from tensorflow.python.keras.engine import functional # pylint: disable=g-import-not-at-top
--> 242 return functional.Functional(*args, **kwargs)
243 else:
244 return super(Model, cls).__new__(cls, *args, **kwargs)
My code is below … I’m pretty sure it’s right. Or at least I thought it was …
{moderator edit - solution code removed}
Why is the Dense layer different than every other layer? At least in the way you have implemented it …
Thanks for posting these links, @mourka!
I was lost on this exercise and the links above really helped. Thanks!