Course 2 week 1 Assignment: Cats vs Dogs. Target is 95% accuracy

Hi,

I’ve done course 2 week 1 Assignment: Cats vs Dogs. Target is 95% accuracy.

5 convolution layers Conv2D 8, 16, 32, 64, 128.
optimizer=tf.keras.optimizers.Adam
Epoch 15/15
1125/1125 - 89s - loss: 0.0671 - accuracy: 0.9734 - val_loss: 0.6122 - val_accuracy: 0.8608 - 89s/epoch - 79ms/step

3 convolution layers Conv2D 32, 64, 128
optimizer=tf.keras.optimizers.Adam
Epoch 15/15
1125/1125 - 102s - loss: 0.0215 - accuracy: 0.9939 - val_loss: 1.1969 - val_accuracy: 0.8016 - 102s/epoch - 91ms/step

With tf.keras.optimizers.RMSprop accuracy not more than 90%

As I understand Adam optimized improved accuracy a lot.

My question what is the different between optimizes?
Why all examples with tf.keras.optimizers.RMSprop. Why course does not have information about Adam and what optimize to use?

Could you please explain me this chart results?
image

Best regards, Taras

That looks to me like overfitting. The model gets really good at minimizing the training loss. Too good. So good that it loses the ability to generalize to the validation set. Here’s an example graph I found on the interweb…

You can search on ‘graph model overfitting’ to find related images and articles.

Perfect explanation yet again @ai_curious
Coming to your query about optimizers you can definitely try others and explore those too
Adam is definitely a popular one and is used widely
You can explore other optimizers too and yes learning rate is also to be kept in mind
A learning rate that suites for RMSProp might not suite for Adam and viceversa
Try playing with these and you shall definitely learn alot more
Also the model shouldn’t overfit training data and that too depends on learning rate and optimizers used
As i said play with all these and see which gives the best fit on both training and test split
Hope this helps.

Thanks that’s clear about optimizer.

The problem was that I was trying first to use more conv layers, I was thinking it will help to achieve accuracy 95% but than I’ve found on community to use optimizer Adam. And optimizer helped a lot.

1 Like

Glad that you have it cleared!