Can we use the Adam optimizer?

On the C2W2 lab, I figured I’d try to use the Adam optimizer to speed things up since the training is super slow and requires babysitting or it self-cancels. However, when I tried to do so using similar syntax as RMSprop, it threw an error. In particular, this works:

  from tensorflow.keras.optimizers import RMSprop
  model.compile(optimizer=RMSprop(learning_rate=0.001),
                loss='binary_crossentropy',
                metrics=['accuracy'])

But this throws an error. What am I doing wrong?

  from tensorflow.keras.optimizers import Adam
  model.compile(optimizer=Adam(learning_rate=0.001),
                loss='binary_crossentropy',
                metrics=['accuracy'])

What you need to do is go to tensroflow website and check how to use the adam optimizer and this is the guide of how you do it in general.

Thanks for the reply. Checking the documentation was the first thing I tried before posting, and I observed that at a high level, the two optimizers’ documentation suggested identical syntax. I also checked my DLS C2 notes which covered Adam, but they did not use it in TensorFlow. If you or others have a more specific hint or can point to an example similar to those used in the course, that would be welcome. Thanks again.

Changing from RMSProp to Adam works as expected… for me. Is it any “formal” error (syntax or whatsoever) you encounter?