On the C2W2 lab, I figured I’d try to use the Adam optimizer to speed things up since the training is super slow and requires babysitting or it self-cancels. However, when I tried to do so using similar syntax as RMSprop, it threw an error. In particular, this works:
from tensorflow.keras.optimizers import RMSprop
model.compile(optimizer=RMSprop(learning_rate=0.001),
loss='binary_crossentropy',
metrics=['accuracy'])
But this throws an error. What am I doing wrong?
from tensorflow.keras.optimizers import Adam
model.compile(optimizer=Adam(learning_rate=0.001),
loss='binary_crossentropy',
metrics=['accuracy'])