Hello @Nevermnd, @paulinpaloalto,
Use the following code for getting a reproducible model. This reference explains what’s going on with determinism.
EPOCHS = 40
VAL_SUBSPLITS = 5
BUFFER_SIZE = 500
BATCH_SIZE = 32
# Key additions:
tf.config.experimental.enable_op_determinism()
tf.keras.utils.set_random_seed(1)
# For more on determinism, check out
# https://www.tensorflow.org/api_docs/python/tf/config/experimental/enable_op_determinism
train_dataset = (
processed_image_ds
.cache()
.shuffle(BUFFER_SIZE) # original, better (but not needed here) to set the seed parameter explicity
.batch(BATCH_SIZE)
)
unet = unet_model((img_height, img_width, num_channels))
unet.compile(
optimizer='adam',
loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
metrics=['accuracy']
)
model_history = unet.fit(train_dataset, epochs=EPOCHS)
My results:
Epoch 1/40
34/34 [==============================] - 18s 216ms/step - loss: 2.0005 - accuracy: 0.4070
...
Epoch 5/40
34/34 [==============================] - 3s 83ms/step - loss: 0.4040 - accuracy: 0.8793
...
Epoch 40/40
34/34 [==============================] - 3s 82ms/step - loss: 0.3304 - accuracy: 0.9006
Cheers,
Raymond