Predicting with the model

Hi,
I am practicing with the “Mirrored Strategy: Basic” lab and I am trying to use model.predict().
I couldn’t extract a picture from the test dataset the way it is inputted in the example file and Instead I used :

from keras.datasets import fashin_mnist
import numpy as np
(train_X, train_y), (test_X, test_y) = fashin_mnist.load_data()

pred=train_X[0]
pred = np.expand_dims(pred, axis=-1)
pred.shape

result : (28,28,1)

when I run model.predict(pred) it gives me the error:

InvalidArgumentError: Graph execution error:
transpose expects a vector of size 3. But input(1) is a vector of size 4
[[{{node sequential/conv2d/ArithmeticOptimizer/ReorderCastLikeAndValuePreserving_uint8_Conv2D-0-TransposeNHWCToNCHW-LayoutOptimizer}}]] [Op:__inference_predict_function_75782]

I checked the model and the input should be (28,28,1) and it is the same for pred variable. any idea why I get this error and how I can fix it ?
thanks!

I think you might need to create a batched dataset for the predict to work, after all you are training with a batch dataset:

Batch the input data

BUFFER_SIZE = len(train_images)
BATCH_SIZE_PER_REPLICA = 64
GLOBAL_BATCH_SIZE = BATCH_SIZE_PER_REPLICA * strategy.num_replicas_in_sync

Create Datasets from the batches

train_dataset = tf.data.Dataset.from_tensor_slices((train_images, train_labels)).shuffle(BUFFER_SIZE).batch(GLOBAL_BATCH_SIZE)
test_dataset = tf.data.Dataset.from_tensor_slices((test_images, test_labels)).batch(GLOBAL_BATCH_SIZE)

Create Distributed Datasets from the datasets

train_dist_dataset = strategy.experimental_distribute_dataset(train_dataset)
test_dist_dataset = strategy.experimental_distribute_dataset(test_dataset)