Tensor("input_63:0", shape=(None, 160, 160, 3), dtype=float32)
Tensor("sequential_2/random_rotation_2/transform_26/ImageProjectiveTransformV2:0", shape=(1, 160, 160, 3), dtype=float32)
Model: "functional_45"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_63 (InputLayer) [(1, 160, 160, 3)] 0
_________________________________________________________________
sequential_2 (Sequential) (1, 160, 160, 3) 0
_________________________________________________________________
tf_op_layer_RealDiv_25 (Tens [(1, 160, 160, 3)] 0
_________________________________________________________________
tf_op_layer_Sub_25 (TensorFl [(1, 160, 160, 3)] 0
_________________________________________________________________
mobilenetv2_1.00_160 (Functi (None, 5, 5, 1280) 2257984
_________________________________________________________________
average_pooling2d_23 (Averag (1, 2, 2, 1280) 0
_________________________________________________________________
dropout_24 (Dropout) (1, 2, 2, 1280) 0
_________________________________________________________________
dense_18 (Dense) (1, 2, 2, 1) 1281
=================================================================
Total params: 2,259,265
Trainable params: 1,281
Non-trainable params: 2,257,984
as you can see above, I’m getting the output shape of layers as (1,…) instead of (None,…).
this is happening after I’m implementing data augmentation.
I’m printing the shapes of layers and the strange is that:
even the shape of inputs that we sent in the data_augmentation() function is changed from (None,…) to (1,…).
please help!