C2W4 ValueError: Shapes (None, 1) and (None, 26) are incompatible

Hi all,

Got this error in this assignment:
ValueError: Shapes (None, 1) and (None, 26) are incompatible

model summary
Model: “sequential_15”


Layer (type) Output Shape Param #

conv2d_34 (Conv2D) (None, 26, 26, 32) 320

max_pooling2d_33 (MaxPoolin (None, 13, 13, 32) 0
g2D)

conv2d_35 (Conv2D) (None, 11, 11, 64) 18496

max_pooling2d_34 (MaxPoolin (None, 5, 5, 64) 0
g2D)

flatten_10 (Flatten) (None, 1600) 0

dense_33 (Dense) (None, 128) 204928

dense_34 (Dense) (None, 26) 3354

=================================================================
Total params: 227,098
Trainable params: 227,098
Non-trainable params: 0

What I am doing wrong ? shouldnt be last layers 26 as number of classes ?

Please fix the loss function.

I solved the problem after

  1. Ensuring that the input_shape=(H, W, C) parameter of my first DNN layer matched the shape of the training & validation images: proper height, width, and number of channels for a black & white image
  2. Using the correct loss function. For a hint about the correct loss function, read the documentation for the categorical crossentropy loss, which was used in this week’s lab.
1 Like

I’m getting a similar error ValueError: Shapes (None,) and (None, 5, 5, 3) are incompatible. I’ve ensured that the input_shape of the first Conv2D layer is correct; using train_generator.x.shape[1:]. Any hints?

Please click my name and message your notebook as an attachment.

@Roger_Yu
There are 2 mistakes:

  1. Loss function.
  2. Number of neurons in the output layer.

@ghasi002
Please fix your loss function.

According to tf.keras.utils.image_dataset_from_directory documentation , the label_mode argument is:
String describing the encoding of labels . Options are:

  • ‘int’: means that the labels are encoded as integers (e.g. for sparse_categorical_crossentropy loss).
  • ‘categorical’ means that the labels are encoded as a categorical vector (e.g. for categorical_crossentropy loss).
  • ‘binary’ means that the labels (there can be only 2) are encoded as float32 scalars with values 0 or 1 (e.g. for binary_crossentropy).
  • None (no labels).

Choose the correct loss function according to the label type.

4 Likes

@Kouboura

This lab uses ImageDataGenerator but the same logic holds for specifying the loss function (i.e. depends on the representation of labels).

Although this API is deprecated as of 2.9, the certificate exam still asks for the candidate to be familiar with the older API.

1 Like

@Jian_WU1
Please fix the loss function of the NN. Use the loss function that’s applicable to multi-class classification where labels are encoded as integers and not as one-hot representation.

I am also having a similar issue with a different error message
Please how can I fix this

The error is different from the post under which it’s reported. Please create a new post.

I think you forgot to Flatten() :slight_smile:

1 Like

Hint → [code removed - moderator] !!!
You saved me!

@Darcos
Here are some hints:

  1. Number of units in the output layer should equal the number of classes you want to classify, in the case of a multi-class classification problem.
  2. It’s sufficient to specify input_shape parameter for the 1st layer of a NN.
  3. Loss function should account for the encoding of labels. Fix the loss function based on whether labels are one-hot encoded.

Sorry but I tried all of these techniques without result

Please explain what you mean by “without result”.
Send me your notebook as an attachment via a direct message.

Thanks, now it is working :slight_smile:

I had this same error. I was using ‘categorical_crossentropy’ loss function. I got arround the error by one-hot encoding my labels, but that felt wrong since the assignment expected output seemed to expect us to use the integer labels directly. Not quite sure what I missed…

There is a loss function that won’t require the labels to be one hot encoded ?

Please go through this section of the page to pick the loss that allows integers instead of one-hot encoded representations.

1 Like