Strange error in dimenationality mismatch

According to hint,

You can do this by changing the units argument of the output Dense layer from 1 to 2, with one for each of the classes (i.e. cats and dogs).

The last layer should output 2.

However, when loading the model checkpoint using the code model.load_weights('0_epochs.h5'), as is suspected, the output needs to be 1001.

I am not sure how to fix this error. It seems that we have to discard the last layer. However, this contradicts the code here.

Yes but the expected output will be one hot encoded. In the do_salience(image, model, label, prefix) you have this line:

Define the expected output array by one-hot encoding the label

The length of the array is equal to the number of classes

expected_output

Right. I actually found that out as you are replying ~ Now there is a casting problem. I think I need to cast something into tf.float32.

There are instructions asking you to do that on the do_salience function!