Hi,
I am getting an error when evaluating the model with example batch. The error message is:
Your model is not compatible with the dataset you defined earlier. Check that the loss function and last layer are compatible with one another.
I have used sparse_categorical_entropy with softmax as the last layer of the model. Don’t understand why there is an error.
codes removed, against community guidelines
All the unit tests up to the previous step of preprocessing is passed.
Please explain the meaning of this (see quoted strings):
Since the test prints the error message when catching an exception, how about printing the stack trace and figuring out where the mistake is? You can create a new cell and run code without catching the exception and see the results. Odds are good that the final layer has incorrect shape.
For reference, see other multi class classification problems and observe the shape of output layer via model.summary().
Thank you so much for your hint. Since column name is not assigned to the tensor slice (dataset variable), extracting columns using name won’t work. I have changed the map/lambda function to access by position and it works with no issues.
Error message:
Your model is not compatible with the dataset you defined earlier. Check that the loss function and last layer are compatible with one another.
The final layer used 5 units with softmax activation function.
Hi, i also have this issue, and have tried some of the suggestions but still get the same issue (model is not compatible etc). I have followed hints in the lab. There is one and only 1 dense layer in the model, configures with 5 & softmax. From the hints here, i am now using loss of sparse_categorical_entropy and metrics of sparse_categorical_accuracy, along with adam optimizer. Any more hints? Thanks!