I cannot achieve 95% training accuracy. I am getting a training accuracy of 0.2270 and loss of -946.9255
Please check the output layer activation function and the loss function.
Hello, I have a similar problem accuracy = 0.2270 (exactly) and val_loss = -67.514. My output layer activation function was softmax and loss function was binary_crossentropy.
I have an additional question. When I followed the instruction to put 5 nodes at final Dense layer, I got an error message “ValueError: logits
and labels
must have the same shape, received ((None, 5) vs (None, 1)).” My model managed to run when I put one node at the dense layer.
@gustavyeung
Your loss function is incorrect. Please look at the labels and use the correct loss function.
I switched the loss function to categorical_crossentropy, and changed my last dense layer to take 5 units. I got the shaping problem back:
ValueError: Shapes (None, 1) and (None, 5) are incompatible
Your loss function is incorrect. Please look at the labels again. categorical_crossentropy
is a valid choice when labels are one hot encoded.
I changed the loss function to [code removed - moderator] and the model seems to be working. Thanks!
Hi, I’m trying C3_W2_Lab_2_sarcasm_classifier.ipynb but I’m not able to get val_acc>0.85. I have even tried dropout, changed hyperparameters, … What’s going bad?
Thanks
Carlos
The notebook say this:
If you used the default hyperparameters, you will get around 99% training accuracy and 80% validation accuracy.
.
Why are you concerned about achieving validation accuracy greater than 85% ?