I’ve passed the assignment with some strange results. My model achieves nearly perfect accuracy in the 1st epoch, and then =1.0 for all epochs, while val_accuracy=1.0 for all epochs.
Look at these weird images:
My model def is:
[edit]
I’ve replicated these results in PyCharm, so the problem is persistent.
I suspect there’s something wrong with the data. Any hypotheses?
Posting code in a public topic is discouraged and can get your account suspended. It’s okay to share stacktrace on a public post and send code to a mentor via direct message. Please clean up the post.
Here’s the community user guide to get started.
Click my name and message your notebook as an attachment.
There’s a mistake in parse_data_from_file. As a result, the entire dataset is considered to belong to a single label. Here’s how your label distribution looks:
One way you can get zero cost and 100% accuracy without doing any training is if the data set is all zeros, and the labels are all zeros. Then if the model is initialized with zero weights, it immediately knows that 0 = 0.