I’ve spent hours trying to find the solution on these message boards to why I can’t match the expected accuracy. Could a Mentor please take a look at my notebook? I’ve tried multiple loss functions, one_hot encoding labels to make the non-sparce loss work, tried training with different model epochs as a starting point to boost accuracy, tried to use fallback runtime (but it’s unavailable) etc. Nothing gets me above 82% average accuracy.
As you stating you used fallback runtime, but you mentioned its unavailable? I didn’t get this unavailability mention?
Kindly refer the below post comment and cross check your codes once. You can make changes referring the comment. In case after making changes you still do not achieve the required accuracy let me know.
I’ve looked at that post and made the changes that I believe you are referencing. I get the same output (I’m pretty sure I had already applied your suggestions in the past, even the one-hot encoding of the labels).
The fallback runtime didn’t produce an error this morning, so I also applied that step too.
It looks like the network is learning, it just doesn’t reach 95% accuracy at the 18th epoch.
Ok send your notebook by downloading as notebook.ipynb but before rename your notebook as Justin Copy of C3W4 Assignment and then download as ipynb. Once you download you can DM me by clicking my name and then message.
I suspected it was some small, but key mistake on my part. I should also point out that I tried to run the code in Colab on my copy of the script, and it failed to reach 95% accuracy (92%).
I had to open Colab via the assignment link, copy my code in, switch to the fallback version, and only then did I reach 96% accuracy and pass. Seems this assignment is quite delicate to changes, so I really appreciate the quick responses from you.
I knew you would had to use fallback runtime again after the correction.
few of the courses of tensorflow had these issues due to version update. Sorry for the inconvenience but I notice people who got this setback learnt the most.