I have checked this and this and it does not help.
I am not shuffling the training data nor I forgot to take the absolute of the gradients. I am sure there is something in my code I am not doing right but I have been playing / tweaking etc… for the past 48h with no luck.
My structural similarity index is 0.91 and I can not improve it. Can someone please help? Happy to share any snippet of code.
train split set to use 80% only?
learning rate at 0.001 for training?
training look like that?
Hi and thanks so much for jumping in!
Here is what my training looks like
And yes for the 80% split for training and 0.001 for the learning rate for RMSProp
Here is the output of the grader:
Maybe I am not using the right loss function. Any hint please?
Your code should look substantially similar to the fragments in the Week 4 lecture notes .pdf file. You might also compare to code in the ungraded labs mentioned in the notebook. Unfortunately, I finished this course over 1 year ago, didn’t save copies of the ungraded labs, and seem to no longer have access to them. I do observe that your accuracy and loss during training are very similar to what I got. However, my submission obtained average 98% so there is something else different.
I notice that there is a structural similarity index utility method in the Python package skimage. Might be interesting to add it to your code and see if you get the same scores the grader does?
from skimage.metrics import structural_similarity as ssim
print("ssim = " + str(ssim(img, normalized_tensor)))
My first try at this raised an exception on incompatible sizes and I’m too lazy to track it down. But seems like it might be worthwhile if you’re stuck for this long. Hope it helps.
Thanks so much for the reply I did compare my code to the ungraded labs and it is similar except for the choice of the loss function as we are not going for a one neuron Dense layer but a 2 neurons one in the assignment.
Good call on the skimage package I will give it a try.
I had an extra maxpooling layer before the GlobalAveragePooling2D that was the issue. Thanks @ai_curious for your help on this!