My val_loss is reducing but the grader is failing my test.
History .pkl file attached, renamed to a txt file to pass the forum’s file filter.
history.py (1.2 KB)
My val_loss is reducing but the grader is failing my test.
History .pkl file attached, renamed to a txt file to pass the forum’s file filter.
history.py (1.2 KB)
The grader looks okay to me. This is the passing criteria: To pass this assignment the slope of your val_loss curve should be 0.0005 at maximum.
The code below is based on your history
pickle file
In [2]: slope, *_ = linregress(epochs, val_loss)
In [3]: slope
Out[3]: 0.0007160037980043799
In [4]: slope > 0.0005
Out[4]: True
But my ending val_loss is lower than my starting one, which was the underlying criteria:
To pass this assignment your val_loss
(validation loss) should either be flat or decreasing.
linregress
fits a straight line using the points in val_loss
. The slope signifies how flat the regression line is.
Even though your model validation loss could be decreasing, it’s not flat enough to pass the grader criteria.
So I should make the contrast between starting val_loss and ending val_loss more stronger?
Please answer this:
What’s the slope of a line when y-axis (validation loss) decreases as x-axis (epochs) increses?
This should help figure out the expectation on validation loss curve over epochs.