Week3_Performance on other dataset

HI everyone!

As I learned from Prof. A.J. the more complex the model is; the more flexible it would be.
when we are increasing the number of hidden neurons; we are increasing the flexibility of the final model. So we expect the model to fit dataset better and lead to higher accuracy over training data set(even if this accuracy is the result of overfitting).

when I tested different hidden layer sizes using Gaussian dataset(one of the extra available datasets for practicing more); I found something different which was strange to me. the results are as follows.

Untitled

as it is clear based on the above image, when we the hidden layer is populated with 50 neurons, the model has less flexibility with respect to when it has only 5 neurons.

I expected the result of 50 neurons case, be a boundary with a lot curvature in it. but it was not!!

anyway… I would be really grateful if someone can explain me the probable reasons.

Test on noisy moon data set also has such problem.

Hi @saiman; nice work on going “off-piste” to see what you might learn!

I would be interested to see your cost curves (cost against iterations) for your training sessions. Different models, even if only differentiated by the size if the hidden layer, typically require adjustments to the optimization hyperparameters. In this case, we are pretty much limited to the learning rate parameter.

Stay tuned, a large part of the second course is dedicated to optimization algorithms.

1 Like

Hi @kenb.

I am now in the 4th course, CNN.
you are completely right. when I increased learning rate, 50 neurons showed me their flexibility(:slightly_smiling_face:)!

Thanks a lot for helping me!

Untitled1

Nicely done, and thanks for sharing! A good experiment, and by the way, one that provides a cautionary tale on overfitting.

1 Like