Wiggly convergence of loss function

Hi all, in order to test my knowledge for this week’s lecture, I implemented an MLP neural network to another dataset (acquired from Kaggle). This is a regression problem (predicting material strength from 8 features). I have a total of 1030 samples. I used tensor flow to define 5 dense layers (4 hidden using relu and 1 output using linear). then I used adam for the optimizer and mae for the loss function. I tried 150 epochs. I plotted the loss function during the training. The graph is shown below.

wiggly

My questions are:

  1. is this wiggly convergence plot ok?
  2. how do I make this graph smoother? what parameters should I change?
  1. Yes.
  2. You don’t really need to make it smoother.

Bonus question: Did you really need five layers? Did you try fewer layers? If so, what were the results?