Hey community!
Could somebody please elaborate why do we need 2 Dense layers instead of just one?
And why the tanh and relu were chosen as activation functions?
I tried to find the answer in the papers mentioned throughout the course but failed.
densor1 = Dense(10, activation = "tanh")
densor2 = Dense(1, activation = "relu")