Tanh Activation Function

Hi Mentor,

Is the below content will be covered detailed in the upcoming course?

mean activation come out of hidden layer will be zero, it makes the effect of centering the data so the mean is zero and makes learning next layer little bit easier

This explanation is a comment which won’t be detailed further. You will understand better, step by step, by gaining experience, that centering the values in the nodes of the hidden layers helps. A similar philosophy appears with Batch Normalization layers covered in the second course :slight_smile: