But the point is that the loss function (log loss) is tied to the sigmoid activation function in that it requires output values between 0 and 1. In other words, you can’t just arbitrarily change the output activation by itself: you need to adjust the loss function as well. So what loss function would you use if tanh is your output activation with a range of (-1,1)?
BTW it turns out you could scale and shift tanh so that it is the same as sigmoid. They are very closely related mathematically.