Can the sigmoid function ever be equal to 1 or 0?

Can someone please tell me both in theoretical and practical terms if and when this situation may occur and how this would affect the logistic loss function in logistic regression?

There are already several great articles on the matter. One of them is

https://kharshit.github.io/blog/2018/04/20/don't-use-sigmoid-neural-nets

TLDR: The sigmoid activation function might saturate and kill gradients, so learning stops or slows down too much.