Is activation function jargon word?

What does activating a neuron’s output mean? My english is not that good

I think its just a mapping function but not just simple map_range function, but something special

def map_range(value, start1, end1, start2, end2):
    """
    Maps a value from one range to another range using linear interpolation.

    Parameters:
    value (float): The value to be mapped.
    start1 (float): The start of the original range.
    end1 (float): The end of the original range.
    start2 (float): The start of the target range.
    end2 (float): The end of the target range.

    Returns:
    float: The mapped value in the target range.
    """
    return start2 + (end2 - start2) * ((value - start1) / (end1 - start1))

An activation is the shaping or modifier function you apply to the value computed in a unit. Examples are sigmoid() and ReLU(), for example. First you compute (w*x + b), then apply an activation.

What does your map_range() function have to do with activations? Can you give more context?

Basically my question is apart from mapping the input to certain range (which is property of the activation function), is there any other thing that activation function do?

For example, the property of sigmoid is any real number will be squeezed in range (0, 1).

Activation functions are also chosen so that they have easily computed partial derivatives. This helps in the efficiency of computing the gradients, so that gradient descent runs better.

It makes sense, is this one of the reason why non differentiable activation function (discontinuous like step) are not used now?