What activation function can i use for hard thresholding operator?

Let x be an n dimensional vector. Hard-thresholding operator is defined by L_k(x) which is the vector retains the k largest elements of x in magnitudes and remaining are zeros.

For example, x=(1,2,4,-7,8,3,11)
Then L_3(x)=(0,0,0,-7,8,0,11).

My question is what activation function can i use for this function in deep neural network?

That specific example looks like you take the absolute value and then use ReLU with a bias of -5.5.