Course 4, Week 2 , 1*1 convolution

Prof. Ng said that ReLU is applied during1* 1 convolution, I have a doubt that if its explicitly done or the 1* 1 convolution by nature implicitly applies it and if so how it happens?

There is no such thing as an activation function being “implicit”. You have to specify which activation to use at each layer: it is a choice that you make, regardless of whether the particular convolution layer is 1 x 1 or not.