Week3 ReLU vs tanh

Hi all
I tested both activation functions.
I thought the ReLU would run faster. But they were almost similar in terms of efficiency.
I should that np.where note for ReLU and its derivative implementation.
is it ok?

What method did you use to measure the “efficiency”?

Hi
Thanks for your reply.
I thought Run time would be an appropriate measure.
I am really eager to know if there is any better method for efficiency measurement.

Since the notebook executes in the cloud, you can’t really measure the runtime accurately.