Why do you need Non-Linear Activation Functions?

Excuse me for interrupting, but isn’t the RELu function is a Linear function?