Hi @Anas_Al_Zabiby, this seems an interesting query and here’s a thread that you can go through to get an idea on why or why not to use ReLU as an activation function for the output layer.
Also, @paulinpaloalto sir and other mentors could throw other realistic approaches towards this query. Thanks!