How does regularization work on layer with activation “relu” in neural network?

Improving Deep Neural Networks: Hyperparameter tun WK-1 , regularization
Having error with my personal project. RELU activation affecting the cost function. any help
Regularization.py (6.2 KB)
TRY_MODEL.ipynb (8.5 KB)

I need help. Thank you

Relu clips the negative weights so regularization would reduce the positive weights of the relu layers if needed!

Please can you explain it for me!! and if possible, kindly run the python files for me and make any recommendadtion for the errors. Thank you