Hello! I hope you are doing well.
I’ve completed all the courses of MLS and the first course of DLS. Now I am creating my own NN model from scratch, using random data. However, I am facing some issues (attached picture). I am using relu for both hidden and output layers (input = numerical value; output = numerical value). Kindly guide me how to resolve this, I will be highly indebted to you.

About data:
X shape is = (20, 1)
y shape is = (20, 1)
n_x is = X.shape[0] = 20
n_h is = 7 (units/neurons in hidden layer)
n_y is = y.shape[0] = 20
L is = len(parameters) // 2 = 2

It seems you are dividing by zero is the problem and is probably the A variable, which as far as I remember are the weights AX+B (right?), so you need to have a look where exactly this division by zero is happening and fix it. I am guessing you might be initializing the weights all to zero maybe its better to have them random. Also make sure the cost computation formula is right!

A is np.maximum(0,Z) while Z is np.dot(W,X) + b
I cannot figure out where that division by zero is. I initialized weights randomly, multiplying by 0.01.

Actually, I copied all the functions from DLS_C1_W4_assignments and changed the input, output, and little modifications. But still struggling to find out my mistake(s).

Well either np.log(AL) or np.log(1-AL) is becoming negative, so make sure to trace it why is it becoming negative and avoid it to become negative. I dont remember the notation right now but that your start.

My first NN model performs poorly and still is giving one error (attached picture). After several hours of debugging, I realize that maybe I am using wrong derivative for A2 (output). I copied this derivative from DLS assignment which was for logistic regression: dA2 = - (np.divide(Y, A2) - np.divide(1 - Y), (1 - A2))
But my own model is dealing with linear regression. Do I need to change dA2? If yes, what is it? Sorry to bother you again.