Help for implementing my own deeplearning model

Hi.
I am trying to implement my own neural network model using numpy and a csv dataset and pandas
Now im running to these problems
When i train my model without changing anything somtimes it works well and accuracy is ok but sometimes its not it changes and switches between these two sates and sumtimes prediction gets nan; and i tried various hyperparameters but the results still the same
Sometimes cost during trainig decreases ok and sometimes with those same hyperparameters statys the same for each batch and its almost constant
:

i would appreciate it if you’d take look :pray:

I haven’t had a chance yet to look at your model, but up front, what do you mean by ‘sometimes’ ? Are you changing your training set/hyperparameters each time (which if you seek consistency at least, obviously you shouldn’t) ?

For each set of hyperparameters that i chosed for that specific set i got diffrend results without changing hyperparameters

General note:
Neural networks do not have convex cost functions - so there can be local minima.

But more likely, you could be making better choices about the weight initialization, learning rate, or feature normalization.

1 Like

i think the problem come from the sigmoid function as it output NAN sometimes you can add epsilon to it to avoid alot of problems like that

def sigmoid(z):
epsilon = np.finfo(z.dtype).eps
return 1 / (1 + np.exp(-z + epsilon))

i will let me own implementation to help u understand what I mean.

Thank you :pray::pray:

Thank you​:pray::pray:
I tried your aprpach and chose better hyperparameters and know its a bit more stable but now it generaly overfits so i need to implement regularization to see if it gets better

Look for the numerical stability of all the formulas in your implementation, specially sigmoid and log related formulas. I had this problem of getting nan outputs due to this.

1 Like

If you normalize the features of the data set, you generally will not have numerical issues with sigmoid or log().

1 Like