The most common issue is if you are using a for-loop but don’t have the correct indentation.
Python uses indentation to identify different code blocks.
For example, if you’re computing f_wb by iterating through the features, be careful that you only add ‘b’ once.
It’s also important to keep this in mind when computing the bias gradient and the weight gradients.
Check training data set and then run, different values of the gradient differentials might be due to change in input data set , x train y train
The expected values are only earned if you exactly load the actual training data set and provide your own .