Hello @NeelD1999,
Welcome and good job working on the logistic regression problem. You are almost there!
If you check the comments for this particular task, you get the advice to use np.dot
for the cost computation (# compute cost using np.dot. Don't use loops for the sum.
). However, you can also use np.sum
, if you make sure the shape returned corresponds to the shape returned by using np.dot
for this task.
I will give you an example that hopefully helps you understand the difference between np.sum
and np.dot
for this task:
import numpy as np
v = np.array([[1.],[2.],[3.]])
print(v)
cost = np.sum(v)
print("cost:", cost)
print("cost.dtype:", cost.dtype)
print("cost.shape:", cost.shape)
cost2 = np.sum(v, keepdims=True)
print("cost2:", cost2)
print("cost2.dtype:", cost2.dtype)
print("cost2.shape:", cost2.shape)
ones = np.array([[1],[1],[1]])
cost3 = np.dot(ones.T, v)
print("cost3:", cost3)
print("cost3.dtype:", cost3.dtype)
print("cost3.shape:", cost3.shape)
Output:
[[1.]
[2.]
[3.]]
cost: 6.0
cost.dtype: float64
cost.shape: ()
cost2: [[6.]]
cost2.dtype: float64
cost2.shape: (1, 1)
cost3: [[6.]]
cost3.dtype: float64
cost3.shape: (1, 1)
You can read more about np.sum
on numpy.sum — NumPy v1.20 Manual
Try changing the cost calculation to only using np.dot
operations. It is a good exercise in linear algebra.
Please let me know if you have any additional questions.