Is this vector implementation correct?

I am solving this assignment using vectorization. Was wondering if the below implementation is correct? I am getting an error “Wrong output. Expected: 2.15510667 got: 1.8975454646678849”

def compute_cost(X, y, w, b, lambda_= 1):
    """
    Computes the cost over all examples
    Args:
      X : (ndarray Shape (m,n)) data, m examples by n features
      y : (array_like Shape (m,)) target value 
      w : (array_like Shape (n,)) Values of parameters of the model      
      b : scalar Values of bias parameter of the model
      lambda_: unused placeholder
    Returns:
      total_cost: (scalar)         cost 
    """

    m, n = X.shape
    
    ### START CODE HERE ###

    # code removed
    
    ### END CODE HERE ### 

    return total_cost

Seems like I found the problem. I was taking np.sum() twice; once when calculating z and once when calculating the total_cost

The first sum() isn’t necessary, since the dot() method automatically computes the sum of the products.

I have edited your message to remove the code. Please don’t post your code for the programming assignments on the Forum.