C1W1 Assignment - gradientDescent error

I’m getting an the error for the line :
theta = theta - (alpha/m) * (x_t * (h - y))

error:
ValueError: operands could not be broadcast together with shapes (3,10) (10,1)

I transposed x according to the instructions and am having a hard time finding my error.

The multiplication operator works element-wise. It (usually) requires that the two operands have the same shape.

Try using np.dot() instead.

ValueError Traceback (most recent call last)
in
8
9 # Apply gradient descent
—> 10 tmp_J, tmp_theta = gradientDescent(tmp_X, tmp_Y, np.zeros((3, 1)), 1e-8, 700)
11 print(f"The cost after training is {tmp_J:.8f}.“)
12 print(f"The resulting vector of weights is {[round(t, 8) for t in np.squeeze(tmp_theta)]}”)

in gradientDescent(x, y, theta, alpha, num_iters)
32
33 # update the weights theta
—> 34 theta = theta - (alpha/m) * (x_t * (h - y))
35
36 ### END CODE HERE ###

ValueError: operands could not be broadcast together with shapes (3,10) (10,1)

^^^ seem to be getting the same error with np.dot

I don’t see where you used np.dot().

1 Like

@TMosh thank you for that. II thought the dot product didn’t work, but it didn’t update the function because of a parenthesis not matching error. It’s working now thank you