Course 2 Week 2, Programming assignment, Exc. 6.1 - Mini-Batch Gradient Descent "ValueError: operands could not be broadcast together with shapes (1,44) (1,64)"

Hi all. Is there an error in this exercise? In all previous exc. all test passed, and this is not supposed to have any student code:

6.1 - Mini-Batch Gradient Descent



ValueError Traceback (most recent call last)
in
1 # train 3-layer model
2 layers_dims = [train_X.shape[0], 5, 2, 1]
----> 3 parameters = model(train_X, train_Y, layers_dims, optimizer = “gd”)
4
5 # Predict

in model(X, Y, layers_dims, optimizer, learning_rate, mini_batch_size, beta, beta1, beta2, epsilon, num_epochs, print_cost)
56
57 # Compute cost and add to the cost total
—> 58 cost_total += compute_cost(a3, minibatch_Y)
59
60 # Backward propagation

~/work/release/W2A1/opt_utils_v1a.py in compute_cost(a3, Y)
98 “”"
99
→ 100 logprobs = np.multiply(-np.log(a3),Y) + np.multiply(-np.log(1 - a3), 1 - Y)
101 cost_total = np.sum(logprobs)
102

ValueError: operands could not be broadcast together with shapes (1,44) (1,64)

Thank you!

Hi sotofernando,

As a belated reply:
The public tests do not capture all possible bugs in your code. As the error traceback indicates, there is a mismatch of dimensions beween a3 and Y. This means there is a bug somewhere in your code calculating a3 and Y, maybe in your function random_mini_batches.