Good day to all!
I’m having a problem executing #UNQ_C5 GRADED FUNCTION: gradient_descent.
Unfortunately, I can’t figure out what is causing the error:
TypeError Traceback (most recent call last)
in
7 num_iters = 150
8 print(“Call gradient_descent”)
----> 9 W1, W2, b1, b2 = gradient_descent(data, word2Ind, N, V, num_iters)
in gradient_descent(data, word2Ind, N, V, num_iters, alpha, random_seed, initialize_model, get_batches, forward_prop, softmax, compute_cost, back_prop)
48
49 # get gradients
—> 50 grad_W1, grad_W2, grad_b1, grad_b2 = back_prop(x, y, yhat, h, W2, b1, b2, batch_size)
51
52 # update weights and biases
TypeError: back_prop() missing 1 required positional argument: ‘batch_size’
The function specified in the error is implemented correctly, in my opinion:
get gradients
grad_W1, grad_W2, grad_b1, grad_b2 = back_prop(x, y, yhat, h, W2, b1, b2, batch_size)
Please tell me what is my mistake?