Exercise 8 - model:LogisticRegression

Hi!, i’ve been trying to find my mistake in this code, someone please can help me, i think the problem is in the initialization of the parameters w and b. I just can’t figure it out how to do it.

YOUR CODE STARTS HERE

w, b = initialize_with_zeros(dim)

parameters, grads, costs = optimize(w,b,X_train,Y_train,num_iterations = 2000, learning_rate = 0.05, print_cost = true)

w = params["w"]
b = params["b"]

Y_prediction_train =predict(w, b,X_train) 
Y_prediction_test = predict(w, b,X_test)

and the errors are:


ValueError Traceback (most recent call last)
in
----> 1 model_test(model)

~/work/release/W2A2/public_tests.py in model_test(target)
109 y_test = np.array([1, 0, 1])
110
→ 111 d = target(X, Y, x_test, y_test, num_iterations=50, learning_rate=1e-4)
112
113 assert type(d[‘costs’]) == list, f"Wrong type for d[‘costs’]. {type(d[‘costs’])} != list"

in model(X_train, Y_train, X_test, Y_test, num_iterations, learning_rate, print_cost)
37
38
—> 39 parameters, grads, costs = optimize(w,b,X_train,Y_train,num_iterations, learning_rate , print_cost)
40
41 w = params[“w”]

in optimize(w, b, X, Y, num_iterations, learning_rate, print_cost)
35 # grads, cost = …
36 # YOUR CODE STARTS HERE
—> 37 grads, cost = propagate(w, b, X, Y)
38 # YOUR CODE ENDS HERE
39

in propagate(w, b, X, Y)
31
32
—> 33 A = sigmoid(np.dot(w.T,X)+ b)
34 a1= (1-Y)
35 a2 = np.log(1-A)

<array_function internals> in dot(*args, **kwargs)

ValueError: shapes (1,2) and (4,3) not aligned: 2 (dim 1) != 4 (dim 0)

You are hard-coding the value of the learning rate and number of iterations when you call optimize from model. E.g. no matter what value is passed into model, you will run 2000 iterations in optimize.

Also note that you should not have to reimplement the initialize code in your model function. Just call the initialize code with the appropriate dimension value.

Also please note that we are not supposed to be sharing source code here on the discourse forums. Please edit your posts to remove the code now that you’ve gotten the answer.