Logistic_Regression_with_a_Neural_Network_mindset Week2 Exercise 8

In the model function , I have intialized w,b

{moderator edit - solution code removed}

When I call the optimize function I get the below error
ValueError: shapes (1,12288) and (4,7) not aligned: 12288 (dim 1) != 4 (dim 0)

StackTrace

ValueError Traceback (most recent call last)
in
1 from public_tests import *
2
----> 3 model_test(model)

~/work/release/W2A2/public_tests.py in model_test(target)
123 y_test = np.array([[0, 1, 0]])
124
→ 125 d = target(X, Y, x_test, y_test, num_iterations=50, learning_rate=0.01)
126
127 assert type(d[‘costs’]) == list, f"Wrong type for d[‘costs’]. {type(d[‘costs’])} != list"

in model(X_train, Y_train, X_test, Y_test, num_iterations, learning_rate, print_cost)
26 # Gradient descent
27 # params, grads, costs = …
—> 28 params, grads, costs = optimize(w, b, X_train, Y_train, num_iterations, learning_rate, print_cost=False)
29
30 # Retrieve parameters w and b from dictionary “params”

in optimize(w, b, X, Y, num_iterations, learning_rate, print_cost)
36 # YOUR CODE STARTS HERE
37
—> 38 grads, cost = propagate(w, b, X, Y)
39
40 # YOUR CODE ENDS HERE

in propagate(w, b, X, Y)
35 cost = None
36
—> 37 A = sigmoid(np.dot(w.T,X) + b)
38 cost = (-1/m) * np.sum((Y * np.log(A)) + (1 - Y) * np.log(1 - A), axis = 1)
39

<array_function internals> in dot(*args, **kwargs)

ValueError: shapes (1,12288) and (4,7) not aligned: 12288 (dim 1) != 4 (dim 0)

You are calling initialize_with_zeros incorrectly: you are “hard-coding” the dimensions to match the actual image data. But we are writing code here that can handle inputs with any number of features, right? And the test case you fail has 4 features instead of 12288.

Hard-coding things is always a bad idea unless they specifically tell you to do that. :nerd_face:

So how could you write the code in such a way that it derived the number of features from the actual input data, rather than just assuming it knows the correct number of features?

Thanks for feedback. resolved it now

1 Like