C1W2 lab Logistic Regression with a Neural Network mindset

When creating a post, please add:

  • Week # must be added in the tags option of the post.
  • Link to the classroom item you are referring to:
  • Description (include relevant info but please do not post solution code or your entire notebook)

Good afternoon Mentors, I have been having these errors, and can’t figure out what is wrong.
AssertionError Traceback (most recent call last)
1 from public_tests import *
----> 3 model_test(model)

~/work/release/W2A2/public_tests.py in model_test(target)
131 assert type(d[‘w’]) == np.ndarray, f"Wrong type for d[‘w’]. {type(d[‘w’])} != np.ndarray"
132 assert d[‘w’].shape == (X.shape[0], 1), f"Wrong shape for d[‘w’]. {d[‘w’].shape} != {(X.shape[0], 1)}"
→ 133 assert np.allclose(d[‘w’], expected_output[‘w’]), f"Wrong values for d[‘w’]. {d[‘w’]} != {expected_output[‘w’]}"
135 assert np.allclose(d[‘b’], expected_output[‘b’]), f"Wrong values for d[‘b’]. {d[‘b’]} != {expected_output[‘b’]}"

AssertionError: Wrong values for d[‘w’]. [[ 0.14449502]
[-0.1429235 ]
[ 0.21265053]] != [[ 0.08639757]
[ 0.12866053]]"

All my functions worked well and tested alright!

1 Like

One common mistake students make in this exercise is to hardcode the learning rate.

1 Like

Thank you so much, boss. I actually hard-coded more than the learning rate. Thanx once again brother.

1 Like

what does it mean to hardcore the learning rate? how does one fix this? it seems like the learning rate is set by the exercise… should i adjust?

The hardcoding is to set a value to fix number. For example, learning_rate = 0.1 is a hard-coded.

The exercise set learning rate to some default value. It means when you do not pass any value for it, the code will use that default value. You should not change that. But you should pass a flexible value (like x = x) when you call optimize function.

I’m stuck on exercise 8, merging functions to model.
When I call the optimize function, “optimize (w, b, X, Y)”, for defining the model function, isn’t X supposed to be X_train?

Hi @Molly_Ng ,

Could you post a screenshot of the error to help understand your problem. When calling any function, passing argument by name is a good practice.
In the case of calling optimize(), the arguments to be passed to the function are like this:
Here, all the arguments are either local variables, such as w, b; or the input arguments such as X_train, Y_train, num_iterations, learning_rate.

Hi @Molly_Ng,

In your model() function, only 4 arguments are passed to optimize() when making the function call. What this meant is that you want optimize() to use default value for those arguments that you have not specified. The consequence of this is that when different value is given by the test case, the result produced is not going to match the expect output.

Please be mindful of the community code of conduct - not to post source code in public.