Course 1 Week 2 Graded Assignment Exercise 8

Edit: There is definitely something strange going on with this question. I just confirmed that we need to feed num_iterations variable from model function to the optimize function (that is 100% the case). Meaning this question is bugged

Hi, I’m doing exercise 8 and can’t seem to get my optimized d[‘w’] values to match the expected values.

I did notice something unusual though.

In the method definition written by the course num_iterations is set to 2000.

def model(X_train, Y_train, X_test, Y_test, num_iterations=2000, learning_rate=0.5, print_cost=False):

The issue is when I feed that to the optimize function that function appends every 100th cost calculation to a list. Then the grader checks my cost calculations and compares the length of that list expecting a list of length 1. But with 2000 iterations the length would have to be 20.

In either case I cannot manage to produce the expected values for d[‘w’]

All my other assignments questions are passing until question 8

Here is my code for the model function

w, b = initialize_with_zeros(X_train.shape[0])
params, grads, costs = optimize(w, b, X_train, Y_train, num_iterations=2000, learning_rate=0.5, print_cost=False)

w, b = params.values()
Y_prediction_test = predict(w, b, X_test)
Y_prediction_train = predict(w, b, X_train)

Hi @haich ,

In the model() definition, num_iterations =2000 is the default value. That is to say, if the function model() is called with num_iterations set to a different value, then the new value is used, not the default one. Therefore, when making function call, we pass parameter to the calling function. That way, we can change the parameter value to suit. Parameter passing is common practice in programming.

It is preset in model but also is preset in optimize

def optimize(w, b, X, Y, num_iterations=100, learning_rate=0.009, print_cost=False):

I assumed I must take the value set in model() function since that’s the function I am currently working on. In any case whether I set it to 100 or to 2000 it doesn’t make a difference. I still get different d[‘w’] values than what’s expected.

I did play around with the learning rate though and managed to tweak it so that I got almost exactly the same output as the expected values. As in if I manually set the learning rate in optimise function to 0.005004872 I get almost the correct result for w and b. So my code must be doing something right.

Feel like this assignment might be bugged tbh. Or maybe I’m doing something slightly wrong not too sure

Hi @haich ,

The issue is in your code. If your code is incorrect, tweaking other parameters would not work. This course has been running for awhile, if there is any issue with the test/assignment, it would have been ironed out by now. There are other learners successfully finished this course and passed with flying colours. So let’s focus on how to put things right. Whatever parameters you have changed, please return them to their original values.

If d['w'] is different from the expected value, then you have to go back to the propagate() function to find out why. Here are a few suggestions to debug:

  1. Are those helper functions that called by mode() passed their unit test?
  2. Is the model() function, optimize() function, passing the input arguments to the calling function, and not hard coded values.
  3. Use print statement in propagate() to print the dw, db from grads

I think the key point here is that you are a bit unclear about how “keyword” parameters work in python. Try googling “python keyword parameters” and do some reading.

The short summary is that defining a function is a completely different thing than calling a function. When you see num_iterations = 2000 in the definition of model, that means what Kin said: that parameter is optional and if you do not pass it on a particular call to model, then 2000 is the value that will be used. But you can also pass a different value like 100 and that is what will actually be used, not 2000. My guess is that your bug is that you also “hard-coded” the optional parameters on the call from model to optimize.

I mean I was pretty clear that I tried many variations of num_iterations and learning_rate. I tried passing them as they are defined by the function (num_iterations=num_iterations, learning_rate=learning_rate). I tried num_iterations=2000, learning_rate=0.5. I tried not passing any values (would be num_iterations=50, learning_rate=0.09). I tried 0.005 learning rate (closest to the expected answer).

None of these worked, even though I got every other question right.

But the reason I know for certain it is bugged is because if I pass num_iterations=num_iterations (which is the most intuitive approach) the test claims my cost function length is wrong and that it should be 1 and not 20.

But then if you run the next cells they expect it to be length 20 (i.e. 2000 iterations). So theres clearly an issue there.

Edit: I guess what I’m trying to say is the only way to make the test run 100 iterations (without altering the testing code) is to hard code the learning rate to 100 in the model function (which would make the num_iterations argument reduntant). But then doing that would make the next few lines of code run 100 iterations instead of 2000

Also I’m not saying I haven’t done anything wrong I probably have. But there is definitely an issue with the question still

Hi @haich ,

When passing parameter to a function, all you need is to give the name of the parameter. Also, after you made a change to the code, you need to rerun that code cell. If in doubt, do a kernel restart and clear all output, and rerun code from start. This will ensure the execution environment is all up-to-date.

Here is a link to tutorial on parameter passing.

1 Like

Hello @haich! I hope you are well.

I just read this thread. First, as Kin already mentioned that “Whatever parameters you have changed, please return them to their original values.” If it is messy for you, read this post to get a clean copy of your assignment. But then you have to do all the exercises again, so, it is better to save your work somewhere else or download it, before getting a clean copy of your assignment.

After that, do all your exercises and share the full error with us of the Exercise 8 - model. Again, do not hardcode anything.

Best,
Saif.

I didn’t change the value of any parameter set in the assignment.

The only thing I did was try setting different values for num_iterations when calling the optimize function inside the model function. That’s not changing the default value of a parameter.

Just to be clear I have not changed any code that is outside a “YOUR CODE STARTS HERE” section.

Please check your DMs for a message from me about how to proceed here.

Ok, things are cleared up now. The hard-coding of num_iterations had already been fixed, but the same bug still existed w.r.t. learning_rate and print_cost. I don’t think the tests or the grader care about print_cost, but they definitely care about having the learning rate handled correctly.