Week 3 Assignment: cost minimizes well, but parameters do not learn

In Week 3 assignment, after training the model, the cost minimizes quite well.
But when I tried to compare the very first initialized parameters and the output parameters, I found that they were equal.

Then I did a trial to save the parameters from trainable_variables after the training loop, but those two parameters were still equal.

Below is the trial code:

def model():

parameters = initialize_parameters()
parameters_init = parameters


parameters_trained = {}
parameters_trained[‘W1’] = trainable_variables[0]
parameters_trained[‘b1’] = trainable_variables[1]

return parameters_init, parameters_trained
print(parameters_init[‘W1’] == parameters_trained[‘W1’])

It returned all True.

I don’t get the way of saving the parameters in a variable, and have no clue why those two parameters were equal, with a nice cost curve.

Need some hints on those issues.

Hi, @Damon.

Be careful, parameters_init = parameters does not create a copy of parameters! This is what happened, and this is probably what you were trying to do.

I think you’ll understand why model works after going through those links, but please let me know if something’s still not clear :slight_smile:

P.S.: Here are the technical details in case you’re interested.


Wow! @nramon ,

You really got me right, and this guidance perfectly solved my confusion.

Base on that understanding, there’s no need to do extra operations to save trained parameters like parameters_trained[“W1”] = trainable_variables[0].

Because the variable parameters always points to “W1”, before and after its value updated.

Thank you so much!

1 Like

That is correct. They already refer to the same object :slight_smile:

I’m very glad I could help. Keep up the good work!

1 Like