Week 4 assugnment DNN-application accuracy

Hello,

I have passed the last assignment for course 1.Eventhough i have not reached the expected accuracy or cost after 2499 iterations, to improve this should i tweak some hyperparameters like inner layer dimensions , learning rate, any pointer will be helpful.(Note:- this is for L_layer_model)

2 Likes

Your cost value is better than the expected value (smaller == better), doesn’t need to improve, keep moving.

1 Like

Welcome @salman0149! If your code is valid the model hyperparameters will not need to be tweaked to reproduce the expected output. First, examine you call to the function L_layer_model:

parameters, costs = L_layer_model(train_x, train_y, layer_dims, num_iterations=2500, print_cost=True)

The function L_layer_model, in turn, has a number of calls to functions that you completed in the previous notebook (A1).

Your instinct is good. From your output, the gradient descent seems to be in good working order. So a likely suspect would be that the learning rate is different (in your case somewhat larger–why?) than the one set in the keyword argument: learning_rate = .0075. First make sure that you did not accidently change that.

If not, take a look at where l_layer_model calls the learning_rate in the body of the function. It should enter as an argument to a function that you wrote in the previous assignment, specifically, the update_parameters function. You want to be sure that you that the learning_rate is not “hard-coded” there, i.e. set to a numerical value. Rather, it needs to inherit the learning rate from the setting in the L_layer_model function, i.e. learning_rate=learning rate.

Review that, and let me know. If it doesn’t work out, please resend your output from running the cell, except with the entire record of costs, i.e. including the initial iteration.

1 Like

Thanks for the pointers , let me take a deep look into the earlier assignments, if I have hard-coded some values .

Sorry I am replying to this so late. I had the same problem as you, resulting in the exact same values you had for the iterations. I went through my code and after a while I realised I didn’t run a small cell which I must have missed (somehow). It’s right above the graded function cell.

Hi @Aadham_Ahmad, so, what is your actual question?Please help me understand. Thanks.

Hi @Rashmi, I don’t have a question, I have what I believe to be the solution to the initial problem. Since I had the same exact values after each 100 iterations as the initial post, it seems most likely that we share the same error. My previous reply and picture state my solution to the problem for anyone else who has the same error.

Okay, I got it @Aadham_Ahmad. Yes, you are very right. Missing any cell could cause an error in the later part of the assignment as each of the functions are integrated to get the desired output.