C2 W2 Optimization Methods Assignment

In this function, shouldn’t we set the counter t to zero every time new epoch starts?

def model(X, Y, layers_dims, optimizer, learning_rate = 0.0007, mini_batch_size = 64, beta = 0.9,
beta1 = 0.9, beta2 = 0.999, epsilon = 1e-8, num_epochs = 5000, print_cost = True):

@jakhon77 recall we are trying to optimize here, or find a hopeful ‘global’ minimum. We want to edge ‘a bit closer’ with each epoch, not revert back to where we started from, so it doesn’t make sense to reset.

1 Like

That’s not the way I wrote the code and my code passes the tests. It’s just a question of how the algorithm is defined. Maybe it’s worth watching the lectures about Adam again. Does Prof Ng address this point in the lectures? Please give us a reference to the point at which he explains that this should be done the way you suggest.

1 Like

@jakhon77 @paulinpaloalto I also don’t want to say too much here, but peeking at it, also recall ‘t’ is going to impact your $$$ \beta_1 \beta_2 $$$. And hmmm… LaTeX is not working for me here…

1 Like

No, he doesn’t say that. I understood now. Thank you for the clarification.

Awesome! Very well explained!!!

Re: LaTeX: use single dollar-signs.