Hi everyone,
I was able to solve an initial error where my cost output did not match the expected output during the optimization stage. I was initially creating a variable lr inside def optimize(...):
because I was not seeing how I could update the learning rate. Then I read the function call more carefully and saw that learning_rate
was already predefined.
I used learning_rate
instead and was able to optmize succesfully but now I am wondering why one variable assignment works and another one does not?
The problem you’re describing involves variable scoping and assignment within a function.
When you used learning_rate
as a predefined parameter, it was passed into the def optimize(...):
function, meaning it was already defined and set when the function was called. This variable was then used consistently throughout the function, ensuring that its value remained consistent through the various stages of the optimization process.
In the test case, the function call expected this specific (non-editable) learning_rate
variable, and its use was consistent with the expectations of the rest of the code.
When you created a new variable lr
inside the def optimize(...):
function, you were essentially introducing a local variable separate from the one expected by the rest of the code. By introducing lr
, you created a variable that was not connected to the rest of the function or its intended design, leading to issues in how the learning rate was applied during optimization.
1 Like
It sounds like you are new to python. It would be a good idea to google “python named parameters” or “python keyword parameters” and do a bit of reading.
There are two kinds of parameters in python:
- Positional parameters. These come first in the function definition and are required on every call to the function.
- Keyword parameters. These are listed after the positional parameters in the function definition. Keyword parameters are optional on any given call to the function, but it is important to understand what that means. They are declared with a default value, which will be used if the parameter is omitted on a given call.
So if you call optimize
and don’t specify the learning rate, then what will happen is you use the default value. In all the test cases we have here, that would be a big mistake: you need to pass the actual value that is passed in to model
at the top level. Notice that learning_rate
is also a keyword parameter at the model
API, but if you look at all the test cases, you will see that they all pass a particular value.
That’s my short summary of how this works. If this is a new idea to you, it really would be a good idea to do the reading I suggested at the beginning.
1 Like
Thank you, gentlemen. I will read up on these differences between positional and keyword parameters.
2 Likes