I am stuck with train_step (UNQ_C5)
I checked every hint for solution but stil have problem
Code Cell UNQ_C1: Function 'compute_content_cost' is correct.
Code Cell UNQ_C2: Function 'gram_matrix' is correct.
Code Cell UNQ_C3: Function 'compute_layer_style_cost' is correct.
Code Cell UNQ_C4: Function 'total_cost' is correct.
Code Cell UNQ_C5: Function 'train_step' is incorrect. Check implementation.
If you see many functions being marked as incorrect, try to trace back your steps & identify if there is an incorrect function that is being used in other steps.
This dependency may be the cause of the errors.
Use alpha = 40, beta = 10 in contrary to instructions to Exercise 6 - train_step: Use alpha = 10 and beta = 40.
And re-run all cells above, because generated_image changes at each iteration
Sorry, I don’t understand your point. I believe that the instructions as written are correct. My code follows the instructions and passes the grader. Are you sure you didn’t make the mistake Tom was describing in his earlier post: reversing the roles of the style and content cost in the total cost? If you did that, then I could believe that reversing the alpha and beta values could result in passing the grader. But that would be two errors compensating for each other …
Paul,
You are right. Thank you. That was my mistake. I swoped costs in arguments of the calling function.
Anyway, one has to re-run the cell from paragraph 5.3 to re-initialize the generated_image. There is a note in Exercise 6: # You always must run the last cell before this one. You will get an error if not.
There should be added a note to re-run the cell from the 5.3. Right?
Could you, please, tell me why we must do this? Which Python operator changes the global variable generated_image?
Honestly, I am not an expert in Python. I would believe that the following operator
generated_image.assign(clip_0_1(generated_image))
cannot change that global variable, because the generated_image is an argument of the function train_step(generated_image), so it is a local variable inside that function. Is not it?
The semantics of passing object references on procedure calls in python is a bit more complicated than that. Objects are passed “by reference”, not “by value”. So that means whether the global object is modified by the function all depends on how the function is implemented. If it assigns anything to that object or elements of that object without first doing a “deepcopy”, then it will modify the global data. So I believe the global object does get modified in that case, but there is also the question of the statefulness of the TF mechanisms to do with computing gradients.
We’re swimming in pretty deep water here with OOP in python, so there is a lot to know. Of course once you graduate to OOP, python is just the framework. You also have to invest the effort to understand the properties of the various classes you are dealing with. Fortunately the TF docs are pretty good in general, but there’s a lot of them.
Actually if you think a little more carefully about how this is all being used, it becomes clear that the whole point is to modify the global state, right? The train_step function gets invoked repeatedly as you step through the training iterations. So if it didn’t change the global state, what would be the point?