Exercise 9 - L_model_backward activation function

In this exercise 9 I am wondering if I did everything correctly. I managed to get “All tests passed.” However, my understanding throughout the entire assignment was, that each layer had it’s own activation function (either “sigmoid” or “relu”). And that this activation function was stored separately for each layer in a variable - except for the loop where it alternates between “linear” and “relu”.

Now in exercise 9 I only managed to get “All tests passed.” by passing on the activation function “hardcoded” as “relu” or “sigmoid” but not if I thought about passing it on as a variable, because I don’t see which variable to use.

So either my understanding is somehow wrong or there is a variable that can be used (or should be used?) because this was given somewhere at the beginning of the assignment in a variable that I’m not able to recall?

Can you please help me out here?

Many thanks.

Yes, each layer has it’s own activation function–relu for all layers except the output layer which is sigmoid. No, they are not stored, or “cached” (except indirectly through the the model parameters computed during forward and backward propagation, e.g. A’s, Z’s, …). There is no need to cache the strings referring to the functions (used in function calls) or the functions themselves. You can see this by examining all of the objects returned by each of the functions.

As for the charge of “hard-coding,” I think that you are not guilty. Look at the signature of the function: L_model_backward(AL, Y, caches). Given what I stated above, neither a keyword reference to the function or the function itself is not called. So if you have called one of more of the functions higher up the stack to complete your L_model_backward() function which needs a keyword argument for an activation parameter, e. g. activation = 'relu', then you are go to. Not guilty! :grinning:.

Many thanks!
What I need is more practical experience.