# Wk2 Ex8 Retrieve Output from Previous Functions

Hi,

I’m doing Exercise 8 from Week 2. I think what it requires us is to put all the previous functions together and compute a final result. But how do we retrieve the outputs from the previous functions? Like if we want to propagate w and b from initialization step, how do we get those two variables?

Hi, @ChrisBi. Correct, you get to put all of your previous work into play. At this point you have filled out the code for a bunch of functions. Suppose that these functions are called `intermediate_function_1` and `intermediate_function_2`. You have previously called these functions to test the validity of your code:

`result1 = intermediate_funtion_1(arg1, arg2)`
`result2 = intemediate_function_2(arg3, arg4)`

Note that the results produced by calling these functions are “global” variables and are specific to the values of the arguments passed to these functions. At this point in the assignment, these results are irrelevant. The results naturally change with the argument values passed to these functions. That’s what makes functions useful.

The function that you are working on, `model()`, can be thought of as a function that accomplishes what you originally set out to do, by aggregating your previous functions (i.e. calling them within that function) to produce the objects of your heart’s desire:

`def my_aggregation_function(arg1, arg2, arg3, arg4):`
`new_result_1 = intermediate_funtion_1(arg1, arg2)`
`new_result_2 = intermediate_funtion_2(arg3, arg4)`
# Do something with `new_result_1` and `new_result_2` to produce a final result …
`return final_result`

You will be calling `my_aggregation_function` to produce the “final result” for different values of `arg1``arg4`. Example:

`cool_result = my_aggregation_function(arg1x, arg2x, arg3x, arg4x)`

Those are supplied to you in the test cell.

Key observation: the global objects of interest are the functions, not the specific values of `result1` and `result2`.

This flexibility is the power of functional programming. You’re going to love it!

Cheers,
Ken B

I get the idea. Thank you so much! It’s so nice of you to talk this much!

Here is an error I got. What I did with costs was to call propagate and optimize function. I’m guessing the problem is in the parameters of optimize function. I’m calling it with num_iterations=100, learning_rate=0.009, print_cost=False as parameters, and then fetching w with params[“w”]. I believe these are the correct steps. How come dw has wrong shape?

Thanks!

The short of it is that you should not be be “hard-coding” any of the arguments. The test case that you run set these up for you. Assuming that your previous functions have passed their tests, the assertion error (or “exception”) thrown by the test case indicates to me that the hard-coding is getting you into trouble.

Note that the `model()` function defined with some “keyword arguments”: `num_iterations=2000`, `learning_rate=0.5`, and the “boolean” argument`print_cost=False`. With the function defined this way, the values assigned serve as default values. This means that if you called the function as (for example), `my_model = model(my_X_train, my_Y_train, my_X_test, my_Y_test)` the keyword arguments would default to the values in the function definition. In this example, the function is called with only the “positional” arguments. These are required and must be in the proper order. That’s why the are called “positional”.

Your mission is to fill out the code (with calls to the “helper” functions), to inherit the values of any of these arguments, so that it works for all settings of all arguments. It seems that you have arbitrarily hard-coded the default values as set up in the definition of the `optimize()` function. Don’t do that.

Also, `parameters` is not a function, but rather a Python dictionary where the values of the optimize weights (for `w` and `b`) are stored from the call to the `optimize()` function. It is well worth your time to get these concepts straight in this assignment, as they are fundamental to the rest of the course, and the Specialization. If, after a sufficient struggle (and Googling!), these do not crystalize, there is no shortage of introductory Python courses on the Coursera platform to choose from if you wish to re-engage this course at a later time.

Good luck!

Hi @ChrisBi, just checking to see if you have made progress on your problem. Any luck?