Course1 Week4 Lab1 exercise4

Hi! Could TA or anyone help explain what the “linear_cache” should be stored, please?

I suppose the 2 functions provided, sigmoid() and relu(), already pass the “activation_cache”(which should be “z”?), so I couldn’t figure out what else I should store for the “linear_cache” parameter.

In the back propagation phase, they give you the logic in the template code to extract the linear_cache and activation_cache from the full cache entry. Just for clarity, the full cache entry is a python “tuple” with two entries:

((A, W, b), Z)

The first entry in the tuple is the “linear cache” and it is a 3-tuple.

The second entry in the cache tuple is the “activation cache” which has only one entry: Z.

The caches were created during the forward propagation phase. The linear_cache was one of the return values from the linear_forward function, right? They also gave you the code to create that in the template code.

Thanks @paulinpaloalto ! Now everything makes a lot sense to me now :smiling_face: