Following the instruction in the comment to calculate all the derivatives for both modes “max” and “average”, I was able to pass the test for the shape. However, when comparing returned values, they’re all wrong.

I’m suspecting that my implementation of those two comments is wrong:

```
# Set dA_prev to be dA_prev + (the mask multiplied by the correct entry of dA) (≈1 line)
...
# Distribute it to get the correct slice of dA_prev. i.e. Add the distributed value of da. (≈1 line)
```

For the first comment, I understand that the correct entry of `dA`

would be `dA[i, vert_start:vert_end, horiz_start:horiz_end, c]`

and for the later the distributed value of `da`

would also be `dA[i, vert_start:vert_end, horiz_start:horiz_end, c]`

.

Besides, I also converted the values of `mask`

to boolean values (0 or 1).

Is there anything that I’m missing here?

1 Like

Yes, I think you’re missing the point that it is only one value of dA at every iteration of the loop. The slicing happens in the *input* space, but we are traversing the *output* space one “point” at a time, right? Forward and backward propagation are the mirror images of each other. Think about what happens in forward propagation:

You select the appropriate “slice” of the h and w dimensions based on the stride and the pooling size and then that “projects” to a single value of the Z output right?

So in backward propagation, we are “projecting” the other direction: from a single [i, h, w, c] point in the output, back into the “slice” of the input. The only question is whether that projection only affects the maximum input value within that slice (in the max pooling case) or whether it has the same average effect on each element of the input slice.

So what you’re doing on the RHS of the two assignment statements is selecting one element of dA and then either using the mask to hit the right element of the input or using the “distribute” function to average it over all the inputs in the slice.

3 Likes

That’s exactly what I was missing. Thanks so much for pointing that out!