NLP C2_W4: Backpropagation formula notation

I am trying to understand what does the notation of .step(Z1) mean in the backpropagation calculation of partial derivatives of W1 and b1. It’s not explained in the video, the .pdf file as well as in the assignment notebook and therefore, I am stuck here failing the tests…

I just ignored it and all tests pass.

In this context, .step(Z1) refers to the derivative (gradient) of the activation function with respect to its input g(Z1)

During backproagqation dZ1 = dA1*g’(Z1), this g’(Z1) is the .step(Z1) here.