Need clarification on function parameter

def linear_activation_forward(A_prev, W, b, activation):
    return A, cache

This is the in-build function provided. All other parameters are derived. But I’m confused about the activation(ReLU and Sigmoid). Do we need to implement or can we get this from somewhere?

Those functions are provided for you in an “import” file. Have a look at the first code cell in the notebook. If you want to look at how those functions are implemented, there’s a topic about that on the DLS FAQ Thread.