C1 W4 Assignment forward propagation

Hello there,

I got this error when calling linear_forward_activation() function in two_layer_model() function with required parameters.
For me it seems there’s something wrong in test files. Or am I passing wrong parameters? (I don’t believe so)
This is how I am calling linear_activation_forward() which takes, A_prev, w, b and activation.

 A1, cache1 = linear_activation_forward(X, W1, b1, relu)

What other value is relu(Z) returning except A? I think that’s what is causing the problem.
I couldn’t access that dnn_app_utils_v3.py file to check.
I appriciate any help.

Hello @TAN548
check this post

Thank you @jenitta.jebaraj

My problem was not misspelling. Putting relu and sigmoid in quotation mark solved the problem.
minor problem.
The error message misled me :smile: