Not understanding syntax error, Wk4, first lab

I’m not getting something here. I reproduced a segment of code from my notebook. This is for the linear_activation_forward function for the linear->activation layer in an N-layer deep neural network.

if activation == “sigmoid”:
#(≈ 2 lines of code)
# Z, linear_cache = …
# A, activation_cache = …

    # YOUR CODE STARTS HERE
    Moderator Edit: Solution code Removed
    # YOUR CODE ENDS HERE

With sigmoid: A = (array([[0.96890023, 0.11013289]]), array([[ 3.43896131, -2.08938436]]))
With ReLU: A = (array([[3.43896131, 0. ]]), array([[ 3.43896131, -2.08938436]]))
Error: Datatype mismatch with sigmoid activation in variable 0. Got type: <class ‘numpy.ndarray’> but expected type <class ‘tuple’>
Error: Wrong shape with sigmoid activation for variable 0.
Error: Wrong shape with sigmoid activation for variable 0.
Error: Wrong shape with sigmoid activation for variable 1.
Error: Wrong shape with sigmoid activation for variable 2.
Error: Wrong shape with sigmoid activation for variable 1.
Error: Wrong output with sigmoid activation for variable 0.
Error: Wrong output with sigmoid activation for variable 0.
Error: Wrong output with sigmoid activation for variable 1.
Error: Wrong output with sigmoid activation for variable 2.
Error: Wrong output with sigmoid activation for variable 1.
Error: Datatype mismatch with relu activation in variable 0. Got type: <class ‘numpy.ndarray’> but expected type <class ‘tuple’>
Error: Wrong shape with relu activation for variable 0.
Error: Wrong shape with relu activation for variable 0.
Error: Wrong shape with relu activation for variable 1.
Error: Wrong shape with relu activation for variable 2.
Error: Wrong shape with relu activation for variable 1.
Error: Wrong output with relu activation for variable 0.
Error: Wrong output with relu activation for variable 0.
Error: Wrong output with relu activation for variable 1.
Error: Wrong output with relu activation for variable 2.
Error: Wrong output with relu activation for variable 1.
0 Tests passed
6 Tests failed

First, sharing your code in a public thread is not allowed, so, avoid sharing your code.
Second, your code is not correct. You have to call linear_forward to compute Z and linear_cache.

For A and activation_cache check the instruction from the notebook, as below:

1 Like

Thanks, Saif. I fixed my code.

1 Like