# W4 A2: Screen Shot 2023-11-27 at 12.36.19 AM.png

Hello,
Below errors were observed in Exercise 1.

ASSUMPTION:
As “activation” was not pre-declared, so “sigmoid” is used for each case:
Forward propagation (A1 and A2).
Backward propagation (dA1 and dA2).

Please opine if activation variable is correct.

If needed, I could share the code.
Sincerely,
A

**

**

ValueError Traceback (most recent call last)
in
----> 1 parameters, costs = two_layer_model(train_x, train_y, layers_dims = (n_x, n_h, n_y), num_iterations = 2, print_cost=False)
2
3 print("Cost after first iteration: " + str(costs[0]))
4
5 two_layer_model_test(two_layer_model)

in two_layer_model(X, Y, layers_dims, learning_rate, num_iterations, print_cost)
47 # linear_activation_forward
48 A1, cache1 = linear_activation_forward(X, W1, b1, activation = “sigmoid”)
—> 49 A2, cache2 = linear_activation_forward(X, W2, b2, activation = “sigmoid”)
50 # YOUR CODE ENDS HERE
51

~/work/release/W4A2/dnn_app_utils_v3.py in linear_activation_forward(A_prev, W, b, activation)
201 if activation == “sigmoid”:
202 # Inputs: “A_prev, W, b”. Outputs: “A, activation_cache”.
→ 203 Z, linear_cache = linear_forward(A_prev, W, b)
204 A, activation_cache = sigmoid(Z)
205

~/work/release/W4A2/dnn_app_utils_v3.py in linear_forward(A, W, b)
176 “”"
177
→ 178 Z = W.dot(A) + b
179
180 assert(Z.shape == (W.shape[0], A.shape[1]))

ValueError: shapes (1,7) and (12288,209) not aligned: 7 (dim 1) != 12288 (dim 0)

Expected output:

cost after iteration 1 must be around 0.69

The doctext tells you what activations to use.

Yes, you got the activation function wrong for the first layer, but the error being thrown there has nothing to do with activation functions. Activation functions all act “elementwise”, so they can’t cause a dimension mismatch like the one you are getting. The mistake is that your arguments are wrong for the second call to `linear_activation_forward`. The point of forward propagation is that the input to the second layer is the output of the first layer, right? That’s not what your code does with the predictable consequences.

Thanks.

A. Change forward activation to “relu”. For layers2, change from X to A1.
A1, cache1 = linear_activation_forward(X, W1, b1, activation = “relu”)

• ``````   A2, cache2 = linear_activation_forward(A1, W2, b2, activation = "relu")*
``````

B. Change AL to A2, as it is a two -layer model.
cost = compute_cost(A2, Y)

C. Backward propagation:
dA1, dW2, db2 = linear_activation_backward(dA2,cache2,activation = “sigmoid”)

• ``````   dA0, dW1, db1 = linear_activation_backward(dA1,cache1,activation = "sigmoid")*
``````

Below error was observed which does not seem to point to any specific line in code that tells the reason for wrong output.

Sincerely,
A
——————————————————————————————————

Error Observed (attached):

Cost after iteration 1: nan
Cost after first iteration: nan
Cost after iteration 1: nan
Cost after iteration 1: nan
Cost after iteration 1: nan
Error: Wrong output for variable W1.
Error: Wrong output for variable b1.
Error: Wrong output for variable W2.
Error: Wrong output for variable b2.
Error: Wrong output for variable 0.
Cost after iteration 2: nan
Error: Wrong output for variable W1.
Error: Wrong output for variable b1.
Error: Wrong output for variable W2.
Error: Wrong output for variable b2.
Error: Wrong output for variable 0.

• 2 Tests passed*
• 2 Tests failed*

---------------------------------------------------------------------------
AssertionError Traceback (most recent call last)
in

• `````` 3 print("Cost after first iteration: " + str(costs[0]))*
``````
• `````` 4 *
``````

----> 5 two_layer_model_test(two_layer_model)

~/work/release/W4A2/public_tests.py in two_layer_model_test(target)

• ``````75     ]*
``````
• ``````76 *
``````

—> 77 multiple_test(test_cases, target)

• ``````78 *
``````
• ``````79 *
``````

~/work/release/W4A2/test_utils.py in multiple_test(test_cases, target)

• 140 print(‘\033[92m’, success," Tests passed")*
• 141 print(‘\033[91m’, len(test_cases) - success, " Tests failed")*
→ 142 raise AssertionError(“Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.”.format(target.name))
• 143 *

AssertionError: Not all tests were passed for two_layer_model. Check your equations and avoid using global variables inside the function.

Expected output:
cost after iteration 1 must be around 0.69

You are still making mistakes in both, forward and backward propagation. First, what is the activation for the last layer (A2)? Second, when calling backward, what activation to choose for `dA0, dW1, db1`?

I think your concept is not clear and just trying to write any activation function anywhere to see if you pass the test. If that is the case, you need to revisit the lectures.

Yes, there must be something fundamentally wrong with your implementation even on the forward propagation side, before you get to back prop. Note that you get `NaN` as the cost value at iteration 1. So the arguments you are passing to `compute_cost` must be wrong at the very least. There really aren’t many moving parts here and we already worked through how to use all these functions in the previous assignment, right? I suggest you take a walk around the block to clear your head and then come back and just read the instructions in the notebook again with a calm mind. As usual, they lay everything out for you pretty clearly. Also don’t forget that the instructions include the comments that they give you in the template code.

Hello Saif,
Thanks. As I look further in the book, it would assist further in clarifying the concepts.
At this stage, only option is to Practice as much as possible.