Please , I am struggling to find the product of the weight parameters (4,2) and the input matrix X of all three examples (2,3) for the first linear calculation.

When I use np.dot(), I get an assertion error for the shape of A2. Note , that i perform the second linear operation with W2 and A1.

What could be the problem.

The shapes that you quote sound right and should work with `np.dot`

. Here’s what I see with some added print statements:

```
X.shape = (2, 3)
W1.shape = (4, 2)
A1.shape = (4, 3)
A2.shape = (1, 3)
A2 = [[0.21292656 0.21274673 0.21295976]]
X.shape = (2, 3)
W1.shape = (4, 2)
A1.shape = (4, 3)
A2.shape = (1, 3)
All tests passed!
```

Please show us the actual error message that you are getting. Note that there are no transposes involved here, unlike in the Logistic Regression case in Week 2. Also note that the problem could also be with the addition that happens after the `np.dot`

.

Thanks! Please print the shape of your A2 value to understand why it is incorrect. You can see what it should be from my print statements above.

Do I have to type it within the cell with the forward propagation function ?

I meant print the shape of A2 in the cell with the forward propagation function?

You can do it either way, but sometimes the test cells are not modifiable. I put the print statements in the body of the `forward_propagation`

function. Then you can delete them or comment them out once you figure out the problem.

Note that the shape of your A2 is 4 x 3, so how could that happen?

Print the shapes of A1 and W2. Here’s what I see:

```
X.shape = (2, 3)
W1.shape = (4, 2)
A1.shape = (4, 3)
W2.shape = (1, 4)
A2.shape = (1, 3)
A2 = [[0.21292656 0.21274673 0.21295976]]
X.shape = (2, 3)
W1.shape = (4, 2)
A1.shape = (4, 3)
W2.shape = (1, 4)
A2.shape = (1, 3)
All tests passed!
```

So if W2 is 1 x 4 and A1 is 4 x 3, then W2 \cdot A1 should be 1 x 3, right? So why did that not happen in your code?

I found out what the error was.

I retrieved the value of b1 and assigned it to b2.

Thank You very much Paulin

That’s great news that you found the problem. Congratulations!