Week 4 - Exercise 5: incorrect shapes of W and b in sigmoid activation

I have a problem when I try to activate the sigmoid function in forward propagation in Excercise 5.

I use WL and bL as parameters, but get the error message: “ValueError: shapes (1,3) and (4,4) not aligned: 3 (dim 1) != 4 (dim 0)”

Does anyone know how to solve this?

Hi @Peeteerrr ,

It looks like there is a problem with parameter passing during function calls or the execution environment is not up-to-date. You can use print statement to print out the values passing into function, and at the start of that function, print out what the input arguments are, to help trace where the error lies.
Be sure to rerun your code from start after editing your code. Just be sure your code is running in a clean environment, restart the kernel and clear all output first, before running the cells from start:
Kernel->restart and clear all output

Hi @Kic ,

Thanks for your help. I tried what you said, but I get the same error. I printed my WL and bL for the linear → sigmoid step. It say:

WL = [[ 0.9398248 0.42628539 -0.75815703]]
bL = [[-0.16236698]]

So I understand that it gives the failure that the shape of WL (1,3) doesn’t match the shape of A (appearently (4,4)). But I don’t understand why A is a 4x4 matrix. I would expect it to be (3,1).

Note: A is in my view in fact A^(L-1) in the right side of the formula for AL.

What do I see wrong?

Could you click my name and send me your notebook in a direct message. I am away from my desk for 2 hours. Will get back to you asap

The way to debug this is to do the “dimensional analysis” which should give you a clear picture of where the problem is. Here’s a thread that describes that for what I think is the same test case you are talking about.

My guess is you are not correctly managing the A_prev value for the output layer, which is after you fall out of the loop over the “hidden” layers.

1 Like

It is not stated in the code, but realize that its necesary add a line updating the activation value after each iteration of this line “for l in range(1, L):”. Please let me know if it solve your problem.

Hi @wilder_flores ,

The problem may have been resolved and there is no update from @Peeteerrr after his last response on finding an indentation misalignment in the code, causing the code logic to skip one layer.