It looks like there is a problem with parameter passing during function calls or the execution environment is not up-to-date. You can use print statement to print out the values passing into function, and at the start of that function, print out what the input arguments are, to help trace where the error lies.
Be sure to rerun your code from start after editing your code. Just be sure your code is running in a clean environment, restart the kernel and clear all output first, before running the cells from start:
Kernel->restart and clear all output
So I understand that it gives the failure that the shape of WL (1,3) doesn’t match the shape of A (appearently (4,4)). But I don’t understand why A is a 4x4 matrix. I would expect it to be (3,1).
Note: A is in my view in fact A^(L-1) in the right side of the formula for AL.
The way to debug this is to do the “dimensional analysis” which should give you a clear picture of where the problem is. Here’s a thread that describes that for what I think is the same test case you are talking about.
My guess is you are not correctly managing the A_prev value for the output layer, which is after you fall out of the loop over the “hidden” layers.
It is not stated in the code, but realize that its necesary add a line updating the activation value after each iteration of this line “for l in range(1, L):”. Please let me know if it solve your problem.
The problem may have been resolved and there is no update from @Peeteerrr after his last response on finding an indentation misalignment in the code, causing the code logic to skip one layer.