Hi I am getting this error in my matrix size. I believe it means that for equation Z= np.dot(W,A)+b, W is of shapes (1,3) and A is of shape (4,4)? Im not sure how to correct this as all the previous sections I ran yielded correct results.
The error there is not a matrix size mismatch: it is caused by the fact that you are passing an incorrect value for the activation argument when you call linear_activation_forward from L_model_forward. If you examine the logic in the lower level function, you can see that it is expecting the string name of the function. You are passing an object reference to the function and this results in neither of the branches that would set linear_cache being taken.