Hi!, i don´t know which parameters i should put in the forward function, for the ReLU, it’s supposed to get A and cache, with A_prev, t_W, t_b, activation, right? or am i wrong?, and for the sigmoide to get AL and cache, the parameter A it is supossed to be the A obteined in the ReLU in the last iteration (2), but i still getting this error, if someone can give me some help i’d be so gratful.
ValueError: shapes (1,3) and (5,4) not aligned: 3 (dim 1) != 5 (dim 0)
Hi @paolaruedad , let me see if I can help you and give you a hint. Your reasoning about the parameters in your first response seems to be allright, so you are on the right track, no worries.
My guess is you are in exercise 5 of the first assignment of week 4? Are you able to share a snipit of your code and the output so I can give you a hint?
The error is pointing you in the right direction, the shapes of W and A are (3, 4). Then remember what is the matrix multiplication rule (see below from Wikipedia):
I think that it’s the A matrix that’s not in the right dimension. The A matrix’s dimension should be (4, 1) instead of (3, 4) for Z to be (4, 1).
How I can’t assume that A is not in the right dimension. So I have to check W to make sure that it has the correct shape to be able to multiply it to A
I got the same problem and I am defining the second function AL,cache as the same one but putting sigmoid instead of relu but it keeps telling me its dimension error, can you please help me solve it, I have been trying since yesterday
Hi @Mohammad_Ali_Chamass, Yes, the parameters would almost be same as (A, cache) above except for A. How are you calling it for A need to be checked. Additionally, you must call the activation for ‘sigmoid’ which in your case seems you have done the same. Thanks!
Hello @Rashmi , I tried several ways calling A but I didn’t find any solution to it till now, ‘A’ calculated above for the RELU function should be the output that should be passed for ‘AL’, cache if I am not mistaken, so I am calling AL, cache = linear_activation_forward(A, and the same parameters as A,cache)
Hi, yeah, that seems correct. Maybe then you haven’t run the cells and they are taking the previous entered variables. You need to run all the cells above to run this current cell again. Hope, you would reach your destination then. Thanks!
The thing now is I am getting the same output as the one expected but they are telling me this: AL = [[0.03921668 0.70498921 0.19734387 0.04728177]]
Error: Wrong shape for variable 0.
Error: Wrong shape for variable 1.
Error: Wrong shape for variable 2.
Error: Wrong shape for variable 1.
Error: Wrong output for variable 0.
Error: Wrong output for variable 1.
Error: Wrong output for variable 2.
Error: Wrong output for variable 1.
The parameters I am calling in A,cache are like that: parameters[‘W’ + str(l)] and same for b along with the activation function “relu” and in AL,cache i am calling them the same except that i am going to l+1 as i want to access the last layer, is that correct?
You are doing a forward prop to attain the output. Check the parameters carefully while calling the codes for A, cache and AL, cache considering the fact they both will lead different activations for string of layers (l) and (L). I think you are not justifying the right codes somewhere and that’s why you are getting this problem.
Go through public.utils.py file by clicking File–>Open and you will get an idea then.