HI All, I could not proceed further with the assignment. Can you please help me here?
A, cache = linear_activation_forward(A_prev,parameters[“W1”],parameters[“b1”],“relu”)
caches = list.append((cache))
AL, cache =linear_activation_forward(A_prev,parameters[“W1”],parameters[“b1”],“sigmoid”)
caches = list.append((cache))
I get the error as follows:
Request to check why I’m getting this error.
caches
is a list which supports append
funtionality.
In L_model_forward(), “cache” is a tuple, and “caches” is a list.
“append()” is a method of “list object”, i.e, “caches”. So, the answer is simply to append “cache” tuple, to “caches” list.
@balaji.ambresh , Sorry that I was not aware of your post. Looks like a few second earlier than mine… 
@RAJA_KARTHEEK_KUMAR We are talking about the same thing. Sorry that my post might confuse you.
caches = list.append(cache)
This is not working. Getting the same error.
Now I get a different error. Thanks
No worries Nobu. Have a good one.
Dot product can be performed only on matrices where inner dimensions match.
HI Balaji,
Thank you for the details. All the previous functions have completed successfully but now I have received a new error. Request you to check this one as well.
I tried np.dot(W.T,A) + b but this one failed
Do not touch it.
The problem is not there.
When you call linear_activation_forward() function, you hard-cord “W1” and “b1”. But, it is inside a for loop. Please use an iteration count “l” (small letter of L) to set the right weight like W1, W2 depending to the iteration count.
The bug is not in the lower level function: it is in your L_model_forward logic. A perfectly correct function can still throw errors if you pass it incorrect or (as in this case) mismatching arguments. That means your logic for handling the various different layers is not correct. The way to debug this is to start with the “dimensional analysis”, which tells you what shapes you should be getting at each layer. Here’s a thread which shows how to do that for this particular test case.
caches = list.append(cache) this is not a right format for appendding
into caches.append(cache) This is the function format you want to use