W4_Assignment-1_E5: Parameters W and b are not available

Hello Everyone,

> Requirement:

CODE REMOVED FROM MODERATOR

Thank you for your assistance.
Sincerely,
A

You have to use parameters dictionary here, the ‘W’ is just a prefix letter in the parameters dictionary can not be accessed this way. What you can do is something like this:

parameters[‘W’+…]

Hello,
Thanks for response.
It is Exercise 5. Sorry for the typo.

Exercise 5 is using “linear_activation_forward” function.

It is still not clear to me:
how to procure W, b, and activation parameters inside function “def L_model_forward(X, parameters):”

Thanks.

You have to concatenate two characters together, using the ‘+’ operator.

  • The first character is ‘W’ or ‘b’.
  • The second character is the loop index (or the number of layers), converted to a string using the “str(…)” function.

Thanks for clarifying.

Above suggestion (parameters[‘W’+…]) works just fine for “relu” loop.
However, same implementation fails for ‘sigmoid’.

AL, cache = linear_activation_forward(A, parameters["W" + str(l)], parameters["b" + str(l)], "sigmoid")
caches.append(cache) 

ERROR: ValueError: shapes (3,4) and (3,4) not aligned: 4 (dim 1) != 3 (dim 0)
(Attached picture).

It is not clear to me why there are shapes error in “sigmoid” logic but not in “relu” logic.

Sincerely,
A

It seems the shape of W or A that is being passed to linear_forward() is not correct.

So trace back and check for any calculations that could mangle W or A.

Note particularly that where linear_activation_forward() is called with sigmoid() activation, it is outside of the ‘for’ loop. So the loop index is not reliable.

Hint: So you can’t use exactly the same index variable there as you used for ReLU activation.

Thanks for response.
I used “l” prefix so as to take final value.
Even if “l” prefix is removed, attached error is observed.


NameError Traceback (most recent call last)
in
----> 1 t_X, t_parameters = L_model_forward_test_case_2hidden()
2 t_AL, t_caches = L_model_forward(t_X, t_parameters)
3
4 print("AL = " + str(t_AL))
5

NameError: name ‘L_model_forward_test_case_2hidden’ is not defined

Code Change performed:
AL, cache = linear_activation_forward(A, parameters[“W”], parameters[“b”], “sigmoid”)
Error remains same even if code is changed to variable A_prev.

Could there be any other considerations in function “L_model_forward_test_case_2hidden()” ?

That’s not where you needed to make changes.

I looked at all previous codes from exercise 1-4, and Exercise 5 (relu), they are being shown as pass.
Is there a way I could share the code for exercise 5.

Thanks.

I believe the issue is in this line of code, it’s from L_model_forward() where you are implementing the output layer Linear → Sigmoid:

The key item here is that the variable you use inside the “str(…)” function cannot be the same variable you used for the hidden ReLU layers.

Since sigmoid() is only used in the output layer, the variable used there should be the number of layers. It was computed earlier in that function, iist before the for-loop.