AssertionError: Wrong values. Check you formulas for parameters['W1'] : Exercise 6 update Adam parameter

Based on multiple forum topics, I tried with moving out epsilon out of np.sqrt(…) still see this problem, any help ?

{moderator edit - solution code removed}

W1 =
[[ 1.66754616 -0.63495721 -0.55137255]
[-1.09616942 0.84220683 -2.2583379 ]]
W2 =
[[ 0.36223989 -0.27257117 1.50530873]
[-2.01693991 -0.27921641 -0.34085356]
[ 1.17697024 -1.05669047 -0.12922741]]
b1 =
[[ 1.78801256]
[-0.71800611]]
b2 =
[[-0.90105921]
[ 0.01901295]
[ 0.55961442]]

AssertionError Traceback (most recent call last)
in
13 print(f"b2 = \n{parameters[‘b2’]}")
14
—> 15 update_parameters_with_adam_test(update_parameters_with_adam)

~/work/release/W2A1/public_tests.py in update_parameters_with_adam_test(target)
257 assert type(parameters[key]) == np.ndarray, f"Wrong type for parameters[‘{key}’]. Expected np.ndarray"
258 assert parameters[key].shape == parametersi[key].shape, f"Wrong shape for parameters[‘{key}’]. The update must keep the dimensions of parameters inputs"
→ 259 assert np.allclose(parameters[key][0], expected_parameters[key]), f"Wrong values. Check you formulas for parameters[‘{key}’]"
260 #print(f"{key}: \n {str(parameters[key])}")
261

AssertionError: Wrong values. Check you formulas for parameters[‘W1’]

Expected values:

W1 =
[[ 1.63942428 -0.6268425 -0.54320974]
[-1.08782943 0.85036983 -2.2865723 ]]
W2 =
[[ 0.33356139 -0.26425199 1.47707772]
[-2.04538458 -0.30744933 -0.36903141]
[ 1.14873036 -1.09256871 -0.15734651]]
b1 =
[[ 1.75854357]
[-0.74616067]]
b2 =
[[-0.89228024]
[ 0.02707193]
[ 0.56782561]]

Issue with my code has been resolved.
This query can be ignored

Glad to hear you found the answer under your own power. For anyone else who has this issue, study your “order of operations” carefully. The \epsilon value must be in the denominator, but not under the square root, right?

Also for future reference note that you filed this question under “General Discussion”. I moved it for you (by using the little “edit pencil” on the title) to DLS Course 2, which is a better place for it.

9 Likes

apologies but I have the same code but am unable to figure out what the bug was in it :frowning:
here’s my code

{moderator edit - solution code removed}

apologies for the trouble I figured it out myself

Glad to hear you found the solution under your own power. Onward! :nerd_face: