Neural_machine_translation_with_attention_v4a-Exercise 1 - one_step_attentio

I could not find out error what i have done ,please help me to resolve this error

thank you ver much

My interpretation of that error trace is that the s_prev value being passed to the one_step_attention function in the test is the wrong shape: it’s a scalar, but a 2D tensor is expected. It also looks like maybe you edited the test function by commenting a couple of previous lines out, but maybe that’s just a misinterpretation on my part. I’ll go look at that test in more detail, but just as a general matter if the test fails, the solution is not to change the test. It’s to figure out what the failure is telling you about your code. :grinning:

sir ,
I did not edit any test cell ,since morning I have been trying different
codes to solve ,but i could not

so i tried to trace each step of code and what is happening behind ,
to know output of s_prev …thats all
yes i added some code in test cell ,and those only i commented other than that i did not modify any test cell

Here’s what that test cell looks like in a clean notebook:

def one_step_attention_test(target):

    m = 10
    Tx = 30
    n_a = 32
    n_s = 64
    a = np.random.uniform(1, 0, (m, Tx, 2 * n_a)).astype(np.float32)
    s_prev =np.random.uniform(1, 0, (m, n_s)).astype(np.float32) * 1
    context = target(a, s_prev)
    assert type(context) == tf.python.framework.ops.EagerTensor, "Unexpected type. It should be a Tensor"
    assert tuple(context.shape) == (m, 1, n_s), "Unexpected output shape"
    assert np.all(context.numpy() > 0), "All output values must be > 0 in this example"
    assert np.all(context.numpy() < 1), "All output values must be < 1 in this example"

    #assert np.allclose(context[0][0][0:5].numpy(), [0.50877404, 0.57160693, 0.45448175, 0.50074816, 0.53651875]), "Unexpected values in the result"
    print("\033[92mAll tests passed!")

So you can see that the shape of s_prev that is passed into your Function Under Test will have shape 10 x 64, right? So your code must have done something strange before that point. The calls to the repeator should be the first code that touches s_prev in our function, right?

sir ,
i tried this way

So you didn’t edit any test cells except where you edited the test cells is what you’re saying. :laughing:

I added some print statements in my one_step_attention function and here’s what I see:

before repeator s_prev.shape = (10, 64)
after repeator s_prev.shape = (10, 30, 64)
concat.shape (10, 30, 128)
All tests passed!

Looking at your exception trace, you are invoking the repeator function incorrectly: it is an “instantiated” layer function. Look at where it was defined: it already knows the size it needs to use for the repeat. You just call it with s_prev as the argument.

Thank you very very much sir
since morning i have been struggling

sir one more question …
if we get error …how to trace …is there any process…to check test cells …
to check…in test utility files