C4W1_Assignment - Translate Function

Hi - In the Translate function… dont we need to pass a different next_token each time ?

Iterate for max_length iterations

for _ in range(max_length):
    # Generate the next token
    try:
        next_token, logit, state, done = generate_next_token(
            decoder=model.decoder,
            context=context,
            next_token=next_token,
            done=done,
            state=state,
            temperature=temperature
        )
    except:
         raise Exception("Problem generating the next token")

Unit test results
Failed test case: translate didn’t return the same logit when using temperature of 0.0.
Expected: -0.6533634066581726
Got: -0.5493094921112

Hi @Manjunath_RN

We do pass next_token each time. The problem is somewhere else. Maybe you do not break out of the loop when “done”?

Thanks and Yes, I do break out

if done:
break

Then maybe you do not initialize the initial state to zeros?

The example you’re given above the exercise, initializes the initial state with random uniform, so maybe you copied that line instead of modifying it to return all zeros?

7 Likes

That was it. Thank you for the clarification!

Finished the nlp module!

1 Like

I wondered, why did TA recommend tf.zeros in the beginning of the exercise… Now I know :slight_smile:

1 Like