[Week 4] Assignment 1 >> Decoder class error

I keep triggering the assertion of

assert np.allclose(outd[1, 1], [-0.34560338, -0.8762897,  -0.4767484,   1.6986415]), "Wrong values in outd"

Although it is pretty similar to the encoder class and passing all the previous unit tests. Commenting out this assertion passes the unit test.
My outd is
tf.Tensor([-0.26879537 -0.9024167 -0.51610225 1.6873143 ], shape=(4,), dtype=float32)
Any thoughts on what I could be doing wrong?


I have the same output as you.

@lsomera I saw your post now. our code is almost identical. I can’t see were it could have gone wrong!

How did you guys declared angle_rads in the positional_encoding function (at the beginning of the assessment) ? I’m stucking there

get_angles(positions_vector, index_vector, d)
make sure that the vectors are properly shaped. (look again at the get_angles doc string)

That’s right but the index_vector isn’t defined

Yes. Think about what it represents ( :point_left: this is defined, kind of)

I got the same output. If I continue in the last exercise it fails more assertions

Yes it fails the same “outd” assertion in function 8 (the following exercise) too!

Hi @vanessaca , the issue is with the last 2 graded functions of the assignment 7/8 and 8/8. All the preceding functions passes their own unit tests.
The assignment uses Keras’s MultiHeadAttention implementation so I guess the function you are referring to isn’t the issue.
In fact my Decoder implementation looks very similar to the implementation in the link you shared. Which is more confusing :thinking:

Well, I am not even that far. Working aside and go now and then back to the assignment for a change. Will delete that comment then.

Thanks for reporting the problem. An issue report is fired, course staffs are working on it.

Hi @edwardyu

Any idea on when this will be fixed? This is my only remaining assignment for the specialization, and my subscription is about to end soon. :frowning: Thanks!

It depends on course staffs’ schedule. I’ll post message once it’s fixed.

Alright, same output here.
Since we are waiting anyway, have you noticed, that you pass the tests whether you add “training” to the dropout layer in Excercise 4 or not? Since, I would give the argument as training=True (or False), I added: training=training to all self.dropout layers. Would like to know, if that was right. How did you implement it?

training = training in “call” function’s dropout layers is the correct approach. This assignment won’t train the model so if you hard coded it, it will probably work in the context of “this” assignment but won’t work if you actually used it for your own projects.

1 Like

Looking forward to that as for me too, this issue is blocking me from completing the specialization which is a bit frustrating :pensive:

Hi there, I just finished the assignment. I had an assertion error on unit 6 (DecoderLayer) so I decided to mute the assertion to continue. What happened is that the following units worked perfectly and I managed to finish it with 100/100. Let me know if I can help you.

@sunwaee I passed the DecoderLayer unit tests. My problem is with the last two. I commented out their assertion tests and submitted again now but I got the same result 75/100! This assignment has serious issues given that you had errors elsewhere and passed.
I guess I’ll have to wait for the issue resolution by the course staff.

1 Like

I had the same issue where the Decoder and the Transformer class failed the tests. Submitting the assignment got me only 75/100.

So, while googling the problem, I found that tenorflow has a tutorial for transformers. Where, in the second MultiHeadAttenion layer (mha2) in the DecoderLayer class, they passed in encoder output,enc_output as the argument for Q,K and the output from the first block, out1 as the argument for V. Doing so will throw an assertion error in the cell below, just comment it out.

Now the other tests should pass and if you submit it, you should get 100.