Wk2 issue with incorrect weight of embedded matrix

Hi,

I am getting following error, associated to incorrect weight of the embedded matrix.
The dimensions of embedding layer (vocab_size xemb_dim) I have used seem correct.
I have tried resetting the page and redo the tasks, but still getting the error:

AssertionError Traceback (most recent call last)
in
28
29
—> 30 pretrained_embedding_layer_test(pretrained_embedding_layer)

in pretrained_embedding_layer_test(target)
24 [[[ 3, 3], [ 3, 3], [ 2, 4], [ 3, 2], [ 3, 4],
25 [-2, 1], [-2, 2], [-1, 2], [-1, 1], [-1, 0],
—> 26 [-2, 0], [-3, 0], [-3, 1], [-3, 2], [ 0, 0]]]), “Wrong vaulues”
27 print("\033[92mAll tests passed!")
28

AssertionError: Wrong vaulues

Any clues?

Thanks,
Suprio

I can’t tell what the code problem is from the error message.

{edited}

Hi Tom,

Thanks for checking! Kic was trying to help me on this, but we are still stuck.

Following is the code:
{mentor edit: code removed}

Thanks,
Suprio

That should be “word_to_vec_map”, not word_to_index.

The key clue there is that the vocabulary is the word_to_vec_map dictionary.

1 Like

Also, since you didn’t use a private message (instead you posted your code on this thread), I’m going to edit your message to remove the code.

1 Like

I apologize, Tom. I thought I sent you a private message but I was wrong!!

To send a private message, click on the user’s name, and click on the “Message” button.

1 Like