C5_W2: LSTM Emoji v2 Emojify_V2_test fails but no clue

Hit the following error in Emojify_V2_test:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-49-bb78fe22f36c> in <module>
     28 
     29 
---> 30 Emojify_V2_test(Emojify_V2)

<ipython-input-49-bb78fe22f36c> in Emojify_V2_test(target)
     20 
     21     maxLen = 4
---> 22     model = target((maxLen,), word_to_vec_map, word_to_index)
     23 
     24     assert type(model) == Functional, "Make sure you have correctly created Model instance which converts \"sentence_indices\" into \"X\""

<ipython-input-48-31fd2c7f4b76> in Emojify_V2(input_shape, word_to_vec_map, word_to_index)
     25     # Propagate sentence_indices through your embedding layer
     26     # (See additional hints in the instructions).
---> 27     embeddings = embedding_layer(sentence_indices)
     28 
     29     # Propagate the embeddings through an LSTM layer with 128-dimensional hidden state

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in __call__(self, *args, **kwargs)
    924     if _in_functional_construction_mode(self, inputs, args, kwargs, input_list):
    925       return self._functional_construction_call(inputs, args, kwargs,
--> 926                                                 input_list)
    927 
    928     # Maintains info about the `Layer.call` stack.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in _functional_construction_call(self, inputs, args, kwargs, input_list)
   1096         # Build layer if applicable (if the `build` method has been
   1097         # overridden).
-> 1098         self._maybe_build(inputs)
   1099         cast_inputs = self._maybe_cast_inputs(inputs, input_list)
   1100 

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in _maybe_build(self, inputs)
   2653           # Using `init_scope` since we want variable assignment in
   2654           # `set_weights` to be treated like variable initialization.
-> 2655           self.set_weights(self._initial_weights)
   2656       else:
   2657         self.set_weights(self._initial_weights)

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in set_weights(self, weights)
   1808           'with a weight list of length %s, but the layer was '
   1809           'expecting %s weights. Provided weights: %s...' %
-> 1810           (self.name, len(weights), expected_num_weights, str(weights)[:50]))
   1811 
   1812     weight_index = 0

ValueError: You called `set_weights(weights)` on layer "embedding_6" with a weight list of length 15, but the layer was expecting 1 weights. Provided weights: [[ 3.  3.]
 [ 3.  3.]
 [ 2.  4.]
 [ 3.  2.]
 [ 3. ...

Does the error happen at the embedding_layer.set_weights([emb_matrix]) line in pretrained_embedding_layer? If yes, how does it pass the pretrained_embedding_layer_test?

Of course a correct function can still fail if you pass it incorrect arguments. The first thing to check is whether the sentence_indices argument being passed there is initialized correctly. I added some print statements in my code:

    print(f"type(sentence_indices) {type(sentence_indices)}")
    print(f"input_shape {input_shape}")
    print(f"sentence_indices.shape {sentence_indices.shape}")

Here’s what I see when I run the test cell with those added statements:

type(sentence_indices) <class 'tensorflow.python.framework.ops.Tensor'>
input_shape (4,)
sentence_indices.shape (None, 4)
After dropout and before LSTM X.shape = (None, 4, 128)
All tests passed!

What do you see if you try that experiment?

The next thing to consider is that the actual function you are invoking that ends up “throwing” is also the result of a function call. Perhaps the embedding_layer function itself is incorrect because you invoked pretrained_embedding_layer with incorrect arguments. But that may be a bit trickier to diagnose and will probably require looking at the code, which we can’t do in a public thread. Please check your DMs for a message from me about how to proceed if the first investigation above doesn’t shed any light.

1 Like

This SO post helped: https://stackoverflow.com/questions/53417537/keras-initialize-large-embeddings-layer-with-pretrained-embeddings