Hit the following error in Emojify_V2_test:
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-49-bb78fe22f36c> in <module>
28
29
---> 30 Emojify_V2_test(Emojify_V2)
<ipython-input-49-bb78fe22f36c> in Emojify_V2_test(target)
20
21 maxLen = 4
---> 22 model = target((maxLen,), word_to_vec_map, word_to_index)
23
24 assert type(model) == Functional, "Make sure you have correctly created Model instance which converts \"sentence_indices\" into \"X\""
<ipython-input-48-31fd2c7f4b76> in Emojify_V2(input_shape, word_to_vec_map, word_to_index)
25 # Propagate sentence_indices through your embedding layer
26 # (See additional hints in the instructions).
---> 27 embeddings = embedding_layer(sentence_indices)
28
29 # Propagate the embeddings through an LSTM layer with 128-dimensional hidden state
/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in __call__(self, *args, **kwargs)
924 if _in_functional_construction_mode(self, inputs, args, kwargs, input_list):
925 return self._functional_construction_call(inputs, args, kwargs,
--> 926 input_list)
927
928 # Maintains info about the `Layer.call` stack.
/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in _functional_construction_call(self, inputs, args, kwargs, input_list)
1096 # Build layer if applicable (if the `build` method has been
1097 # overridden).
-> 1098 self._maybe_build(inputs)
1099 cast_inputs = self._maybe_cast_inputs(inputs, input_list)
1100
/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in _maybe_build(self, inputs)
2653 # Using `init_scope` since we want variable assignment in
2654 # `set_weights` to be treated like variable initialization.
-> 2655 self.set_weights(self._initial_weights)
2656 else:
2657 self.set_weights(self._initial_weights)
/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in set_weights(self, weights)
1808 'with a weight list of length %s, but the layer was '
1809 'expecting %s weights. Provided weights: %s...' %
-> 1810 (self.name, len(weights), expected_num_weights, str(weights)[:50]))
1811
1812 weight_index = 0
ValueError: You called `set_weights(weights)` on layer "embedding_6" with a weight list of length 15, but the layer was expecting 1 weights. Provided weights: [[ 3. 3.]
[ 3. 3.]
[ 2. 4.]
[ 3. 2.]
[ 3. ...
Does the error happen at the embedding_layer.set_weights([emb_matrix]) line in pretrained_embedding_layer? If yes, how does it pass the pretrained_embedding_layer_test?