[Week 2] Assignment: Emojify Exercise UNQ_C5 - Bad LSTM Result/error

Code Deleted by Me , because problem was solved

Obtaining errors:

ValueError Traceback (most recent call last)
in
22
23
—> 24 Emojify_V2_test(Emojify_V2)

in Emojify_V2_test(target)
16
17 maxLen = 4
—> 18 model = target((maxLen,), word_to_vec_map, word_to_index)
19
20 expectedModel = [[‘InputLayer’, [(None, 4)], 0], [‘Embedding’, (None, 4, 2), 30], [‘LSTM’, (None, 4, 128), 67072, (None, 4, 2), ‘tanh’, True], [‘Dropout’, (None, 4, 128), 0, 0.5], [‘LSTM’, (None, 128), 131584, (None, 4, 128), ‘tanh’, False], [‘Dropout’, (None, 128), 0, 0.5], [‘Dense’, (None, 5), 645, ‘linear’], [‘Activation’, (None, 5), 0]]

in Emojify_V2(input_shape, word_to_vec_map, word_to_index)
25 # Propagate sentence_indices through your embedding layer
26 # (See additional hints in the instructions).
—> 27 embeddings = LSTM( units=embedding_layer , return_sequences=True )(sentence_indices)
28
29 # Propagate the embeddings through an LSTM layer with 128-dimensional hidden state

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/layers/recurrent.py in call(self, inputs, initial_state, constants, **kwargs)
661
662 if initial_state is None and constants is None:
→ 663 return super(RNN, self).call(inputs, **kwargs)
664
665 # If any of initial_state or constants are specified and are Keras

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in call(self, *args, **kwargs)
924 if _in_functional_construction_mode(self, inputs, args, kwargs, input_list):
925 return self._functional_construction_call(inputs, args, kwargs,
→ 926 input_list)
927
928 # Maintains info about the Layer.call stack.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in _functional_construction_call(self, inputs, args, kwargs, input_list)
1090 # TODO(reedwm): We should assert input compatibility after the inputs
1091 # are casted, not before.
→ 1092 input_spec.assert_input_compatibility(self.input_spec, inputs, self.name)
1093 graph = backend.get_graph()
1094 # Use self._name_scope() to avoid auto-incrementing the name.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/input_spec.py in assert_input_compatibility(input_spec, inputs, layer_name)
178 ‘expected ndim=’ + str(spec.ndim) + ‘, found ndim=’ +
179 str(ndim) + '. Full shape received: ’ +
→ 180 str(x.shape.as_list()))
181 if spec.max_ndim is not None:
182 ndim = x.shape.ndims

ValueError: Input 0 of layer lstm_16 is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: [None, 4]

But i don’t understand why , what i made of wrong ?

resolved but i can’t understand embedding_layer(sentence_indices) , what is meaning ?
No problem about , i understood so the topic can be considering like solved, to close.

Hi, what was the problem? I have the same error message. I have been looking at the code for ages and I don’t see what is wrong with it.
Cheers!

TensorFlow and company have this strange way to propagate definition defined previously with the futures and previous instances . It produces confusion also for me .

The definition will return a function and in this case it is to use in the future step for get the next definitions, for reach all tasks of your ML workflow.

But I think you can see it as you prefer ( like definitions or instances, this depends by your perceptions ).

{mentor edit: code removed}

To note you could build this as:
{mentor edit: code removed}

Hi,
thank you for your explanation. You are right. The problem I have is that I have the less compact code almost exactly the same. The only difference is the (input_shape[0]) where I’m just using shape = input_shape.

It’s not so compact but it’s supoused to be right but I get my error. I still get error changing my code with your (input_shape[0]) which I would be happy if it worked but I wouln’t understand why. :slight_smile:

Thxs

I found the problem: It was not before the LSTM but in the call to the first LSTM. I was getting a single element and not the sequence…
Thxs!