# Course 5, Week 3, Assignment 1, Exercise 2

ValueError Traceback (most recent call last)
in
34
35
—> 36 modelf_test(modelf)

in modelf_test(target)
11
12
—> 13 model = target(Tx, Ty, n_a, n_s, len_human_vocab, len_machine_vocab)
14
15 print(summary(model))

in modelf(Tx, Ty, n_a, n_s, human_vocab_size, machine_vocab_size)
37
38 # Step 2.A: Perform one step of the attention mechanism to get back the context vector at step t (≈ 1 line)
—> 39 context = one_step_attention(a, s0)
40
41 # Step 2.B: Apply the post-attention LSTM cell to the “context” vector.

in one_step_attention(a, s_prev)
22 concat = concatenator([a,s_prev])
23 # Use densor1 to propagate concat through a small fully-connected neural network to compute the “intermediate energies” variable e. (≈1 lines)
—> 24 e = densor1(concat)
25 # Use densor2 to propagate e through a small fully-connected neural network to compute the “energies” variable energies. (≈1 lines)
26 energies = densor2(e)

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in call(self, *args, **kwargs)
924 if _in_functional_construction_mode(self, inputs, args, kwargs, input_list):
925 return self._functional_construction_call(inputs, args, kwargs,
→ 926 input_list)
927
928 # Maintains info about the `Layer.call` stack.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in _functional_construction_call(self, inputs, args, kwargs, input_list)
1090 # TODO(reedwm): We should assert input compatibility after the inputs
1091 # are casted, not before.
→ 1092 input_spec.assert_input_compatibility(self.input_spec, inputs, self.name)
1093 graph = backend.get_graph()
1094 # Use `self._name_scope()` to avoid auto-incrementing the name.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/input_spec.py in assert_input_compatibility(input_spec, inputs, layer_name)
214 ’ incompatible with the layer: expected axis ’ + str(axis) +
215 ’ of input shape to have value ’ + str(value) +
→ 216 ’ but received input with shape ’ + str(shape))
217 # Check shape.
218 if spec.shape is not None:

ValueError: Input 0 of layer dense is incompatible with the layer: expected axis -1 of input shape to have value 128 but received input with shape [None, 30, 192]

The error is being thrown by the “e = densor1(concat)” line in the one_step_attention() function.
Perhaps the problem is in the line where you compute the s_prev value.

I am seeing a similar error. The previous step of building the attention layer passed all tests but i am seeing this error now while building the mode

``````---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-53-22936cc05244> in <module>
35
36
---> 37 modelf_test(modelf)

<ipython-input-53-22936cc05244> in modelf_test(target)
12
13
---> 14     model = target(Tx, Ty, n_a, n_s, len_human_vocab, len_machine_vocab)
15
16     print(summary(model))

<ipython-input-52-d95c19cee99c> in modelf(Tx, Ty, n_a, n_s, human_vocab_size, machine_vocab_size)
41
42         # Step 2.A: Perform one step of the attention mechanism to get back the context vector at step t (≈ 1 line)
---> 43         context = one_step_attention(a, s)
44
45         # Step 2.B: Apply the post-attention LSTM cell to the "context" vector. (≈ 1 line)

<ipython-input-50-736d4a97e314> in one_step_attention(a, s_prev)
22     concat = concatenator([a, s_prev])
23     # Use densor1 to propagate concat through a small fully-connected neural network to compute the "intermediate energies" variable e. (≈1 lines)
---> 24     e = densor1(concat)
25     # Use densor2 to propagate e through a small fully-connected neural network to compute the "energies" variable energies. (≈1 lines)
26     energies = densor2(e)

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in __call__(self, *args, **kwargs)
924     if _in_functional_construction_mode(self, inputs, args, kwargs, input_list):
925       return self._functional_construction_call(inputs, args, kwargs,
--> 926                                                 input_list)
927
928     # Maintains info about the `Layer.call` stack.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in _functional_construction_call(self, inputs, args, kwargs, input_list)
1090       # TODO(reedwm): We should assert input compatibility after the inputs
1091       # are casted, not before.
-> 1092       input_spec.assert_input_compatibility(self.input_spec, inputs, self.name)
1093       graph = backend.get_graph()
1094       # Use `self._name_scope()` to avoid auto-incrementing the name.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/input_spec.py in assert_input_compatibility(input_spec, inputs, layer_name)
214                 ' incompatible with the layer: expected axis ' + str(axis) +
215                 ' of input shape to have value ' + str(value) +
--> 216                 ' but received input with shape ' + str(shape))
217     # Check shape.
218     if spec.shape is not None:

ValueError: Input 0 of layer dense_2 is incompatible with the layer: expected axis -1 of input shape to have value 128 but received input with shape [None, 30, 124]
``````

would appreciate any pointers.

Thank you

As discussed in my reply on this thread from three years ago, the issue is likely in how you’re setting s_prev in the one_step_attention() function.

i think i have that right actually. ‘s’ gets updated from the post_attention LSTM but for the first time in the for loop it is s0. have i not understood that right ?

Try restarting the kernel and clearing all output, then run all the cells again.

This often clears issues regarding the use of global objects inside the notebook.

This is extremely important throughout Course 5, because the LSTM objects are almost always globals.

Running a cell that modifies a global object will get the notebook into a sad state unless the kernel is restarted.