Modelf() error

**Hi, when I run modelf_test(), I gat an assertion with no error string. Here’s the full stack:
**
AssertionError Traceback (most recent call last)
Cell In[66], line 36
31 assert len(model.outputs) == 10, f"Wrong output shape. Expected 10 != {len(model.outputs)}"
33 comparator(summary(model), expected_summary)
—> 36 modelf_test(modelf)

Cell In[66], line 13, in modelf_test(target)
9 n_s = 64
10 len_human_vocab = 37
—> 13 model = target(Tx, Ty, n_a, n_s, len_human_vocab)
15 print(summary(model))
18 expected_summary = [[‘InputLayer’, [(None, 30, 37)], 0],
19 [‘InputLayer’, [(None, 64)], 0],
20 [‘Bidirectional’, (None, 30, 64), 17920],
(…)
28 [‘LSTM’,[(None, 64), (None, 64), (None, 64)], 33024,[(None, 1, 64), (None, 64), (None, 64)],‘tanh’],
29 [‘Dense’, (None, 11), 715, ‘softmax’]]

Cell In[65], line 58, in modelf(Tx, Ty, n_a, n_s, human_vocab_size, machine_vocab_size)
55 outputs.append(out)
57 # Step 3: Create model instance taking three inputs and returning the list of outputs. (≈ 1 line)
—> 58 model = Model(inputs=[X,s,c], outputs=outputs)
60 ### END CODE HERE ###
62 return model

File /usr/local/lib/python3.8/site-packages/tensorflow/python/keras/engine/training.py:242, in Model._new_(cls, *args, **kwargs)
239 if is_functional_model_init_params(args, kwargs) and cls == Model:
240 # Functional model
241 from tensorflow.python.keras.engine import functional # pylint: disable=g-import-not-at-top
→ 242 return functional.Functional(*args, **kwargs)
243 else:
244 return super(Model, cls)._new_(cls, *args, **kwargs)

File /usr/local/lib/python3.8/site-packages/tensorflow/python/training/tracking/base.py:457, in no_automatic_dependency_tracking.._method_wrapper(self, *args, **kwargs)
455 self._self_setattr_tracking = False # pylint: disable=protected-access
456 try:
→ 457 result = method(self, *args, **kwargs)
458 finally:
459 self._self_setattr_tracking = previous_value # pylint: disable=protected-access

File /usr/local/lib/python3.8/site-packages/tensorflow/python/keras/engine/functional.py:115, in Functional._init_(self, inputs, outputs, name, trainable)
108 @trackable.no_automatic_dependency_tracking
109 def _init_(self, inputs=None, outputs=None, name=None, trainable=True):
110 # generic_utils.validate_kwargs(
111 # kwargs, {‘name’, ‘trainable’},
112 # 'Functional models may only specify `name` and `trainable` keyword ’
113 # ‘arguments during initialization. Got an unexpected argument:’)
114 super(Functional, self)._init_(name=name, trainable=trainable)
→ 115 self._init_graph_network(inputs, outputs)

File /usr/local/lib/python3.8/site-packages/tensorflow/python/training/tracking/base.py:457, in no_automatic_dependency_tracking.._method_wrapper(self, *args, **kwargs)
455 self._self_setattr_tracking = False # pylint: disable=protected-access
456 try:
→ 457 result = method(self, *args, **kwargs)
458 finally:
459 self._self_setattr_tracking = previous_value # pylint: disable=protected-access

File /usr/local/lib/python3.8/site-packages/tensorflow/python/keras/engine/functional.py:184, in Functional._init_graph_network(self, inputs, outputs)
181 layer, node_index, tensor_index = x._keras_history # pylint: disable=protected-access
182 # It’s supposed to be an input layer, so only one node
183 # and one tensor output.
→ 184 assert node_index == 0
185 assert tensor_index == 0
186 self._input_layers.append(layer)

AssertionError:

In addition to that, there is a warning in the beginning:
WARNING:tensorflow:Functional inputs must come from `tf.keras.Input` (thus holding past layer metadata), they cannot be the output of a previous non-Input layer. Here, a tensor specified as input to “functional_7” was not an Input tensor, it was generated by layer lstm.
Note that input tensors are instantiated via `tensor = tf.keras.Input(shape)`.
The tensor that caused the issue was: lstm/PartitionedCall_69:2
WARNING:tensorflow:Functional inputs must come from `tf.keras.Input` (thus holding past layer metadata), they cannot be the output of a previous non-Input layer. Here, a tensor specified as input to “functional_7” was not an Input tensor, it was generated by layer lstm.
Note that input tensors are instantiated via `tensor = tf.keras.Input(shape)`.
The tensor that caused the issue was: lstm/PartitionedCall_69:3

Check your Step 3:

2 Likes

Thanks, Saif.
Now I have this error:
[[‘InputLayer’, [(None, 30, 37)], 0], [‘InputLayer’, [(None, 64)], 0], [‘Bidirectional’, (None, 30, 64), 17920], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘Concatenate’, (None, 30, 128), 0], [‘Dense’, (None, 30, 10), 1290, ‘tanh’], [‘Dense’, (None, 30, 1), 11, ‘relu’], [‘Activation’, (None, 30, 1), 0], [‘Dot’, (None, 1, 64), 0], [‘LSTM’, [(None, 64), (None, 64), (None, 64)], 33024, (None, 1, 64), ‘tanh’], [‘InputLayer’, [(None, 64)], 0], [‘Dense’, (None, 11), 715, ‘softmax’]]
Test failed at index 9
Expected value

[‘InputLayer’, [(None, 64)], 0]

does not match the input value:

[‘LSTM’, [(None, 64), (None, 64), (None, 64)], 33024, (None, 1, 64), ‘tanh’]

I have no idea why index 9 does not match.

Also, I saw this comment at step 2.B: #Remember: s = None

I’m confused about what parameters to pass to post_activation_LSTM_cell()

The value of post_attention_LSTM_cell is an “instantiated” LTSM function, so you invoke it the usual way that we’ve seen several times up to this point in DLS C5, with two arguments inputs and initial_state. Then you just have to make sure those have the appropriate values. They give you some pretty helpful hints in the comments there.

1 Like