C5W1 Assignment 3 djmodel() - AttributeError: 'NoneType' object has no attribute 'op'

Hello,

I have this error (see below) every time I try to create the model (using “model = djmodel(Tx=30, LSTM_cell=LSTM_cell, densor=densor, reshaper=reshaper”). I checked every step in my code many times and read every post about this assignment but could not even figure out what is the origin of the problem…

Can anyone help with this?

Thanks in advance,
Anatole


AttributeError Traceback (most recent call last)
in
----> 1 model = djmodel(Tx=30, LSTM_cell=LSTM_cell, densor=densor, reshaper=reshaper)

in djmodel(Tx, LSTM_cell, densor, reshaper)
51
52 # Step 3: Create model instance
—> 53 model = Model(inputs=[X, a0, c0], outputs=outputs)
54
55 ### END CODE HERE ###

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/training.py in new(cls, *args, **kwargs)
240 # Functional model
241 from tensorflow.python.keras.engine import functional # pylint: disable=g-import-not-at-top
→ 242 return functional.Functional(*args, **kwargs)
243 else:
244 return super(Model, cls).new(cls, *args, **kwargs)

/opt/conda/lib/python3.7/site-packages/tensorflow/python/training/tracking/base.py in _method_wrapper(self, *args, **kwargs)
455 self._self_setattr_tracking = False # pylint: disable=protected-access
456 try:
→ 457 result = method(self, *args, **kwargs)
458 finally:
459 self._self_setattr_tracking = previous_value # pylint: disable=protected-access

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/functional.py in init(self, inputs, outputs, name, trainable)
113 # ‘arguments during initialization. Got an unexpected argument:’)
114 super(Functional, self).init(name=name, trainable=trainable)
→ 115 self._init_graph_network(inputs, outputs)
116
117 @trackable.no_automatic_dependency_tracking

/opt/conda/lib/python3.7/site-packages/tensorflow/python/training/tracking/base.py in _method_wrapper(self, *args, **kwargs)
455 self._self_setattr_tracking = False # pylint: disable=protected-access
456 try:
→ 457 result = method(self, *args, **kwargs)
458 finally:
459 self._self_setattr_tracking = previous_value # pylint: disable=protected-access

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/functional.py in _init_graph_network(self, inputs, outputs)
140
141 if any(not hasattr(tensor, ‘_keras_history’) for tensor in self.outputs):
→ 142 base_layer_utils.create_keras_history(self._nested_outputs)
143
144 self._validate_graph_inputs_and_outputs()

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer_utils.py in create_keras_history(tensors)
189 the raw Tensorflow operations.
190 “”"
→ 191 _, created_layers = _create_keras_history_helper(tensors, set(), )
192 return created_layers
193

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer_utils.py in _create_keras_history_helper(tensors, processed_ops, created_layers)
224 'op wrapping. Please wrap these ops in a Lambda layer: ’
225 ‘\n\n\n{example}\n\n’.format(example=example))
→ 226 op = tensor.op # The Op that created this Tensor.
227 if op not in processed_ops:
228 # Recursively set _keras_history.

AttributeError: ‘NoneType’ object has no attribute ‘op’

Hi @Anatole and welcome to Discourse. I would check if X, a0, c0, and outputs are initialized properly as tensors. My guess is that one of them was not initialized, and therefore doesn’t have a reference to the attribute .op TesorFlow complains about

Hello @yanivh, thank you for your fast answer. X, a0, c0 are correctly initialized as tensor however, following the instructions, I initialized ‘outputs’ as an empty list (and outputs.append() is used to fill it with the softmax outputs). Should ‘outputs’ be initialized as an empty tensor instead? I briefly tried but it doesn’t seem to be an obvious way of creating an empty tensor of unknown size…

As explained in the notebook, outputs should be initialized as an empty list (as you originally did). But, within the loop over t you should append to it the output of the LSTM cell at the current time step. This output you append to outputs is a tensor that will have the method .op

Understood. It appears that my implementation is correct, X, a0, c0 are initialized as tensor and ‘outputs’ is a list of tensor. However, the error persists… Do you have another lead on what could be the problem?

From going over the notebook again, nothing comes to mind. I would try and play with outputs. Maybe instead of updating it in a loop, just try to put there densor for only one time stamp, namely, a0, and c0. Build something super simple, and see what you get.

Following your advice, I’ve bypassed the loop and fed a0 to densor() but the error was still there. I then directly used ‘out’ instead of ‘outputs’ for the Model creation (Model(inputs=[X, a0, c0], outputs=out)) and in that case, I don’t have the error.

It appears that Model cannot deal with lists of tensor, which doesn’t make sense to me because it is the instructed implementation and apparently I am the only one experiencing this problem.

I am not very skilled in Python but I guess I should try with other types of variables than lists to stack tensors?

I would do some more basic debugging, which may give you some more understanding. For example, in tf.keras.Model  |  TensorFlow Core v2.4.1 there are few examples you can try out, even in the notebook you are working on. Try to play around, and use list of outputs to see how it works. I believe the basic examples on this doc should work for you, and may guide you to fix something in your implementation

I found the problem: I was creating a list of ‘NoneType’ instead of an empty list when initializing ‘outputs’… Thank you @yanivh for your help

Found the error. I had initialized outputs list as
outputs = [None]
Double clicked on “None” then typed [.
Autocomplete wrapped “None” with the [None] and this problem occurred.
Thank you for the replies.

1 Like