Improvise_a_Jazz_Solo_with_an_LSTM_Network_v4: AttributeError: 'NoneType' object has no attribute 'op'

Hi All,
I got stuck with strange error. Please see details bellow. Any hints / suggestions?

Thanks!
Daniel.


AttributeError Traceback (most recent call last)
in
1 ### YOU CANNOT EDIT THIS CELL
2
----> 3 model = djmodel(Tx=30, LSTM_cell=LSTM_cell, densor=densor, reshaper=reshaper)

in djmodel(Tx, LSTM_cell, densor, reshaper)
51
52 # Step 3: Create model instance
—> 53 model = Model(inputs=[X, a0, c0], outputs=outputs)
54
55 ### END CODE HERE ###

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/training.py in new(cls, *args, **kwargs)
240 # Functional model
241 from tensorflow.python.keras.engine import functional # pylint: disable=g-import-not-at-top
→ 242 return functional.Functional(*args, **kwargs)
243 else:
244 return super(Model, cls).new(cls, *args, **kwargs)

/opt/conda/lib/python3.7/site-packages/tensorflow/python/training/tracking/base.py in _method_wrapper(self, *args, **kwargs)
455 self._self_setattr_tracking = False # pylint: disable=protected-access
456 try:
→ 457 result = method(self, *args, **kwargs)
458 finally:
459 self._self_setattr_tracking = previous_value # pylint: disable=protected-access

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/functional.py in init(self, inputs, outputs, name, trainable)
113 # ‘arguments during initialization. Got an unexpected argument:’)
114 super(Functional, self).init(name=name, trainable=trainable)
→ 115 self._init_graph_network(inputs, outputs)
116
117 @trackable.no_automatic_dependency_tracking

/opt/conda/lib/python3.7/site-packages/tensorflow/python/training/tracking/base.py in _method_wrapper(self, *args, **kwargs)
455 self._self_setattr_tracking = False # pylint: disable=protected-access
456 try:
→ 457 result = method(self, *args, **kwargs)
458 finally:
459 self._self_setattr_tracking = previous_value # pylint: disable=protected-access

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/functional.py in _init_graph_network(self, inputs, outputs)
140
141 if any(not hasattr(tensor, ‘_keras_history’) for tensor in self.outputs):
→ 142 base_layer_utils.create_keras_history(self._nested_outputs)
143
144 self._validate_graph_inputs_and_outputs()

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer_utils.py in create_keras_history(tensors)
189 the raw Tensorflow operations.
190 “”"
→ 191 _, created_layers = _create_keras_history_helper(tensors, set(), )
192 return created_layers
193

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer_utils.py in _create_keras_history_helper(tensors, processed_ops, created_layers)
224 'op wrapping. Please wrap these ops in a Lambda layer: ’
225 ‘\n\n\n{example}\n\n’.format(example=example))
→ 226 op = tensor.op # The Op that created this Tensor.
227 if op not in processed_ops:
228 # Recursively set _keras_history.

AttributeError: ‘NoneType’ object has no attribute ‘op’

At first guess, I’d say in the djmodel() function, you have not set the “outputs” variable correctly.

Second guess would be a problem in the X, a0, or c0 variables.

In addition to Tom’s suggestions, also do a quick scan of the “YOUR CODE HERE” sections of djmodel and make sure you filled in all the places where it says “None” there. Those are placeholders, meaning you need to replace them with the real code.

Thanks Tom, Paul, found it. I made a stupid mistake when initializing outputs. I was asked to initialize empty list. However, what I’ve done is to initialize list with one element equal to “None” :face_with_hand_over_mouth:

I had the same problem. I think this is a Jupyter UI problem. When you select the None with the mouse and then type [], it will surround the None rather than replace it.

The UI is trying to help you with parens and brackets, but unfortunately its behavior sometimes does not correctly align with our intentions. It’s particularly annoying when you try to add parens or brackets in the middle of an existing expression. All more reason that it’s important to read and understand the implications of what we just typed. :smile: