Jazz music problem - first exercise

I am using x = X[:, t, :] but apparently this is wrong. If X is a 3D array with shape m, Tx, nvalues then how do you slice out a vector at time t? This is what I get:

X =  (None, 30, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)


That is correct.

Then you use reshaper().

What are you using for ‘t’?

The t in the for loop: for t in range (Tx)

It might also help to see the actual exception trace you are getting.

1 Like

I added a print statement in my loop (right after the reshaper call) and here’s what I see:

x.shape (None, 1, 90)
x.shape (None, 1, 90)
x.shape (None, 1, 90)

So it looks like your “reshaped” value is the correct shape.

That is correct.

Here it is with my printed dimensions in front.

X =  (None, 30, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-11-6e06a9870a31> in <module>
      1 ### YOU CANNOT EDIT THIS CELL
      2 
----> 3 model = djmodel(Tx=30, LSTM_cell=LSTM_cell, densor=densor, reshaper=reshaper)

<ipython-input-10-8891f9c27f39> in djmodel(Tx, LSTM_cell, densor, reshaper)
     46         print("x reshaped = ", x.shape)
     47         # Step 2.C: Perform one step of the LSTM_cell
---> 48         _, a, c = LSTM_cell(inputs=x, initial_state=[a0, c0])
     49         # Step 2.D: Apply densor to the hidden state output of LSTM_Cell
     50         out = densor(a)

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/layers/recurrent.py in __call__(self, inputs, initial_state, constants, **kwargs)
    707       # Perform the call with temporarily replaced input_spec
    708       self.input_spec = full_input_spec
--> 709       output = super(RNN, self).__call__(full_input, **kwargs)
    710       # Remove the additional_specs from input spec and keep the rest. It is
    711       # important to keep since the input spec was populated by build(), and

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in __call__(self, *args, **kwargs)
    924     if _in_functional_construction_mode(self, inputs, args, kwargs, input_list):
    925       return self._functional_construction_call(inputs, args, kwargs,
--> 926                                                 input_list)
    927 
    928     # Maintains info about the `Layer.call` stack.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in _functional_construction_call(self, inputs, args, kwargs, input_list)
   1090       # TODO(reedwm): We should assert input compatibility after the inputs
   1091       # are casted, not before.
-> 1092       input_spec.assert_input_compatibility(self.input_spec, inputs, self.name)
   1093       graph = backend.get_graph()
   1094       # Use `self._name_scope()` to avoid auto-incrementing the name.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/input_spec.py in assert_input_compatibility(input_spec, inputs, layer_name)
    156                      str(len(input_spec)) + ' inputs, '
    157                      'but it received ' + str(len(inputs)) +
--> 158                      ' input tensors. Inputs received: ' + str(inputs))
    159   for input_index, (x, spec) in enumerate(zip(inputs, input_spec)):
    160     if spec is None:

ValueError: Layer lstm expects 7 inputs, but it received 3 input tensors. Inputs received: [<tf.Tensor 'reshape/Reshape_2:0' shape=(None, 1, 90) dtype=float32>, <tf.Tensor 'a0_2:0' shape=(None, 64) dtype=float32>, <tf.Tensor 'c0_2:0' shape=(None, 64) dtype=float32>]

It says it is expecting 7 input tensors for lstm, but the instructions seem to indicate 3.

One problem is that you are invoking LSTM_cell incorrectly. The number of parameters looks correct, but your inputs will be the same every loop iteration. At least the values of a and c. That’s not how RNNs work, right? You are learning the a and c values on the fly. :nerd_face:

Nope.

1 Like

ok, of course that makes sense. But I changed a0 and c0 to a and c in the lstm and i am still not having success.

Still the same error trace?

No, here it is:

Test failed at index 1 
 Expected value 

 ['TensorFlowOpLayer', [(None, 90)], 0] 

 does not match the input value: 

 ['TensorFlowOpLayer', [(30, 90)], 0]

---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-10-2ad560a25ebc> in <module>
      3 # UNIT TEST
      4 output = summary(model)
----> 5 comparator(output, djmodel_out)

~/work/W1A3/test_utils.py in comparator(learner, instructor)
     26                   "\n\n does not match the input value: \n\n",
     27                   colored(f"{b}", "red"))
---> 28             raise AssertionError("Error in test")
     29     print(colored("All tests passed!", "green"))
     30 

AssertionError: Error in test

It looks like you are hard-coding the “samples” dimension. That should just be driven by the inputs and the batches can be different sizes, right?

Where would I be hard-coding the samples dimension? I don’t ever specify m or a value for m.

Here are my printed values for x and x-reshaped throughout the Tx range

X =  (None, 30, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)
x dim =  (None, 90)
x reshaped =  (None, 1, 90)

Sorry, 30 is T_x, so maybe my analysis is wrong. But the “None” there that you are not matching is the samples dimension. So how could that end up being T_x?

Maybe this is a creative enough error that we need to look at your code in order to understand what is happening. :nerd_face: Please check your DMs for a message from me.

To close the loop on the public thread, it looks like the code is fine. Doing the sequence:

Kernel -> Restart and Clear Output
Cell -> Run All Above

got the tests for the djmodel function to pass. So the theory must be that the runtime state of the notebook was somehow out of sync.

Just as a general matter, it never hurts to try the above. At worst it costs you a couple of mouse clicks and the wait time for it to run, but it can’t hurt anything and in a surprising number of cases just running everything in order from a clean start actually helps.

Great! I will use that in the future. Thanks again!