When I do x = X[:,t,:], and send that to the reshape layer x = reshaper(x), the model summary seems to be showing a ton of tf_op_layer_strided_slice_364 layers, which is results in the tests to be failing since index 2 should be a reshape layer. Any idea on how to fix this?
Please post a screen capture image that shows your model summary.
That says that your code is wrong because your layers don’t come out the right way. They give you the summary of the model you need to match. Print the summary of your model and then compare them. That should point out where you need to look for the problem, right? Where does your model go off the rails?
I used logic like this:
print("Generated model:")
for index, a in enumerate(summary(model)):
print(f"layer {index}: {a}")
print("Expected model:")
for index, a in enumerate(djmodel_out):
print(f"layer {index}: {a}")
This is how debugging works: you start from the error message. What is it telling you. Your layers are wrong. Ok, then the next question is “what are your layers and what should they be?” Then “Ok, why did they turn out wrong?”
Please post another screen capture image, this time showing the first few lines in the model summary.
The error message refers to this part of the model:
Did you complete Step 2.B?
It’s the call to reshaper().
Also, did you use the correct variable name? ‘x’ and ‘X’ are different variables.
first is x=X[:,t,:], then x = reshaper(x).
I’m not sure why this is making so many strided slice layers
Summary for those who find this thread later:
- LSTM_cell() must be provided with the initial_state. See the instructions in Step 2.C.
- Since the LSTM_cell() object has global scope, any time you modify any of the models in this notebook, you must restart the kernel and run all of the cells again (“Kernel → Restart and run all”).