Week 1: My model differs from the djmodel in the checker

I cannot figure out what’s wrong with my djmodel. The output of:


shows that layers 0-6 of my model are correct, but layer 7 isn’t:

[[‘InputLayer’, [(None, 30, 90)], 0], [‘TensorFlowOpLayer’, [(None, 90)], 0], [‘Reshape’, (None, 1, 90), 0], [‘InputLayer’, [(None, 64)], 0], [‘InputLayer’, [(None, 64)], 0], [‘TensorFlowOpLayer’, [(None, 90)], 0], [‘LSTM’, [(None, 64), (None, 64), (None, 64)], 39680, [(None, 1, 90), (None, 64), (None, 64)], ‘tanh’], [‘Reshape’, (None, 1, 90), 0]]

[[‘InputLayer’, [(None, 30, 90)], 0], [‘TensorFlowOpLayer’, [(None, 90)], 0], [‘Reshape’, (None, 1, 90), 0], [‘InputLayer’, [(None, 64)], 0], [‘InputLayer’, [(None, 64)], 0], [‘TensorFlowOpLayer’, [(None, 90)], 0], [‘LSTM’, [(None, 64), (None, 64), (None, 64)], 39680, [(None, 1, 90), (None, 64), (None, 64)], ‘tanh’], [‘TensorFlowOpLayer’, [(None, 90)], 0]]

What might I be doing wrong? By the way, what is TensorFlowOpLayer?

OK. I used Reshape((1, n_values))(x) instead of reshaper(x). Now I pass the test, but I am not sure I understand the difference between the two versions… Also, I am still wondering what is TensorFlowOpLayer and where did it come from?

Let me piggy back off this with a similar-sounding problem:
My model’s output and djmodel_out differed in that mine show 30 lines of TensorFlowOpLayer after (apparently) slicing X (2A).

Ah, to be clear, I’m failing the unit test as such:

Test failed at index 2
Expected value

[‘TensorFlowOpLayer’, [(None, 90)], 0]

does not match the input value:

[‘Reshape’, (None, 1, 90), 0]

In the LSTM layer, your initial_state is incorrect.

Please edit your posts to remove the code. That clears up the course Honor Code.

It is not likely that I will get an answer to my question once the discussion has shifted to your similarly sounding but different question. Please submit a separate question instead of shading someone else’s question.

I am not a TensorFlow expert, so I cannot say for sure. It appears to be an internal housekeeping layer that TF inserts for some layer types.

Thank you. Why did my fix (please see the original question and my own reply to it) make a difference?

I don’t know, i haven’t tried them both yet.
At first reading, it seems they should give the same results.

But they don’t… Please let me know if you can figure it out.

In RNN model, each time step shares the same parameters (weights), i.e., each time step reuse all components such as lstm cell, dense layer object, and reshape layer object. Therefore, in the exercise, all components are defined outside the time steps loop, Tx. If these components are defined inside loop, they will be created Tx times and no parameters sharing at each time step.
So, Reshape((1, n_values))(x) will create Tx times reshape layer objects, which is not we expected. Although, the Reshape layer only reshapes the input data and does not learn any parameters, it’s not a good practice.

1 Like

Hi Michael, how did you resolve your problem?

I just investigated what TMosh (mentor) suggested:

I’ll add that I put additional effort into understanding the values of various variables as djmodel progresses through the for loop.

I am facing the same problem. How do I solve this problem?

After following few discussions, I noticed few posts from mentors that LSTM cell need to be initialized with a and c, that I see a and c variables are initialized with a0 and c0 before for loop, but that is not linked with LSTM_Cell. if this is the problem, can you please tell me how to initialize LSTM_Cell with initial state. Thanks

{edited the reply …}

1 Like

Hi Michael,

Did you check your LSTM layer? The initial value should be: initial_state=[a, c] rather than a0 and c0.

1 Like