Dear all,
I have run into a problem that is not addressed in the other threads. I hope this is the right place to ask about it.
I think I have a little difficulty understanding what is meant by the notion of ‘shape’ in different parts of the text. In the instructions for “Inputs (given)” you read “The ‘shape’ parameter takes a tuple that does not include the batch dimension (‘m’).” But then in the comments it says “# Step 2.B: Use reshaper to reshape x to be (1, n_values) (≈1 line)”. So, is it still so that ‘m’ is just not mentioned, or should it really be removed this time? I tried removing it and what I get is just
Tensor(“Slice:0”, shape=(None, 1, 90), dtype=float32)
Tensor(“Squeeze:0”, shape=(1, 90), dtype=float32)
…
…
ValueError: Shape (90, 1) must have rank at least 3
And if I don’t remove it then it runs through it all with
Tensor(“Slice:0”, shape=(None, 1, 90), dtype=float32)
Tensor(“reshape/Reshape:0”, shape=(None, 1, 90), dtype=float32)
But then # UNIT TEST generates
AssertionError Traceback (most recent call last)
in
1 # UNIT TEST
2 output = summary(model)
----> 3 comparator(output, djmodel_out)
~/work/W1A3/test_utils.py in comparator(learner, instructor)
16 def comparator(learner, instructor):
17 if len(learner) != len(instructor):
—> 18 raise AssertionError(“Error in test. The lists contain a different number of elements”)
19 for index, a in enumerate(instructor):
20 b = learner[index]
AssertionError: Error in test. The lists contain a different number of elements
Btw: I’m restarting the Kernel for every minor change I make in this ex, after what I read about the next ex’s