Week 1 Assignment 3, Exercise 3: Display Lambda layers as standard TensorFlowOpLayers

Hi!
I’m sorry to post yet another question on this topic, I saw that a lot of questions were asked on this particular exercise in the past but I haven’t found a valid answer for my specific problem, despite scrolling through these long posts for hours.

The issue is in the validation of my inference model. All tests pass until the comparator function, where I get this error:

Outputs =  50
Single output shape =  (None, 90)
len(pred) =  50
pred[0].shape =  (1, 90)
Test failed at index 5 
 Expected value 

 ['Lambda', [(None,)], 0] 

 does not match the input value: 

 ['TensorFlowOpLayer', [(None,)], 0]
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-41-8f5b8c69817f> in <module>
     37 
     38 inference_summary = summary(inference_model)
---> 39 comparator(inference_summary, music_inference_model_out)

~/work/W1A3/test_utils.py in comparator(learner, instructor)
     26                   "\n\n does not match the input value: \n\n",
     27                   colored(f"{a}", "red"))
---> 28             raise AssertionError("Error in test")
     29     print(colored("All tests passed!", "green"))
     30 

AssertionError: Error in test

My loop over Ty does this:

  • Compute LSTM_cell
  • Apply the Dense layer (using densor)
  • Store the output of the densor in outputs
  • Use 2 separate Lambda layers to get the argmax / one_hot
  • Apply RepeatVector to the Lambda output to get the right shape.

In order to debug my problem, I added a cell in the notebook to print the expected vs actual Layers of my model:
print(“Reality:”)
print([[f"{l.class.name}, {l.output_shape}, {l.count_params()}“] for l in inference_model.layers])
print(”\n\n\nExpectation:")
print(music_inference_model_out)

I can indeed see, in these lists, that my Lambda layers show up as Lambda layers:

[...], ['Lambda, (None,), 0'], ['Lambda, (None, 90), 0'], ['RepeatVector, (None, 1, 90), 0'], [...]

While in the expected output, it is supposed to show up as:

[...], ['TensorFlowOpLayer', [(None,)], 0], ['TensorFlowOpLayer', [(None, 90)], 0], ['RepeatVector', (None, 1, 90), 0, 1], ['...]

My understanding is that TensorFlowOpLayers are pre-defined, specific Layers in Keras, as opposed to Lambda layers which have no pre-defined behaviors (depend on the lambda given to it to execute).

I believe I ran out of ideas on how to get unstuck. I noticed that in the output, the 2 TensorFlowOpLayers corresponding to my argmax / one_hot Lambda layers have their tensors wrapped in between brackets (i.e. [(None,)] instead of (None,)), maybe I am missing something and should be reshaping the tensors for some reason, but nothing I’ve tried have helped. Also, I believe I’m following the instructions to the letter, and I don’t see why I should add random steps to the function.

One last thing I have tried, based on answers to other similar questions, was to restart the kernel and re-run all the cells. But that didn’t help me at all.

What am I missing?

Instead of Lambda layers, use tf.math.argmax and tf.one_hot.

1 Like

Hello, @bmarques,

Thanks for your detailed analysis. I think the assignment just did not expect your Lambda layers for tf.one_hot and tf.math.argmax. In fact, to Tensorflow, you might simply just use the functions without Lambda.

For example, you can do this:

x = tf.math.add(input_1, 1)  #instead of x = tf.keras.layers.Lambda(lambda x: tf.math.add(x, 1))(input_1)
x = tf.math.add(x, 2)
...

Cheers,
Raymond

1 Like

Tip:

  • Don’t use “Lambda” functions, unless the instructions specifically say to.
1 Like

Oh no… Thanks so much!

I initially wasn’t using Lambdas, but I got stuck on an error which led me to believe that I needed to use them in order to run such functions in the layers of my model… I probably fixed whatever caused the initial problem, but did not figure out that these Lambdas were not required after all… :person_facepalming:

Thanks for your comments, it fixed my problem entirely!