Week 1 Assignment 3 WARNING:tensorflow

Why do I get this warning?

WARNING:tensorflow:Functional inputs must come from tf.keras.Input (thus holding past layer metadata), they cannot be the output of a previous non-Input layer. Here, a tensor specified as input to “functional_5” was not an Input tensor, it was generated by layer repeat_vector_49.
Note that input tensors are instantiated via tensor = tf.keras.Input(shape).
The tensor that caused the issue was: repeat_vector_49/Tile:0

1 Like

There appears to be an error in your music_inference_model() function.
It might be where you are using the RepeatVector() function.
Or it might be if you are using the wrong range for the ‘t’ variable in that for-loop.

For t, I use “for t in range(Ty)” and it seems correct.

For RepeatVector(), my code for step 2D and 2E is
{mentor edit, code removed}

The shapes of x after each line of code are, respectively,
(None,)
(None, 90)
(None, 1, 90).

I don’t know what is wrong here.

I think “out.shape[1]” is wrong.
You’re supposed to use depth=n_values there.

I still got similar warning after replacing that line with x = tf.one_hot(x,depth=n_values):

WARNING:tensorflow:Functional inputs must come from tf.keras.Input (thus holding past layer metadata), they cannot be the output of a previous non-Input layer. Here, a tensor specified as input to “functional_16” was not an Input tensor, it was generated by layer repeat_vector_649.
Note that input tensors are instantiated via tensor = tf.keras.Input(shape).
The tensor that caused the issue was: repeat_vector_649/Tile:0

Also, something wrong with this line
—> 67 inference_model = Model(inputs=[x, a0, c0], outputs=outputs)

ValueError: Graph disconnected: cannot obtain value for tensor Tensor(“input_19:0”, shape=(None, 1, 90), dtype=float32) at layer “lstm”. The following previous layers were accessed without issue:

Can you give me some hints?

Sounds like there is a problem with how you declared one of the input layers.

inputs=[x, a0, c0]

That’s not correct.

1 Like

It works now. Thanks

Thank you, this bug was driving me crazy in my code!