Do not have the same size and shape

i try to build a Variable AutoEncoder use AutoGraph and TPU colab. I got an error for the different size and shape in the distributed_train_step function. Here is my model architecture, loss function and my error for detail, can you help me to solve it :frowning:




Hello huuvuot,
Welcome to the Discourse community. In my reply I will do my best to share how you could resolve your issue.

The error you are encountering in the distributed_train_step function of your Variable AutoEncoder using AutoGraph and TPU colab is likely due to a shape or size mismatch in the input data.

To debug the error, you can try printing the shapes and sizes of the input data at various stages of the training process to identify where the mismatch is occurring.

You can also try using the tf.debugging.check_numerics function to check for NaN or Inf values in the input data.

Please feel free to share if you have any further issues about this here. I will try to do my best to help you figure and solve the issue that you are having.
Best,
Can Koz

Hello Can koz,
Thanks for your reply. I tried your solution, but I got another error: The input shape is correct. I searched for new error on Google. No solution works for this.



Dear @Community-Team,
I cannot think of any further queues to help huuvuot. Can this be a notebook related issue? @TMosh is it possible for you to also take a look into this?
Regards,
Can

Dear @canxkoz , @Community-Team
Here is my co-lab notebook link. I hope that will be useful for you. Thank you too much. https://colab.research.google.com/drive/1j1hMVumILXVC9fWuVbcGoe7-ZLyn8-MZ?usp=sharing

Sorry, I’m not expert on the colab platform.

It’s possible your colab environment is not using a compatible set of packages.

If any of the code you’re using came from a Coursera notebook, you have to be extremely careful to use the same versions of the packages and tools. Coursera’s platform does not use the newest versions.

1 Like

Also dealing with distributive training is a tricky process, those packages are in constant upgrade and they need certain dataset split and preparation.

So not only you have to be updated with the libraries but you need to have good knowledge of how each distribution strategy needs to be set.