What is the engineering reason behind Tensorflow Variables not reshaping the input like Constants?

Link to classroom item (lecture video)

In the lecture video, it was mentioned and reinforced that unlike Tensorflow Constants, Tensorflow Variables cannot automatically reshape their inputs to fit new dimensions. For example, we can supply a list [1,2,3,4] into a Constant initialized with shape [2,2], but cannot do so for Tensorflow Variable. which must take in [[1,2], [3,4]] instead.

I assume there is a reason behind this inconsistent behavior, and there is some sort of engineering benefit to setting up the library this way. Anyone knows why?

Hi @perry256

it is not about inconsistent behaviour of tf.variables but the ability to read and modify the value of tensor. Kindly go through the below link

While using tf.Variable one need to define the shape it is looking to reshape, otherwise it will always take the shape of initial value.

Regards
DP

1 Like