Difference between tf.constant() and tf.Variable

Hello, I have a naive question about difference between two substances in Tensorflow.
It is said in the notebook that we can modify the state of a tf.Variable but cannot change the state of a tf.constant .
Still I don’t understand the point as we still can redefine the variable later in the code that was constant we can still reassign it to a new value.
Thanks for any help in understanding !

Hi @Viktoriia,

I don’t know exactly the context of your question because you don’t mention in which notebook that comment appears, but in general, the use of constants, when appropiate, has several benefits like improving code readability, facilitating program maintenance, making sure that you don’t change accidentally a value, and in some cases helping for performance optimization. For Tensorflow in particular I assume it can have the same benefits, especially the last one.

Hello @Diego! Thanks for giving a quick response. My question is regarding the C2 week 3 the first lab in Tensorflow environment. Simple things I am sure but looks confusing with this new version of the tf. In the exercise 2.1 it outputs ‘all test passed’ no matter I set the Variable or a constant for W, X and b. I would like to understand the difference and when to use the two. Could you please explain ?

Hi ! I also found this point unclear when I went through this tutorial.

My understanding now is that Variables should be used for mutable tensors, in other words tensors that are supposed to change over time.

For example, Ws and bs are going to change during gradient descent, so they should be Variables.

On the contrary, X should be immutable because the values of a given training example are not going to be modified later. So it should be a constant.

I guess that Constant protects you against modifying these values by mistake in the code. TF will certainly raise an exception in this case, so that you can see the problem and debug immediately.

HTH, Colin

1 Like

Hello @Colin !
That definitely makes sense ! Thank you for your kind reply.
What I still do not get is that we still can create a new variable with the same name X and set it to different value.
What comes to mind is the analogy with tuples, we can not address to a particular element in tuple and modify it, but we still can create a new tuple with the same name.


aha good point ! in fact, in python creating a variable is called name binding. if you create a variable called X it means that you put label X on an object, say a tuple or a string, which are both immutable.

You can do :

a = (0, 1)
a = 'hello'
a = (1,2,3)

You’re just moving label a around, but none of these immutable objects was modified in the process :slight_smile:

I guess tf constants work in the exact same way.


Hi @Viktoriia, you and @Colin make good points about the difference between constants and variables.

I like to add that in Tensorflow, there’s more than meets the eye about tf.Variable than just being a space to storage a Tensor, you can check more about it in the tf.Variable guide and the tf.constant documentation.

Regarding the exercise, I think the purpose was to become familiar with the way in which tensors are initialized and operations are performed with them. So, whether you use constants or variables to store the tensors, the operations are performed on those tensors and the result is the same, hence ‘all tests passed’ in both situations.