Coffe Roasting in TensorFlow Lab - initial weights?

In this lab C2_W1_Lab02_CoffeeRoasting_TF we have:


Dense performs a sigmoid activation in each layer, which requires weights W and b.
Where do we get the initial weights in the first Dense call here, to work on input x/Layer 0?
Previously we used to initialize weights to some random numbers, but I don’t see here where we initialize the weights here?

Thank you!

Good question. In the tensorflow site for the Dense class of layers:

There are a few selections on kernel (weights) and bias and I think there is one in default if not specified further.

1 Like

thank you, that makes sense.

But then for the second Dense we don’t specify the Kernel either, so by default, it seems to me, it would be the same default Kernel as in the first Dense, but we should be using the one created by the first Dense?
Is Sequential somehow knows to pass Kernel updated in one Dense to the next Dense?
Thank you!

Each layer maybe initialized with similar method like “glorot uniform” but those weights and biases evolve differently for each layer, so I am not sure I understand what you mean by pass the updated kernel to the next layer. Each layer has its own set of weights.

Layer 1 uses Kernel from Layer 0 ( kernel_initializer=‘glorot_uniform’).

Layer 2 uses Kernel created by Layer 1.
But in second Dense call we don’t specify that kernel_initializer = ‘Layer1_kernel’ (not sure about the correct syntax, just trying to convey that it is a Kernel generated by Layer 1). So, if we don’t specify kernel_initializer, by default second Dense should also use kernel_initializer=‘glorot_uniform’, correct?
But it doesn’t, it uses the correct Kernel (from Layer 1). So how does Layer 2 knows how to use the correct Kernel if we don’t specify it in the call?
Does the question make sense?
Thank you

From reading the TF “Dense” documentation, it looks to me like any Dense layer uses glorot_uniform to initialize the weights, unless you specify something else.

Can you post some data that shows this?

No data - but from the lectures Layer 2 uses weights (kernel) from Layer 1, not the default ’ ‘glorot_uniform

So, how do we pass weights (Kernel) from Layer 1 to Layer 2 if we don’t specify it in the second Dense call (in which case Dense is supposed to use the default, ‘glorot_uniform, instead of weights from Layer 1)?
I’m sorry if I am not being clear…

In other words, which weights second Dense uses and how does it gets them?

Hello @Svetlana_Verthein,

In the screenshot from your first post, there are 2 calls to tf.keras.layers.Dense. As @gent.spah pointed out, a tf.keras.layers.Dense uses 'glorot_uniform' by default to initialize the weights. Therefore,

  1. both the Dense will be initialized using the same method 'glorot_uniform'
  2. since one Dense will be initialized after the other Dense, even though they both use the same method, their initialized weights will be different
  3. the weights of both Dense are initialized according to 'glorot_uniform', and it is NOT true that the second Dense will wait for the first Dense to pass its weights. There is no weight passing through layers in weight initialization.


Tensorflow does not pass weights from Layer 1 to Layer 2.

By calling 'glorot_uniform' which can initialize the weights randomly. This is true for both the first and the second Dense.

Could you please share the source of this? Perhaps the video name and timestamp, or which lab and which section of the lab?



Thank you, Raymond, everything is clear to me now. I see where I was confused - for some reason I thought we were using not just a values, but also weights from the previous layer - now I see i was obviosuly wrong, and the lectures don’t say that.
I really appreciate your patience and clarity of explanation!

You are very welcome, @Svetlana_Verthein! :slight_smile:


I had a similar question: how are the initial weights for each layer determined? This thread helped. Thanks!