Tensorflow 1s Lab - Neurons and Layers

Re: Coursera | Online Courses & Credentials From Top Educators. Join for Free | Coursera

This is the first python Lab on TensorFlow, and it kinda too much to grasp on first TF lesson.

linear_layer = tf.keras.layers.Dense(units=1, activation = 'linear', )
→ Pls explain what and why Dense and what are units. What are we doing here, and the purpose of choosing Linear.

a1 = linear_layer(X_train[0].reshape(1,1))
thought X_train was already in 1,1 shape

linear_layer.set_weights([set_w, set_b])
→ THis is confusing. Pls explain why we are passing an array of weights

a1 = linear_layer(X_train[0].reshape(1,1))
print(a1)
alin = np.dot(set_w,X_train[0].reshape(1,1)) + set_b

→ Am lost here as to why we are doing what we are doing

model = Sequential(
    [
        tf.keras.layers.Dense(1, input_dim=1,  activation = 'sigmoid', name='L1')

→ Why this Sequential was not done for Linear one ?

Let me know if I am asking too many questions.
This is the first lab, I am confused as to its objective , and how and why the path is chosen to achieve this Neural nw

Thanks

1 Like

Hello @Venkat_Subramani,

I will connect the relevant content in the lab to your question with some additions. It will take some time to think and understand, but it is also a good chance to do so :wink:

To start, please be reminded that the purpose of the lab is to introduce the idea of neural network, neural network layer, and neuron. A neural network can contain any number of layers, and a layer can contain any number of neurons.

In particular, a network with one layer that contains only one neuron, which is what the lab is showing, is actually a linear regression formula. Moreover, if we add a sigmoid activation to that only neuron, it becomes a logistic regression formula.

In other words, the lab wants to connect our existing knowledge about linear/logistic regression with neurons!

Before you continue, I recommend you to read the lab’s introductory paragraph, which states the objective again.

With my starting comment in mind, see if you can make sense the following graph:

We are dealing with X_train[0] here, not X_train. Please check the shape of X_train[0] instead.

Again, with my starting comment in mind, see if you can make sense the following graph:

Simply not necessary. It is good for us to see how one thing can be achieved in different ways. Remember, one of the lab’s jobs is to show us possibilities. Now, you know we can do it in two different ways, and you can verify yourself the other way for the case of logistic regression :wink:

Cheers,
Raymond

PS: I edited your post to format the code.

2 Likes

Thank you so much, Raymond!!!

1 Like

No problem, @Venkat_Subramani!

Hey @rmwkwok , so X_train[0] is [1.] and when I reshape it I get [[1.]]
My question is why do we use the latter, is it like the example we are taking might work without reshaping but would need to reshape once we have a decent size of inputs? or is there any other reason?

Hello @Tarun_Kumar_S,

If we were still talking about this line, then this line won’t work without the reshape. Tensorflow’s Dense expects a 2D array. You can give it a try! :wink:

Cheers,
Raymond

1 Like