Why does the final layer become linear in the softmax lab?

I was doing the softmax lab and I am just a little confused as to why the final layer has become linear and how it is linear?

No, a layer does not become linear when you set its activation function to ‘linear’. Similarly, a function does not become relu when you set its activation function to ‘relu’. If you have question about the purpose of activation function, please watch the Course 2 Week 2 Video “Why do we need activation functions?”.

As for why we use ‘linear’ for the last layer, please watch the Course 2 Week 2 Video “Improved implementation of softmax”.

Cheers,
Raymond

PS: Sorry, I mis-edited your post.