DeepLearning.AI
Why "logit" stands for the output of linear activation function
Course Q&A
Machine Learning Specialization
Advanced Learning Algorithms
week-module-2
rmwkwok
September 5, 2022, 5:37am
3
If you would like to read some maths, you might check out
this discussion
.
show post in topic
Related topics
Topic
Replies
Views
Activity
Question about is_logit
Advanced Learning Algorithms
week-module-2
30
952
February 17, 2024
Preferred Implementation of Softmax
Advanced Learning Algorithms
week-module-2
3
366
September 5, 2023
Improved implementation of softmax - Neural network training | Coursera
Advanced Learning Algorithms
week-module-2
1
74
June 25, 2024
Output layer: why a linear activation function instead of a relu?
Advanced Learning Algorithms
week-module-2
12
159
July 4, 2024
Week 2, prog_assgn, Ex-2
Convolutional Neural Networks
coursera-platform
5
556
October 25, 2021