DeepLearning.AI
Why is it important to update both parameters at the same time?
Course Q&A
Machine Learning Specialization
Supervised ML: Regression and Classification
week-module-1
rmwkwok
June 22, 2022, 1:39pm
11
Hey
@Elemento
,
Here’s the deviation for the case of one
w
, but the same logic applies to more.
PXL_20220622_133743672~2
1920×2609 102 KB
1 Like
Interpreting coefficients of scaled features
show post in topic
Related topics
Topic
Replies
Views
Activity
Gradient descent in NN: Order of updating weights in different layers
Advanced Learning Algorithms
week-module-2
16
1070
August 31, 2023
Course 1 week 2 gradient descent
Neural Networks and Deep Learning
coursera-platform
2
598
September 24, 2021
Multiple Linear Regression
Supervised ML: Regression and Classification
week-module-2
2
625
July 1, 2022
Simultaneous update of parameters w & b in Gradient Descent in Multiple Linear Regression
Supervised ML: Regression and Classification
week-module-2
5
686
May 30, 2024
Gradient Descent Query
Advanced Learning Algorithms
week-module-3
1
351
September 16, 2023