DeepLearning.AI
Derivation of the gradients of W^[2], and b^[2] in the 1 hidden neuron network
Course Q&A
Deep Learning Specialization
Neural Networks and Deep Learning
week-module-3
,
coursera-platform
saifkhanengr
March 17, 2025, 1:36pm
2
Check
this
YouTube guide of Eddy Shyu and this
chain rule
.
1 Like
show post in topic
Related topics
Topic
Replies
Views
Activity
W3_A1_Derivative for hidden neural layers (Backprop)
Neural Networks and Deep Learning
coursera-platform
5
608
February 9, 2023
Week 3, "Gradient Descent for Neural Networks"
Neural Networks and Deep Learning
week-module-3
,
coursera-platform
10
475
March 25, 2024
W3_A1_Ex-6_What's the link between dz[1] and w[2] equation?
Neural Networks and Deep Learning
coursera-platform
1
584
October 23, 2022
W2_A2_Calculation of Partial derivatives
Neural Networks and Deep Learning
coursera-platform
12
1013
July 24, 2023
Course 1: Week 3 (backpropagation intuition)
Neural Networks and Deep Learning
coursera-platform
21
5294
April 27, 2022