Applied Batch norm in W3 prog exercice?

Hi,
At the end of the W3 exercice the quick recap is :

  • Used tf.Variable to modify your variables
  • Applied TensorFlow decorators and observed how they sped up your code
  • Trained a Neural Network on a TensorFlow dataset
  • Applied batch normalization for a more robust network

I have understand how to implement the mini-batch algorithm but I didn’t see where the batch normalization is processed throught the exercice ?
Thanks for your help

1 Like

Hi Fredo,

Thanks for the feedback! I think you found an issue.
The batch normalization should have happened in the implementation of the forward_propagation, when you calculate Znorm before passing it to calculate A[n].

2 Likes