Week 04 - Building your Deep Neural Network for Regression

Hi, all programming exercise for week 4 is used for classification problems.

I want to implement or make my own deep neural network that can be used for regression. I intend to use activation functions of ‘relu’ for the L-1 layers and ‘linear’ for L-th layer. How will my code change from the programming exercise? more specifically

  1. If I use the mean squared error as the loss function, what will be the expression for dAL to be used for the backpropagation?
  2. How do I calculate the dAprev, dW and db for a layer with ‘linear’ activation?
  3. What other things should I change from the programming exercise in order to use it for regression problems?

thanks in advance.


  1. As cost is \frac{(\hat{Y} - Y)^2}{2m}, so, derivative of dAL will be (AL - Y).
  2. Same as in the assignment
  3. Using MSE and activation function other than sigmoid/softmax for the last layer.
1 Like

Hi, thank you for your response. I have some results using a) two layer neural network (from week 03 exercise) and b) L-layer neural network. But the results of the two are different even if I am using the same architecture. I am using a Matlab implementation of my code which I based it from the programming exercises.

Using two-layer neural network code

Using L-layer neural network code

What am I missing out here?

Oh, I now figured it out. The second graph uses relu for the L-1 layers while the upper graph uses sigmoid thus the difference.

I think the result is already fine.