I am running the training part of C2W3_Assignment and the hint says that I need 3 linear layers and one ReLU activation. So altogether it should be 4 layers if I am not mistaken.
The hint says that the code should be five lines. What does that fifth line do? In the test part the third line also tests, if my network has more than 4 layers or not.
I have the feeling that I need a fifth layer but this is not what I get from the exercise description.
Could you help me please?
Thank you in advance!
No, you don’t need to add an additional layer.
Let me explain the description to you - it says
You need 3 linear layers and should use ReLU activations. ReLU is not typically considered a separate layer in neural networks, it is an activation function which is applied on the layers. The instruction asks you to add 3 layers ( 3 linear layers ) and use ReLU activation function on the layers.
So, for example, if I was given this instruction which states that I need to define 2 linear layers and should use ReLU() activations and it should be of 3 lines then I will start with defining a linear layer, next line I would define ReLU() activation (since it is sequential model (nn.sequential), it would be performed on the output of first linear layer which then would act as the input of next linear layer), and the next line (3rd line) will have one more linear layer definition. (It satisfies all the requirements).
Use this and think about your case. Hope, this helps. If not feel free to post your queries.
Thank you very much! It is working now!