Algorithm for CNN

Hello I’m newbie at the convolutional neural network and I need anyone can explain this algorithm to me
hyparameter_CNN_Model

Perhaps the source where you got that table also includes a description of what task the model is designed to perform.

No I found nothing it’s just an article ? but I can not figure out why they use leaky relu + relu after FC layer ?

To understand what you show there, I think your best bet is to take DLS C4 which is all about Convolutional Networks from the ground up. It’s too much to answer here. There are 4 weeks of lectures and 8 programming assignments to explain all that stuff.

ReLU and Leaky ReLU are perfectly valid choices for activation functions. You commonly see them used in lots of different network architectures including DNNs, CNNs and even RNNs. They are the “minimalist” activation functions in that they are very inexpensive to compute. But they don’t always work. In particular ReLU has the “dead neuron” problem for z < 0. You try them first and if they work, you are “good to go”. If they don’t, then you graduate to more complex and expensive functions, like tanh, sigmoid and swish, which are based on exponentials, which is why they are more expensive. There is never any guarantee you know the answer for which activations to use in a given network without trying them to see what happens.

Ah, ok, sorry, I had not actually looked at the details of the network you show. I agree that I do not understand why they would put a Leaky ReLU layer directly after a ReLU layer. That literally makes no sense. The second layer will be a NOP. It would do no harm, but it’s just a waste. Well, unless there are more details that they aren’t showing in that table, e.g. if the network is not “simply connected”.

1 Like