Mobilenet vs resnet

Q1-Why dont we remove relu after addition of skip connection in resnet50 like we do in mobile-net v2 for better performance?

Q2-And why dont we have Convolution layer in skip connection for dimention matching of mobile-net v2 when the layer dimensions changes, like we do in resnet for matching the output channels when the layer dimensions changes?

Hi @tarunsaxena1000

In ResNet50, relu is used for introducing non-linearity. However, in MobileNetV2 the aim is to preserve information and avoid losing important features.

MobileNetV2 tries to keep efficiency and reduce computational complexity, relying on other layers. ResNet uses convolutions in skip connections due to its emphasis on deeper and wider residual learning.

Hope this helps!

2 Likes

if removing relu after each bottleneck layer helps to preserve information and avoid losing important features, why should we not do this in resnet also. it will help resnet right?

It’s the aim that matters. Using ReLU could help the model to gain additional non-linearity, hence the use in resnet. However, for mobilenet, the aim is to have an efficient model that works on smaller devices, i.e. to preserve information and get the model as good as possible. So many factors go into designing an architecture.

2 Likes

Thank you for your timely responses. @Alireza_Saei @lukmanaj . so basically you guys are saying the more non linearity we have the better the NN will be and in mobilenet we trade this high accuracy with computationally easy network?

You’re welcome! Happy to help :raised_hands:

Yes, more non-linearity can enhance a neural network’s performance by allowing the model to learn more complex representations of there data.

In MobileNet, we trade some accuracy for a computationally efficient network suitable for mobile and embedded devices.

1 Like