C4W2, Why is it easier for residual block to learn identity function?

as the topic in lecture:why ResNet work?
Q1: when I saw 3:47, I couldn’t understand this sentence. I beg someone to explain it in another way
Q2: half a minute after 3:47, since the residual block has little impact on the neural network, why add the residual block is not very understood, and I hope someone can explain it in another way.
thanks very much.
When I post a post, the website limits the number of words I can display, and it may send the wrong version. Please also ask the administrator to correct if I post in the wrong area.


this is the silde.

Hey @boe,
You will find an amazing explanation for your query on this thread. Please go through this thread once and let me know if this helps.

Now as for this, I am not exactly sure what is the issue that you are facing. Are you referring to Discourse by “website” or something else? If you are referring to Discourse, then where exactly it limits you? I am assuming there might be a limit in the title only, and not in the body section. Please do let us know a bit more about this, so that we can help you out.

Cheers,
Elemento

This is the picture of the website. I can only see course4, but I can’t see the specific week. If I can see it, maybe I can be more sure whether I posted it in the right section

Hey @boe,
This is exactly not an issue on your side, but how the Discourse section of Deep Learning Specialization was created. Since, it was the one of the first specializations to move their support from Coursera to Discourse, hence, that time, only course wise bifurcation was maintained, and hence, the result. On the other hand, specializations that were added afterwards on Discourse, have a week-wise bifurcation as well, for instance, the newly launched Machine Learning Specialization. For more clarity, you can always add the Week in the title, as you have done in this thread. I hope this helps.

Cheers,
Elemento