GlobalAveragePooling2D

Why are we using GlobalAveragePooling2D() in Lab “Transfer Learning with ResNet50” ?
Can any body explain what is GlobalAveragePooling2D(), and what is benefit of using this GlobalAveragePooling2D() here ?

Here in this link in the answer gives a pretty good explanation of GlobalAveragePooling2D():

and I quote:
“GlobalAveragePooling2D does something different. It applies average pooling on the spatial dimensions until each spatial dimension is one, and leaves other dimensions unchanged. In this case values are not kept as they are averaged. For example a tensor (samples, 10, 20, 1) would be output as (samples, 1, 1, 1), assuming the 2nd and 3rd dimensions were spatial (channels last).”

Here is a nice description of its benefits: