I can’t understand the documentation for this point

Integer, tuple of integers, or None. The axis or axes that should have a separate mean and variance for each index in the shape. For example, if shape is `(None, 5)`

and `axis=1`

, the layer will track 5 separate mean and variance values for the last axis. If `axis`

is set to `None`

, the layer will normalize all elements in the input by a scalar mean and variance. When `-1`

the last axis of the input is assumed to be a feature dimension and is normalized per index. Note that in the specific case of batched scalar inputs where the only axis is the batch axis, the default will normalize each index in the batch separately. In this case, consider passing `axis=None`

. Defaults to `-1`

.

The Keras documentation is not very helpful - it’s written by people who already know all the answers, so it’s not helpful for education.

Here is an explanation I found on the internet:

Oddly, this stackoverflow post has exactly the same title as you chose for this thread!

I read this article before and also i did not get it

can any one explain it in much simpler way

is axis means that i will get mean and variance of last feature in the input to use in the function

Axis refers to the dimension indices of a matrix.

Example: if you have a 3D matrix, then it has three axes.

I think the best way to investigate this is to create a small 3D matrix, so you can compute the means yourself, and then do some experiments with the “axis =” parameter.

i will do that and share my results