What's the difference in mu and sigma in the code?

In the code mu and sigma both come from type of layer (Dense with 20 units) and it is said that mu is mean and sigma is standard deviation, but they haven’t taken any mean or standard deviation anywhere.

I have passed the assignment but on the plot generated images I get images like

In the code, the probabilistic encoder is divided into mu and sigma in the latent space before passing onto the probabilistic decoder. One need to understand this mu and sigma which is mean and standard deviation is from selection from a random vector which is then passed with epsilon also known gaussian noise, during which you can either use mu or sigma to pass to the dense layer and get the output layer.