can you please explain the highlighted part?

density ratio between the model distribution and target?

Thank You

Varun

It means the distribution output of the decoder does not match that of the target.

The density ratio between the model distribution and the target distribution gives the measure of the distance/similarity between them. The learning task of the discriminator is to increase the distance between the fake and real distribution while it is opposite for the generator.

So, in statistics, the density ratio gives an estimate of the correction factor required to make a distribution (fake in GAN’s case) equal to another distribution (real in GAN’s case).

Mathematically, it is represented as **r = p(x) / q(x)**, where **r** is the density ratio and **p(x)** and **q(x)** are the two distributions under question.

I would suggest to read about KL divergence, JS Divergence, and f divergence to get to understand the use of density ratio. These divergences incorporates the idea of density ratio to estimate the similarity between two distributions.

hello , @starboy

link GAN — Spectral Normalization. GAN is vulnerable to mode collapse and… | by Jonathan Hui | Medium

Spectral normalization is a normalization technique used for generative contradictory networks, used to stabilize the formation of the discriminator. Spectral normalization has the practical property that the Lipschitz constant is the only hyper-parameter to be tuned.

The highlighted line simply means that…

The training of the discriminator helps us to get a good estimator which gets better over time, that can distinguish between the generated images and the target images. As our generated and target images are normally distributed, the discriminator distinguishes between them for the density ratio of their distribution.

Hope this answer helps

Hi @starboy,

Sorry for the delay, but can you please elaborate on which part of my answer you did not understand?

Is it the way how the two distributions (p(x) and q(x)) are compared for similarity, or is it about the reading material that I posted on the answer? Or should I explain more with respect to density ratio using KL divergence?

Feel free to post your query.

I understood now.

Thank You.