Is that mode collapse does not happen in In unconditional generation?
Sorry, I don’t know what you mean by “unconditional generation”. The concept of “mode collapse” is unique to GANs, if that’s what you mean. It is a side effect of the competition between the generator and the discriminator, which is the whole point of the “Adversarial” in Generative Adversarial Networks, right? You don’t have that situation in the non-adversarial cases like fully connected feed forward networks, convolutional nets or RNNs, which just take an input and are trained to produce an output based on a cost function.
Of course there are other types of training problems you can have in the non-adversarial cases like vanishing or exploding gradients. And you can have those problems individually in the GANs case on either the generator or the discriminator. So GANs just adds a new “layer” of complexity on top of the normal complexity of training neural nets.
If I’ve completely missed the point of your question, please let me know.