Discrete data generation

Dear all,

I hope this is the right place to ask this, as I mostly see questions related to the GAN specialization.

I joined this specialization because I would like to create a GAN that can output strings of text, like a single word composed of alphanumeric and special characters.

I saw in the first course that there is ProteinGAN that can do that, but the code does not get into the training of the model because of the time it takes to train the GAN.

I have seen papers using the one-hot encoding to represent each character, and I’d like to have an opinion on that matter.

Thanks !

1 Like

Hi, @Barb
Actually, if you want to generate texts, you need some knowledge about Sequence Model, especially for LSTM. These topics are covered in the last course of Deep Learning Specialization.
Specifically, there is exactly one Programming Assignment – Dinosaur Island - Character-Level Language Modeling – in it guiding you to generate dinosaur-name-like texts.

2 Likes

Hi @kevinjiang,

Thanks for the quick answer. I’m looking into taking this specialization as well, since I’m almost finished with GANs.

Without going into too much details, do you think I will be able to combine GANs and LSTM networks one way or another to generate what I need ?

Thanks

Hi @Barb I am not sure how much this will help you but you can checkout papers like below for text generation GANs:

General Text Generation

  • SeqGAN - SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient
  • LeakGAN - Long Text Generation via Adversarial Training with Leaked Information
  • MaliGAN - Maximum-Likelihood Augmented Discrete Generative Adversarial Networks
  • JSDGAN - Adversarial Discrete Sequence Generation without Explicit Neural Networks as Discriminators
  • RelGAN - RelGAN: Relational Generative Adversarial Networks for Text Generation
  • DPGAN - DP-GAN: Diversity-Promoting Generative Adversarial Network for Generating Informative and Diversified Text
  • DGSAN - DGSAN: Discrete Generative Self-Adversarial Network
  • CoT - CoT: Cooperative Training for Generative Modeling of Discrete Data

Category Text Generation

  • SentiGAN - SentiGAN: Generating Sentimental Texts via Mixture Adversarial Networks
  • CatGAN (ours) - CatGAN: Category-aware Generative Adversarial Networks with Hierarchical Evolutionary Learning for Category Text Generation
9 Likes

Hi @RC_Stark,

Thanks for the pointers, I’ve seen some of them and I’ll look into the others.

I didn’t read it yet, but does CatGAN provide open source code for reusability ?

Best regards,

This GitHub repo GitHub - williamSYSU/TextGAN-PyTorch: TextGAN is a PyTorch framework for Generative Adversarial Ne has implementations of all the works I mentioned.

3 Likes

That’s awesome news, I will look into it !

Thank you very much :slight_smile:

@Barb You will benefit more if you try to code yourself first and then look the implementations

Hi @Barb,
With respect to your question, it is a long story to combine GANs and LSTM. The best practice is to definitely read some papers and learn from others’ open source implementations.
In this point, I think @RC_Stark has given all you need. Thanks for your detailed list, @RC_Stark. ;D

2 Likes

@RC_Stark agreed, so far I’ve always found that looking at other people’s implementation to be quite stressful because of the size of the code.

@kevinjiang Thanks, I was just asking because in the GAN spec. it has been pointed out that you can use GANs component to acts as VAE etc, so I thought that some combination might be doable.

In any case, I’ll try to build my own GAN progressively and then change components like a mechanic.

Thank you all for the reading suggestions and the github, this is much appreciated.

1 Like

hello @Barbillon

the goal is not simply to duplicate what has already been done by someone else. the primary goal is to understand the mechanisms behind each mechanism and understand the why and the how in order to be able to better know when in an open source code I can integrate an element of someone else’s code and adapt it to their own needs

very good sources have been given to you, keep learning

2 Likes

Hi @Barb,

Generating text using GAN is an interesting topic.
As @kevinjiang mentioned, you can use Sequence models to generate text, like GPT, BERT, XLNet and more.
For generating text using GAN, apart from the papers @RC_Stark mentioned, you can have a look at PassGAN as well, it is creating a variant of GAN and use it to crack passwords, which is strings of characters, symbols and numbers.
Here is their paper and opensource code.

Hope it helps!

1 Like

Hi @Barb ,

You raised an interesting question. GANs are primarily used as generative models, but the attention mechanism of GAN is still an active area of research. For text generation, we need the insight/output from previous layers of the neural network. In such cases, Sequence and Attention models work best. You can look out for attention models such as BERT/ T5/ GPT and so on; yet you can try to use GAN for text generation where the sequence and attention mechanism might become handy.

I hope this answers your question.

Thanks!

1 Like

@fangyiyu Thanks for the input, I think PassGAN is the closest to my topic, I’ll have a look.
@albirahman I see, I’ve seen BERT and GPT, but I have to emphasize that what I need is not a NLP Generator, but a geneator that can output a single string based on training etc.
I’ll also investigate LTSM networks, they seem to be a potential match.

Edit: When I mean single string, I mean in the way that it does not carry a linguistic meaning.

2 Likes

I suggest reading the following: Text-To-Text Generative Adversarial Networks | IEEE Conference Publication | IEEE Xplore and A Text GAN for Language Generation with Non-Autoregressive Generator | OpenReview.

1 Like

Hi everyone,

Small update on my understanding, if anyone can confirm:

Each discrete character is onehot encoded and each onehot is concatenated to make a tensor.

Then when the generator produces a fake, an argmax function is applied for each “character” of the output because the generator will not nicely produce a onehot-like output.

This is what I understood so far.

@cvetko.tim Thank you for the suggestion