This repository contains the code for text generation results
of the paper:
OptiGAN: Generative Adversarial Networks for Goal Optimized Sequence Generation, The 2020 International Joint Conference on Neural Networks (IJCNN), 2020.
This project uses Python 3.6.x, with the following lib dependencies:
The experiments
folders contain scripts for starting the different experiments.
For example, to reproduce the COCO Image Captions
experiments, you can try :
cd real/experiments
python coco_lstmgan_pg_baseline_mle_gan.py [job_id] [gpu_id]
or EMNLP2017 WMT News
:
cd real/experiments
python3 emnlp_small_lstmgan_pg_baseline_mle_gan.py [job_id] [gpu_id]
Note to replace [job_id] and [gpu_id] with appropriate numerical values, (0, 0) for example.
To cite this work, please use:
@INPROCEEDINGS{9206842,
author={M. {Hossam} and T. {Le} and V. {Huynh} and M. {Papasimeon} and D. {Phung}},
booktitle={2020 International Joint Conference on Neural Networks (IJCNN)},
title={{OptiGAN: Generative Adversarial Networks for Goal Optimized Sequence Generation}},
year={2020},
volume={},
number={},
pages={1-8}
}
This code is based on RELGAN and the previous benchmarking platform Texygen.