Tensorflow implementation of adversarial auto-encoder for MNIST
An implementation of adversarial auto-encoder (AAE) for MNIST descripbed in the paper:
The paper suggest various ways of using AAE.
Only results on 'Incorporatiing Label Information in the Adversarial Regularization' are given here.
Three types of prior distrubtion are considered.
The following graphs can be obtained with command:
python test_prior_type.py --prior_type <type>
Leveraging label information to better regularize the hidden code in Figure 4 in the paper.
The following results can be reproduced with command:
python run_main.py --prior_type mixGaussian
The following results can be reproduced with command:
python run_main.py --prior_type swiss_roll
The following results can be reproduced with command:
python run_main.py --prior_type normal
python run_main.py --prior_type <type>
Required :
--prior_type
: The type of prior distrubition. Choices: mixGaussian, swiss_roll, normal. Default: mixGaussian
Optional :
--results_path
: File path of output images. Default: results
--n_hidden
: Number of hidden units in MLP. Default: 1000
--learn_rate
: Learning rate for Adam optimizer. Default: 1e-3
--num_epochs
: The number of epochs to run. Default: 20
--batch_size
: Batch size. Default: 128
--PRR
: Boolean for plot-reproduce-result. Default: True
--PRR_n_img_x
: Number of images along x-axis. Default: 10
--PRR_n_img_y
: Number of images along y-axis. Default: 10
--PRR_resize_factor
: Resize factor for each displayed image. Default: 1.0
--PMLR
: Boolean for plot-manifold-learning-result. Default: True
--PMLR_n_img_x
: Number of images along x-axis. Default: 15
--PMLR_n_img_y
: Number of images along y-axis. Default: 15
--PMLR_resize_factor
: Resize factor for each displayed image. Default: 1.0
--PMLR_z_range
: Range for unifomly distributed latent vector. Default: 3.0
--PMLR_n_samples
: Number of samples in order to get distribution of labeled data. Default: 10000
This implementation has been tested with Tensorflow 1.2.1 on Windows 10.