A torch implementation of http://arxiv.org/abs/1511.06434
OTHER License
luarocks install optnet
luarocks install cudnn
Optionally, for displaying images during training and generation, we will use the display package.
luarocks install https://raw.githubusercontent.com/szym/display/master/display-scm-0.rockspec
th -ldisplay.start
You can see training progress in your browser window. It will look something like this:
mkdir celebA; cd celebA
Download img_align_celeba.zip from http://mmlab.ie.cuhk.edu.hk/projects/CelebA.html under the link "Align&Cropped Images".
unzip img_align_celeba.zip; cd ..
DATA_ROOT=celebA th data/crop_celebA.lua
DATA_ROOT=celebA dataset=folder th main.lua
LSUN dataset is shipped as an LMDB database. First, install LMDB on your system.
brew install lmdb
sudo apt-get install liblmdb-dev
Then install a couple of Torch packages.
luarocks install lmdb.torch
luarocks install tds
Download bedroom_train_lmdb
from the LSUN website.
Generate an index file:
DATA_ROOT=[path_to_lmdb] th data/lsun_index_generator.lua
DATA_ROOT=[path-to-lmdb] dataset=lsun th main.lua
The code for the LSUN data loader is hardcoded for bedrooms. Change this line to another LSUN class to generate other classes.
myimages
.images
and place all your images inside it.DATA_ROOT=myimages dataset=folder th main.lua
Follow instructions from this link.
DATA_ROOT=[PATH_TO_IMAGENET]/train dataset=folder th main.lua
dataset = 'lsun', -- imagenet / lsun / folder
batchSize = 64,
loadSize = 96,
fineSize = 64,
nz = 100, -- # of dim for Z
ngf = 64, -- # of gen filters in first conv layer
ndf = 64, -- # of discrim filters in first conv layer
nThreads = 1, -- # of data loading threads to use
niter = 25, -- # of iter at starting learning rate
lr = 0.0002, -- initial learning rate for adam
beta1 = 0.5, -- momentum term of adam
ntrain = math.huge, -- # of examples per epoch. math.huge for full dataset
display = 1, -- display samples while training. 0 = false
display_id = 10, -- display window id.
gpu = 1, -- gpu = 0 is CPU mode. gpu=X is GPU mode on GPU X
name = 'experiment1',
noise = 'normal', -- uniform / normal
epoch_save_modulo = 1, -- save checkpoint ever # of epoch
The generate script can operate in CPU or GPU mode.
to run it on the CPU, use:
gpu=0 net=[checkpoint-path] th generate.lua
for using a GPU, use:
gpu=1 net=[checkpoint-path] th generate.lua
##2.1. Generate samples of 64x64 pixels
gpu=0 batchSize=64 net=celebA_25_net_G.t7 th generate.lua
The batchSize parameter controls the number of images to generate. If you have display
running,
the image will be shown there. The image is also saved to generation1.png
in the same folder.
##2.2. Generate large artsy images (tried up to 4096 x 4096 pixels)
gpu=0 batchSize=1 imsize=10 noisemode=linefull net=bedrooms_4_net_G.t7 th generate.lua
Controlling the imsize
parameter will control the size of the output image.
Larger the imsize, larger the output image.
##2.3. Walk in the space of samples
gpu=0 batchSize=16 noisemode=line net=bedrooms_4_net_G.t7 th generate.lua
controlling the batchSize parameter changes how big of a step you take.
net=[modelfile] gpu=0 qlua arithmetic.lua