Tensorflow implementation of the paper Image Style Transfer Using Convolutional Neural Networks.
This is an easy-to-understand, single file implementation of neural style transfer idea.
Content and style losses and layers are as in the paper.
Variational loss for preventing high frequency artifacts is added.
Adam optimizer is used instead of L-BFGS. see this. RMSprop and SGD do not give visually appealing results.
Fine tuning of optimizer hyperparameters, alpha/beta ratio, and variational loss weight is needed for desired output.
- initialization: content image(default), white noise
- iterations : 1000 (500 is enough to get visually appeling output)
- maximum image height(width) : 512
- alpha/beta : 1e-4 (highly experimental depending on content and style images)
- variation loss weight: 30
- vgg weight scaling as in section 2
- argument parsing