You are watching: Perceptual losses for real-time style transfer and super-resolution
The image transformation network is presented listed below. For a offered style photo, the netoccupational is trained utilizing the MS-COCO dataset to minimize perceptual loss while being regularized by complete variation. Perceptual loss is identified by the combination of function rebuilding loss and also the style rebuilding loss from pretrained layers of VGG16. The feature reconstruction loss is the expect squared error in between function representations, while the style rebuilding and construction loss is the squared Frobenius norm of the distinction in between the Gram matrices of the attribute maps.
TrainYou have the right to train a design for a provided style photo with the adhering to command:
$ python style.py train --style-picture "path_to_style_image" --dataset "path_to_coco"
$ python style.py train --style-picture style_imgs/mosaic.jpg --datacollection coco --gpu 1 --visualize 1
EvaluationYou can stylize a photo through a pretraind version with the adhering to command also. Pretrained models for mosaic.jpg and udine.jpg are offered.
$ python style.py carry --model-course "path_to_pretrained_model_image" --source "path_to_source_image" --tarobtain "name_of_target_image"
You deserve to additionally specify if you would certainly favor to run on a GPU:
For example, to carry the style of mosaic.jpg onto maine.jpg on a GPU, I would certainly use:
$ python style.py move --model-course model/mosaic.version --source content_imgs/maine.jpg --targain maine_mosaic.jpg --gpu 1
MosaicModel trained on mosaic.jpg applied to a few images:
And below is a GIF mirroring exactly how the output alters throughout the training process. Notably, the network-related generates qualitatively appealing output within a 1000 iterations.
See more: Which Of The Following Is A Characteristic Of Perfectly Competitive Markets?
UdineModel trained on udine.jpg used to a couple of images:
And here is a GIF reflecting exactly how the output transforms during the training process. Significantly, the netjob-related geneprices qualitatively appealing output within a 1000 iterations.