More Efficient Style Transfer Algorithm
Written by Lucy Black   
Sunday, 19 August 2018

Researchers from NVIDIA and the University of California, Merced have developed a new, more effective, deep learning-based algorithm for style transfer in order to make technique more widely adopted for achieving photorealistic results.

styletransfer1

You may think that style transfer is already doing quite well in finding its way not only into art, but also into other areas such as cuisine. In case you haven't come across the use of algorithms in this way, Xueting Li, Sifei Liu, Jan Kautz, Ming-Hsuan Yang explain what style transfer does, how it does it, and some of the limitations of the current approach in their research paper:

Given a random pair of images, an arbitrary style transfer method extracts the feel from the reference image to synthesize an output based on the look of the other content image. Recent arbitrary style transfer methods transfer second order statistics from reference image onto content image via a multiplication between content image features and a transformation matrix, which is computed from features with a pre-determined algorithm. These algorithms either require computationally expensive operations, or fail to model the feature covariance and produce artifacts in synthesized images.

Having  analyzed arbitrary style transfer algorithms and their extensions, the team derived the form of transformation matrix theoretically and used an approach that learns the transformation matrix with a feed-forward network. They used NVIDIA TITAN Xp GPUs with cuDNN-accelerated PyTorch deep learning framework to train a convolutional neural network on 80,000 images of people, scenery, animals, and moving objects and tested their approach on four style transfer tasks: artistic style transfer, video and photo-realistic style transfer, and domain adaptation.

They concluded:

Our algorithm is highly efficient yet allows a flexible combination of multi-level styles while preserving content affinity during style transfer process. 

The research was shown at SIGGRAPH 2018 where it was explained that the breakthrough made is to allow two light-weighted convolutional neural networks to replace any GPU-unfriendly computations, such as SVD decomposition, and to transform the images. Because of this, users can apply different levels of style changes in real time. The experimental results demonstrate that the proposed algorithm performs favorably against many state-of-the-art methods on style transfer of images and videos.

Liu commented:

“Our solution also allows people to alter a video in real time. You can use numerous patterns to find the style that works best you. I think this will encourage content producers to create more, maybe people who aren’t good at painting will use style transfer to create art."

 styletransfersq

More Information

New AI Style Transfer Algorithm Allows Users to Create Millions of Artistic Combinations

Learning Linear Transformations for Fast Arbitrary Style Transfer

Related Articles

A Neural Net Creates Movies In The Style Of Any Artist 

Style Transfer Makes A Movie 

Style Transfer Applied To Cooking - The Case Of The French Sukiyaki 

A Neural Net Colorizes Photos

{laodposition signup}

Banner


GitHub Announces Free Copilot
19/12/2024

GitHub has launched GitHub Copilot Free, a free version of Copilot that provides limited access to selected features of Copilot and is automatically integrated into VS Code. The free tier is aimed at  [ ... ]



AI At edX With 30% Savings
13/12/2024

edX is offering a 30% discount on selected courses and program bundles until December 19th. We look at  AI-related certifications that could boost your resume in 2025.


More News

espbook

 

Comments




or email your comment to: comments@i-programmer.info

Last Updated ( Sunday, 19 August 2018 )