Art Generation - Facebook AI Research, Google DeepDream, and Ruder's Style Transfer for Video - Deep Learning: Zero to One

Art Generation - Facebook AI Research, Google DeepDream, and Ruder's Style Transfer for Video - Deep Learning: Zero to One

Deep Learning: Zero to One

18/04/2017 6:02PM

Episode Synopsis "Art Generation - Facebook AI Research, Google DeepDream, and Ruder's Style Transfer for Video - Deep Learning: Zero to One"

Justin Johnson, now at Facebook, wrote the original Torch implementation of the Gatys 2015 paper, which combines the content of one image and the style of another image using convolutional neural networks. Manuel Ruder’s newer 2016 paper transfers the style of one image to a whole video sequence, and it uses a computer vision technique called optical flow to generate consistent and stable stylized video sequences. Ruder’s implementation was used, by me, to generate a stylized butterfly video, located at https://medium.com/@SamPutnam/deep-learning-zero-to-one-art-generation-b532dd0aa390

Listen "Art Generation - Facebook AI Research, Google DeepDream, and Ruder's Style Transfer for Video - Deep Learning: Zero to One"

More episodes of the podcast Deep Learning: Zero to One