The Marriage of Art and Science
What do you do when your heart and your career veer between art and technology? You create a dream sequence for a film written and directed by Kristen Stewart, of course.
When I was growing up, my parents nurtured my knack for engineering and science. They also introduced me to Star Wars and other movies. I desperately wanted to be artistically involved in such films, but I couldn’t figure out how to get there.
The opportunity came years later when my doctorate in building surgical simulators allowed me to step sideways into film, where computer simulations of physical phenomena were becoming a thing. I ended up at Industrial Light & Magic and lived out my dream of making tools to create fantastic worlds (Star Wars included!). I learned that science could be steered to be in service of story; we could build tools to help artists express themselves in entirely new ways.
A NEW TOOL
Years after my stint at ILM, while working alongside Flickr’s research and computer vision teams, my friend and coworker Frank Liu put me on to a technique setting the artificial intelligence world on fire: Neural Style Transfer, which uses neural networks to re-draw an image in the style of, for example, an existing painting.
Here’s how it works in broad terms: One image supplies the content, and another image supplies the style. The same neural network breaks down those two images into small patches. The network transfers patches in the style image to similar patches in the content image; it repeats this process until the content image appears to have been redrawn in the style of the style image.
The technique sparked the imaginations of many. Could a computer be taught to draw expressively? Might this be the first spark of completely computer-driven creativity? Some artists worried that this could set the precedent of removing people from certain types of artistic expression.
I started to tinker with Liu’s working version of Neural Style Transfer, trying to turn this research tool into a reusable, steerable instrument that artists could iterate on to make art. I first redrew the bleak but beautiful Blade Runner in the style of Starry Night by Van Gogh, an equally vivid but dark masterpiece.
Then I took the experiment in another direction by mashing together two things that were unlike in essence — the stark structure of 2001: A Space Odyssey and the colorful, chaotic expressiveness of a Picasso’s Les Femmes d’Alger.
HOLLYWOOD CAME KNOCKING
I posted these experimental videos online and forgot about them, but a week later they’d gone viral in the tech and art spheres. Then I heard from Starlight Studios’ David Shapiro, who said that Kristen Stewart had seen the clip and liked it. She was directing her first short film, called Come Swim, and wanted to use the technique for a couple of scenes.
Come Swim is an exploration of impressionism. Kristen had literally painted the concept for the film, and our task was to re-draw some of the movie’s key scenes in the same impressionistic style as her painting.
Developing the look for the scenes was challenging. The painting was texturally complex, and the texture details were small. Our initial style transfer used a low-quality photo that blew out the paint’s reflective highlights and missed subtle textures. When we instead used a well-lit, high-quality photo as the style image, the Neural Style Transfer more faithfully captured the original painting’s contrasts, colors, and textures.
Once we nailed down the quality and general feel of the result, we had to fine-tune it to match Kristen’s vision for the scenes. To do so, we adjusted one major parameter: the style transfer ratio, which controls the size of the texture blocks transferred from the style image to the content image. Change the size and you create an image that looks more or less impressionistic.
For more about our process, read “Bringing Impressionism to Life with Neural Style Transfer in Come Swim.” It’s a case study that I, Stewart, and Shapiro have written and intend to submit it to a conference later this year.
GIVE IT A TRY
You’ve probably seen style transfer for still images in apps such as Prisma, Pikazo, and Dreamscope. They rely on predefined styles, so your creative input is limited. I found that the technique was most satisfying when I tuned the transfer by hand. The artistry came in the choice of style image, aesthetic parameters, and other variables. So to really flex your artistic muscle, I recommend you try neural style implementations on Github, including Frank Liu’s style-transfer and alexjc’s neural-doodle. You’ll have to be comfortable working on the command-line, and if you want to do high-resolution style transfers, you’ll need a desktop computer with a lot of GPU horsepower.
AI has an exciting future as a tool to help artists. While there are the obvious applications for it to ease repetitive creative work, we can build a new set of tools around expressing an image with another image — a kind of supercharged collage. The kind of meta-representation this represents has been confounding and delighting artists for millennia; Socrates asked, “Which is the art of painting designed to be — an imitation of things as they are, or as they appear — of appearance or of reality?” It’s a new way to explore fundamental truths in imagery and I can’t wait to see where artists and storytellers take it.