Whether or not the art world wants a robotic painter, it's going to get one. Chris Chen, founder of Instapainting, a web service that converts your photographs into paintings, has built a machine that's creating an artwork live on Twitch. Users can punch in commands on the gaming platform for the paint brush wielding machine to follow.
Inspired by Twitch Plays Pokemon, a popular stream that allowed people to play the game collectively through the chat, Chen decided to hook up his robot to the platform for a similar collaborative experience. "I wanted [people] to take control of the robot to paint," he gushes. "The idea is to integrate technology into the creation process [of art] so people can watch it being painted in real time."
Painting isn't a new artistic endeavor for robots. From AARON, the first autonomous painting machine, to eDavid and bitPaintr more recently, inventors have been toying with the idea of robotic artists. But unlike AARON, created by Harold Cohen in the '70s, Chen's robot isn't painting with imagination or intent. It's being fed a set of instructions for every stroke and color. It can either follow the lead of an artist who shows it how it's done or it takes its cues from users on the Internet.
The robot, which cost about $200 to build, made its first mechanical reproduction of an artwork a couple of months ago. When Jean Liang, a digital artist, drew on a Wacom tablet, the robot responded in real-time and followed the motions of the pen. But it also recorded the artist's movements to create a replica of the painting autonomously soon after.
Whether or not a robot can be creative is a heated, inconclusive debate. But experiments like Chen's do fall neatly into the category of human-machine collaborations. When he first made the robot available on Twitch, programmers in the viewer-crowd took control and managed to write scripts in real-time to make the robot paint circles and mash up colors. But Chen wanted to make the collaborative process of robot painting more accessible. "I'm redoing it to make it a valid experiment of collaborative art so regular people wouldn't be inhibited to try and take control," he says. "It's a basic control GUI, so all you have to do is click and it'll move." When a user types a command in the chat, "up 400 right 300 brush 40" for instance, the robot averages all three dimensions -- X, Y and Z coordinates -- to follow the precise command for a stroke on the canvas.
Chen's entire business model for Instapainting is based on the similar sentiment of making art accessible. But he's quick to clarify: "You shouldn't necessarily see it as art, unless your photo is a piece of art. I'd prefer not to add creative input. Not because people can't do that, but because that's not good business." The way he sees it, adding an artist's creative interpretation to a painting will lead to a lot of dissatisfied or fickle customers. So he sticks to replicas of photos exactly as the customers want them. "It's not perfect artwork," he says. "It's perfect painting."
When Instapainting launched in 2014, backed by YCombinator, there was a small but instant demand. There were people who wanted their favorite pet pictures converted into oil paintings and a quick Google search threw up Instapainting as an option. At the time, when the service was slowly gaining traction on Reddit, it seemed feasible for them to have the paintings made in the country. But when art studios in China reached out to Chen with their price lists, he couldn't turn them down. "They offered really good quality," he says. "They were cheaper, too." Soon, the photo-paintings were outsourced to Chinese art studios.
For now, Chen's robotic painter exhibits the possibilities of man-machine collaborations. But eventually, when the AI-version of this robot, which is expected to follow as per the company's blog, becomes capable of churning out painted replicas, it could rival the Chinese studio suppliers. "If you get a painting right now [that's] different from the [photo], it's not because the artist added a creative input, it's because they made a mistake," says Chen. "We want to offer a service that's close to a printer. Except, right now it's cheaper to have an actual human artist do it rather than a robot."