Advertisement

This is what it looks like when a neural net colorizes photos

The results are spotty, but when they turn out well they're actually pretty impressive.

University of California, Berkeley

We've seen the horrific results of Google's servers taking acid and interpreting photos with DeepDream, but what happens when a neural network does something altogether less terrifying with snapshots? It'll go all Ted Turner and colorize black and white images with what it thinks are the right chroma values based on analyzing countless similar photos. At least that's what a team of University of California at Berkeley researchers experimented with in their paper Colorful Image Colorization (PDF).

The results are all over the map, but a few of the test images -- like the puppy and Monarch butterfly above -- look pretty good. The algorithms work using a few common sense rules (the sky is typically blue, dirt roads are usually brown and have a similar texture), and "hallucinating" a plausible colorized photo. But the results are far from perfect. For example, the neural net has a hard time coloring within the lines with more complex subjects like vegetables on a plate or keeping a heron bright white.

When it does hit the mark, however, it's impressive. In fact, 20 percent of the folks surveyed for a "colorization Turing test" were fooled into thinking that the images weren't monochrome to start. Unless you want to cry yourself to sleep though, don't look at what the algorithms do with anything from Ansel Adams. Trust me.