Welcome to Time Machines, where we offer up a selection of mechanical oddities, milestone gadgets and unique inventions to test out your tech-history skills.
Today the world can easily be captured in 1s and 0s for our viewing pleasure. The hardware behind this capability all started as a DIY lab project in 1974 to test out some new gear, and the result was a Frankenstein-like device that would eventually lead to world-changing advances in photographic technology. Head on past the break for the full story.
First digital camera
The image showed a clear silhouette, but what should have been details was just a sea of static. The camera had worked, but it still had a ways to go. After some troubleshooting, however, Steve Sasson realized that the playback unit had just jumbled the bit order. Only the distinct 1s and 0s (black and white) had displayed correctly, leaving the remaining parts of the image a distorted mess. With an hour's worth of tinkering, he managed to fix the output problem and was able to view the first successful digital photograph.
Sasson was an electrical engineer at Eastman Kodak in 1974 when he started working on his digital camera project. He was testing out a charge-coupled device (CCD) that had just been released that year and decided to build a camera to check its image quality. The unique thing was that it would be all-digital. It wouldn't need any film, which was certainly an interesting development for a company that made its money selling film and photographic paper.
It wouldn't need any film, which was certainly an interesting development for a company that made its money selling film and photographic paper.
After about a year on the project, Sasson and his team managed to cobble together an 8.5-pound portable camera from spare parts. The lab team had scavenged bits and pieces from all around the company campus. They had snagged a lens from Kodak's Super 8 camera, along with a mixture of parts including a portable digital cassette recorder, 16 nickel-cadmium batteries and several dozen digital and analog circuits all wired together across about six circuit boards. The CCD was a digital imaging chip, providing a matrix of light-sensing photosites (or pixels) and offered a 100 x 100 matrix array -- which measures 0.01 megapixel by today's standards. Light patterns would strike the sensor, which would convert them into electrical signals that could be converted into a digital image. It took a while, but Sasson and his team had finally brought his digital camera from concept to a rather unconventional-looking reality.
Until the first real-world test, their only interaction with the device had been through voltage measurements and oscilloscope traces. In December 1975, they headed down the hall to try their photographic skills out on a comely lab technician. With a little coaxing and a click of the shutter, they had captured the snapshot and returned to their lab to view the results. The CCD sent the image data over to a digital cassette recorder bolted onto the side of the camera, a process that took 23 seconds to complete. After a short wait, they ejected the tape, slotted it into their own custom-built playback unit and glanced over at the display to find that, with some fine-tuning, the experiment was a success.
Sasson shared his findings with various groups throughout the company over the course of 1976. His demo was titled: Film-less photography, which certainly raised a few eyebrows at the film-entrenched Eastman Kodak. His co-workers flooded him with questions, but they weren't about the technology, rather, its implications. Many inquired about the potential impact this device would have on the market, while others asked why anyone would want to view photographs on a TV. Some were curious about what an "electronic photo album" would look like and how these digital images would be stored. Most importantly, they asked when it would be ready for consumer application.
He was hardly prepared to answer those questions; Sasson was an engineer, not a futurist, but he made his best guess based on Moore's Law, suggesting it might be 15 to 20 years before it would be ready for the public. That was in 1976. In 1977, Kodak quietly began registering for patents related to digital camera technology, but avoided spreading the news about the potential for film-less photographs; they were primarily in the business of selling film, after all.
His co-workers flooded him with questions, but they weren't about the technology, rather, its implications.
Moore's Law proved an accurate compass, with digital cameras appearing on the market by the early '90s. The Dycam Model 1 -- also sold as the Logitech Fotoman FM-1 -- was one of the earliest consumer models to hit shelves, but its QVGA resolution and $1,000 price tag made it a tough sell. As the years went on, though, more digital shooters would land in stores, including 1994's Apple QuickTake 100. The Kodak-manufactured QuickTake was a full-color camera that could connect to Macintosh and Windows PCs, offering both 640 x 480 (0.3 megapixel) and 320 x 240 (0.077 megapixel) resolutions.
While the origins of the digital camera may be traced back to Eastman Kodak, the company never successfully made the crossover into the burgeoning digital market. After finding itself in a growing financial hole, it was forced to sell valuable patents to help clear up its 2012 bankruptcy. A device that had been spawned in its own labs was at the core of Kodak's stubbornly film-dependent downfall.