The original iPad came out on April 3rd, 2010, at a time when most smartphone manufacturers were making the awkward transition from full QWERTY keyboards to touchscreen-only devices. Apple sold 1 million iPads in that first month, and by the end of 2010, that figure had climbed to 15 million.
That same year, the top video games were Fallout: New Vegas, Bayonetta, Red Dead Redemption, Super Meat Boy and StarCraft II. The alpha version of Minecraft was generating some slight buzz.
As touchscreen technology advanced, smartphones and tablets became not only ubiquitous in everyday life, but necessary. Nowadays, it's safe to assume every consumer has a screen at their fingertips essentially all the time.
For video game players, this is known as a second screen. The iPad wasn't the beginning of second-screen gaming, but it could have been a tipping point for this idea, inviting developers to innovate on standard input methods as more players than ever had access to additional hardware. It was the right device at the right time. However, despite a few valiant tries, second-screen play didn't really stick on the iPad. Or any other device.
I'll address the obvious rebuttal first -- yes, the Nintendo Switch has a second screen. However, it's truly a single-display console, favoring either the handheld screen or the TV, but never both. Nintendo is alone among video game companies in its willingness to play with hardware design, and it often releases wildly unique devices that change the way people interact with their digital experiences. The Switch is a fine example of this ethos.
However, the Switch is an all-inclusive, proprietary piece of hardware. It doesn't leverage the devices that everyone already has -- the smartphones, tablets and laptops that are always by our sides when playing games, usually with a tab already open to GameFAQs or crafting guides or YouTube walkthroughs. This is the market the iPad could have led.
Nintendo took up the second-screen mantle in early 2002 with the Game Boy Advance link cable for the GameCube. This turned the GBA into a second screen for some GameCube titles, unlocking new modes and displaying stats like health, maps or ammo in titles including Splinter Cell, Pokemon Ruby and Sapphire, Phantasy Star Online, The Legend of Zelda: Wind Waker and the original Animal Crossing.
Will Lipman / Engadget
And, of course, there was the Wii U in 2012. Though it represents a cringeworthy period of Nintendo history, the Wii U served as a necessary predecessor to today's Switch, allowing the company to experiment with its own brand of dual-display gaming and welcoming developers to think in new ways. Rayman Legends was arguably the most successful iteration of second-screen thinking, allowing players to drop into the game and take control of a character via the Wii U GamePad, which sported a touchscreen. Other games took advantage of the Wii U's weird design, including ZombiU, The Legend of Zelda: Wind Waker HD, The Wonderful 101, Deus Ex: Human Revolution and New Super Mario Bros. U. Overall, though, the Wii U was a flop and its screen-embedded controller felt like more of a gimmick than an industry-shifting innovation.
As the Wii U was trying to find its footing, Xbox revealed SmartGlass, an app for Android, iOS and Windows 8 that would enable additional functionality for Xbox 360 games. This is closer to the pure, iPad-driven dream of second-screen gaming, utilizing existing hardware to augment games. In Dead Rising 3, for instance, SmartGlass enabled additional missions, map options and the ability to call in drone support to take out hordes of undead with a single blow. Participating games were still playable in full without second screens, but the app provided fresh ways to interact with digital worlds.
SmartGlass also allowed users to control the Xbox 360 itself from phones and tablets, which was a big deal at a time when Microsoft was pushing non-interactive entertainment options like live and streaming TV. SmartGlass was fun while it lasted, but in the end, few titles took advantage of the tech and the entire thing eventually evolved into the Xbox app. Today, it serves as a hub for buying Xbox games and activating Xbox Game Pass, the company's monthly subscription service.
Of course, the alternative-play industry didn't start with Xbox or even Nintendo. The true granddaddy of second-screen gaming was the Visual Memory Unit for the Sega Dreamcast. Originally released in Japan in 1998, it was a Frankenstein monster mash-up of three things: a memory card, an auxiliary display and a tiny standalone console, complete with directional pad, action buttons and the ability to connect to other VMUs. The device augmented Dreamcast games, displaying useful stats and enabling mini-games in certain titles. For example, Sonic Adventure had Chao Adventure, a Tamagotchi-style experience played entirely on the VMU, while Quake III Arena had a maze game.
Sega dropped out of the console industry in 2001, discontinuing the Dreamcast and VMU. But for most people who got their hands on one, the VMU remains a bright, warm memory of gaming goodness. The most disappointing thing about it was that more games didn't take advantage of its weird feature set.
This seems to be the sticking point with second-screen gaming: developers. Console manufacturers can release hardware with as many screens as they want, and there can be nearly 4.8 billion smartphones and tablets in the wild, but it's up to studios to create software that pushes these devices to their full potential. There have been attempts at leveraging mobile devices in modern games, such as the Destiny 2 Companion App, Fallout Pip Boy tool, and Grand Theft Auto iFruit experience. These apps generally serve as hubs for tracking progress or expanding the game world, though they tend to be built for phones rather than tablets, and they appear as an afterthought of multimillion-dollar marketing budgets.
The divide between mobile and console gaming is shrinking as tablets and smartphones become powerful enough to support rich experiences, and developers attempt to tap into a market that's billions of devices deep. In this environment, it feels like a good time to give second-screen game development another go. Nintendo alone demonstrated the allure of dual-screen gaming with the Wii U, though it unraveled that progress with the Switch, which technically uses two displays but doesn't offer second-screen play. And now Nintendo is all about the Switch Lite, a cheaper console that doesn't connect to a TV at all. After the failure of the Wii U, the studio has apparently ditched the idea of a built-in second screen altogether.
Luckily for Nintendo -- and literally every other video game company -- players nowadays come with their own screens.