Even though it debuted a whole year after Periscope and Meerkat, Facebook Live is by far the dominant force today in mobile live video. Part of the reason for its success is sheer name recognition, but a lot of it also has to do with how hard the company has been pushing it. From day one, you could broadcast and view live videos from the main app, without having to download additional software. What's more, Facebook also took pains to pay news outlets and media companies to use its live video service. This gave the service more gravitas and also brought it plenty of publicity. A few months later, Facebook gave Live its own discovery section in the app, further boosting its visibility.
From there, the videos on Facebook Live went viral. Candace Payne broadcasted a video of herself wearing a Chewbacca mask, and before she knew it, the clip had more than 140 million views. It was so popular that CEO Mark Zuckerberg invited her to Facebook's Menlo Park offices, and she also appeared on various shows like "Good Morning America" and "The Late Late Show with James Corden."
Mobile live video made headlines again a few of months later when House Democrats used Facebook Live as well as Periscope to stream their sit-in from the House floor when Speaker Paul Ryan shut off C-Span's cameras. It showed that live video doesn't just need to be about exploding watermelons or Chewbacca masks; it could also be a way to stream news events where traditional media outlets have little to no access.
Unfortunately, there's also a darker side to live video. At least a couple of police shootings were captured on Facebook Live: one of Antonio Perkins in Chicago and another of Philando Castile in Falcon Heights, Minnesota. The latter video was briefly deleted due to a "glitch," according to Facebook, but was soon reinstated with a graphic content warning. Facebook Live was also used to broadcast the deaths of 11 Dallas police officers during a protest over those aforementioned police shootings.
For better and worse, it's clear that Facebook Live is an additional tool in reporting the news. Indeed, Facebook even teamed up with ABC for live coverage of both the Democratic and Republican national conventions, so users could see what was going on without needing to fire up their TV sets. Facebook is even looking into airing scripted shows and sports broadcasts on the platform -- further evidence that the company is more media-driven than previously thought.
This all dovetails with Facebook's increasing role in news dissemination. Even though CEO Mark Zuckerberg continues to deny it, Facebook has all the markings of a media company. Sure, it doesn't produce any content, but millions of people use the site every day to get information. A Pew Research study published this year showed that around 44 percent of Americans now consider Facebook their primary source of news. Seeing as several media organizations have partnered with the firm to produce so-called "instant articles" -- stories that are stored on Facebook's servers rather than their own -- it's clear that the company is at least aware of its role as a news hub.
For evidence that Facebook is indeed a media arbiter, consider the time its algorithm automatically censored the iconic "napalm girl" photo due to nudity. After realizing its importance, the company reinstated it, explaining that it's difficult for an algorithm to differentiate between child porn and an image of historical or cultural significance. The company faces the same issue with live video: When is violence permissible? These are questions that traditional tech companies don't have to answer but media companies do.
Also, we learned earlier this year that Facebook had been using a team of human editors to curate the trending topics list you see on the right side of your News Feed. Obviously this indicates a certain amount of editorial decision-making, despite Facebook's arguments to the contrary. There were also critics who said this team of editors was suppressing conservative news in favor of left-leaning stories.
Then, in August, Facebook largely disbanded that team, leaving trending topics to be curated by algorithms. Unfortunately, just a few days later this led to a fake news article about Megyn Kelly getting top billing on the site. Months later, a 9/11 truther story appeared in the trending topics section as well. Considering Facebook's algorithm tends to favor stories with high engagement -- those that gain more Likes and clicks will naturally float to the top of the feed -- articles with sensationalist headlines would naturally get more traction. Despite Facebook's efforts to limit these stories, they're more likely to be clickbait or even false.
Fake news would continue to plague Facebook's reputation for much of the year, especially as speculation increased that the rise of News Feed falsehoods had an impact on the outcome of the election. After initial statements that downplayed the role of fake news, Zuckerberg did eventually come forward and state that Facebook was taking steps to eradicate it, like cutting off advertising to fake news sites, making them easier to report and having third-party fact-checkers give them a second look.
As more people look to Facebook as their source of information rather than traditional media outlets, it's time for the company to take its role as a media entity more seriously. Right now it's mostly relying on AI and algorithms to filter through content, but it's clear by now that human beings are still required to judge what is real and what is not. Considering fake news has the potential to influence elections and sway people's minds, Facebook should take its responsibility as media arbiter a lot more seriously.
Check out all of Engadget's year-in-review coverage right here.