During a fireside talk at Def Con this year, Matthews and Goerzen presented the idea that the media was like a vulnerable computer system. Instead of injecting malicious code, nefarious individuals and groups are planting and manipulating coverage. From fake Antifa pranksters getting on Fox News to insist horses are racist, to the rise of algorithm and ad-manipulating bots pushing articles on social media, the system looks like it's in need of at least an update.
But who decides how that's carried out? The talk wasn't about finding a quick and easy solution (there isn't one), but more of a way to start that conversation. Views and talking points expressed by the audience ranged from the idea that the media created this situation with a focus on sensationalist stories (if it bleeds, it leads) to the notion that people will gravitate to well-fact-checked news if it were available.
The reality is far more complex than a quick fix, and Data and Society knows this. It's spent years researching how the media has been manipulated and how coverage is shaped.
"I, personally, would like to see us think about places we can introduce friction or different types of amplification knobs and controls into the systems that we're using," Matthews said. The idea is that if something seems to be trending or popular on social media or elsewhere doesn't automatically make it newsworthy especially if it's not factual.
The truth is the cornerstone of journalism. But with the consolidation of media companies and withering budgets, it's tough for journalists to spend the amount of time needed to really dig into a subject. Matthews and Goerzen talked about bringing expertise to articles, but even reporters who have been covering a topic for years are having to pump out articles quicker and quicker to keep up with the constantly moving stream of the internet.
Not only are journalists doing more with less, they are in fewer locations. With slashed budgets come fewer bureaus. If only one outlet has a person in a region, there are fewer voices reporting on the news happening at that location.
Plus, as more and more people get their news via the Facebook and Twitter algorithms, it's tougher for any news that doesn't conform to a person's "like" data to break through. That's giving a tech company the power of a gatekeeper, a task typically given to news outlets. But the consolidation of media companies means that task is in the hands of fewer people. The list of who controls what's reported a shrinking number of more powerful entities. The consequence begs the question: How does something actually make it in front of the eyeballs of people? Even when it doesn't conform to their worldview?
Throw in fears of regulatory oversight and the conversation became interesting while simultaneously frustrating. Again, this talk won't solve the issues. But it starts a conversation. Maybe one that continues after Def Con. Goerzen will continue researching the issue and hopes to write a paper about what he finds.
Close up, the idea of the media being treated as an operating system that's full of vulnerabilities is interesting. We find areas where the system is weak and patch them.
The problem is, you can't patch people, and all the issues around journalism are part of the fallacy of human beings. Greed and shortsightedness killed the local newsroom and bureaus. Confirmation bias keeps people from reading anything that doesn't fit their worldview, and it's true if it bleeds it leads. People want to see those articles at the expense of well-reported pieces and the trolls and hackers that target the media are aware of all of this.
We are all complicit. But if a roomful of hackers with varying backgrounds can have a civil discussion about how we can fix the situation, there's a chance those conversations can start happening outside of Las Vegas convention rooms. The truth depends on it.