We've got great news this week for nation-state employees tasked with using social media to spark a class war in previously stable democracies! Facebook is patenting technology to decide if its users are upper, middle or working class -- without even using the usual marker for social class: an individual's income (the patent considers this a benefit).
Facebook's patent plan for "Socioeconomic Group Classification Based on User Features" uses different data sources and qualifiers to determine whether a user is "working class," "middle class," or "upper class." It uses things like a user's home ownership status, education, number of gadgets owned, and how much they use the internet, among other factors. If you have one gadget and don't use the internet much, in Facebook's eyes you're probably a poor person.
Facebook's application says the algorithm is intended for use by "third parties to increase awareness about products or services to online system users." Examples given include corporations and charities.
The patent essentially tells us that Facebook intends to add class to its advertising preferences. You know, the ones that until very recently allowed advertisers to filter by race and ethnicity, which they dragged their feet about fixing, and still hovers in the realm of "temporarily disabled."
The patent's illustrations tells us a story. It's a tale of upper-class employees indoctrinated, sheltered, and inoculated from outside thinking. Facebook has been literally sitting there going "what does it mean to be middle class", defining it themselves, and getting excited about who they can sell class information to.
I love it when a company tells us more about itself like this. "Working class" is at the bottom of its graphs. Its idea of "middle class" is a homeowner in the cash-soaked city of Palo Alto; its artwork suggests that home ownership in San Jose is surely a lower-class indicator. Af if to say, who would buy a home there?
Looking at the patent's details, I can also imagine how this sets Facebook up to assign particular locations (like restaurants) or objects (brand names or items) to a socioeconomic class.
Meaning, it's not a stretch to think that Facebook will be able to decide from check-ins which restaurants, phones, and brands of shoes, are "lower class."
Oops -- I meant to write "working class." Really.
Open season on opportunity hoarding
Leave it to Facebook to race ahead of Black Mirror. In light of Facebook's recent announcement to focus on more "concrete local issues" it's not too hard to imagine how this could be used to keep working class renters out of a particular neighborhood, to facilitate predatory lending, or make sure that lower classes don't see your dating service ads.
This is not a new problem; Facebook is just 'improving' it. After reading the 2014 White House report on data, privacy, algorithms, and discrimination, Google engineer Jeremy Kun wrote:
Here's a not-so-imaginary example of the problem. A bank wants people to take loans with high interest rates, and it also serves ads for these loans. A modern idea is to use an algorithm to decide, based on the sliver of known information about a user visiting a website, which advertisement to present that gives the largest chance of the user clicking on it.
The bank then makes a tasty little ad offering much-needed money for little up front, and the algorithm takes over. Target the lower class, the working class -- which are often people of color, too -- because data shows poor people (those least in a position to take high-interest loans) are more likely than the general population to be so desperate that they'll take on a bad loan.
Kun explains, "So an algorithm that is 'just' trying to maximize clickthrough may also be targeting black people, de facto denying them opportunities for fair loans."
According to Kun, under US law, a practice can be considered discriminatory "even in the absence of intent." In comments he added:
E.g. if people in poorer neighborhoods are disproportionately shown ads for predatory loans because an algorithm decided it has higher clickthrough. They have fewer chances for a fair loan treatment because they aren't exposed to normal loan offers. Is that direct or indirect? In either case, it's illegal. Even if a practice is not racist by intention, it can have an disproportionate adverse impact on a particular racial class, and so it's still considered illegal.
Facebook's algorithm may not be making illegal decisions, but it is absolutely facilitating humans to do so. Welcoming them to it, actually. Not that Facebook lets the law deter them from making a buck. I mean, c'mon.
It's not too hard to think of ways Russia's state-sponsored propaganda trolls will use this new feature either.
Stay classy, Facebook
Well, you're saying, it's just an updated version of Facebook's patent to help lenders discriminate against borrowers based on social connections, isn't it? Which was the inspiration for the parody app, "Unfriend the Poors." And maybe now isn't such a ridiculous idea after all in a reality where you'll probably want to unfriend your mom to raise your credit score before you click on that Facebook ad.