Last week, counselling charity Samaritans launched Radar, a new social media service that remotely listens in on Twitter conversations and warns you when someone you follow might need emotional support. Radar is already tracking over a million Twitter accounts, and while the idea is a virtuous one, the service has sparked a huge online backlash, with many calling for it to be shut down. But why?
Radar scans chosen Twitter feeds for key words and phrases -- like "depressed," "help me" and "hate myself" -- and sends an email to the user if it identifies any red flags. Samaritans argues that because a person's tweets are already public, the service is merely catching something you may have missed. However, because Radar doesn't require the person being monitored to give permission, it can also serve as the perfect tool for online trolls to stealthily catch people when they are at a particularly low point.
Samaritans is consistently lauded for its telephone hotlines, which allow callers to speak to a real person when they might have something on their mind. Additionally, friends may call on behalf of a friend or family member for advice. Unfortunately, Radar assumes that you are a well-meaning person with nothing but good intentions, which doesn't translate well on Twitter. As the GamerGate controversy has highlighted many times over, social media can also serve as a platform to spread hate messages and harass other users. Radar's auto-pilot mode can effectively tell an ill-meaning follower when the best time is to pounce.
Radar isn't simply reading tweets on your behalf: it indexes each individual message
Then there's the question of privacy. We already know that Samaritans views personal tweets available in the public domain as fair game, but Radar is a lot more complex than that. As The Register points out, Radar isn't simply reading tweets on your behalf: it indexes each individual message, pushes it to a third-party server in order to process keywords and then stores it for future matching.
UK and European data protection laws already protect citizens from services that interfere with people's rights, but Samaritans believes the app's approach doesn't contravene regulations. "Samaritans Radar has been in development for over a year and has been tested with several different user groups who have contributed to its creation, as have academic experts on suicide through their research," the charity said in a recent statement. "In developing the App we have rigorously checked the functionality and approach taken, including an impact assessment against data protection and data processing principles." It also notes that it is continuing to work with regulators and "will take action as needed to address these concerns appropriately going forward."
When machines are doing all the work, expect false positives. Tweets containing lyrics from a sad song or a movie quote can potentially trigger Radar's filter. Not only does this prove wasteful for people signing up for the alerts, too many erroneous emails could mask a real issue when one arises.
As of November 2nd, 3,000 people had activated the Samaritans Radar app, which is reportedly now tracking over 1.64 million Twitter accounts. The people behind these accounts now have invisible crosshairs on their backs, and the only way for them to rid themselves of potential targeting is to manually opt out by having Samaritans add their name to a whitelist.
It was enough for information policy activist Adrian Short to call on Twitter to take action. He's created a Change.org petition to ask the company to bar the charity from accessing Twitter users' data for breaching people's privacy, making people more vulnerable online and "making Twitter a less comfortable and useful place for people with emotional and mental health problems."
Samaritans has yet to fully address the individual concerns leveled at Radar, choosing instead to potentially incorporate changes further down the line. If the campaign to have it taken down continues to gather steam, however, it won't be able to stay quiet for long.