Advertisement

House bill would limit Section 230 protections for 'malicious' algorithms

Don't count on the measure reaching the President's desk.

Energy and Commerce Committee Chairman Frank Pallone, Jr. (D-NJ). (Tom Williams/CQ-Roll Call, Inc via Getty Images)

Congress is once again hoping to limit Section 230 safeguards under certain circumstances. Rep. Frank Pallone and other House Democrats are introducing a bill, the Justice Against Malicious Algorithms Act (JAMA), that would make internet platforms liable when they "knowingly or recklessly" use algorithms to recommend content that leads to physical or "severe emotional" harm. They're concerned online giants like Facebook are knowingly amplifying harmful material, and that companies should be held responsible for this damage.

The key sponsors, including Reps. Mike Doyle, Jan Schakowsky and Anna Eshoo, pointed to whistleblower Frances Haugen's Senate testimony as supposed evidence of Facebook's algorithm abuse. Her statements were proof Facebook was abusing the Communications Decency Act's Section 230 "well beyond congressional intent," according to Eshoo. Haugen alleged that Facebook knew its social networks were harmful to children and spread "divisive and extreme" content.

The bill only applies to services with over 5 million monthly users, and won't cover basic online infrastructure (such as web hosting) or user-specified searches. JAMA will go before the House on October 15th.

As with past proposed reforms, there are no guarantees JAMA will become law. Provided it passes the House, an equivalent measure still has to clear a Senate that has been hostile to some Democrat bills. The parties have historically disagreed on how to change Section 230 — Democrats believe it doesn't require enough moderation for hate and misinformation, while Republicans have claimed it enables censorship of conservative viewpoints. The bill's vaguer concepts, such as 'reckless' algorithm use and emotional damage, might raise fears of over-broad interpretations.

The bill could still send a message even if it dies, though. Pallone and the other JAMA backers argue the "time for self-regulation is over" — they're no longer convinced social media heavyweights like Facebook can apologize, implement a few changes and carry on. This won't necessarily lead to a more strictly regulated social media space, but it could put more pressure on social networks to implement far-reaching policy changes.