Facebook will train its content-removal AI with police camera footage

Facebook wants to make its AI better at detecting videos of mass shootings.

Facebook will use footage from police body cameras to train its automatic detection systems to spot and remove footage of mass shootings. The company announced Tuesday that it will partner with law enforcement in the US and UK to obtain footage from their firearms training programs. Data from those videos should help Facebook's systems detect real-world, first-person footage of violent events.

According to Financial Times, Facebook will provide the UK's Metropolitan Police with body cameras at no cost. In exchange, Facebook will have access to that data, which it will also share with the UK Home Office. The company is reportedly in talks with US law enforcement about a similar partnership.

Facebook was criticized this spring when, after the Christchurch shooting in New Zealand, footage from the attack was available on Facebook for weeks following the event. Part of the problem, Facebook said, is that it did not have enough content depicting first-person footage of violent events to train its machine learning technology. Facebook hopes footage from the police training programs will allow it to improve its automatic detection systems. By making those systems more accurate, Facebook may also reduce the likelihood that it will mistakenly remove violent scenes from fictional content, like movies or video games.

Facebook says some of the changes it has made to combat hate and terrorism predate the Chirstchurch shooting. "But that attack, and the global response to it in the form of the Christchurch Call to Action, has strongly influenced the recent updates to our policies and their enforcement," Facebook wrote in a press release. Tomorrow, the company is expected to testify alongside Google and Twitter before a US Senate committee on how it's tackling violent and extremist content online.