The first part of the report details Facebook's failings and proposes measures that would result in drastic changes at the social network. For one, it recommends criminal prosecution against Facebook if it fails to act against harmful and illegal content.
It also recommends bans on micro-targeted political ads, the creation of a new designation for Facebook that's somewhere in between platform and publisher, new powers for the UK Electoral Commission to combat fake news on social media, and a comprehensive overhaul of election advertising legislation. Finally, it issued another demand that Mark Zuckerberg "come to the committee to answer questions to which Facebook has not responded adequately to date."
A professional global Code of Ethics should be developed by tech companies, in collaboration with this and other governments, academics, and interested parties, including the World Summit on Information Society, to set down in writing what is and what is not acceptable by users on social media, with possible liabilities for companies and for individuals working for those companies, including those technical engineers involved in creating the software for the companies.
The committee also expressed concern about Cambridge Analytica, specifically that it had worked for the UK government with a "secret clearance." It also points out that the firm had ties with a Malta-based company that was essentially selling passports to Malta (and by extension to Europe). It notes that investigative journalist Daphne Caruana Galizia, assassinated by a car bomb last year, was investigating that very passport scheme.
Finally, the DCMS urged an investigation into Russia's ties to disinformation on social media. Committee chair Damian Collins said that early inquiries into Russian disinformation on Facebook soon led to questions about interference in Brexit and other UK elections. "And we noticed an aggressive campaign against us even asking these questions. It underlined the need to persist, which we have done," he told the Observer.
It said that the millionaire backer of Leave.EU had ties to Russian companies and officials, and urged the National Crime Agency (NCA) to follow the money used that paid for ads on Facebook and other sites. It also criticized statements by Facebook's UK Policy Director Simon Milner that the company wasn't aware of Russian interference in the Brexit referendum. "We deem Mr. Milner's comments to the Committee to have been disingenuous and typical of Facebook's handling of our questions," the report states.
The DCMS group worked together with the Senate Intelligence Committee in Washington DC, which will launch its own hearing on foreign interference on social media. Meanwhile, the UK government is expected to propose new regulations for tech companies later this year, especially around election advertising. If they implement new rules proposed by the committee, "social medial companies can [no longer] hide behind the claim of being merely a 'platform,'" the report states.
Update: Facebook has responded to the DCMS committees report with the following comment:
The Committee has raised some important issues and we were pleased to be able to contribute to their work.
We share their goal of ensuring that political advertising is fair and transparent and agree that electoral rule changes are needed. We have already made all advertising on Facebook more transparent. We provide more information on the Pages behind any ad and you can now see all the ads any Facebook Page is running, even if they are not targeted at you. We are working on ways to authenticate and label political ads in the UK and create an archive of those ads that anyone can search. We will work closely with the UK Government and Electoral Commission as we develop these new transparency tools.
We're also investing heavily in both people and technology to keep bad content off our services. We took down 2.5 million pieces of hate speech and disabled 583 million fake accounts globally in the first quarter of 2018 — much of it before anyone needed to report this to Facebook. By using technology like machine learning, artificial intelligence and computer vision, we can detect more bad content and take action more quickly.