A spokesperson cited YouTube's policy that prohibits the monetization of videos with "dangerous and harmful" content as the company's reason behind the decision. "We have strict policies that govern what videos we allow ads to appear on, and videos that promote anti-vaccination content are a violation of those policies," they told BuzzFeed News. "We enforce these policies vigorously, and if we find a video that violates them, we immediately take action and remove ads."
YouTube will now replace those ads with an information panel linking to a Wikipedia page for "vaccine hesitancy" that clearly describes it as "one of the top ten global health threats of 2019." The platform received a lot flak of after the original report was published, especially since Facebook already started "exploring additional measures" to fight the spread of anti-vaccine disinformation by then. Facebook was under tremendous pressure to squash the rampant spread of anti-vax sentiments in Groups, seeing as it may have contributed to a measles outbreak in the US.
The original report also found that the platform tends to add anti-vaccine videos to the "Up Next" section and to auto-play them right after pro-vaccination content. Due to the fact that pro-vaccination videos aren't as common -- people who've never had measles or any vaccine-preventable illness won't be shooting videos extolling the benefits of vaccination, after all -- YouTube's algorithm lines up anti-vax content instead. The company told TechCrunch that it's going to implement changes to its Up Next algorithm to prevent anti-vax videos from spreading.