YouTube's algorithms couldn't detect graphic bestiality images

Some videos featuring bestiality thumbnails have been on the site for months.

Sponsored Links

shutterstock
shutterstock

YouTube might have pulled 8.3 million videos violating its guidelines in the fourth quarter of 2017, but the platform is still plagued with various kinds of disturbing content. A new BuzzFeed report, for instance, has revealed that YouTube has been hosting images depicting bestiality for quite a while now. Certain search terms, including "girl and her horse" and "lovely smart girl playing baby cute dogs on rice," surface videos that use thumbnails of women who seem to be engaging in sexual acts with horses and dogs.

The videos apparently feature women brushing, bathing and taking care of animals, with a camera moving around to take upskirt and crotch shots. An unnamed YouTube employee told BuzzFeed News that the platform's thumbnail monitoring tech still isn't as powerful as its video algorithms. He said it probably didn't catch the bestiality thumbnails, because they lack the usual signifiers associated with typical porn, like human skin.

The employee also told the publication that the thumbnails are similar to the ones made by a Cambodian content farm, which got kicked off the website in 2017. That probably explains why multiple videos share the same explicit image. He said the use of bestiality shows "how the Cambodian content farm tactics are evolving." One of the other reasons why YouTube might have missed the images is because the videos aren't monetized, though their uploaders are likely planning to make money off their content in the future. And if YouTube doesn't stop them, they could make a lot since some of their videos have amassed millions of views in the months they've been up on the website.

According to the publication, at least one of the offending accounts (SC Daily) also uploaded those bizarre videos masquerading as kid-friendly content. It said YouTube already started removing the videos featuring bestiality thumbnails, but a quick check shows that they're still around. Some of them have been up on the website for over half a year, though most have been uploaded merely a few days ago. We even found one using a kid-themed profile and channel name.

In YouTube's first Community Guidelines enforcement report, it said most of the videos it took down were flagged by its algorithms. Clearly, it has to beef up the algorithm monitoring its thumbnails. For now, you can report any disturbing content you find by going into the video, clicking the ellipsis icon and choosing Report.

Popular on Engadget