Twitter's auto image cropping had a bias toward white people and women

It didn't show signs of the male gaze, however.

DeFodi Images via Getty Images

Last October, Twitter promised to re-evaluate its image cropping algorithm after users complained that it was biased. On Wednesday, the company released the results of that study. When people first started documenting problems with the algorithm, Twitter said its initial investigations had not shown evidence of racial or gender bias. Today's research paints a different picture.

In comparing the algorithm's preference between men and women, in addition to its treatment of white and Black individuals, the company found an 8 percent difference in favor of women and a 4 percent favor toward white people. What it didn't see was that the algorithm reflected a built-in male gaze. Testing it against images of women, Twitter found the algorithm only cropped about three photos in every 100 to a location other than the face, and when it did so, it didn't focus on specific body parts. Instead, it narrowed in on elements like numbers on sports jerseys.

The company said one potential reason for the algorithm's cropping choices could be its preference for high-contrast images but noted that's not an excuse. "Machine learning-based cropping is fundamentally flawed because it removes user agency and restricts user's expression of their own identity and values, instead imposing a normative gaze about which part of the image is considered the most interesting," the company said. "One of our conclusions is that not everything on Twitter is a good candidate for an algorithm, and in this case, how to crop an image is a decision best made by people."

To that end, the company recently rolled out full-sized image previews on Android and iOS, and says it plans to implement additional changes to how the platform handles media in the future.