Bug Bounty shows Twitter’s auto-crop is Racist

Bug Bounty shows Twitter auto-crop is Racist
Bug Bounty shows Twitter auto-crop is Racist | Image: Report Teller

Twitter hosted an open context to find out the algorithmic bias in Twitter’s image cropping algorithm. Today Twitter has announced the result of the context. Back in March Twitter disabled automatic photo cropping because users reported twitter’s algorithm prefers light-skinned and beautiful faces. After that Twitter held a bug bounty to find out the reality of the problem.

Earlier findings of the bug bounty showed that the Top Placed Entry (@hiddenmarkov) showed that image cropping algorithm favors smooth and light-skinned, slim, young, and faces with feminine traits.

facial tone
Test faces by 1st winner

The second (@halt_ai) and third (@RoyaPak) entries showed that the algorithm was biased against white or grey hairs and the algorithm preferred English over Arabic.

“When we think about biases in our models, it’s not just about the academic or the experimental […] but how that also works with the way we think in society,” said Chowdhury. “I use the phrase ‘life imitating art imitating life.’ We create these filters because we think that’s what beauty is, and that ends up training our models and driving these unrealistic notions of what it means to be attractive.”

-Rumman Chowdhury, director of Twitter’s META team

Award for the most innovative in this bug bounty goes to @OxNaN because of their finding that Twitter’s auto image crop algorithm favored emojis with light color tone rather than emojis with dark color tone.

Read the latest news about Twitter.


Follow us on Facebook & Twitter and turn notifications on, so you never miss any updates about What’s happening in the tech world – Free of cost. We daily share the top news with you need to know.

We will highly appreciate your Follow.

Leave a Reply