Twitter hosted an open context to find out the algorithmic bias in Twitter’s image cropping algorithm. Today Twitter has announced the result of the context. Back in March Twitter disabled automatic photo cropping because users reported twitter’s algorithm prefers light-skinned and beautiful faces. After that Twitter held a bug bounty to find out the reality of the problem.
Earlier findings of the bug bounty showed that the Top Placed Entry (@hiddenmarkov) showed that image cropping algorithm favors smooth and light-skinned, slim, young, and faces with feminine traits.
“When we think about biases in our models, it’s not just about the academic or the experimental […] but how that also works with the way we think in society,” said Chowdhury. “I use the phrase ‘life imitating art imitating life.’ We create these filters because we think that’s what beauty is, and that ends up training our models and driving these unrealistic notions of what it means to be attractive.”
-Rumman Chowdhury, director of Twitter’s META team
Award for the most innovative in this bug bounty goes to @OxNaN because of their finding that Twitter’s auto image crop algorithm favored emojis with light color tone rather than emojis with dark color tone.
We will highly appreciate your Follow.