If you are regularly using Twitter then you'd know that the application crops images to give you a preview. This is why many users often write in the captions, open full image for surprise or just see full image. Now it has come to light that the image preview algorithm is highly racist and sexist. It came to light after a user Collin Madland was pointing out how Zoom does not detect the face of a black person. When he took to posting the wide angle screenshots on his Twitter, turned out Twitter just showed his face, while the rest of the image was cropped in the preview. It started a whole discussion and people began to experiment to find out, editing images and uploading them over and over again to see how the algorithm works. Many concluded that it is highly racist as it hides faces of black people and also sexist as pictures of females are hidden when posted along with males. The Social Dilemma: 11 Tips Offered by the Experts in the Netflix Documentary to Reduce Social Media’s Control Over Us.
It has led into an online discussion of sorts while some joke around how the whole system works. Twitter users replicated so many other pictures in order to check how the preview works. With the discussion blowing up with sexist and racist remarks, the team has promised to look into it. The discussion has started over the weekend and people still continue to do their "experiments" to see what Twitter algorithm picks up. We bring you some of these tweets so you get an idea what's the issue. Google Algorithm Update: Search Engine Will Promote Original Reporting With Algorithm Change.
Check The Tweets About Image Cropping Algorithm:
Twitter Picks Mitch McConnell over Barack Obama
Trying a horrible experiment...
Which will the Twitter algorithm pick: Mitch McConnell or Barack Obama? pic.twitter.com/bR1GRyCkia
— Tony “Abolish (Pol)ICE” Arcieri 🦀 (@bascule) September 19, 2020
Image Shows a Male Over Female
let's see if the twitter image algorithm is sexist as well as racist pic.twitter.com/yBPLhxzcbE
— SҚЏLLԐҐФЙ (@NeilCastle) September 20, 2020
Another Example That Combines 4 Faces But Shows 1
Which face does Twitter show you in the thumbnail?
(I'm testing how racist the algorithms that control our thoughts may be. A totally normal thing to do in 2020. I'll try different versions of this, this one is an extreme example.)
I'll leave up all results. pic.twitter.com/WUxY4RVXHW
— James Houston (@1030) September 20, 2020
Same for Fictional Characters
I wonder if Twitter does this to fictional characters too.
Lenny Carl pic.twitter.com/fmJMWkkYEf
— Jordan Simonovski (@_jsimonovski) September 20, 2020
It Works on Dogs too!
I tried it with dogs. Let's see. pic.twitter.com/xktmrNPtid
— - M A R K - (@MarkEMarkAU) September 20, 2020
One More!
alright gonna test this pic.twitter.com/AUjgKuSHTC
— Sevvy (@sevvyperior) September 19, 2020
All of these examples prove their point again, that the algorithm picks up fairer faces and males over females in images. It has got Twitter's attention and they are working on it.
What Does the Image Cropping Algorithm State?
The ability to share pictures on Twitter has been there since 2011 and everyday thousands of users post pictures of all kinds on the microblogging platform. As per Twitter's blogpost, the images are cropped so that people can see more tweets at a time. They previously used facial recognition to choose what part of an image to showcase, but since not all photos have faces, it brought along limitations. They mention another way as saliency, wherein the neural networks and other algorithms to predict what people might want to look at by focusing on salient image parts.
It is noted in tweets by Twitter chief design officer Dantley Davis that background colour of an image also plays an important role in the preview. Twitter Communication official also tweeted that they need more analysis to find out what is happening.
(The above story first appeared on LatestLY on Sep 21, 2020 01:36 PM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).