The Female Instructor’s Strange Private Lesson (2025)first look a Twitter user gets at a tweet might be an unintentionally racially biased one.
Twitter said Sunday that it would investigate whether the neural network that selects which part of an image to show in a photo preview favors showing the faces of white people over Black people.
The trouble started over the weekend when Twitter users posted several examples of how, in an image featuring a photo of a Black person and a photo of a white person, Twitter's preview of the photo in the timeline more frequently displayed the white person.
This Tweet is currently unavailable. It might be loading or has been removed.
The public tests got Twitter's attention - and now the company is apparently taking action.
"Our team did test for bias before shipping the model and did not find evidence of racial or gender bias in our testing," Liz Kelly, a member of the Twitter communications team, told Mashable. "But it’s clear from these examples that we’ve got more analysis to do. We're looking into this and will continue to share what we learn and what actions we take."
Twitter's Chief Design Officer Dantley Davis and Chief Technology Officer Parag Agrawal also chimed in on Twitter, saying they're "investigating" the neural network.
This Tweet is currently unavailable. It might be loading or has been removed.
This Tweet is currently unavailable. It might be loading or has been removed.
The conversation started when one Twitter user initially posted about racial bias on Zoom's facial detection. He noticed that the side-by-side image of him (a white man) and his Black colleague repeatedly showed his face in previews.
This Tweet is currently unavailable. It might be loading or has been removed.
After multiple users got in on testing, one user even showed how the favoring of lighter faces was the case with characters from The Simpsons.
This Tweet is currently unavailable. It might be loading or has been removed.
Twitter's promise to investigate is encouraging, but Twitter users should view the analyses with a grain of salt. It's problematic to claim incidences of bias from a handful of examples. To really assess bias, researchers need a large sample size with multiple examples under a variety of circumstances.
Anything else is making claims of bias by anecdote – something conservatives do to claim anti-conservative bias on social media. These sorts of arguments can be harmful because people can usually find one or two examples of just about anything to prove a point, which undermines the authority of actually rigorous analysis.
That doesn't mean the previews question is not worth looking into, as this could be an example of algorithmic bias: When automated systems reflect the biases of their human makers, or make decisions that have biased implications.
SEE ALSO: People are fighting algorithms for a more just and equitable future. You can, too.In 2018, Twitter published a blog post that explained how it used a neural network to make photo previews decisions. One of the factors that causes the system to select a part of an image is higher contrast levels. This could account for why the system appears to favor white faces. This decision to use contrast as a determining factor might not be intentionally racist, but more frequently displaying white faces than black ones is a biased result.
There's still a question of whether these anecdotal examples reflect a systemic problem. But responding to Twitter sleuths with gratitude and action is a good place to start no matter what.
This Tweet is currently unavailable. It might be loading or has been removed.
Topics Artificial Intelligence X/Twitter
Bungie's final 'Destiny' inPro tip: How to get ready for the next 'Overwatch' hero OrisaMaisie Williams thinks she looks like an emoji and she's sort of rightThis is the busiest man on Twitter on International Women's Day62 fashion icons deliver strong message about the importance of womenThe Statue of Liberty went dark and the timing is just too perfectAn artist painted a staircase gold but authorities are predictably unimpressedBrilliant ad shows the impact of Japan's tsunami of 2011 that killed tens of thousandsBold kid crashes local weather report, forecasting 'farts and toots'Need another reason to cut CO2? Oceans will suffer, study saysOh nothing, just Chris Hemsworth crashing a couple's wedding photosNew report claims the next iPhone might be called 'iPhone Edition'Trump's favorite techie thinks there should be 'more open debate' on global warmingNeed another reason to cut CO2? Oceans will suffer, study saysBrilliant ad shows the impact of Japan's tsunami of 2011 that killed tens of thousandsHey superfans! Here's where you can find Team Mashable during SXSW 2017Hey superfans! Here's where you can find Team Mashable during SXSW 2017Putin celebrates women with the cold eyes of doomEmma Watson's book fairies are hiding feminist books all over the worldTinder offers $100 donations in honor of Women's Day Drinking with Salinger by Sadie Stein Ye Olde Grease Lightning, and Other News by Sadie Stein Swag by Sadie Stein See You There: Brooklyn Book Festival by Sadie Stein The Faint, Gray Areas by Lisa John Rogers Paradise Found by Sadie Stein The Church of Baseball by Adam Sobsey Empty Vessels by Alice Bolin Kafkaesque Hotels, and Other News by Sadie Stein Radio Days by Sadie Stein Reader’s Block by Diane Mehta Saving the Harriet Beecher Stowe House, and Other News by Sadie Stein In Which Jane Austen Tells Your Fortune, and Other News by Sadie Stein Spoiler Alert by Scott Spencer What We’re Loving: YA, Sci Have Questions About The Paris Review? Ask Our Editors on Reddit! by The Paris Review Letters from Jerry by Shelley Salamensky A Life in Matches by Justin Alvarez Driving Mr. Murray by Tony Scherman Nowhere to Go But Everywhere by Sadie Stein
2.1053s , 10132.734375 kb
Copyright © 2025 Powered by 【Female Instructor’s Strange Private Lesson (2025)】,Warmth Information Network