TikTok no longer considers "Asian women" a dirty phrase. The Nanda van Bergenvideo sharing app has fixed an issue that censored the term in its automatic captions, saying that it was an "error."
TikTok introduced automatic captions in early April, testing it in the U.S. and Japan with the intent to eventually roll it out more widely. This helpful new feature automatically detects and transcribes what video creators are saying, providing captions they can then review and edit for accuracy. Captions are a vital accessibility tool for people who are deaf or hard of hearing, and are also generally appreciated by anyone scrolling through their For You page without earbuds.
However, TikTok users soon noticed the automatic captions making questionable calls regarding the words it considered inappropriate. While the phrases "white men", "white women", "Black men", "Black women", and "Asian men" all passed TikTok's language filter with no problems, the phrase "Asian women" did not. Instead, TikTok's automatic captions censored the phrase so it appeared as "a**** w****."
The issue was fixed and "Asian women" uncensored earlier this week after TikTok was made aware of the problem. It's unclear when the issue began, but the auto caption feature has been available since April 6. TikTok says it was fixed within hours of them being notified on April 26.
"We care deeply about supporting and elevating underrepresented groups on TikTok and worked quickly to resolve this error," a TikTok spokesperson told Mashable.
This Tweet is currently unavailable. It might be loading or has been removed.
The fix means you won't be able to replicate the issue in any new videos you create, and it won't be a problem when captions are rolled out in more countries. However, videos captioned before the fix are still censored. (The phrase "Asian woman," singular, was never censored.)
SEE ALSO: How to turn on TikTok's new auto captionsTikTok's censorship of "Asian women" is particularly unfortunate in light of the current surge in anti-Asian hate crimes. Advocacy group Stop AAPI Hate documented 503 hate incidents against Asian people in the U.S. during the first two months of 2021, with violence against Asians being stoked by racist rhetoric surrounding the coronavirus pandemic. The actual number of anti-Asian hate crimes is believed to be significantly higher, as incidents often go unreported.
Asian women are also uniquely vulnerable to such violence, as they are highly sexualised and fetishised. In March, six Asian women were among eight killed in the Atlanta massage parlor mass shootings. Several TikTok users have speculated that such sexual connotations imposed upon Asian women may be the reason why the app censored the phrase.
TikTok currently has a resource page linking to information on how to support the Asian American community.
This isn't the first time tech companies have censored the term "Asian." In March, Apple updated its iOS 14.5 Beta so that web searches containing the word "Asian" were no longer blocked by its adult content filter — an issue that had persisted for over a year. Fortunately, TikTok was a bit quicker to fix the problem.
UPDATE: April 30, 2021, 8:53 a.m. AEST This post has been updated to clarify when TikTok learned of the issue.
Move over Gigi, Tweety the canary is modeling's new fresh faceWhatsApp fixed a lethal security flaw that would crash the appChromecasts aren't the best streaming gift for everyoneApple's pricey Mac Pro is also the company's most repairable item in foreverThis hilarious parody nails the double standards women face when reporting sexual assaultLondon is getting the UK’s first 24/7 zero emission street6 big hopes and wishes for 'Star Wars: The Rise of Skywalker'People are pissed they didn't get more snow during Blizzard 2017This hilarious parody nails the double standards women face when reporting sexual assaultStudy on narcissism finds Boomers are more sensitive than Millennials'Top Gun: Maverick' gets a highElon Musk said he could fix state energy crisis in 100 days. Here's his chance to prove it.Lab test results stolen in hack of 15 million patients' recordsTrump never has to buy sunglasses again because Obama's photographer will shade him foreverBBC Dad comes out of digital hiding to talk about that infamous clipRyan Reynolds explains that shocking Spice Girls moment in '6 Underground'Apple Arcade is now a bit cheaper — if you pay for a full yeariPhone's Night Shift mode might be hurting instead of helping you, study suggests'Star Wars: Rise of Skywalker' world premiere blasts off: PhotosMischa Barton is the latest celebrity to become a victim of revenge porn Reporter instantly regrets attempting to broadcast beside a large puddle The Trump kids are in Aspen and it's ruining everyone else's hygge Not getting credit for 'Key & Peele' hurt Dave Chappelle's feelings J.J. Abrams creating a RuPaul dramedy is the most gloriously absurd creative alliance Glimpse into Amazon's futuristic living spheres College students stage buzzer beater that's almost more thrilling than actual March Madness A puppeteer's personal story about bringing Sesame Street's first autistic Muppet to life Apple's new video app is part Snapchat, part iMovie and full of AI 'Alexa, bring me a beer from Prime Now' is now possible, and life is beautiful 'The Big Bang Theory' will continue to haunt your television for years to come Indian carriers fight over who has the fastest 4G internet 'Sesame Street' has brilliantly trolled Donald Trump for decades Emoji are currently either male or female — but that could be about to change Trump administration cuts LGBTQ questions from elder care surveys New mobile banking app supports 11 Indian languages and English Police want to know about literally anyone who googled this guy Why you should care about the return of Dave Chappelle Something is very, very wrong with Kim Kardashian's Snapchat Why people are painting their faces green in Russia Crowded city builds a train track running through an apartment block
2.292s , 10136.25 kb
Copyright © 2025 Powered by 【Nanda van Bergen】,Unobstructed Information Network