How TikTok’s hate speech detection tool sparked debates about racial bias on the app
[ad_1]
“That’s why I’m fucking angry. We are tired,” a popular black influencer Ziggi Tyler said in an interview. Recent viral videos On Douyin. “Any content related to black people is inappropriate content,” he continued behind the video.
Tyler expressed his disappointment with TikTok because he discovered a discovery while editing his profile in the app’s Creator Marketplace that the app connects popular account holders with brands that pay to promote products or services stand up. Tyler noticed that when he entered phrases about black content in his Marketplace creator resume, such as “Black people’s fate” or “Black people’s success,” the app would mark his content as “inappropriate.” But when he entered phrases such as “white supremacy” or “white success”, he did not receive such a warning.
For Taylor and many of his followers, the incident seemed to fit a larger model of controlling black content on social media. They stated that this proved evidence that they believed the app had racial bias against blacks-some urged their followers to leave the app, while others flagged TikTok’s company account and asked for answers. Taylor’s original video about the incident has received more than 1.2 million views and more than 25,000 comments; his follow-up video has received nearly 1 million views.
“I won’t sit here and let this happen,” Taylor, a 23-year-old college graduate from Chicago, told Recode. “Especially on one platform, all these pages are saying,’We support you. February is Black History Month.'”
A TikTok spokesperson told Recode that this issue is a bug in the hate speech detection system it is actively working to resolve, and it does not indicate racial prejudice. According to a spokesperson, TikTok’s policy does not restrict the posting of “black people’s fate is also fate”.
In this case, TikTok told Recode that the app mislabeled phrases like “Black Lives Matter” because its hate speech detector was composed of a combination of words containing the words “Black” and “audience” Triggered-because “audience” contains the word “dead” in it.
A company spokesperson said in a statement: “Our TikTok Creator Marketplace protection measures usually flag phrases related to hate speech, but are incorrectly set to phrases that do not consider word order.” “We admit and feel very frustrated about this. , Our team is quickly resolving this major error. What needs to be clear is that Black Lives Matter has not violated our policy and currently has more than 27B page views on our platform.” TikTok said that it has contacted Taylor directly, but he did Response.
But Taylor said that he believes that TikTok’s explanation of Recode is insufficient, and he believes that the company should discover problems in its hate speech detection system as soon as possible.
“No matter what the algorithm is and how it starts, someone must program the algorithm,” Taylor told Recode. “And if [the problem] It’s an algorithm, it’s been available in the market since then [2020], Why is this not a conversation between you and your team, knowing that there is a race dispute? ” he asks.
Taylor is not the only one who is frustrated-he is just one of many black creators who have recently protested against TikTok because they said they were not recognized and underserved.Many of these black TikTokers are participating in what they call activities “#BlackTikTok strike”, They refused to create an original dance for a hit song—because they were angry because the black artists on the app were not properly attributed to the viral dance that they first choreographed and imitated by other creators.
These problems are also related to another criticism that has been received on TikTok, Instagram, YouTube, and other social media platforms over the years: their algorithms recommend and filter posts that everyone sees, often with inherent racial and gender biases.
A 2019 study showed that The leading AI model for detecting hate speech is 1.5 times more likely For example, mark tweets written by African Americans as “distasteful” compared to other tweets.
Discoveries like this have sparked an ongoing debate about the advantages and potential harms of relying on algorithms (especially the development of artificial intelligence models) to automatically detect and regulate social media posts.
Major social media companies such as TikTok, Google, Facebook, and Twitter-although they admit that these algorithmic models may be flawed-are still making them a key part of their rapidly expanding hate speech detection system. They say they need a less labor-intensive way to keep up with the ever-increasing amount of content on the Internet.
Tyler’s TikTok video also shows the tension surrounding the lack of transparency in how these apps regulate content.in During the Black Lives Matter protests in June 2020 In the US, some activists accused TikTok of censoring certain popular #BlackLivesMatter posts-for a while, even if they had billions of views, the app showed zero views. TikTok denied this, and Said that this is also a technical failure affecting other hashtags. According to reports, at the end of 2019, TikTok executives Discuss to curb political discussions on the appAccording to Forbes, in order to avoid political controversy.
A TikTok spokesperson admitted that there is greater frustration with the Black representative on TikTok, and said that earlier this month, the company launched an official @BlackTikTok account to help cultivate the Black TikTok community on the platform—— In general, its team is committed to developing a recommendation system that reflects inclusiveness and diversity.
But for Taylor, the company still has a lot of work to do. “This example is just the tip of the iceberg, and you will have all these problems below the water level,” Taylor said.
[ad_2]
Source link