Long before he joined TikTok, Charles Conley loved superheroes.
Growing up, Batman was his favorite among many. But one thing always bothered him about the comics and blockbusters he devoured: “When it came to people of color, we got relegated to the background.”
As a Black man, that sense of exclusion was part of what led Conley to get involved with cosplay, or creating and wearing fictional characters’ costumes. He eventually joined TikTok in October 2020 to show off his creations and discuss social justice issues.
But Conley’s posts on the short-form video app were regularly met with racist comments. Viewers would call him “monkey” and the N-word, he said. In a private message, a user told him to kill himself. He began to suspect that trolls were mass-reporting his videos — a common concern — in an effort to get them taken down. Videos in which he called out other accounts’ racist posts would get flagged for “harassment and bullying” or “hate speech” and removed from the app, he said, while the posts he was criticizing stayed up.
Eventually, his account was permanently banned for multiple unspecified community guidelines violations. TikTok says it suspends and bans accounts for severe or repeated rule-breaking.
Conley made a second account but has struggled there too. He recently filmed a response to a viral video of a white user piercing their ears with a livestock-tagging gun, but his commentary was taken down for “dangerous acts” even though it didn’t include any such content beyond what had been in the original clip — which remained online. (The viral video eventually disappeared after The Times asked TikTok about the two posts; around the same time, Conley’s response was restored.)
At this point, Conley is ready to call it quits.
“It’s so draining,” he said. “Having an application … actively sabotaging you and not backing you up, or saying that you are the perpetrator of these transgressions or aggressions — it gets beyond tiring.”
Conley is not the first Black TikToker to say that he feels over-scrutinized and under-protected by the platform. Since at least the Black Lives Matter protests of summer 2020, users of color have complained that TikTok — the most downloaded app in the world last year — handles their accounts and content in ways that seem unfair and racially biased.
But what sets Conley and the other Black TikTokers who spoke to The Times for this story apart is what they plan to do about it: get off TikTok for good.
Some are moving to Fanbase, an Instagram-esque platform that lets users charge their followers for access to bonus content. Others are trying out Clapper, a TikTok look-alike that has already found favor among right-wing “TikTok rejects.” The two apps have become the focal points of a small but vocal cohort of Black TikTokers looking to direct simmering dissatisfaction with TikTok into an actual exodus.
“We need a new app,” said one such user in a recent TikTok. “Shout out to Fanbase! Go to Fanbase … and Clapper.”
“Let’s start jumping ship y’all!,” a different user wrote, encouraging followers to join Fanbase.
It won’t be an easy transition. TikTok has an estimated 66 million users in the U.S. alone, each of whom represents a potential fan — or customer — for someone like Conley. Compared with that virtual nation of users, Clapper and Fanbase barely represent a midsize city. On the Google Play store, where TikTok has more than 1 billion downloads, the two apps have closer to 100,000 and 10,000, respectively. And although TikTok was downloaded from iTunes a total of 12 million times in August, Clapper and Fanbase were several orders of magnitude less popular at 50,000 and 20,000 downloads apiece, according to app analytics firm Sensor Tower.
(A Clapper representative said it has been downloaded 1.5 million times total since launching last summer; Fanbase declined to share the size of its user base.)
“It’s kind of hard, just completely leaving a platform that everybody has access to and moving to an app that only a select market has,” Conley said.
But if Black users do make good on their threats to depart for friendlier pastures, the shift will hardly have come out of nowhere.
Built up and broken down
The frustration over TikTok’s moderation has less to do with any single error than with what many see as a recurring pattern. As the MIT Technology Review has noted, TikTokers from marginalized groups keep finding themselves subject to strange and seemingly targeted censorship; when the media picks up on a complaint and asks TikTok to explain, the company tends to blame one-off technical errors.
Last summer, amid the racial justice protests that erupted after George Floyd’s murder, TikToks tagged with #BlackLivesMatter and #GeorgeFloyd seemed to be getting zero views. The company attributed the issue to a glitch, noting that other, unrelated hashtags were similarly affected.
But even months after the protests, Black creators continued to complain that their videos were being taken down without explanation, listed lower than expected in search results and receiving far fewer views than usual.
In July, the app faced more such criticism after Ziggi Tyler, a popular Black user, went viral for a series of videos showing TikTok blocking him from using phrases such as “Black success” and “pro-Black” — but not “white success” or “pro-white” — in his bio. Again, TikTok said a technical error was at fault.
“I had inescapable proof of what we’ve all been talking about for, like, two years,” Tyler said in an interview.
It’s hard to know for sure whether bias in the TikTok algorithm is the problem in cases such as these, Kalinda Ukanwa, an assistant professor at USC who studies algorithmic bias, said in an email. “However,” she added, “the perception that TikTok’s algorithms are biased has been persistent.”
Other flashpoints have erupted around culture and community issues, with white commenters harassing Black creators, racist videos going viral, digital filters changing users’ skin tone, and non-Black users adopting Black affects. Viral dance challenges, a hallmark of the app, have proved particularly fraught, with white influencers accused of taking credit for moves created by Black choreographers.
TikTok is well aware of these complaints. On June 23, the company published a lengthy “commitment to diversity and inclusion” emphasizing that the platform would not tolerate racism and was working hard to “elevate and support Black voices and causes.”
To that end, TikTok launched a “diversity collective” to help guide the company’s approach to inclusivity, an accelerator program for Black creators and a program aimed at developing partnerships with Black entrepreneurs.
“We care deeply about the experience of Black creators on our platform,” a TikTok spokesperson told The Times. “We’re committed to seeing that our policies and practices are fair and equitable.”
Nevertheless, many Black TikTokers remain suspicious of TikTok’s moderation efforts.
Dea, a comedian with more than 200,000 TikTok followers, initially created two accounts on the app: one for her public speaking and one for her comedy. The latter account has been suspended on and off for weeks; she only recently regained full access to it.
In July, she jumped on a trend in which women pretend to light fires in order to get attractive firefighters to come rescue them. Some users garnered more than 1 million views by responding to the firefighters’ original video — but when Dea joined in, her post was taken down for “dangerous acts.”
After Dea appealed, TikTok restored her post, but the experience left her feeling too nervous to leave it up, and subsequent suspensions have her looking for an alternative: “It’s super toxic, it’s racist, and we’re never going to be able to progress with this app.”
Another Black TikToker, cosmetologist Sharly Parker, said she started having problems with TikTok during the 2020 election campaign, when her more political posts would get flagged for violating platform rules. Posts would get marked as containing nudity even if she was fully dressed, she said, or remain stuck at zero views all day.
Two videos reviewed by The Times were taken down for violations that were never explained to her and were not obvious. Another post she made featuring a song that used the N-word was removed for “hate speech” violations, even though a screen recording shows that the song itself — and several thousand other videos featuring it — remained up at the time. (The song appears to have since been removed.)
There have been financial effects too. Parker said TikTok has repeatedly blocked her from joining its “Creator Fund” monetization program for community guidelines-related reasons, and her fans frequently report being unable to access her livestreams — the only part of the app she makes money from.
“It can be depressing,” she said of the frequent takedowns. “It builds you up to break you down.”
A TikTok spokesperson said the platform does not moderate user content on the basis of race.
Moderation is a tricky job, and one that no social platform gets right 100% of the time. It requires balancing the competing interests of different users, lawmakers, advertisers and employees, and forces companies to grapple with thorny questions of free speech and public safety. TikTok does it with a combination of human and automated systems, both of which can be skewed by unintentional biases.
But for each of the users with whom The Times spoke — and the many other Black TikTokers who’ve publicly discussed their struggles with the app — the problem is rooted less in any singular moderation decision or technical gaffe than in a gradual accumulation of many recurring errors, some big and some small.
A Black exodus
The bottom line for TikTok is clear: Black users are thinking about leaving.
For some, that means ramping up their presence on mainstream platforms such as Instagram and YouTube (both of which have faced their own accusations of racial bias). Others are betting that lesser-known alternatives such as Fanbase and Clapper can be havens for Black creators.
Dea, the comedian, says she’s moving to Clapper, Fanbase and Instagram.
“I was like, ‘You know what, I’m done,’” Dea said. “I made the decision to leave the app for a while and just get my ducks in a row, concentrate on my other platforms.”
Parker, the cosmetologist, also joined Fanbase. Now she posts all her TikTok content there too — and hasn’t had “a single thing taken down.”
At this point, she said, there’s nothing TikTok could do to keep her around long term. Her bio now reads “Im off this racist app” and directs her followers to find her elsewhere.
For other Black creators, it’s not that simple.
Singer, actress and comedian Nicque Marina said she began posting TikToks about the Black Lives Matter movement and her experiences as a Black woman for the first time in fall 2020. It was then that she began to notice that the number of views on all of her posts, including her comedy and pop culture ones, “started to steadily drop to literally fractions” of what they’d been before — a change she attributed to her newfound outspokenness about race.
Because of the opaque nature of many social media algorithms, it can be hard to pinpoint the cause of any singular bump or drop in engagement. But explanations have been floated as to why TikTok might discriminate against racial justice content, from the app’s users being heavily white, to its algorithm over-indexing for creators’ races, to the company not hiring enough Black staff.
Ukanwa, the USC assistant professor, said one possibility is that the algorithm was “trained to flag content that involves race” as an anti-bullying measure but isn’t nuanced enough to distinguish between good and bad discussions of it. Another factor could be parent company ByteDance’s aversion to political content.
To avoid being “shadowbanned,” or subjected to artificially deflated viewership, Marina now avoids talking about some subjects on TikTok, reserving them for her YouTube followers (TikTok has denied using shadowbans). Yet she is sticking around, finding TikTok’s editing tools and the audience of more than 1 million followers she has cultivated there too valuable to abandon.
“I have no intention of being bullied off of anywhere,” she said. “Especially as a Black, female creator, I am not giving up a seat at a table that I’ve rightfully earned.”
It’s a common sentiment. LaToya Shambo, chief executive of Black Girl Digital, an influencer marketing agency focused on Black women, said that although many Black creators don’t feel appreciated or supported by TikTok, it’s hard to opt out of TikTok’s enormous user base when the apps being suggested as alternatives are much less popular.
“It’s a challenge when you are making money and you are seeing success on the platform,” Shambo said. “Some people are like, ‘I’m leaving. Fine, this is it.’ And some people are like, ‘You know what, not today.’”
Conley, the cosplayer, is in the former camp. Although his original account was restored soon after The Times reached out to TikTok about his complaints, he’s still in the process of moving over to Fanbase and Clapper. The transition is a headache — phasing out his TikTok presence; posting the same videos across multiple platforms; backing everything up to Google — but he sees it as a necessary act of resistance.
“I’m trying to keep my head up, and trying to keep fighting,” Conley said. “I don’t want to be defeated by an app.”