Live Markets, Charts & Financial News

TikTok was aware of risks kids and teens face on its platform, legal document alleges

2

Article content

TikTok was aware that its design features were harmful to its young users and that publicly promoted tools meant to limit children’s time on the site were largely ineffective, according to documents and internal communications disclosed in the Kentucky lawsuit.

The details are among redacted portions of the Kentucky lawsuit that contain internal communications and documents discovered during a more than two-year investigation into the company by various states across the country.

Advertisement 2

Article content

The lawsuit was filed in Kentucky this week, along with separate complaints filed by attorneys general in dozens of states as well as the District of Columbia. TikTok is also facing another lawsuit from the Department of Justice and is itself suing the Department of Justice over a federal law that could ban it in the United States by mid-January.

The redacted information — inadvertently revealed by the Kentucky Attorney General’s Office and first reported by Kentucky Public Radio — touches on a range of topics, most importantly how well TikTok knows how much time young users spend on the platform and how truthful it is. This was when tools aimed at reducing overuse were introduced.

Beyond the use of TikTok among minors, the complaint alleges that the short-form video-sharing app prioritized “beautiful people” on its platform, and noted internally that some of the content moderation metrics it posted were “wildly misleading.”

The unredacted complaint, seen by The Associated Press, was sealed by a Kentucky judge on Wednesday after state officials filed an emergency motion to seal it.

Article content

Advertisement 3

Article content

When reached for comment, TikTok spokesperson Alex Horik said: “Publishing information under court seal is highly irresponsible on the part of the Associated Press. Unfortunately, this complaint misquotes and takes outdated documents out of context to misrepresent our commitment to community safety.

“We have strong safeguards in place, which include proactive removal of suspected underage users, and we have voluntarily launched safety features such as default screen time limits, family pairing, and privacy by default for minors under the age of 16,” Horik said in a prepared statement. “We stand by these efforts.”

TikTok use among young users

The complaint alleges that TikTok determined how long it took young users to get hooked on the platform, and shared the results internally in presentations aimed at increasing user retention rates. A “habit moment,” as TikTok calls it, occurs when users watch 260 or more videos within the first week of having a TikTok account. The complaint says this can happen in as little as 35 minutes since some TikTok videos play for up to 8 seconds.

Advertisement 4

Article content

Kentucky’s lawsuit also cites a spring 2020 presentation from TikTok that concluded the platform had “already hit a ceiling” among young users. At that point, the company estimated that at least 95% of smartphone users under the age of 17 were using TikTok at least monthly, the complaint notes.

TikTok tracks youth user metrics, including how long young users spend watching videos and how many use the platform each day. The complaint says the company uses the information it gleans from these reviews to feed its algorithm, which tailors content to people’s interests and drives user engagement.

TikTok is conducting its own internal studies to find out how the platform affects users. The lawsuit cites one group within the company called “TikTank,” which noted in an internal report that compulsive use was “rampant” on the platform. He also quotes an unnamed executive as saying that kids watch TikTok because the algorithm is “really good.”

“But I think we need to be aware of what that might mean for other opportunities. When I say other opportunities, I literally mean sleeping, eating, moving around the room and looking into someone’s eyes,” the unnamed executive said, according to the complaint.

Advertisement 5

Article content

Time management tools

TikTok has a daily screen time cap for minors of 60 minutes, a feature it rolled out in March 2023 with the stated aim of helping teens manage their time on the platform. But Kentucky’s complaint says the time limit — which users can easily override or disable — was intended more as a public relations tool than anything else.

The lawsuit says TikTok measured the success of its time limit feature not by whether it reduced the time teens spent on the platform, but by three other metrics — the first of which was “improving public confidence in the TikTok platform through media coverage.”

Reducing screen time among teens was not included as a measure of success, the lawsuit said. In fact, it claimed that the company planned to “reconsider the design” of the feature if the time limit feature caused teens to reduce their TikTok use by more than 10%.

TikTok conducted an experiment and found that time limit prompts shaved just a minute and a half off the average time teens spent on the app — from 108.5 to 107 minutes per day, according to the complaint. But despite the lack of traffic, TikTok hasn’t tried to make the feature more effective, officials in Kentucky say. They claim that the feature’s ineffectiveness was, in many ways, by design.

Advertisement 6

Article content

The complaint says a TikTok executive named Zhu Wenjia only gave approval for the feature if its impact on TikTok’s “core metrics” was minimal.

TikTok — including its CEO Shou Chew — spoke about the app’s various time management tools, including videos that TikTok sends to users to encourage them to exit the platform. But a TikTok executive said in an internal meeting that these videos are “useful” talking points, but “they’re not completely effective.”

TikTok has prioritized beautiful people on its platform

In a section detailing the negative effects TikTok’s face filters can have on users, the state of Kentucky claims that TikTok’s algorithm “prioritized beautiful people” despite knowing internally that content on the platform could “perpetuate a narrow standard of beauty.”

The complaint alleges that TikTok changed its algorithm after an internal report indicated that the app was displaying a “high volume of…unattractive topics” in the app’s main “For You” feed.

“By changing TikTok’s algorithm to show fewer ‘unattractive topics’ in the For You feed, Defendants actively took steps to promote narrow beauty standards even though doing so may negatively impact their younger users,” the complaint says.

Advertisement 7

Article content

TikTok “dropout” rates

The lawsuit also targets TikTok’s content moderation practices.

He cites internal communications where the company notes that its moderation metrics are “wildly misleading” because “we’re good at moderating the content we catch, but those metrics don’t take into account the content we miss.”

The complaint notes that TikTok knows it has — but does not disclose — significant “leak” rates, or content that violates the site’s community guidelines but has not been removed or moderated. Other social media companies are also facing similar issues on their platforms.

For TikTok, the complaint notes that “leakage” rates include approximately 36% of content that normalizes child sexual abuse and 50% of content that glorifies simple sexual assault.

The lawsuit also accuses the company of misleading the public about its moderation and allowing some popular creators who were deemed “high value” to post content that violated the site’s guidelines.

Article content

Comments are closed, but trackbacks and pingbacks are open.