close
close

TikTok is aware of the risks children face on its platform, the lawsuit says

TikTok is aware of the risks children face on its platform, the lawsuit says

According to internal documents and communications disclosed in a lawsuit filed by the state of Kentucky, TikTok was aware that its design features were harmful to its young users and that it publicly touted tools aimed at limiting children's time on the site , were largely ineffective.

The details are among redacted portions of Kentucky's lawsuit, which includes internal communications and documents that came to light during a more than two-year investigation of the company by various states across the country.

Kentucky's lawsuit was also filed this week separate complaints brought by attorneys general in a dozen states and the District of Columbia. TikTok is also just around the corner another Justice Department lawsuit and is it itself Lawsuit against the Department of Justice over a federal law that could ban it in the US by mid-January.

The redacted information — which was accidentally disclosed by the Kentucky attorney general's office and first reported by Kentucky Public Radio — touches on a number of issues, most notably the extent to which TikTok knew how much time young users were spending on the platform and how sincere they were were in the introduction of tools to curb excessive use.

Beyond underage use of TikTok, the complaint alleges that the short-form video sharing app prioritizes “beautiful people” on its platform and has internally determined that some of the content moderation metrics it publishes are “largely misleading.”

The unredacted complaint, seen by The Associated Press, was sealed Wednesday by a Kentucky state judge after state officials filed an emergency motion to seal it.

When reached for comment, TikTok spokesman Alex Haurek said: “It is highly irresponsible of the Associated Press to publish information that is under a court seal. Unfortunately, this complaint selects misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.”

“We have strict security measures in place, including proactively removing suspected underage users, and we have voluntarily implemented safety features such as default screen time limits, family pairing and default privacy for minors under 16,” Haurek said in a prepared statement. “We stand by these efforts.”

TikTok usage among young users

The complaint alleges that TikTok quantified how long it takes young users to engage with the platform and shared the results internally in presentations aimed at increasing user retention rates. The “habit moment,” as TikTok calls it, occurs when users have watched 260 or more videos in the first week of their TikTok account. This can happen in less than 35 minutes, as some TikTok videos run as short as 8 seconds, the complaint says.

The Kentucky lawsuit also cites a Spring 2020 presentation from TikTok that concluded the platform had already “reached a ceiling” among young users. At that time, the company's estimates showed that at least 95% of smartphone users under 17 were using TikTok at least monthly, the complaint says.

TikTok tracks metrics for young users, including how much time young users spend watching videos and how many of them use the platform daily. The company uses the information gleaned from these reviews to feed its algorithm, which tailors content to people's interests and drives user engagement, the complaint says.

TikTok conducts its own internal studies to find out how the platform affects users. The lawsuit cites a group within the company called “TikTank” that noted in an internal report that compulsive use was “widespread” on the platform. It also quotes an unnamed executive who said kids watch TikTok because the algorithm is “really good.”

“But I think we need to be clear about what this could mean for other opportunities. And when I say 'other occasions,' I literally mean sleeping, eating, moving around the room, and looking into someone's eyes,” the unnamed manager said, according to the complaint.

Time management tools

TikTok has introduced a daily screen time limit of 60 minutes for minors. This feature was launched in March 2023 with the stated aim of helping teenagers manage their time on the platform. But Kentucky's complaint argues that the deadline — which users can easily bypass or disable — was intended more as a public relations tool than anything else.

The lawsuit says TikTok measured the success of the time limit feature not by whether it reduced the time teens spent on the platform, but by three other metrics – the first of which was “improving the public's trust in TikTok.” Platform through media reporting”.

Reducing screen time among teenagers was not considered as a measure of success, the lawsuit says. In fact, it was claimed that the company planned to “rethink the design” of the feature if the time limit feature resulted in teens reducing their TikTok usage by more than 10%.

TikTok conducted an experiment and found that its time limit prompts were just a minute and a half shorter than the average time teens spent on the app — from 108.5 to 107 minutes per day, the complaint says. But despite the lack of movement, TikTok hasn't tried to make the feature more effective, Kentucky officials say. They claim the ineffectiveness of the feature is in many ways intentional.

The complaint says that a TikTok executive named Zhu Wenjia only agreed to the feature if its impact on TikTok's “core metrics” was minimal.

TikTok – including its CEO Shou Chew – has talked about the app's various time management tools, including videos that TikTok sends users to encourage them to leave the platform. But a TikTok executive said in an internal meeting that these videos were “useful” talking points but “not entirely effective.”

TikTok has “prioritized beautiful people” on its platform.

In a section describing the negative impact that TikTok's face filters can have on users, Kentucky claims that TikTok's algorithm “prioritizes beautiful people” despite internally knowing that content on the platform “perpetuates a narrow beauty standard.” “ could.

The complaint alleges that TikTok changed its algorithm after an internal report found that the app was displaying a large “amount of… unattractive topics” in the app's main “For You” feed.

“By changing TikTok's algorithm to display fewer 'unattractive topics' in the For You feed, defendants have taken active steps to promote a narrow beauty standard, even though this may negatively impact their young users,” it says it in the statement of claim.

TikTok’s “leakage” rates

The lawsuit also targets TikTok's content moderation practices.

It cites internal communications in which the company states that its moderation metrics are “largely misleading” because “we are good at moderating the content we collect, but these metrics do not take into account the content we miss.”

The complaint notes that TikTok knows that there are significant “leakage” rates, or content that violates the site's community guidelines, but is not removed or moderated, but does not disclose this. Other social media companies are also facing similar problems on their platforms.

For TikTok, the complaint states that “leakage” rates include approximately 36% of content that normalizes pedophilia and 50% of content that glorifies minor sexual assault.

The lawsuit also accuses the company of misleading the public about its moderation and allowing some popular YouTubers who were considered “high-quality” to post content that violated the site's policies.

Leave a Reply

Your email address will not be published. Required fields are marked *