TikTok fined £12.7m by UK watchdog over misuse of children's data

Social media giant said it has "invested heavily" to protect children after ICO ruling says 1.4 million children gained access to the app without parental consent back in 2020.

TikTok has been fined £12.7m by the UK privacy regulator for failing to protect children’s data, in a fresh blow to the Chinese-owned app as it faces governmental bans worldwide. 

Announcing the fine, the Information Commissioner’s Office said TikTok allowed 1.4 million children under the age of 13 to use the app in 2020, despite its own rules requiring users to be above this age to create a TikTok account, 

TikTok allowed 1.4 million children under the age of 13 to use the app in 2020, despite its own rules requiring users to be above this age to create a TikTok account, the watchdog said.

“TikTok should have known better. TikTok should have done better.”

In a statement, John Edwards, UK Information Commissioner, said that children's data may have been used to track and profile them, with the risk that they could be shown harmful or inappropriate content.

“There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws,” Edwards said. “As a consequence, an estimated one million under-13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data.

“That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll. TikTok should have known better. TikTok should have done better.

“Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”

The ‘special data category’

The regulator slashed the potential fine, which it first announced in September last year, after deciding not to pursue an initial finding that the company had unlawfully used ‘special category data’. Special data includes ethnic and racial origin, political opinions, religious beliefs, sexual orientation, trade union membership, genetic and biometric data or health data.

However,  the ICO upheld its findings that TikTok failed to ensure that users under 13 had permission from their parents or carers to use the platform. It also did not carry out adequate checks to identify and remove these children from its site despite concerns being raised to senior members of staff.

TikTok ramps up ‘safety team’

A TikTok spokesperson said the company disagreed with the ICO's decision but was pleased the fine had been reduced from the possible £27m set out by the ICO last year.

The spokesperson said: “TikTok is a platform for users aged 13 and over. We invest heavily to help keep under 13s off the platform and our 40,000 strong safety team works around the clock to help keep the platform safe for our community.

“While we disagree with the ICO’s decision, which relates to May 2018 – July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.”

Regulator scrutiny 

TikTok, owned by China-based ByteDance, has soared in popularity in recent years, as users and advertisers flock to its short form video sharing format. But that success has sparked heightened scrutiny from regulators worldwide. 

The penalty comes as the popular video sharing app comes under pressure over data concerns, with calls for the app to be banned in the US. The UK has recently banned the app on government phones, with similar bans announced by Australia, New Zealand, Canada, the European Parliament and the European Commission.