A decision by the FTC sentenced (TikTok) to a fine of $5.7 million for violating the age-appropriate application of the Children’s Privacy Act.

A decision by the FTC sentenced (TikTok) to a fine of $5.7 million for violating the age-appropriate application of the Children’s Privacy Act.

A major FTC decision today will see the TikTok video application fined $5.7 million for violating U.S. laws protecting the privacy of children under 13, and will impact the operation of the application for children under 13. In an update released today, all users will be required to verify their age, then children under 13 will be invited to a separate and more limited experience in their application that protects their personal information and prevents them from publishing their videos to TikTok.

It was the same day that TikTok began promoting its new security series, designed to help its community stay informed about its privacy and security tools, which is very timely.

The Federal Trade Commission had begun to investigate TikTok when it was known as, and the decision itself is a settlement with

The industry’s self-regulatory group, the Children’s Advertising Review Unit (CARU), referred to the FTC last spring for violating U.S. child privacy law by collecting personal information for users under the age of 13 without parental consent. (The complaint, filed by the Department of Justice on behalf of the Commission, is here.), technically, no longer exists. It was acquired by the Chinese company ByteDance in 2017. The application was then closed in mid-2018 while its user base was merged into TikTok.

But his regulatory problems followed him to his new home.

According to the US COPPA Children’s Privacy Act, operators of applications and websites for young users under 13 years of age may not collect personal data such as email addresses, IP addresses, geolocation information or other identifiers without parental consent.

But the application required users to provide an email address, phone number, user name, first and last name, short biography and profile photo, says the FTC. The application also allowed users to interact with others by commenting on their videos and sending direct messages. In addition, user accounts were public by default, which means that a child’s profile, username, photo and videos could be viewed by other users, the FTC explained in its press release.

He also noted that there were reports of adults trying to contact children in, and until October 2016, there was a device that allowed others to see users within a 50 mile radius.

“ now known as TikTok-have known many children who used the application, but they still have not obtained parental consent before collecting the names, email addresses and other personal information of users under 13 years of age,” said Joe Simons, FTC President, in a statement. “This record penalty should be a reminder to all online services and websites that target children: We take the application of COPPA very seriously, and we will not tolerate companies blatantly ignoring the law.”

COPPA, of course, is becoming a little complex to implement for applications like TikTok that are in a grey area between being adult-oriented and for children. More specifically, the preferred applications of pre-teens and teenagers – such as Snapchat, Instagram, YouTube and TikTok – are often requested by young children under 13 years of age and parents often comply with them.

But some parents are caught off guard by these applications. According to the FTC, has received “thousands of complaints” from parents because their children under 13 years of age had created accounts.

In addition to the $5.7 million fine, the FTC’s settlement with includes an agreement that will impact the operation of the TikTok application.

He says that TikTok is now considered a “mixed” application, which means that there must be an age range implemented on the application. Instead of locking users under 13 years of age from the TikTok service, young users will be directed to a different application experience that prevents TikTok from collecting the personal information prohibited by COPPA.

TikTok also complies with the decision by making significant changes to its application. It will now prevent children under 13 from filming and publishing their videos in the TikTok application. It will also remove all videos of children under 13 years of age.

Instead, people under 13 will only be able to love the content and follow the users. They will be able to create and record videos on their device – but not on the public TikTok network. Nor can they share videos about the application with their friends if they use TikTok via a private account.

As TikTok already has a large number of young children on its application, it will push an update of the application today that displays the new age grid to both new and existing users. Children will then have to check their date of birth to be directed to the appropriate experience.

This will probably not have an impact on the number of children using TikTok, however. Today’s children already know that you have to lie to age pop-ups so that they can fit into a restricted application. That’s how they create accounts on Facebook, Instagram, Snapchat and others…

Leave a Comment